Google-apps
Hoofdmenu

Post a Comment On: C0DE517E

"Offline"

4 Comments -

1 – 4 of 4
Blogger Unknown said...

Ray packets useless? Are you kidding?

February 4, 2009 at 1:53 AM

Blogger DEADC0DE said...

I don't see any good in ray packets, they seem to me just to be an expression of the confusion state that RTRT research is in nowadays.

Ok that's totally unfair, researchers do a great job, they investigate great and not great ideas, and it's obvious that most of them are impractical, that's the whole point of the research, for practical stuff there's applied research done by software companies...

So why raytracing is cool (http://www.sci.utah.edu/rt08/talks/haines08ray.pdf)? We have to decide!

It can be cool because we can shoot arbitrary rays, integrate over paths and do global illumination and a lot of cool effects easily... for sure it's not cool to render stuff that exhibits a lot of coherency, because for that rasterization easily kills it.

There was this naive idea that raytracing was cool because each ray was independent and so it provided the opportunity for plenty of parallelism. And I guess that for years rasterization gurus were laughing at that, while developing GPU chips that did accomplish incredible results exactly with the opposite principle. Parallelism is cool when there's a lot of coherency, because memory is painfully slow.

Now CPUs are following the "GPU path" too, so it's evident that streaming architectures are for the win. And so RTRT research shifted towards them, trying to add back a lot of coherency to their incoherent queries... But then you get into the field where rasterization wins easily, and so it does. Now with bizzarre algorithms (raytracing simplicity?) and a cluster of CPU/GPUs you can do something that rasterization did years ago, more slowly, but hey, look at those totally cool chome spheres!

Again I'm unfair, of course ray packets are intresting, and of course raytracing research on GPUs is, because maybe we'll find some sort of middle ground, maybe we'll find a good mapping of raytarcing to the streaming paradigm.

But as of now and here ray packets are hyped and useless, ray tracing is still cool for offline rendering, I don't see much research in that anymore, and it's cool to hear about data structures geared towards SIMD and incoherent rays, that's to say, current processors and current raytracing applications. That's _useful_

February 4, 2009 at 12:03 PM

Blogger Unknown said...

Wow, your reply could have been a full blog post :) But well, I'm in the process of becoming a RTRTracing guy, because of my graduation project. From practice, I know that there are applications in the RTRTacer I'm working on where ray packets incredibly increase performance. And applications where they don't. Anyway, I belive things will get really interesting with Larrabee, with hybrid rasterization/RTRT things. We'll see.

February 4, 2009 at 12:26 PM

Blogger DEADC0DE said...

That's cool, the more RTRT researchers there are, the more I'll probably see something that will change my mind on the status of the research :)

February 4, 2009 at 1:43 PM

You can use some HTML tags, such as <b>, <i>, <a>

Comment moderation has been enabled. All comments must be approved by the blog author.

You will be asked to sign in after submitting your comment.
Please prove you're not a robot