3D Raytracing Chip Shown at CeBIT

A Classic Raytraced Scene

One of the first scenes I ever raytraced, back in 1986 or so…

Slashdot is running a story on a German group who is showing a 3D raytracing chip at CeBIT. This group has apparently constructed an FPGA based raytracing accelerator board, and is trying to convince companies like nVidia and ATI that they should adopt their techniques and built raytracing-based game accelerators.

I think their work is interesting, but I’m skeptical.

Raytracing is a very powerful and flexible technique, but it’s an awfully big hammer to crack nuts with. It does have one major advantage: the work that it performs is sublinear in the size of the input scene, rather than the linear time normally taken by object-order algorithms like the Z-buffer. But the constants are remarkably high, and the fact that naive raytracing tends to probe the scene in fairly arbitrary ways makes caching of only limited utility. In looking at the test images they produce, they appear to have lots of aliasing and lots of sharp shadows, which helps ease these problems significantly.

Ultimately, I remain interested, but unconvinced. I’ve seen at least one company start quickly out of the gate, only to be smacked by the cold hand of reality.

But hope springs eternal!

Gratuitous cool link from the thread: Castle Wolfenstein in 5K of Javascript.

4 thoughts on “3D Raytracing Chip Shown at CeBIT

  1. Dan Lyke

    Oh, look, someone else who’s gotten hardware up to the point where it can render the hard edged shadows and heavily aliased edges that software game engines were circa 1995! In fact, I don’t even see any effective use of reflections.

    If someone wanted to blow me away with a hardware raytracing demo, they could do it with a couple of teapots over a checkerboard plain and concentrate on image quality. The fact that they didn’t do that says to me that they’re another bunch of upstarts who decided to solve the easy problems because they don’t yet know enough about the issues to solve the hard problems.

    I wish ’em luck and all, but I’m far more excited about what can be done with procedural textures in a Z buffer based system.

  2. mmp

    The guys behind this have done some of the most innovative research in real-time CPU-based ray tracing over the past few years. They are certainly not a bunch of upstarts solving easy problems. (Read their research papers if you’re interested.)

    I don’t buy the pro-ray tracing argument that it has sublinear time in the input scene size for two reasons. First you need to build acceleration structures, which is at best linear time. So asymptotic time is really linear.

    While naive zbuffering is also linear, in practice people store their scenes in 3D data structures, frustum cull them, and draw them front to back, using hardware occlusion queries to tell when they don’t need to keep sending down geometry that is actually hidden. Thus, in practice, it ends up having a similar sort of sublinear time complexity to ray tracing.

    There are lots of other good reasons that ray tracing is nice, but sublinear running time isn’t one of them.

  3. Dan Lyke

    I have deliberately ignored rendering for 5 years, but you can do a hell of a lot better than front-to-back Z buffer for hiding, too. Especially if you can pre-process large portion of the scenes (ie: terrain and static elements).

    Which drops back to my skepticism based on those pictures. I’ve been hearing “hardware ray tracing” for a long time. I wrote my first ray tracer (in C) by stealing algorithms from a ray tracer implemented in assembly language on a TI 34010 (About a year later than Mark’s picture above I had an article in DDJ with a tiny renderer that did the classic sphere over a checkerboard). I’ve seen the Transputer phase come and go. I remember when the ART RenderDrive first appeared on the scene (and, frankly, was very surprised to find that it still is on the scene).

    None of those pictures show me why I’d want to use ray tracing. No reflections or refractions(!). No area lights or soft shadows (although, admittedly, when your texture filtering is that bad, hard shadows are kinda cool). If you’re goiing to show off ray tracing, screw bad game scenes, give me 9 chrome teapots in a checkerboard box,

    Now I was a skeptic about reasonable price-points on hardware for Z buffers too, and I think it was 1997 when playing with a 3dfx Voodoo card convinced me that even though it was slower (in a per-triangle sense) than what you could do in software on a then current Pentium, it was close, and the quality was better.

    But the vibe I’m getting from these pictures is roughly what we were doing in software in 1997.

    However, I’m getting back into CG and I’ll check out their papers, thanks for the recommendation.

  4. Mark Post author

    Thanks for the comments Matt.

    You are of course correct on both counts.

    The problem with expressing your thoughts on the web is that there always seems to be someone smarter than you who knows more about things than you do, even when you think of yourself as an expert.

Comments are closed.