I haven't done any graphics programming in way too long, so tonight I
finally decided to get around to playing with an idea for
hardware-accelerated radiosity algorithm I've been pondering for a while.
These are two screenshots from the 6-hour hacking session, both with two
passes of my algorithm applied.
The algorithm is actually quite simple. Basically, for every vertex in the
scene, render a low-resolution version of the scene from its viewpoint.
Take the average of all the pixels and add in the surface's emission, and
there you have the light value. I played with various weighted averages to
try to account for angle of incidence and such, but I actually got the best
results by just taking an unweighted average of the pixels on a 128x128
image rendered with a 60-degree field of view.
The algorithm is pretty slow. Each pass took about a minute and a half for
the top image (the bottom image was a lot faster, though I didn't time it).
However, once the lighting is computed, it renders at about 150FPS on my
machine (Matrox G400 running under Linux on a Duron 850). Of course, after
the radiosity is calculated, the engine is a pure polygon-pusher, so the
framerate isn't really *that* impressive. :)
A fun thing about the implementation is that you get to watch the radiosity
calculations as they're going. I got pretty dizzy when it was working on
the spheres though. :)