flipCode - Tech File - Alex Champandard [an error occurred while processing this directive]
Alex Champandard
Click the name for some bio info

E-Mail: alex@base-sixteen.com

   09/22/1999, Water Contest Entry: Rendering Raytraced Refraction with 3D Hardware

A few people asked me how I managed to get a 2D looking distortion effect running through the 3D hardware acceleration. I promised to explain, so here we go :)

My approach was to focus all the work on the actual water, rather than the surrounding world. And for me, water is all about refraction and reflection. I also thought about caustics, but real caustics can't be done in real time, and the cheap hack looked lame, so i gave that idea up.

Firstly, I create a voxelised representation of the water. This is just a standard height map, just like most voxel landscape engines use. I hesitated a while on how to generate ripples in that buffer. The choice I had was to use the famous blur-sharpen filter algorithm, or a sum of sines function. The filter has the advantage of being able to create any number or ripples in the same time complexity. On the other hand, there are some nasty looking instabilities in the water once the ripples should have faded out. But most importantly, the result is frame rate dependant. So I opted for the sum of sines function. The time complexity is function of the number of ripples, but since using too many ripples looks chaotic, I can get away with only 4 ripples.

The actual function to generate the ripples is quite simple. It's a sine that fades with time and distance.

      amplitude = max_amplitude * distance_fade * time_fade * sin( time_scale )

Both fading parameters are contained within [0..1]. They are computed as a linear function of time / distance. An exponential function also produces nice results, but slightly slower, and not too noticeable to the untrained eye.

Once the voxelised water representation is computed (default 64x64), I calculate the normals at each point of the water. Increasing the mesh size is quite computationnaly expensive since I need one square root to generate each normal.

Now this is the tricky part :) I need to render a textured plane that is distorted by the ripples above it. To do this I use ray tracing. For every corner of 8x8 pixel blocks, I find the equation of the 3D line that passes through that pixel. I then compute the intersection of that ray with the water plane. I then compute the normal of the water at that point by bilinearily interpolating the 4 closest normals, for better quality. I then use the refraction equation to find the refracted ray's equation.

      Ai * sin( alpha ) = Bi * sin ( beta )

alpha is the angle from the normal to the ray
beta is the angle from the refracted normal to the ray
Ai and Bi are the refraction indexes of the two substances involved, by default water (1.33) and air (1.00).

Another method I considered was to simply add the normal's X and Z components to the ray's. Thats the principle most simple 2d distortion algorithms use. This also looks good, and much quicker to compute, but I wouldn't have had the satisfaction of coding the proper refraction :)

Once we have the refracted ray's equation, we simply find it's intersection with the floor plane, and compute the texture coordinates. So we end up with texture coordinates every 8x8 pixels. I then use hardware acceleration to approximate the texture coordinates for the pixels in between. To do that I bypass OpenGL's projection capabilities, and draw 8x8 rectangles onto the screen, with the right texture coordinates applied to the 4 corners. The benefit of using the hardware here is that i can interpolate linearily the texture coordinates to approximate the true refraction. And this is obviously much quicker. I added an option to change the refraction precision to 1x1. But that turns my realtime demo into an expensive raytracer that happens to use the OpenGL for output.

Now we have a screen full with a texture refracted through the water. All we need to do now is re-enable the OpenGL's projection's capabilities, and draw the ripples that reflect a nice environment map.

Now that's the theory. The code itself is pretty fast, but there still is a fair bit of room for improvement, like using a quadtree for the water and clipping that against the camera frustrum. But i think that's overkill for a small demo :)

I've updated the demo since the voting... the contest was just an excuse to get that refraction coded, which I always wanted to do. I also had to add a completely new OpenGL subsystem, and fix that stupid bug that prevented me from having music in the entry I submitted for the contest. Anyway, it works now and sounds great... thanks a lot to Steve for the music :)

Download the demo here (336k).

  • 01/05/2002 - A.I., Websites and Voxels
  • 03/05/2001 - State Of The Art Character Animation
  • 01/24/2001 - Alternate Programming Languages
  • 01/11/2001 - The Sound Of Silence
  • 09/20/2000 - Engine prototypes, yet more landscape LOD and hard body physics
  • 07/13/2000 - Good Practice (TM)
  • 06/13/2000 - Revision Week Indeed
  • 04/11/2000 - Escape From Inactivity ;)
  • 01/05/2000 - Generic Shaders
  • 12/03/1999 - Start Of Term: Lots Of Good Intensions :)
  • 10/06/1999 - Dynamic LOD, Landscape Generation and much more...
  • 09/22/1999 - Water Contest Entry: Rendering Raytraced Refraction with 3D Hardware
  • 09/16/1999 - More About Bezier Patches And Fractals
  • 08/11/1999 - The Right Landscape Engine For You!
  • 07/12/1999 - Leftovers From The Contest: Fixing TP7's Runtime Error 200 bug
  • 07/01/1999 - Introduction

  • This document may not be reproduced in any way without explicit permission from the author and flipCode. All Rights Reserved. Best viewed at a high resolution. The views expressed in this document are the views of the author and NOT neccesarily of anyone else associated with flipCode.

    [an error occurred while processing this directive]