Reflections are a quick way to add a lot of visual interest to your work. Not just something shiny and sparkly, but real-time dynamic reflections that show changes in the world around them are alawys stunning.

It's important you've read the cubemap tutorial or understand cubemaps already.

So, given an object in the world, how do we make its surface reflective? What does that mean? It means that what we see on the surface is light coming from somewhere else in the world. Light is bouncing off this object as opposed to just lighting up the object itself.

Image of light bouncing off ball into eye

We draw the object during the fragment shader phase of the graphics pipeline, and that process means returning a color for the pixel. So... how do we return a color that's based on whatever is around the object? Inside a fragment shader, we don't have any information about the world around us, and so that's the solution - give the shader information about the world around it.

Rendered Cubemaps

The color to return depends on how the world looks around it. Wouldn't it be nice if we could create a cubemap of the world at the location of the object? If we had that, we could reflect the incoming camera's view direction and then sample the cubemap which would give us the correct color.

Image of the world and camera with ball and other object(s)?

Image from above, but camera shooting ray onto ball and labeling reflected vector

Cube-box render of the world above without the sphere

Image of that vector being used in a cubemap of the surrounding world

So how do we create this cubemap? Well, game engines will provide this for us. Again though, what if we are making a demo and have to make this cubemaps ourselves? It's a bit tricky, but we have to do multiple passes - we need to render the scene a few times.

There are a few approaches. We could render this cubemap once right before the demo runs since the world probably isn't going to change (at least not significantly) during the demo. This is a staic (unchanging) cubemap. It's built once and used as needed. We can move the camera around and it all looks beautiful!

Image of the 3D scene with a reflective sphere, showing the stuff around it

Now what if we change the color of the ambient light?

Image of the same above, with different light, but the ball is still reflecting light

The problem is the cubemap isn't responding to the changing light. The cubemap was generated on startup and isn't being created every render frame. For a dynamic (adaptive, changing) cubemap, we need to render the cubemap every frame. It's really expensive to do six separate cubemap renders every single frame (and not easy in a pure vertex and fragment shader). The ultimate solution is to change the render engine from shading to ray-casting. Ray-casting is a huge topic for a separate tutorial though.

Brief Note on Rays

Basic rendering is coloring a model based on the lights nearby. Then we do that for many models and compile the results to make a scene. What if instead we shot physical light-beams from lights onto the models like it works in real life? Then whatever the colors that bounce off the objects that make it into the camera are drawn, but that's a lot of work. Most of the light rays won't ever enter the camera...

What if we shot rays of light out of the camera and then calculating the color based on whatever the ray hits? That's ray-casting in a nutshell. We shoot a single ray of light out for each pixel on the screen and then draw on the screen the accumilation of what it hits based on lights nearby.


Refractions are surprisingly just like reflections above, but with a super minor tweak. Instead of the ray from the camera bouncing off the model, it bends into the model. It's a physical property depending on the type of physical material the object is. For our purposes though, we only need to change a single word of code. Change reflect to refract and that's really it!

Image of the same reflective sphere, but now refractive


© Bitzawolf 2019