Hi everyone. For the past couple of weeks I have been mostly working on Monter’s movement system, but it’s still heavily WIP so I can’t write about it just yet. What I did finish though is a simple sky shading pass. It turned out quite nice because it gels well with Monter’s simple low-poly art style. So I’m going to write about that instead.
Considering how to draw a sky
Skybox has been a popular choice for many years. It’s a technique that packs the pre-rendered skydome into a cubemap, which is a texture with six faces. A cube geometry is then passed to the renderer with the cubemap texture on.
However, most skybox images on the internet go for the realistic look, which is still pretty crappy. Viewers can easily tell they are fake because they are static images. Since Monter is going for a minimalistic look, I decided to render sky procedurally; a simple hemispherical gradient will do.
Setting it up
So instead of having some sort of preset geometry representing the sky, I want to project rays out of the viewer and render the sky that way. It gives me a finer control of what color to put at each pixel. And this sky shading pass is going to happen before anything else, so anything that’s rendered in the scene will override the sky pixels.
The first thing to do is to generate some view rays for each pixel on the screen. To do that I will first need to pass two triangles to the vertex shader as my view plane. It’s going to cover the entire screen. Here’s a visualization of the fullscreen quad made out of triangles in OpenGL’s normalized device coordinates:
I’m taking advantage of OpenGL’s rendering pipeline here by letting it rasterize these two primitives. After they get rasterized into a bunch of fragments (or pixels), I get to decide how to shade them.
For each pixel, I can get its coordinate on the screen and then use that to project a view ray out through the view plane. P.S. the following explanation uses a left-hand coordinate system.
Firstly, I computed the view rays on all four corners. Here’s an illustration:
Imagine the camera is a single point, viewing the sky through a plane. In order to know how big this plane is, we need to set the distance between camera and the plane, called depth. I set it to 1 here. The depth can actually be any value because the ray will be normalized. Another important parameter is the FOV (field of view). It’s an angle in radians that represents how wide the eye can see. After obtaining the FOV and depth, it’s clear how the dimensions of this view plane can be computed by using the equations above.
With these four corner view rays computed, we can calculate the rest of the view rays at any arbitrary pixel location just by linearly interpolating the corner rays. Luckily we can again exploit the graphics hardware to do this for us. Lastly, they just have to be normalized to unit vectors.
After that, we have a set of view rays that cover the entire screen, but they all point at the Z direction. In order to point these rays at player’s view direction, I multiply them with the inverse of the rotation component of the view matrix to convert them from view space to world space.
Shading the sky
As I said before, I’d like to shade the skydome as a gradient that interpolates along the hemisphere hovering above the ground.
The skydome gradient:
Say we project each view ray out into the skydome and intersects at point P. The height of this P can be used to determine what color this pixel is. The closer P is to the ground, the whiter the pixel is and vice versa (it doesn’t have to be white and blue, just two extreme colors with one representing the color at the top of the skydome, and another the bottom color of the skydome).
Since I normalized view rays, I know the Y component of each ray is in the range of [-1, 1] and is suitable to be used as the percentage value in the lerp() function. Furthermore, I can add an exponent to the percentage value to have a finer control over the gradient transition.
A sky gradient after some tweaking:
Drawing the sun
Adding a sun to the sky shading pass is pretty simple since everything is set up. All we need to do is to compute the angle difference between the current view ray and a ray from the viewing point to the sun. This can be easily done with a dot product. If the dot product is bigger than a certain threshold, we shade that pixel as a part of the sun, otherwise it’s just a part of the sky.
A cone for capturing the sun on the sky:
Here’s the sun drawn in the sky:
It works, but if you look closely, the edge of the sun is quite jagged.
Here’s the code that draws the sun with jagged edge:
vec3 SkyColor = ComputeSkyColor();
if (dot(ViewRay, SunRay) > 0.999)
SkyColor = SunColor;
This is equivalent to doing:
vec3 SkyColor = mix(ComputeSkyColor(), SunColor, step(0.999, dot(ViewRay, SunRay)));
Here we can replace step() with smoothstep(), so that the intermediate area that connects the sun and the sky is interpolated. It’s essentially anti-aliasing the edge by blurring it.
vec3 SkyColor = mix(ComputeSkyColor(), SunColor, smoothstep(0.998, 0.999, dot(ViewRay, SunRay)));
As a result, we get a much fuzzier sun with no jaggies:
Finally, combine the sky shading pass with bloom effect:
I say it looks good enough for a first pass. :)