Moving the Heavens: An Artistic and Technical Look at the Skies of The Last of Us: https://www.youtube.com/watch?v=o66p1QDH7aI
Moving the Heavens: An Artistic and Technical Look at the Skies of The Last of Us: https://www.youtube.com/watch?v=o66p1QDH7aI
In short words, atmospheric scattering is the process of scattering away light when it’s traveling from a light source to a point. The light arriving the point is the result of multiplication of light at the light source and the transmittance between the light source and the point. Transmittance is related to the average atmospheric density (optical depth) between the light source and the point and the scattering constants, i.e. the exponential of optical depth multiplies scattering constants.
Atmospheric scattering is used to simulate sky color. Simulating sky color integrates atmospheric scattering to the light traveling process from sun to any view point in the atmosphere. Specifically, the sky color at any view direction from a view point in the atmosphere is the integration of in-scattered light at each sample on the view ray (starts from the view point and cast towards the view direction) to the view point. The light at each sample is the in-scattered light from sun to the sample. Chaining the process with a ray marching algorithm, the sky color at a specific view direction from any view point in the atmosphere can be approximated in the following steps:
After all samples in the view ray direction are iterated by the ray marching algorithm, the final sky color is obtained from the 9th (ix) step.
Figure 1: a C++ program simulating hand-drawn strokes.
Figure 2: reference of hand-drawn strokes. From movie “The Little Prince”
So, it’s possible to “draw” crappy strokes by programming! Figure 1 is generated by a C++ program I wrote to simulate the strokes in Figure 2.
The idea is simple: define the size of a stroke with width and height, then randomly generate the starting point and direction of the stroke within the size. Finally, draw the stroke in an image by rasterizing the line. While drawing the stroke, jitter the pixel to be rasterized, and draw more pixels stretching towards sides of the jittered pixel with a random width. The intensities of these pixels are randomized.
Figure 1 is generated by drawing 128 strokes sizing 400/50 in an image of size 800/600.
Human emotion is sensitively driven by lighting effect. To trigger different human emotion, getting the right lighting effect is the first step.
Color to Emotion:
Neuroscience:
Color:
As shown in the above figure, the color of sky after sunset from apex to center is blue, orange, blue. With only the sun as the light source, there shouldn’t be blue from the apex. Because, as shown in the following figure, sunlight diminishes as photon’s traveling distance increases (e.g. with the green-blue sun, the traveling distance increase from a to b), which changes from blue to red. My question is, why’s there still blue color from apex? Are there other light sources other than the sun?
Based on the definition of coordinate system of ray marching point in atmosphere in a previous article, this article talks about how to transform the space from one point to another during ray marching.
For a given point p, whose coordinate in its own space is (0, r + h). Starting from p, if we move p to p_prime along a direction with vertical angle theta (in p’s space), we get p_prime in p’s space. Then how to get p_prime’s coordinate and the direction in p_prime’s space?
So that’s the basic idea. In practical (i.e. in glsl shader), the input is p represented by a normalized altitude and a normalized vertical angle. The output should be p_prime with its normalized altitude and normalized vertical angle in its own space. To do this, we need to:
Now let’s do 4) and 6).
In conclusion, the steps for computing p_prime’s normalized altitude and normalized vertical angle in its own space with p’s normalized altitude and normalized vertical angle are:
4), 1), 2) 3), 5).
#1 In practice, what we have is on the left: a sky dome holding vertices of the sky and the position of the eye.
#2 What we want from #1, is getting the altitude and vertical angle ‘theta’ of the eye as shown on the right. So, how to get the right from the left??
#3 The simple answer of #2 is getting the value of the y axis of the eye position in the sky dome space as the altitude, and getting the vertical angle of the eye ray starting from the eye position in the sky dome’s space to a sky vertex. This would result in the following graph as the colors of the sky, and arise an problem:
When the eye is located at the sky dome center, with a given field of view, i.e. fov, the fragment color is obtained by sampling the center of the fragment. This gives detailed sky color when the eye is looking towards the apex of the sky dome, and coarse sky color when looking towards the center of the sky dome, because of the ellipse-shaped sky is sampled evenly by the same-sized fragments. The color detail of the sky reduces as the the eye direction goes from the apex to the center. This situation applies the same when the eye is located above the sky dome center.
One solution to enrich the color details close to the center of the sky dome is sampling the fragments close to the center of the sky dome. The number of samples per fragment is proportional to the angle between the eye ray and the vertical axis, i.e. Y axis. However, sampling the fragment is optional as the quality of the sky color could be good enough with the regular sampling of the fragments in GLSL.
#4 With the discussion in #3, we can get the altitude and vertical angle of an eye ray by:
So far, the only problem left is computing the matrix as discussed in #4.1.1. The known facts are:
Then the eye position in the sky-dome space is: (X_e – X_s) / r. The sky-dome space shares the same directions of the axis of the world space. The result can be approached by the multiplication of a scaling matrix, S, and a translation matrix, T: ST. Where:
T and S in this case is row major. With glsl the matrix should be converted to column major, which is:
[1, 0, 0, 0,
0, 1, 0, 0,
0, 0, 1, 0,
-x_s, -y_s, -z_s, 1]
[1/r, 0, 0, 0,
0, 1/r, 0, 0,
0, 0, 1/r, 0,
0, 0, 0, 1]
[1/r, 0, 0, 0,
0, 1/r, 0, 0,
0, 0, 1/r, 0,
-x_s/r, -y_s/r, -z_s/r, 1]
First glance of the result: the graph shows the result of surface fitting for optical depth of blue color for altitudes and vertical angles deviated from the vertical angle:
Will add the rendering result in Blender for sky with the math equations..