So what exactly is light transmission?In 3D graphics rendering, it is the global illu Watch

Badges: 2
Report Thread starter 5 months ago
I've wanted to do an official engine analysis for zelda: dead end of the wilderness for a long time, but I never had time to do it.However, now that Switch has a new video recording feature, I thought it would be a perfect time to revisit the game and share my thoughts via video that I uploaded to Twitter I'll start with a summary of my findings, but I'll also break down each of the technical features at the end of the article to make it easier to understand.I also try to avoid repetition.Something like Digital Foundry has analyzed some of the features of the engine and I won't mention them here.The purpose of this post is to expose more people to technical achievements in the game that others don't bother to investigate. Anyway, here's a summary of the engine's features: • Global Illumination | • Local Reflections | Local Reflections (calculated by Fresnel reflection) Physically based Rendering | • Emissive materials/area lights | • Screen Space Ambient Occlusion | • Dynamic Wind Simulation system | • real-time cloud formation | real-time cloud deformation (affected by wind) • Rayleigh /Mie highlights | Rayleigh /Mie scattering • Full Volumetric Lighting | Full volume illumination • Bokeh DOF and approx. of Circle of Confusion | Bokeh DOF and approx • Sky Occlusion and Dynamic Shadow volume | features features with combined Occlusion and Dynamic Shadow volume Aperture Based Lens Flares | at the same time • Sub surface highlights | surface Scattering • Dynamically Localized from tiny Illumination lighting | dynamic local Lightning • per-pixel Sky Irradiance | Per Pixel • Fog inscatter | Fog light scattering • Particle Lights | • Puddle formation and evaporation | formation and evaporation of water Global Illumination/Radiosity | Global Illumination/ light transmission First of all, I want to make it clear that all the so-called real-time global illumination schemes have been faked in some way

So what exactly is light transmission?In 3D graphics rendering, it is the global illumination approximation of the light reflected from different surfaces and the transfer of color information from one surface to another in the process.The more accurate the transfer of light energy, the more reflected light needs to be calculated in order to transmit the appropriate color.The engine in breath of the wild USES light probes to collect color information on different surfaces near the light probes throughout the environment.There is nothing simulating reflected light, just approximations of the basic colors in a given region.The algorithm that wild life USES to calculate this information is not clear, but my best guess is spherical harmonic functions or something, based on color averages and localization of light energy transfer.Unlike super Mario: Odyssey, the transmission of light energy in breath of the wild is not binary but particle.The lighting information calculated from the light probe appears to be bundled with the LOD system at the same rendering pipeline level, making it extremely efficient.

Observation tip: notice how the rocky cliffs receive green tones from the grass as the camera nears the area.

At first I assumed that there might be spherical harmonics placed throughout the environment to collect color samples, because link seems to update to the base color as he moves through the environment. However, after further investigation, I now know that those basic color reflections are due to the lack of color changes in the environment. When I tested global illumination in an area with many different colored surfaces adjacent to each other, it became clear how the global illumination system worked. Notice how link's color is transferred to all surfaces facing the opposite direction when he touches the red wall. The same is true for the green wall in the opposite direction of the red wall (although the effect is not very strong because the probe is closer to the red wall, so the color of the red wall itself reflects more strongly). In fact, at any given point, this will happen in all directions. The ground transmits color upward, and any ceiling or colored surface directly above link's head transmits color. The probe dynamically samples and transmits colors (we can assume this is reflected light) because the probe picks up more colors for new transmissions and must sample them. Finally, the final result will stop changing because the sample closest to the probe will have the dominant color, regardless of the color shift. The process is orderly but very local and fast. The probe has a limited sampling range and applies these results to materials in the world space. Because of this efficiency, the probe can approximate the effect of many reflected light, but only in the area closest to the probe looks accurate.
This is a very important discovery. (other materials are "dyed" red near the red wall) (other materials are also "dyed" green when near the green wall) Global illumination actually approximates multiple reflections. A light probe on link's head samples the colors of most materials in the environment. Each sampled color is then transmitted and reflected in the opposite direction. Interestingly, intensity is considered to be influenced by the probe's closest surface and the intensity of the reflected light. It may not seem obvious outdoors, but global lighting looks good when there are multiple adjacent surfaces. Local Reflections | Local Reflections So, one area that has always bothered me since I started analyzing the game seems to be a local reflex. There are so many seeming inconsistencies, because my theory first flew around. Now I can say with confidence that I have solved the mystery of how local reflection works. Clearly, this is a three - pronged approach based on specific circumstances. • Specular Lighting | Specular Lighting Sunlight, skylight, lightning, and point sources all fall into this category. At first I thought the same thing was true of temples and towers (since they are self-luminous, I assumed they were regional light sources), but that was ruled out when I saw the very revealing artifacts that the temples and towers displayed. Not all luminous materials can illuminate the environment, and temples and towers can be attributed to those that cannot. • Aperture mapping | reflections If the term sounds new to you, it may be. Based on the game's text dump, the wild breath developers marked their views on unreal engine 4's scene capture 2D reflections. The environment is reflected in this way. The virtual cameras above link's head (aperture, specifically) have a relatively small field of view, so as links move, the reflections (shown in real time) move in their proper space until the aperture captures the environment again. You can see this kind of processing trace and view in video below. • Screen Space Reflections | Screen Reflections Only those that look like laminates use this model, and these are limited to temples. A number in the gloss map tells the engine to use screen space reflections only for these materials. They reflect everything on the screen and can be seen from the incident corners of any material. However, these materials also use aperture mapping to reflect the environment, which is one of the sources of my confusion. The incongruity of these reflections led me to make assumptions about other materials outside the temple. Thankfully, we've cleared this up. Observation tip: see how link's reflection compares to that of the blue light. Link has to be on the screen to show the reflection, and the blue light doesn't have to be on the screen to show the reflection. (screen space reflection + specular highlights) Mystery of local reflection solved! (front walls don't reflect, while side walls do.) Mystery of local reflection solved! Well, the temple materials have an extra layer of gloss and reflection, but they also use the same reflection model for external reflections. No wonder it's so confusing! Using a gloss material, you can capture the reflection of everything in screen space (screen space reflection). Using non-gloss materials (almost all external materials), capture 2D reflections of the scene using almost the same techniques used in unreal engine 4 to capture ambient reflections. Basically, the virtual camera (which has its own visual and field of view) sits directly above link's head, always facing the horizon of the main camera, regardless of link's orientation (which allows for limited off-screen reflections). The captured image is then fed into a reflective material, as if broadcasting a live signal to a television. This means that the image feed is projected in real time at any frame rate (30 frames) that the game is running. This allows different elements of the material to be updated without waiting for a new capture. However, the actual capture screen itself is updated at a much lower frame rate (4 to 5 frames). You can see this as long as the scene captures the camera moving from its absolute position. Before updating the capture reflection, the currently captured image inside the material (for example, water) moves in any direction the camera moves in real time (30 frames). However, once the material receives the updated capture, it corrects the reflection. This correction delay allows us to really understand the capture of updates along the material trace (4 to 5 frames). (the reflection of the bridge column is slightly delayed) As you can see here, the out-of-date reflection can still track link's movements smoothly. There's no caton. The reflection is then corrected when the new capture is updated. This works differently from a reflection map, which updates the reflection only when the map itself is updated. At this point, the captured reflection is clearly out of date, but it still changes its position at a rate of 30 frames. You can see the FOV of the capture camera in the following gifs: (since there is no color reflected on the material at the end of the horizon camera line of sight) Now it makes sense why all non-self luminous materials have only Fresnel reflections. With this reflective technique, these are the only angles it works! I happened upon this arch and realized that it was the perfect setting to measure and capture the camera's field of view: Let's do some basic trigonometry I estimate the horizontal field of view to be about 115 °. The reflection of the arch is off the screen before link passes through it, so we know it's definitely not a 180 degree field of view, because if it were, the reflection of the arch wouldn't be a visual error like this. You can also see that when the camera is a few feet away from the arch and perpendicular to it, the reflection is tilted and proportional to the field of view, which allows us to observe its width. It measures the relative horizontal field of view of the scene captured by the camera. But I want to reiterate that this is only a rough estimate, so I might be about 10 degrees off the top, but it's impossible to use this field of view at some angles, so by excluding, we can at least have an estimate. Physically based Rendering | Before anyone asks, no, that doesn't mean "the material that looks physically right". This is just one way to apply 3D graphics rendering pipes, where all materials (textured surfaces) interact with light in a unique way that changes their behavior. This is what happens in the real world, which is why it's called physics-based rendering. Different materials cause light to behave differently, which is why we can visually distinguish between different surfaces. Traditionally, the rendering pipeline relies on the artist's understanding of how light interacts with different real-world materials and defines the texture map based on that understanding. As a result, there are many inconsistencies between different textural surfaces and how they compare to their real-world counterparts (which is understandable, since we can't expect an artist to have an encyclopedic knowledge of everything in the real world). For PBR, the fundamental principle of light physics is part of the pipe itself, and all textured surfaces are classified as having unique properties that will cause light to behave according to these unique properties. This allows different surfaces to be placed under different lighting conditions and dynamic camera angles, and dynamically adjusts how light interacts with these surfaces. Artists do not have to pre-define this interaction as traditional workflows do. Everything is automatic. Because of the efficiency of the PBR, the developers wanted to make games where all the materials have unique qualities that affect light. In breath of the wild, its PBR has a bit of artistic flair, so you might not even notice that its engine relies on such pipes, because textures don't necessarily have to look realistic. However, it is clear that the BDRFs(bidirectional reflection distribution function) used on materials makes the engine useful for PBR. For each dynamic lighting, its specular highlights (light itself shows part of reflecting surface) and the specular reflectivity/refractive index is based on the incidence Angle (the Angle of the incident light relative to the surface normals) and light interact with the refractive index of any material (when the light contact with the surface, the material of "bending" and how much light) dynamically generated. If the game USES traditional pipes, there is not much difference between the specular highlights allocated between wood and metal. But in this game, specular highlights are entirely dependent on the material the light interacts with. Another key factor that indicates the use of PBR in breath of the wild is the Fresnel(s silent) | Fresnel reflection on all materials. First of all, most games that use traditional pipes don't even use Fresnel reflection, because it's better to just use PBR. As I explained earlier in my discussion of local reflections, the Fresnel reflection becomes visible at the incident coincident Angle (the Angle at which the incident light is almost parallel to the surface on which the observer/camera perspective interacts). According to the Fresnel reflection coefficient, all materials reach 100% reflectivity at the Angle of incidence, but the effectiveness of reflectivity will depend on the roughness of the material. Thus, programmers can distinguish between reflectivity and refractive index. Some materials reflect light in all directions (diffuse materials). Even at 100% reflectivity, 100% of the light may be reflected from the entire surface area, but not all of the light is reflected in the same direction, so the light is evenly distributed, and you don't see any specular reflection (the mirror image around the surface). Other materials only reflect incoming light in the opposite direction (mirror material), so you can only see the reflection at the right Angle, nearly 90% of the light is reflected. Diffuse and specular reflectance the reflectivity of a material is not always 100%, even at an incident Angle, which is why no material can see perfect specular reflection at an incident Angle, even in the real world. The clarity of the Fresnel reflection will vary with the material that produces the reflection. Observation tip: notice how the green light on the barrel wood looks the same from all angles, and this same green light also seems to change the reflection of the metal hoop (the metal circle on the barrel). This one is easy. The material of the luminous object provides a unique light source to illuminate the environment in the same shape as the material itself. These are not point light sources that propagate in all directions, or even simple directional light sources that illuminate in one direction. It is important to note that only global (sun/moon/lightning) sources cast shadows. However, the bidirectional reflection distribution function still applies to all light sources in the game. Observation tip: notice the shape of the light cast by the fire sword. This shape matches the shape of the sword itself, but the intensity of the light will depend on the distance between the sword and the surface it illuminates. Screen Space Ambient Occlusion | In the real world, when light is reflected in the environment, a certain amount of "ambient light" will color the environment, making it completely diffuse. If the shadow is the product of blocking the object of direct sunlight, then the ambient light shielding can be considered as the product of the gap blocking the ambient light in the environment. The scheme used in breath of the wild is called SSAO(ambient shading of screen space) because it computes ambient shading of screen space and depends on the viewpoint. It receives ambient light only when it is perpendicular to the camera. Observation tip: when viewed from the front, look for dark, shaded noise-mode effects in the gaps in the walls. The same noise pattern outlined link's profile from this Angle. Dynamic Wind Simulation system | This one surprised me because I had no idea it would be so powerful. Basically, the physical system is related to the wind simulation system. It is completely dynamic and affects different objects according to their weight. The most prominent objects affected are grass and cloud generated by the program. Observation tip: if you look closely, you can see how the directional flow of grass and clouds matches the direction of the wind. Real-time cloud formation | real-time cloud formation This game doesn't use traditional sky boxes in any sense. The cloud is programmatically generated based on the parameters set by the engine. They cast shadows in real time. They receive light information based on the position of the sun in the sky. As far as I know, the cloud is considered the actual material in the game. They're not volumetric clouds, so you don't see any gap light or anything like that, but they're also not skybox clouds. They are also shaped by wind systems. Observation tip: notice how cloud particles in the sky randomly cluster together Rayleigh, Mie, | In the real world, when light reaches the earth's atmosphere, it is scattered by air molecules, creating earth's blue sky, because shorter wavelengths of blue light scatter more easily than other colors. However, as the sun nears the horizon, it has to pass through more of the atmosphere, causing most of the blue light to scatter when the light reaches the viewer's glasses, leaving longer wavelengths of orange and red light to reach the naked eye. Wild breath mathematically approximates this algorithm (I actually found it earlier this year in text dump code!). Apparently this algorithm also explains the mi scattering that allows fog to appear in the sky. To be honest, if I hadn't looked at the code in the text dump, I would never have thought of emulating this phenomenon in the game. It's easy to fake this effect. However, after observing the reflection of the sky in the water, it all made sense. This scattered light bounces back into the environment in real time. A simple sky box would make this impossible. Observation tip: notice how different shades of orange and red in the sky reflect the same color on the environment. Although this is not shown in the GIF, the scattered light in the sky also illuminates the environment and water surface in other colors, depending on how the light is scattered. Observation tip: notice how the color of the snow changes as the sun sets. Observation suggestion: at the beginning of this GIF, water has at least five different reflections. Temple (blue), hill (green), flag (black outline), sky (orange) and sun (pink).

Quick Reply

Attached files
Write a reply...
new posts
to top
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.


People at uni: do initiations (like heavy drinking) put you off joining sports societies?

Yes (422)
No (204)

Watched Threads

View All