Encyclopedia of Color Science and Technology

2016 Edition
| Editors: Ming Ronnier Luo

Rendering Natural Phenomena

Reference work entry
DOI: https://doi.org/10.1007/978-1-4419-8071-7_171

Synonyms

Definition

In computer graphics, rendering is a process of synthetically generating an image – or a sequence of images – of an object, based on its mathematical and possibly physical description. Natural phenomena are inherently very diverse, and therefore, their rendering is a very heterogeneous field with different paradigms and approaches.

As the name implies the objects and phenomena of interest will mainly have a natural origin. However, due to similar characteristics of the underlying problems, rendering of artificial objects made of natural materials is often considered a part of the field as well.

Although the main target of rendering is the creation of images, it is usually not trivial to obtain the mathematical and physical description of the simulated entities (i.e., the input data). Because of this the rendering algorithms can potentially require coupling with simulation methods, which provide means to computationally generate the required data. Alternatively, acquired or even hand-modeled data can be used, in case the simulation proves to be infeasible for some reason (e.g., too time-consuming).

Categories of Natural Phenomena

The richness of the natural environment of course implies the existence of a tremendous amount of phenomena and objects. The structure of this system is hardly apparent, but can roughly be classified according to the physical characteristics of the phenomena as follows (also see examples in Fig. 1):
Rendering Natural Phenomena, Fig. 1

Examples of sparse, fluid, and solid phenomena

  • Sparse phenomena, involving gases, aerosols, and vapors but also effects of electromagnetism or high-energy particles. These include various well-known meteorological phenomena like atmospheric scattering, clouds, fog, rainbows, or lightning; astronomical phenomena such as auroras, stars, nebulae, and other stellar bodies; and also smaller-scale phenomena like fire, smoke, and dust.

  • Fluid phenomena, most notably oceans and other large water bodies and effects associated with their surfaces, such as waves and streams. Also medium- and small-scale liquid substances, especially beverages like milk, fruit juices, and coffee; suspensions like blood, paints, and inks; and volcanic phenomena such as lava flow. In addition, especially from the perspective of simulation methodology, it is possible to regard fine-grained solid materials like sand and partly solid substances such as gels, as fluid phenomena.

  • Solid objects and phenomena. On a large scale, primarily geological formations ranging from mountains to entire planets. On a medium scale, organic entities like vegetation, biological tissues, hair, and fur and inorganic objects such as ice and rock formations, crystals, metals and their alloys, and man-made objects manufactured from these. Among small-scale solid objects, most effort has been focused on rendering precious gems. Additionally, rendering of natural and artificial solid objects exhibiting a layered structure has attracted research attention, for example, coated or painted objects, oxidized and patinated metals, composite materials, gemstones, and many others.

From the perspective of rendering and also modeling, another important distinction can be made between phenomena and objects with well-definable geometry and opaque surfaces and those with a dominating volumetric character (either from the spatial or the optical point of view).

The first group will likely be modeled using geometric primitives and rendered with algorithms suitable for distinguishing between discrete parts of the phenomenon, e.g., the object-air interface. The light interaction will primarily be taking place on these interfaces, mainly as reflection and refraction. Most opaque solids belong into this category but also some fluid substances, such as clear liquids.

On the other hand, the second group will likely utilize a volumetric representation (or a combination with a geometric one) and naturally also volume-rendering methods. Materials with these properties are called participating media, and the dominating optical interactions here will be scattering and absorption. These characteristics are inherent to virtually all sparse phenomena but also to most fluid and many solid ones.

This distinction is naturally relative and without a sharp transition. As an example, one can imagine that a small stone grain will be translucent and have an apparent volumetric structure, while a large piece will appear opaque and will interact with light mostly by diffuse surface reflection (Fig. 2).
Rendering Natural Phenomena, Fig. 2

A small quartz crystal and a large piece of sandstone

Causes of Color

Another important aspect of natural phenomena from the perspective of color science is how they interact with light and produce color and the overall appearance [1]:
  • Emission, due to incandescence (e.g., stellar radiation, volcanic activity, lightning), gas excitation (auroras, artificial gas lamps), or a combination of these (fire). The variation of color is primarily caused by varying energy density. Virtually all natural light originates in these processes.

  • Geometric causes, such as scattering (atmosphere, clouds, smoke, milk, and generally almost all substances exhibiting volumetric properties or diffuse reflection), dispersion (rainbows, sundogs, snow), diffraction (opals, thin filament-like objects such as spider webs and hair, carapaces of certain bugs), interference (single- or multilayered structures such as bubbles and certain gemstones and insects), and polarization (specular reflection from smooth surfaces). These interactions are usually elastic, i.e., they conserve energy, and the color variation is mainly caused by geometric configurations and relations.

  • Absorption and reemission, mostly in organic compounds (plant and animal tissues, natural and artificial pigments, dyes, and inks) and all metals but also in certain minerals and semiconductors and occasionally in light liquids, most notably in pure water. These interactions are usually inelastic, leading to energy loss described by the Beer-Bouguer law. Color variation is caused by relative efficiency of the reemission in different wavelengths.

Historical Overview

Synthesizing physically plausible images of natural phenomena has arguably been one of the primary foci of rendering from the beginning [2]. However, the complexity of most natural phenomena prevented their plausible rendering until the early 1980s, mostly because of the lack of theoretical understanding and computational power.

A significant milestone was reached by introducing radiosity [3] and stochastic ray tracing [4] algorithms in 1984, enabling physically based simulation of global illumination effects. A more general solution was then presented in 1986 [5] by James Kajiya in the form of the rendering equation – a unified mathematical framework that enables the simulation of all effects that conform to geometric optics. These approaches were then extended to rendering of volumetric phenomena in the late 1980s and early 1990s.

Algorithmic improvements have gone hand to hand with the evolution of computing hardware, making rendering of image sequences and even entire movies feasible by the end of the 1980s. However, probably the biggest breakthrough came with the introduction of parallel programmable graphics processor units (GPUs) to the consumer market in 2001. Their programmability enables researchers and developers to produce specialized algorithms focused on simulating and rendering effects of diverse nature. Programmable GPUs quickly became widespread and enabled the development of algorithms and applications capable of simulating numerous natural phenomena, including video games and other interactive applications.

Rendering Methodologies

The traditional view is to split (realistic) rendering as a field into noninteractive (“offline”) rendering [2] and interactive rendering [6], with the former putting emphasis on quality and the latter on speed. This applies to rendering of natural phenomena just as well. The spectrum of possible approaches is however much more continuous than that and rather depends on the paradigm chosen for designing a particular rendering method (refer to Figs. 3 and 4).
Rendering Natural Phenomena, Fig. 3

A comparison of interactive (empirical versus physically based) methods for rendering atmospheric scattering and clouds. The empirical approaches are typically faster, but produce less dynamic and believable results and in some cases require additional artistic input (Image credit: Petr Kellnhofer using the model of Sean O’Neil, (top ‘sunset’ pair), Niniane Wang (top ‘clouds’ pair)

Rendering Natural Phenomena, Fig. 4

A comparison of empirical and physically based reflectance models. Top row: The Lambert and the (physically based) Oren-Nayar models on a diffuse gray clay material. The Oren-Nayar model correctly produces a “flatter” distribution, which is caused by backscattering on the rough clay surface. Bottom row: A glossy object made of gold, reflecting an environment map. The Torrance-Sparrow model produces a more plausible reflection, which is mainly visible in grazing angles where the Phong model incorrectly produces darker reflections in disagreement with the Fresnel law. In addition, the Phong model fails to reproduce the yellow hue of gold, since it does not handle conductors properly. This could only be obtained by additional manual tweaking

Empirical or phenomenological methods are generally top-down; their design starts by collecting observations of a phenomenon to be simulated and then proceeds to invent an arbitrary algorithm which best reproduces the phenomenon while keeping computational costs as low as possible. As such, empirical methods are used mainly in interactive applications where the important factors are visual plausibility, robustness, and speed, while physical plausibility and radiometric accuracy are secondary. Notable examples include:
  • The Lambert reflectance model (1760) states that reflection from a sufficiently rough surface is perfectly diffuse, i.e., isotropic with regard to observation direction. Despite building on physically meaningful assumptions (multiple scattering underneath the surface) and the fact that many materials (such as uncoated paper) behave very closely to the model, there is no ideal diffuse reflector.

  • The Phong reflectance model (1975) approximates reflection from glossy surfaces by the cosine function raised to an exponent proportional to the surface smoothness. This leads to reflection in the form of a blurry circular patch which gets brighter and smaller with increasing smoothness and in the limit case corresponds to the Dirac function for a perfect mirror. Despite having no physical basis, the model has been widely adopted and is still used, mainly thanks to its simplicity.

  • Particle systems (1983) are ubiquitously used in both interactive and offline applications to model volumetric phenomena (such as clouds). In this sense, particles are small semitransparent entities often modeled by mapping a texture onto a rectangular geometric primitive that always faces the observer. If used in sufficient numbers, they are able to mask their discrete nature, while still being cheaper than the corresponding full volumetric representation (e.g., a 3D voxel grid). However, simulating global illumination effects in conjunction with particle systems requires additional effort, which might hinder their utilization. As a result it is often necessary to design another empirical model to compute illumination in an intended way, increasing the necessary amount of work.

The main downsides of empirical methods are that they often require a lot of artistic supervision during the content creation process and that they seldom work outside the range of phenomena they were designed for.

Physically based methods proceed in the opposite, bottom-up fashion. Using physically meaningful data, these methods apply the laws of physics to render the target phenomenon. Consequently, rather than being centered around a particular simulated phenomenon, these methods share more principal similarities and are usually useful for rendering a whole class of phenomena to which the implemented laws and assumptions apply. On the downside, their computational cost is often much higher than that in empirical approaches. This is usually treated by applying Occam’s razor, simplifying the rendering algorithm in ways that decrease the simulation time but do not negatively affect the result. Examples of such methods include:
  • The Torrance-Sparrow reflectance model (1992) simulates reflection from rough glossy surfaces (and as such being an alternative to the Phong model). The method considers the simulated surface to consist of microscopic facets oriented according to a certain statistical distribution (i.e., Gaussian distribution). Each facet is assumed to reflect light according to the Fresnel law, and additionally, probabilistic shadowing of neighboring facets is taken into account. Although in some situations the model produces results similar to the Phong model, it is energy conserving, correctly handles objects made of conductors, and behaves plausibly in grazing angles, albeit being somewhat more difficult to understand.

  • Photon mapping (1996) [7] is a general framework for rendering global illumination effects. It is applicable to both surface and volume illumination, which makes it especially suitable for rendering natural phenomena. Its main idea lies in distributing the light energy by shooting and tracing small energy particles, photons. Every interaction of a photon with the simulated environment is recorded, and when all illumination energy is distributed, the algorithm calculates the energy density at observed locations by performing local photon density estimation. Much work has been invested into improving the original technique, resulting in one of the most versatile global illumination algorithms to date.

Originally, physically based methods were only used in offline applications. Recently, however, with the increasing performance and flexibility of the programmable hardware, the boundary between offline and interactive methods has gotten weaker. Many physically based rendering algorithms (such as ray tracing or photon mapping) can today be implemented to run interactively, albeit with decreased rendering quality.

Finally, predictive methods represent a step further than physically based methods. Opposite to empirical approaches, their primary focus is physical correctness and radiometric accuracy. The utilized algorithms must be spectral, unbiased, and support all significant physical phenomena that occur in the simulated environment. They require measured and acquired input data, and the results usually need to be viewed under controlled conditions. The resulting methods typically need validation and are generally very slow, even compared to physically based approaches. As such they are useful mainly in virtual prototyping applications such as in automotive industry, architecture, and gem processing.

Phenomena Simulation

As mentioned in the “Introduction,” in addition to rendering a phenomenon (and thereby producing an image of its momentary appearance), it is often necessary to obtain data about its spatial and temporal development prior to the rendering. Virtually all natural phenomena are dynamic in a certain sense, and this dynamicity can be regarded in the short and the long term.

Short-term development usually stems from the character of the phenomenon itself and its internal dynamics. Its nature can be synthetic (e.g., condensation of water vapor leading to cloud formation), evolutionary (for instance, flow of liquid particles in a stream), or destructive (such as cracking or shattering of an iceberg). Capturing this behavior will often entail using discrete particle or continuous dynamics.

On the other hand, long-term development is usually related to interactions of a phenomenon with its surrounding environment [8], sometimes referred to as weathering (in case the process is destructive). Many forms of this behavior exist, for instance, terrain formation, soil erosion and cracking, oxidation and patination of metals, organic tissue decomposition, and others.

From the methodological point of view, approaches for simulating natural phenomena can again be divided into empirical and physically based. Very often, however, a combination of these two shows to be the best compromise. The cause of this is arguably the fact that while it is often very difficult to devise an empirical method that captures the entire complexity of a given phenomenon well, a complete physically based simulation of any larger system can easily become intractable. Consequently, many approaches simulate the global behavior of the target phenomenon in a physical, possibly simplified way (for instance, using a reduced resolution of the simulated structure) and then add the remaining details empirically. The most frequent way to perform the latter is using fractal functions or systems:
  • Terrain rendering methods usually generate the overall terrain morphology by simulating the orogenetic and erosive processes (or simply use satellite data) and add more detailed features by random fractal perturbations.

  • Rendering of plants frequently employs the so-called L-systems to generate the plants and trees. L-systems are iterated functions described by grammars that imitate branching in real plants. Adding random perturbations into the system can produce plausibly looking plants with unlimited number of variants that retain the overall character defined by the L-system.

  • Rendering of ocean waves can generate larger-scale waves with a fluid dynamics simulation or synthesizing them from trochoidal wave theory using measured wave frequency spectra and then again add random fractal perturbations to break disturbing repetitive patterns.

Despite the convenience of fractal functions, their inherent iterative nature makes them difficult to use in some cases. Therefore, efforts have been made to design regular functions with fractal properties. The most successful method to do so is the Perlin noise [9], which has become a cornerstone for generating many diverse natural phenomena and also fuelling future development in this area (see examples in Fig. 5). Since Perlin noise behaves as a regular function, it enables on-the-fly evaluation without requiring additional storage, making it feasible for use even in interactive applications.
Rendering Natural Phenomena, Fig. 5

Multiple superposed frequencies (octaves) of Perlin noise (top left) and examples of procedurally generated content – terrain, forest, and fire (Image credit: Franck Doassans (forest), Sergei Bolisov (fire))

It might of course be possible to acquire the data needed for rendering a given phenomenon, although in some cases this can prove difficult or impractical for quantitative reasons:
  • The phenomenon of interest might be too large and as a result produce overwhelming amounts of data. A good example is acquisition of terrains, which, although today possible via satellite scanning, still does not produce data with resolution sufficient for some applications.

  • Dynamic phenomena are generally difficult to acquire, especially in cases when the scanning process is slower than the rate of the phenomenon’s significant change. Acquisition of flames and smoke is a good example here.

  • Scanning processes yield limited amounts of instances of the target phenomenon. If many such instances are needed (e.g., in cloud rendering), the cost of the acquisition process might become prohibitive.

Similar to physically,based simulation, however, these problems can in some cases be overcome by applying procedural perturbation techniques to the scanned data.

Future

The present knowledge in physics theoretically allows us to explain and hence simulate virtually all observable natural phenomena. The limiting factors in doing so are therefore always computational resources. In the future, the increasing memory density will enable us to work with larger natural systems, and the growing parallel computing power of CPUs and GPUs will allow simulation and rendering of more complex phenomena. Especially in interactive applications, the current trend of using physically based approaches over empirical ones will most likely continue.

Cross-References

References

  1. 1.
    Nassau, K.: The Physics and Chemistry of Color, 2nd edn. Wiley-Interscience, ISBN 0471391069 Hoboken, New Jersey, USA (2001)Google Scholar
  2. 2.
    Pharr, M., Humphreys, G.: Physically Based Rendering, 2nd edn. Morgan Kaufmann, ISBN 0123750792 Burlington, Massachusetts, USA (2010)Google Scholar
  3. 3.
    Goral, C.M., Torrance, K.E., Greenberg, D.P., Battaile, B.: Modeling the interaction of light between diffuse surfaces. SIGGRAPH Comput. Graph. 18, 213–222 (1984)CrossRefGoogle Scholar
  4. 4.
    Cook, R.L., Porter, T., Carpenter, L.: Distributed ray tracing. SIGGRAPH Comput. Graph. 18, 137–145 (1984)CrossRefGoogle Scholar
  5. 5.
    Kajiya, J.T.: The rendering equation. SIGGRAPH Comput. Graph. 20, 143–150 (1986)CrossRefGoogle Scholar
  6. 6.
    Akenine-Moller, T., Haines, E., Hoffman, N.: Real-Time Rendering, 3rd edn. AK Peters, ISBN 1568814240 Natick, Massachusetts, USA (2008)Google Scholar
  7. 7.
    Jensen, H. W.: Global illumination using photon maps. In: Proceedings of EGWR, pp. 91–100 Porto, Portugal (1996)Google Scholar
  8. 8.
    Dorsey, J., Rushmeier, H., Sillion, F.: Digital Modeling of Material Appearance. Morgan Kaufmann, ISBN 0122211812 Burlington, Massachusetts, USA (2007)Google Scholar
  9. 9.
    Perlin, K.: An image synthesizer. SIGGRAPH Comput. Graph. 19, 287–296 (1985)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Computer Graphics DepartmentMax-Planck-Institut InformatikSaarbrückenGermany