SIGGRAPH 2019

SIGGRAPH 2019 was a fantastic event, full of inspiring people and astonishing technology. The presence of Unity and Epic Games was large this time. And the work of Unity’s research scientist Eric Heitz was cited everywhere.

This is an informal report of the talks that I attended.

Real-time rendering in the industry

Advances in Real-Time Rendering in Games: Part I & II: This was a day-long course organized by Unity. It featured presentations by major players of the gaming industry, like Unity itself, Rockstar (Red Dead Redemption 2), EA/Frostbite, and Sony Santa Monica (God of War). Slides are slowly becoming available here. Slides of previous years are available here. These weren’t just high-level descriptions, but deep-dives with code snippets, equations and diagrams. What follows are my impressions and key takeaways. I confess that I didn’t understand much of the content, so I won’t try to reproduce or summarize it here; the abstracts here are a much better way to get a sense of the contents of each talk.

  • A Journey Through Implementing Multiscattering BRDFs and Area Lights (Ubisoft): I learned that SIGGRAPH has hosted a series of courses on Physically-based shading over the years, not real-time necessarily.

  • Leveraging Real-Time Ray Tracing to build a Hybrid Game Engine (Unity): By hybrid one means rasterization + ray tracing, and one of the main points the presenter tried to make was that ray tracing alone is not always the best; he even has a rather impassioned blog post here where he tries to deflate the current ray tracing craze. Unity’s High Definition Render Pipeline, or HDRP, is a C# API that allows users to build their own rendering pipelines; the Reality vs Illusion demo was produced with it, it intersperses real-time ray-traced animation with real footage, and it is absolutely gorgeous (at a later talk, a Unity engineer revealed that the demo has major flaws, especially when it comes to accurate reflections, despite its evident beauty); that’s Charles Bukowski there. Physically-based lighting is at the core of the HDRP and allows for 5 kinds of “effects”: Ambient occlusion, indirect diffuse, indirect specular for opaque object reflections, recursive tracing for transparent object reflections, and stochastic area shadows. For sampling-based integration, which is at the core of path tracing, he referred us to the work of Eric Heitz. The presenter walked us through each and every step of the render graph. There’s a video of the exact same presentation, but at the Digital Dragons conference here (slides). Keywords: Ray binning, ray budget, multi-scattered BRDF, area lights and linearly transformed cosines (LTC).

  • A separate talk titled Leveraging Ray Tracing Hardware Acceleration in Unity delved into the details of the design of Unity’s HDRP. Slides aren’t available yet, though.

  • Strand-based Hair Rendering in Frostbite (EA): Frostbite is EA’s game engine. Rasterization only. Deep Opacity Maps is the state of the art technique for rendering hair in real time and it solves 3 challenges: Single hair scattering, azimuthal scattering, multiple scattering (hair strands); I have no more info on each. EA promised to release a series of technical blog posts over the coming months starting here. Keywords: Deep opacity maps, hair cards, triangle strips.

  • Interactive Wind and Vegetation in ‘God of War’ (Sony Santa Monica): The presenter emphasized the iterative nature of the development of most complex systems, with GoW’s wind system being a prime example. The effects of wind (on vegetation as well as on the character’s hair, beard and clothes) and those of the interaction of the character with its environment are subtle, but create an immersive and believable experience for the player; emphasis on the word believable, rather than realistic, as striving for realism in such complex systems is bound to reach a point of diminishing returns with the current state of technology. An interesting point was the concept of debug visualization: Little tools that the developer writes to visualize the behavior and effects of the system; slides 33 and 34 here contain videos of examples of debug visualization. Keywords: Wind vectors, wind motors, wind receivers (audio, cloth, particles, and meshes), wind mask, logarithmic binning, weather flow maps and flow flips, billboard clouds, cards and clusters, Beaufort scale, particle ribbon trails.

  • Multi-resolution Ocean Rendering in Crest Ocean System (Electric Square): Ocean simulation for Unity. This was a promising talk that disappointed: Most other presenters used beautiful or interesting interactive visualizations or videos. This person just talked and showed a few images. Pirate Cove scene video. Repo.

  • Creating the Atmospheric World of Red Dead Redemption 2: A Complete and Integrated Solution (Rockstar): This was my favorite talk of all the conference. The focus was on volumetric rendering of fog, clouds, rain clouds, lightning, and even rainbows (although rainbows, they said, are not physically-based) for vast natural environments. Voxelization is used to render at the near distance and ray marching at the far distance. The slides contain footnotes, check them out. Keywords: Ray marching, raymarch reconstruction, ray length, voxelization, volumetric scattering, transmittance, material extinction parameter, extinction volumes, Perlin and Worley noise, ambient light, checkerboard min/max depth downsampling.

Open Problems in Real-Time Rendering: Part I & II: This was a day-long series of presentations where prominent figures (Natalya Tatarchuk, Matt Pharr) and experts of the field talked about some of the problems that remain open in real-time rendering. Each year the focus is different and this year it was physically-based lighting with real-time constraints. Slides will appear here eventually. What follows are the talks that I attended.

  • 2018 saw the arrival of the first ray-traced AAA games: Control and Shadow of the Tomb Raider, to name a few. However, none of them are fully ray-traced: Q2VKPT (Q is for Quake, VK is for Vulkan, PT is for Path Tracing) is “the first playable game that is entirely raytraced and efficiently simulates fully dynamic lighting in real-time, with the same modern techniques as used in the movie industry… Q2VKPT is the first project to implement an efficient unified solution for all types of light transport: direct, scattered, and reflected light”. The code is open source.

  • Game Engine Design (Tatarchuk, Unity): Games have 10x more geometry than before. Shader code is getting huge. Shader/material permutation compilation explosion, exacerbated by ray tracing. Artists are forced to limit the number of lights that participate in light transport. Texture LOD management and BVH management need to improve. Keywords: Ubershaders, Lambert lobes, shader kernels, artist graphs.

  • Path Tracing for Future Games (Pharr, NVIDIA): The talk started off with an important clarification: Ray tracing and path tracing are not different names for the same thing. Path tracing is a “unified light transport algorithm based on random sampling (Monte Carlo integration)”. Ray tracing is a “geometric algorithm that computes ray-based visibility”. Then he went on to explain path tracing in detail and give examples of notable work out there (ray-traced Minecraft, ray-traced reflections in EA’s Battlefield V). The open problems he discussed were: Ray coherence/incoherence; better data structures for visibility; denoising; BSDF and microfacets; sampling well under sample reprojection; does sorting rays really improve performance?; is there a ray tracing equivalent to conservative rasterization? He is especially interested in the last one. He praised 2 SIGGRAPH ‘19 papers: Distributing Monte Carlo Errors as a Blue Noise in Screen Space by Permuting Pixel Seeds Between Frames (Heitz, paper) and Volume Path Guiding Based on Zero-Variance Random Walk Theory (Herholz, described in this doc elsewhere). Keywords: HDRi environment lighting Monte Carlo estimator, importance sampling estimator, importance sampling diffuse direct lighting, direct lighting estimator, environment map luminance, path guiding, point-sample distributions, progressive sample sequences, ray frusta, conservative occluders.

  • Scaling Light Complexity (Karis, Kelly, Epic Games): There is a limit to the number of direct dynamic lights that can be used; dynamic lighting can only scale when computation becomes sublinear in the number of lights. Baked light maps are being used instead (Precomputed Global Illumination, Frostbite, slides). A 2nd problem he discussed was that of maintaining more than 25 shadow methods in UE4; a single one can’t be used because most modern methods are supported only by the latest hardware. Keywords: HL2 basis, directional basis, spherical harmonics, froxels, proxy light, area lights, LOD with shadow proxies, shadow caching, dynamic lights, baked light maps.

Path tracing in production

  • This was a day-long course delivered in 2 parts: Modern Path Tracing and Making Movies. The course dove deep into the theory of path tracing and how it is used to create the shading effects that we see in movies like Alita: Battle Angel, Spider-Man Far From Home, and Lego Movie. I won’t summarize any of it here because my understanding is limited and because those 2 linked documents explain everything in generous detail. Keywords: Path sampling framework, stereo techniques, next event estimation, Manuka micropolygon renderer, light hierarchy, acceleration structures for next event estimation, fitness and BSDF biasing, equiangular sampling, transmittance sampling, biased ray marching, null scattering.

GPU architecture

  • Mesh Shading: Towards Greater Efficiency Of Geometry Processing (NVIDIA): NVIDIA presented Mesh Shading as the next major step in the evolution of the GPU. The presentation starts with a history of this evolution: Fixed function stages, followed by programmable vertex shaders, followed by compute shaders, and now mesh shaders. 2 new shaders that precede rasterization replace the old shaders: The task shader and the mesh shader. Turing GPUs only. The redesign of the rendering pipeline comes with a redefined programming model that enables greater efficiency and control in geometry processing and that opens up many new applications and offloads much of the burden of LOD management. The Vulkan & OpenGL CAD Mesh Shader Sample repo demonstrates the new programming model. NVIDIA’s Asteroids demo is a great example of LOD management taking place in task shaders, as opposed to in the CPU. Adaptive GPU Tessellation was demonstrated with the generation of terrain whose mesh was handled entirely by the GPU (repo). Keywords: adaptive GPU tessellation, meshlets, task and mesh shaders.

Academic papers

Research papers across 32 different categories were presented. The Technical Papers Fast Forward session presents all of them briefly. I attended those of the Advanced Volume Rendering and Meshing categories. I wish I had attended the Light Science (Optimal Multiple Importance Sampling, Kondapaneni, paper) and High Performance Rendering categories.

Advanced Volume Rendering

  • Volume Path Guiding Based on Zero-Variance Random Walk Theory (Herholz): This was a widely-cited paper. It was mentioned in a couple other talks, including the one by Matt Pharr on Open Problems in Real-Time Rendering. Guided path tracing performs well in challenging scenes with poor lighting and participating media (dust in the air, in the example), with fewer speckles (“fireflies”) in the end result. The von Mises-Fisher probability distribution is at the core of the algorithm, and Matt Pharr confessed he was delighted by it. The source code of the implementation by the author will be available soon if requested at sebastian.herholz@gmail.com. Keywords: Guided and unguided path tracing, volumetric Monte-Carlo, kd-tree spatial cache, von Mises-Fisher, guided russian roulette and splitting.

  • A Null-Scattering Path Integral Formulation of Light Transport (Miller, paper): The slides of this talk contain a great introduction to path tracing through volumetric media (they are not public yet). They also include a beautiful render of light passing through a bunny-shaped smoke cloud. I have no more info. Keywords: Multiple importance sampling (MIS), scattering, transmittance, null-scattering path integrals, diffuse manifolds, color majorant, next event estimation.

  • Fractional Gaussian Fields for Modeling and Rendering of Spatially-Correlated Media (Guo, paper): No info. Keywords: Random fields, fractional integral operator, fractional Laplacian, autocovariance function, pink noise, extinction field, transport kernel, Open Shading Language.

  • Photon Surfaces for Robust, Unbiased Volumetric Density Estimation (Deng, paper): No info. Keywords: Photon mapping, participating media, 1D and 3D blur, photon plane.

Meshing

  • I noticed that nobody talks about the applications of meshing research and computational geometry in games and film; it is probably at the core of every game engine, though. The people who do talk about that are the folks in the 3D printing industry (Carbon Inc. gave a talk titled Computational Geometry and Software at Carbon) and those interested in simulation of physical phenomena.

  • Parametrization Quantization With Free Boundaries for Trimmed Quad Meshing (Lyon, paper): This technique produces smoother and nicer quad meshes. They said it’s suitable for building roof meshes in architectural visualization. Keywords: Quad meshes, singularities, quantized global parametrization, motorcycle graphs, patches, Dijkstra, T-junctions, integer grid maps. Image source: Paper.

  • TriWild: Robust Triangulation with Curve Constraints (Hu, paper, repo) and TetWild: Tetrahedral Meshing in the Wild (Hu, paper, repo): TriWild triangulates 2D images described by curved contours, achieving correct color diffusion and smoother boundaries. TetWild tetrahedralizes the interior of a 3D shape. Keywords: Triangulation, tetrahedralization, 2D, segment soups, linear and curved mesh generation, vertex smoothing, edge collapsing, envelope size. Image sources: Papers.

  • Finding Hexahedrizations for Small Quadrangulations of the Sphere (Verhetsel, paper): Hexahedrization is the subdivision of a volume into several hexahedra. I have no idea what it is used for, but it looks very cool. They have a website. Keywords: Hexahedrization, quad sections, tet meshes, quadrangulation. Image source: Paper.

  • Harmonic Triangulations (Alexa, paper): The interesting thing about this presentation was the way that the author explained and related with each other classical concepts of Computational Geometry: Delaunay triangulations, Voronoi diagrams and convex hulls. Keywords: Dirichlet energy, piecewise linear surfaces, Rippa’s theorem, bistellar flips, harmonic flips, pedal triangles and tetrahedra, harmonic flipping vs sliver exudation. Image source: Paper.

  • Navigating Intrinsic Triangulations (Sharp, Crane, paper): Correct meshing is important because algorithms may fail on them otherwise. This paper introduces a data structure called signpost that allows a large set of existing mesh processing algorithms to succeed on poor-quality meshes. A very interesting thing about the paper is that the resulting triangles of the triangulation are not necessarily planar (intrinsic triangulation). This paper and others by CMU are the work of a group called the Geometry Collective. Keenan Crane has a great free book on Discrete Differential Geometry. Keywords: Signpost data structure, triangle cotangent weights, geodesic paths, edge flips, Delaunay triangulation, intrinsic Delaunay refinement and vertex insertion, Steiner trees, adaptive mesh refinement. Image source: Paper.

Other