For this year's theme of "The Outer Limits," I was inspired to explore the small-scale end of the outer limits of our world. This photograph is one of several from Nikon's annual Photomicrography Competition that uses a polarization microscope to produce an image. This type of microscope uses a linear polarizer above and below the specimen (in crossed orientations) to filter incoming light from below. The diatoms are composed of a translucent, birefringent material, which means they act as waveplates, or phase shifters. The bright colors are a result of the phase shifts incurred while passing through the diatoms, and are a function of their thickness, orientation, and "birefringence" (the difference in index of refraction between the fast and slow transmission axes).
I decided to simulate diatoms viewed through a simple polarization microscope in dirt.
These are the major features I implemented in my renderer.
Since rendering can be a rather slow process, parallelism was the first modification I made to dirt. First, I break up the image into 16x16 "render blocks" (pixel coordinates) and save them in a work list. I then detect the number of concurrent threads supported by the hardware and start that many threads. The threads continually pop render blocks off the work list until all threads are finished.
The number of threads is also a command-line argument to dirt with key '–t / ––nthreads.'
|1 thread / 36.78s||2 threads / 24.16s||3 threads / 21.17s||4 threads / 20.16s||5 threads / 19.14s|
|6 thread / 18.14s||7 threads / 17.13s||8 threads / 16.13s||9 threads / 16.11s||10 threads / 16.11s|
|1x1 / 16.61s||2x2 / 16.64s||4x4 / 16.61s||8x8 / 16.64s||16x16 / 16.63s||32x32 / 16.62s||100x100 / 17.13s|
I used Worley noise to create interesting background textures that look like organic cells. I followed the implementation here, which uses a 3x3 for loop to look up the local neighborhood of feature points and calculate their distance to the shade point.
While there are many complex elements to a typical microscope, I decided to create a simple setup where light passes through a linear polarizer, the specimen, a crossed linear polarizer, an objective lens, and lands on the image plane. As in a real microscope, the image of the specimen is reversed horizontally and vertically on the image plane.
This diagram illustrates my microscope setup.
For the objective lens, I use a glass spherical cap. I implemented a spherical cap surface primitive that can be intersected with rays. One known issue of using lenses with spherical surfaces is spherical aberration, which means that light passing through the lens does not converge to a perfect focal point. To combat this, I followed the recommendations here and here, which suggest both using an aperture to cut off rays on the fringes of the lens that are highly convergent and moving the lens closer to the focal point than the lensmaker's equation would suggest.
For the microscope camera, I created a camera similar to an orthographic camera. To generate rays, this microscope camera produces a ray origin within a pixel of the image plane. Unlike an orthographic camera, the ray direction is the direction from the ray origin to some focal point within the scene, which in our case is the focal point of the objective lens. In theory, rays should pass through the focal point, hit the objective lens, and become collimated. However, due to spherical aberration, this does not work out perfectly and the image becomes distorted toward the edges of the lens.
|d = 3||d = 6||d = 9|
|With lens (+ distance correction)|
Fig 4. Some tests showing how the camera and lens work together. In all images, the image plane is located a focal distance (d = 2) away from the focal point. The distance of the "specimen" (checker texture) from the focal point is above each column. We can see that in all images, the texture is flipped horizontally and vertically, as we would expect.
Row 1: no lens. As the specimen moves away from the focal point, it grows smaller. Row 2: a hemispherical lens is located at its focal distance (d = 2). The rays toward the edges of the sphere are refracted too strongly, causing them to converge on the opposite side of the lens. Row 3: a hemispherical lens is located closer to the focal point (d = 1). The specimen is less distorted and barely shrinks at long distances.
Instead of shooting rays of light with RGB colors through a scene, spectral rendering shoots rays of light that each carry one wavelength. This allows us to incorporate realistic wavelength-dependent effects such as dispersion and phase shifts. In our path tracer, we pick a wavelength with a uniform pdf between 400nm and 700nm to shoot into the scene, which adds one dimesion to the Monte Carlo estimate of radiance. This extra dimension manifests as color noise at low sample counts.
To compare the results of my spectral path tracer with our old RGB path tracer, I ported PBRT's spectral upsampling scheme for RGB values to use on diffuse surfaces and light sources. To convert spectra to XYZ colors, I implemented the analytic functions from this paper.
|RGB||Spectral (constant IOR)||Spectral (wavelength-dependent IOR)|
I also implemented blackbody lights to use in my final render. Blackbodies are bodies that absorb all incident radiation and may emit radiance diffusely, e.g. stars and human bodies. Planck's law provides a simple formula for a blackbody's emitted radiance as a function of wavelength and its temperature.
The bulk of what I implemented for my project is realistic polarization. Polarization is a property of light that describes how its waveform moves as it progresses along the light propagation direction. On a macroscopic scale, light may be linearly polarized (oscillating back and forth in a single plane), circularly polarized (drawing a circle as the waveform moves through the origin), or elliptically polarized (the general case). Bulk light may also be partially polarized or unpolarized, since it is comprised of many individual photons with unique properties.
I generally followed this recent SIGGRAPH course for my implementation. I chose to represent polarized light in my path tracer using Mueller calculus, which can account for fully, partially, or not at all polarized light, as opposed to Jones calculus, which can only account for fully polarized light. In brief, a Stokes 4-vector represents the polarization state for a ray of light, including its total radiance (S0) and relative responses to a horizontal vs. vertical polarizer (S1) a +45° vs. –45° polarizer (S2), and a right-hand-circular vs. left-hand-circular polarizer (S3). A Mueller matrix represents an optical transformation of a Stokes vector, such as light passing through a linear polarizer. To represent these, I created two new data structures, StokesState and MuellerTransform. In addition to the Stokes vector and Mueller matrix, these two structures also hold the reference bases necessary to transform, add, and multiply Stokes vectors. Keeping track of these reference bases and transforming them correctly was a challenge.
|Unpolarized light through linear polarizer||Polarized light through linear polarizer||Unpolarized light through
crossed linear polarizers
|Air-to-water Mueller matrices||Water-to-air Mueller matrices|
|Horizontal polarizer||+45° polarizer||Vertical polarizer||–45° polarizer||Horizontal + vertical crossed polarizers|
|Rendered image||Degree of polarization|
|Orientation of linear polarization|
Note that red and green in the bottom row are swapped from the visualizations in the paper, but follow their text instructions. In Sec. 5.3, they say to map negative S1 (vertical polarization) to red and positive S1 (horizontal polarization) to green; however, this disagrees with their Fig. 5 and Fig. 6, where horizontal polarization is red and vertical polarization is green.
|Rendered image||Degree of polarization||Orientation of linear polarization|
As previously mentioned, there is another framework for polarization called Jones calculus. I tested my Mueller waveplate calculations against a Jones matrix formulation for validation.
|t = 40000nm||t = 50000nm||t = 60000nm||t = 70000nm||t = 80000nm|