Rendering star fields in 3D

November 4, 2023

Pictures of stars often have these pretty lens-flare type patterns. This post will go into why they look the way they do, and how to recreate this effect in realtime 3D rendering. The short answer is diffraction spikes and point spread functions.

JWST star field showing several stars with familiar 6-pointed spokes. There is also an orange nebula visible.
NASA, ESA, CSA. Image Processing: Joseph DePasquale (STScI); NASA's James Webb Space Telescope on Flickr
JWST star field showing a few stars with a more striking 6-pointed spokes. There is also a long streaking nebula visible.
ESA/Webb, NASA, CSA, T. Ray (Dublin); NASA's James Webb Space Telescope on Flickr
Hubble star field showing many different stars in reds and blues, showing 4-pointed spokes.
Core of Omega Centauri; NASA, ESA, and the Hubble SM4 ERO Team; NASA Website

Except for Sol, all stars visible in the sky are so far away that they are generally much smaller than a pixel in size. It seems like they should be only a single bright pixel in the image, but instead we get these patterns.

The patterns are caused by light diffracting as it passes through different parts of a telescope, because of light behaving like a wave. This diffraction spreads out energy in a pattern that's different for every optical system. This spreading is described by the point spread function.

This effect also happens in cameras (where it's called lens flare, bloom, or glare), microscopes, human eyes, and all other optical systems. The spreading pattern also varies slightly depending on the wavelength of light. This causes slight chromatic aberration in color images. It's related to some other phenomena like bloom, lens flare, and glare, which have their own techniques for simulating in computer graphics.

For a circular aperture pinhole camera, the pattern you get is called the Airy Disk, and it looks like this:

Diagram showing concentric circles decreasing in brightness further out.
Airy disk pattern for a 9 mm wide pinhole, showing a point source at 610nm (red). An arcsecond is 1/60th of an arcminute, which is 1/60th of a degree. The result is displayed in a log scale so that more of the detail is visible - in a real image you would only ever see tiny parts of this.

Non-circular apertures cause diffraction spikes, which are lines that emanate from the light source. For a hexagon shaped aperture, the pattern looks like this:

Diagram showing 6 spikes emanating from a central bright spot
There are lines going out in 6 directions. These lines are perpendicular to the sides of the hexagon forming the aperture. The lines are dotted. There are also small circular shapes spread out all around.

If the hexagon pattern looks familiar, it should - both the James Webb Space Telescope, as well as many consumer cameras, have hexagonal diffraction spikes.

The number of spikes corresponds to the number of edges in the shape forming the aperture. In general there are about twice as many spikes as there are edges in the aperture shape, but these spikes can overlap each other. Hexagons have 12 spikes but due to overlapping only 6 are visible. A pentagon has 10 spikes.

The pattern is actually related to the 2D fourier transform of the shape of the aperture and anything else obstructing the light. The fourier transform of a hard edge is a sum of an infinite series of waves, which in a 2D fourier transform show as spikes coming out from the center of the image.

Imperfect or dirty lenses and mirrors cause additional distortions. This shows up as an uneven and asymmetrical pattern.

Some consumer cameras have apertures formed by a set of 6 curved blades. This is used to make the aperture larger or smaller, controlling how much light it takes in. This produces a slightly different shape than a straight-edged hexagonal aperture.

Similar to the hexagon aperture, but the spikes get wider as they go out.
An aperture defined using 6 circles overlapped with slight offsets from each other. This is a simplified model, a real camera has lenses.

In the case of the James Webb Space Telescope, it has an array of hexagonal mirrors. These mirrors on their own create a more complex pattern, but there are also 3 support struts on the telescope that affect the pattern as well.

Image from Wikipedia; Wikimedia Commons

For the Hubble telescope, its pattern is close to an Airy Disk but with 4 diffraction spikes due to its internal support struts.

Recreating the effect

The most common approach for rendering stars is to have a single image of a twinkling star, and then scale this image up or down depending on the star's brightness. Sometimes more than one image is used and they are blended between. While this does work, it doesn't make as visually interesting of an image in my opinion.

I recently made my own take on this in the form of a Godot addon for rendering stars. It's available here: https://github.com/tiffany352/godot-starlight

Near field: Earth-like planet, orbiting a star in the Hyades cluster. Mid field: 330,000 stars from the Gaia Catalog of Nearby Stars being rendered in real time. Deep field: Milky Way skybox based on Gaia data.

Stars are rendered as billboards (a square that always faces the camera) with a fixed size. This fixed size is very large, covering about 90 degrees field of view.

Outside the shader, each star's color in RGB is calculated based on its effective temperature, using black body radiation. This color varies through red, orange, yellow, white, then blue. I found this function and converted it to GDScript for this.

Inside the vertex shader, these steps are performed:

  1. Find the distance from the camera to the star, in light years.
  2. Calculate how much the star should be dimmed based on the inverse square law.
  3. Multiply the star's brightness, measured in multiples of that of Sol, by the dimming factor.
  4. Multiply this with the color of the star. This final color is passed to the fragment shader.

In the fragment shader, it samples the PSF texture and multiplies it by the color from the previous step.

If the full PSF texture is drawn like this, the result will be way too much overdraw. So the PSF texture is actually cropped based on how bright the star is, as far away stars only use a tiny portion of the texture.

I found that diffraction spikes decrease in brightness following the inverse square law from the center of the PSF texture, so by scaling and taking the square root of the brightness I can find what size the texture should be cropped to. I found the constants for this by trial and error.

Creating PSF textures

There are two paths for this: Optical simulations, and hand-authoring. Early on when I was experimenting with this, I tried extracting PSF textures out of photos, but this doesn't work. PSF textures have way more dynamic range than camera sensors can capture.

PSF textures for some telescopes can be obtained in the FITS image format from various astronomy-related sources. FITS is an image format used in astronomy and some other fields, it's not readily ingested by game engines. I used a Python script to convert from FITS to OpenEXR, which Godot can import. So far all the PSF textures I've found online have been for telescopes and microscopes, none for cameras.

These images are extremely high dynamic range. Like 30 exposure stops. You will either need to store them as floating point textures, which are quite costly in terms of VRAM, or do a transform to make them fit in 24-bit color.

This transform involves multiplying every pixel by the square of the distance from the center of the texture, then reversing this transform in the shader. This will make diffraction spikes have constant brightness, while the airy disk will fall off more or less linearly. You still lose a lot of color information if you store this as 24-bit color, but it's at least usable.

Here's an example with the JWST PSF, with both this inverse square transform, as well as taking the square root of every pixel value (in the range [0,1][0, 1]):

Untitled

For hand authoring, this transform is necessary. Otherwise, the PSF textures are way too high of dynamic range to edit by hand.

I authored this fairly normal looking pixel art texture using Paint.NET:

Untitled

Some tips for making your own:

  1. The diffraction spikes should be constant brightness, because the inverse square falloff will be applied when processing the texture.
  2. I added chromatic aberration in several places based on what I've seen in real PSF textures, and because it looks cool. Add more than you think you'll want, as a lot of it will be blown out to white when rendered.
  3. The rings around the outside of the central bright area are meant to simulate some very faint lines that can be seen in the JWST PSF.
  4. There is a gentle gradient that goes from a dark grey to pure black. There is a similar glow in many real PSFs, which falls off faster than inverse square law. I did this using a simple radial gradient.

I edited the shader to apply this inverse square falloff to the texture in the shader, instead of baking it into the texture. I also added a pixel art antialiasing step. The result was this:

To the right of the brightest star, the Hyades cluster is visible roughly 100 lightyears away.

I really like the variation in the shapes of stars that this approach adds. The shape smoothly changes as you get closer to the star, it feels very natural.

Generating PSFs using Poppy

Documentation for POPPY — poppy vdev

Poppy is a Python library for simulating PSFs of optical systems. It's mainly targeted at optics for telescopes, but it can also be used for simulating optics of microscopes, cameras, and other devices. Several of the images I shared in this post were generated using it.

I have a Jupyter notebook here which contains several of the examples: https://github.com/tiffany352/godot-starlight/blob/main/docs/poppy psfs.ipynb

Using real star catalog data

Star catalog data can add a lot of believability and visual interest to your renders, which can be hard to get with procedurally generating stars. The data is openly accessible online and can be processed using astronomy tools.

In the screenshots I showed before, I used the Gaia Catalog of Nearby Stars (GCNS). GCNS is a slice of the Gaia catalog containing stars within 100 parsecs (341 lightyears), of which there are about 330,000. The authors estimate this is 92% of all stars inside of this sphere. Around 1000 stars were excluded due to being too bright (such as Sirius, Alpha Cen A, etc.), while others were too dim to have been detected.

Gaia is a space telescope from ESA which scans the entire sky. The data was used to produce catalog of 1.5 billion stars, with parallax and spectral information. There's also a free app called Gaia Sky which you can use to visualize this.

I used GCNS because it contained all of the stars, even very dim ones, that are in proximity of the Hyades cluster.

Another good option is the AT-HYG database. It contains 2.5 million stars with a lot of useful information about each one. It's also conveniently distributed as a CSV file.

For use with star rendering, any catalog you choose needs to at least have at least three things:

  • Spectra: Effective temperature, or some metric that can be used to calculate it, such as BP/RP spectrographic data.
  • Magnitude: You'll need absolute magnitude, or something that it can be calculated from, such as apparent magnitude + distance.
  • Distance: Distance, or parallax which can be used to calculate it. If you plan to render stars from any vantage point other than Earth, you'll need it.

Typically the data for these catalogs comes in some format like FITS tables, VOTable, or another astronomy-oriented format. It usually contains a lot of information that you don't really need, and the information you do need often needs to be extracted from other fields. So it makes sense to process this data before importing it into a game engine.

To see how I processed the GCNS data, check out this Jupyter notebook. (Download)

magnitude,temperature,x,y,z
11.61,3468.0,11.27021,7.54892,-66.66381
6.08,4797.0,4.1014,8.09092,-36.2716
5.91,5315.0,-46.21263,84.37781,14.272
... 330,000 more entries ...
The result is a nice CSV file. Download full CSV

I loaded the data into Godot using a custom import plugin. There are some weird things there, mainly that you can't use .csv extension because it will try to create a translation table, so you need to use .txt or similar instead. Even if you try to change the import mode away from translation table, Godot will just crash instead. The import plugin I used creates a PackedArray out of each of the columns of the CSV, for small size on disk and quick loading.

At runtime I calculate luminosity from absolute magnitude using the formula pow(10.0, (4.74 - magnitude) / 2.5), and then pass the data to the starlight addon.

log10 L=4.74M2.5log_{10} \space L = \frac{4.74 - M}{ 2.5}

I also added some code that only renders stars that have an apparent magnitude of less than 20. This drops the number of rendered stars from 330,000 down to 29,000. If you want to simulate stars visible to the naked eye, then you'll be using a value more like 6.5, and end up with a number more like 10,000 (depending on where you place the viewer).

The vantage point I use isn't Earth, so I have to recalculate the apparent magnitude from the absolute magnitude. This can be done by adding log10(dist) * 5 + 5 to the absolute magnitude, where the distance is in parsecs. Or to put it in math, where mm is apparent magnitude, MM is absolute magnitude, and dpcd_{pc} is distance in parsecs:

m=M+5 log10 dpc+5m = M + 5 \space log_{10} \space d_{pc} + 5

Side note about redshift

When I first read about redshift it seemed like it would be really hard to render accurately, because game engines don't do multispectral rendering. It turns out though that there is a solution for the case of black body emission:

Trecv=Temit1+zT_{recv} = \frac{T_{emit}}{1 + z}

Where zz is the redshift value, which can be calculated with this formula:

1+z=1+vccosθr1v2c21 + z = \frac{1 + \frac{v}{c} \cos θ_r}{\sqrt{1-\frac{v^2}{c^2}}}

Where vv is the velocity of the star relative to the observer, cc is the speed of light, and θr\theta_{r} is the angle of the velocity vector. An angle of 0° means the star is moving away, ±90° means it is moving sideways, and 180° means it is moving towards the observer.

In a game engine context, you can replace the cosθr\cos θ_r term with a dot product of the velocity vector and the direction of the star relative to the observer: vp\mathbf{v} \cdot \mathbf{p}.

// v is in c units, so a unit vector is traveling at exactly the speed of light.
float redshift_temperature(float temp, vec3 vec, vec3 p) {
	// negate so that the input is the observer's velocity relative to the star
	// instead of the other way around.
	vec = -vec;
	// since v will be immediately squared anyway, use the squared length instead.
	float v_sq = dot(vec, vec);
	// combines the v/c and cos terms into one by not normalizing the input vector.
	float v_cos_theta = dot(vec, normalize(p));
	// c in these units is equal to 1, so c^2 cancels out.
	return temp / (1.f + v_cos_theta) * sqrt(1.f - v_sq);
}
Moving forwards and to the right at 0.9c. Everything on the left is redshifted, everything on the right is blueshifted.