Skip to content

Temporal Anti-Aliasing #3663

@superdump

Description

@superdump

Description

Aliasing happens in an image (i.e. spatial aliasing) when a low resolution sampling is made from a higher-resolution representation.

The more well-known aliasing along the edges of triangles as they are sampled as part of the rasterisation process is addressed by MSAA (Multi-Sample Anti-Aliasing).

As the detail of textures increased, there has been a similar problem with, let's say the contents of the surface of the triangles as a high resolution texture on a model that is far away is being sampled at a much lower resolution in order to shade an individual screen fragment. This can be addressed by mip mapping with involves sampling from appropriately-filtered, downscaled versions of the full resolution original texture such that when the model is far away, samples are taken from a lower resolution texture. Filtering can be done affecting the way that the textures are sampled such as trilinear and anisotropic filtering.

As part of physically-based lighting models, specular highlights as light reflects off a shiny surface when viewed from a specific angle also cause aliasing as the bright spot fills an entire screen fragment where it would perhaps have been a tiny speck in the analogue world. The way that modern renderers solve this may vary but Temporal Anti-Aliasing has emerged as the leading solution in the game industry. It involves practically accumulating samples over multiple frames (as the 'temporal' in its name suggests), making sure to match up the samples for the current frame with their position in the previous frame, and, in my opinion very interestingly, jittering (moving around) the camera position. The camera jitter makes it so that when the camera is 'still' it is technically not and so the technique can still address the aliasing problems it aims to remove. It also has additional positive side effects as instead of sampling the scene only at the positions directly in-line with the screen fragments, the technique is sampling at many positions close to and around those positions. This means that the technique gathers more information than 'basic' non-jittered rendering and so Temporal Anti-Aliasing techniques can also be used as a sort of spatio-temporal super-sampling which allows for upscaling as part of the process. TAA's temporal filtering enables a number of sparse sampling techniques that look noisy without filtering but the mathematics of blue noise and low discrepancy sequences applied to the patterns of sampling produce excellent results for fewer samples. This can be applied to ambient occlusion and shadow techniques among others.

Another solution could be this: https://www.jcgt.org/published/0010/02/02/paper.pdf This uses the derivative of the normal to detect where aliasing might happen and boosts the surface roughness to reduce potential aliasing.

Some downsides and complications of TAA are that 'averaging' pixels over time causes smearing / blurring / ghosting artifacts, the need for motion vectors to reproject fragments from the current frame into the history buffer has a few problems of occlusion/disocclusion (a point on a surface becoming blocked or visible between frames), and obtaining motion vectors for animated vertices that are not simply defined by the vertex buffer and model transform. These are solvable, if complicated.

Solution

Metadata

Metadata

Assignees

No one assigned

    Labels

    A-RenderingDrawing game state to the screenC-FeatureA new feature, making something new possible

    Type

    No type

    Projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions