- This topic is empty.
For good or bad, temporal anti-aliasing – or TAA – has become a defining element of image quality in today’s games, but is it a blessing, a curse, or both? Whichever way you slice it, it’s here to stay, so what is it, why do so many games use it and what’s with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren’t they used any more?
For around a full decade, from the late 90s until circa 2010, the best anti-aliasing you could get was SSAA – super-sample anti-aliasing. The principles are remarkably straightforward. To remove the jaggies, you deployed GPU resources to render the image at a much higher resolution, then downscaled. 8x SSAA on a 1080p screen effectively rendered internally at 8K (!), while 4x SSAA downscaled from 4K instead. It’s the brute force method, delivering a stable image with little sub-pixel break-up and pristine edges. When the PS4 Pro and Xbox One X arrived, Sony and Microsoft didn’t just sell gamers on the 4K dream, but also on the image quality benefits from SSAA by rendering at a higher resolution and then downscaling.
However, the brute force approach means that a tremendous amount of GPU resources are required. Using Crysis 3 on an RTX 3070, native 1080p runs at nearly 200 frames per second without it being CPU-limited. However, 4x SSAA takes that down to 70fps, reducing to just 19fps at 8x SSAA.13 February 2024 at 5:04 PM #31065
- You must be logged in to reply to this topic.