Adaptive Image Space Shading for Motion and Defocus Blur

ID 672293
Updated 6/26/2012
Version Latest
Public

author-image

By

By Karthik Vaidyanathan1, Robert Toth1, Marco Salvi1, Solomon Boulos2, Aaron Lefohn1
1Intel Corporation, 2Stanford University

We present a novel anisotropic sampling algorithm for image space shading which builds upon recent advancements in decoupled sampling for stochastic rasterization pipelines. First, we analyze the frequency content of a pixel in the presence of motion and defocus blur.We use this analysis to derive bounds for the spectrum of a surface defined over a two-dimensional and motion-aligned shading space. Second, we present a simple algorithm that uses the new frequency bounds to reduce the number of shaded quads and the size of decoupling cache respectively by 2X and 16X, while largely preserving image detail and minimizing additional aliasing.

Preprint Paper: Adaptive Image Space Shading for Motion and Defocus Blur [PDF 14.6 MB], Video [MP4 133 MB]

Presented at High Performance Graphics 2012
Adaptive Image Space Shading for Motion and Defocus Blur. Karthik Vaidyanathan, Robert Toth, Marco Salvi, Solomon Boulos, Aaron Lefohn. In Proceedings of High Performance Graphics, pp. 13-21. 2012.
The definitive version is available at http://diglib.eg.org/

 

 

Additional Resources

Intel Rendering Research