Graphics Pipelines for Next Generation Mixed Reality Systems
The Pipelines project is an EPSRC funded investigation into new graphics techniques and hardware. Our goal is to re-purpose and re-architect the current graphics pipeline to better support the next generation of AR and VR systems. These new systems will require far greater display resolutions and framerates than traditional TVs and monitors, resulting in greatly increased computational cost and bandwidth requirements. By developing new end-to-end graphics systems, we plan to make rendering for these displays practical and power-efficient.
A major focus of the project thus far has been on improving efficiency by rendering or transmitting only the content in each frame that a user can perceive. More detail on our work in this direction is given on the page for our paper Beyond Blur.
To peripheral vision, a pair of physically different images can look the same. Such pairs are metamers relative to each other, just as physically-different spectra of light are perceived as the same color. We propose a real-time method to compute such ventral metamers for foveated rendering where, in particular for near-eye displays, the largest part of the framebuffer maps to the periphery. This improves in quality over state-of-the-art foveation methods which blur the periphery. Work in Vision Science has established how peripheral stimuli are ventral metamers if their statistics are similar. Existing methods, however, require a costly optimization process to find such metamers. To this end, we propose a novel type of statistics particularly well-suited for practical real-time rendering: smooth moments of steerable filter responses. These can be extracted from images in time constant in the number of pixels and in parallel over all pixels using a GPU. Further, we show that they can be compressed effectively and transmitted at low bandwidth. Finally, computing realizations of those statistics can again be performed in constant time and in parallel. This enables a new level of quality for foveated applications such as such as remote rendering, level-of-detail and Monte-Carlo denoising. In a user study, we finally show how human task performance increases and foveation artifacts are less suspicious, when using our method compared to common blurring.
This project is funded by the EPSRC/UKRI project EP/T01346X.