NVIDIA researchers demonstrate thin holographic VR glasses

A team of scientists from NVIDIA Research and Stanford has published a new article featuring a pair of thin holographic VR glasses. Displays can show real holographic content, solving the problem of vergence-accommodation. Although the research prototypes demonstrating the principles were much smaller in the field of view, the researchers say it would be easy to obtain a 120 ° field of view.

Released ahead of this year’s SIGGRAPH 2022 conference, a team of scientists from NVIDIA Research and Stanford demonstrated a proximity VR display that can be used to display flat images or holograms in a compact body. The article also discusses the interrelated variables in the system that affect key display factors such as field of view, field of view, and eye relief. In addition, scientists are investigating various algorithms to optimally render the image for the best visual quality.

Commercially available VR headsets have not improved significantly over the years, mainly due to optical limitation. Most VR goggles use a single display and a simple lens. To direct the light from the display to the eye, the lens must be some distance from the display; closer and the image will be out of focus.

Bridging this gap between the lens and display would unlock previously impossible design factors for VR headsets; understandably, a lot of research and development has been exploring how this could be done.

In a newly published NVIDIA-Stanford article, Holographic glasses for virtual realitythe team shows that they built a holographic display using a spatial light modulator in combination with a waveguide instead of a traditional lens.

The team built both a large laboratory model – to demonstrate the basic methods and experiment with different image rendering algorithms for optimal display quality – and a compact wearable model to demonstrate the aspect ratio. The images you see, showing the compact glasses, do not contain the electronics that control the display (as the size of this part of the system is beyond the scope of the study).

You may remember some time ago that Meta Reality Labs published their own paper on compact VR goggles the size of glasses. While this work involves holograms (to form the system’s lenses), it is not a “holographic display,” meaning it does not solve the convergence and accommodation problem that is common to many VR displays.

On the other hand, researchers at Nvidia-Stanford write that their holographic eyewear system is actually a holographic display (thanks to the use of a spatial light modulator), which they tout as a unique advantage of their approach. However, the team writes that it is also possible to display typical flat images on the screen (which, like modern VR goggles, can converge to obtain a stereoscopic view).

Photo courtesy of NVIDIA Research

Not only that, but the design of the holographic glasses touts only 2.5mm thick on the entire display, much thinner than the 9mm thick Reality Labs design (which was impressively thin anyway!).

As with any good article, the Nvidia-Stanford team is quick to spot the limitations of their work.

First, their wearable system has a tiny 22.8 ° field of view with an equally tiny 2.3mm eyelet. Both are too small to be used in a practical VR headset.

Photo courtesy of NVIDIA Research

However, the researchers write that the limited field of view is largely due to an experimental combination of new components that are not optimized to work together. They explain that the drastic expansion of the field of view is largely a matter of selecting complementary elements.

“[…] this [system’s field-of-view] was mainly limited by the size available [spatial light modulator] and the focal length of the GP lens, which can be improved with a variety of components. For example, the focal length can be halved without significantly increasing the overall thickness by stacking two identical GP lenses and a circular polarizer [Moon et al. 2020]. With a 2 inch SLM and a 15mm GP lens, we were able to achieve a monocular FOV up to 120 ° ”

As for the 2.3 mm eye-box (the volume in which the rendered image can be seen), it is definitely too small for practical use. However, scientists write that they experimented with a simple way to extend it.

They show that with the addition of eye-tracking, the eye-box can be dynamically extended to 8mm by changing the angle of the light that is sent to the waveguide. It’s true that 8mm is still a very tight eyepiece and may be too small for practical use due to differences in the distance where the eyes are exposed and the way the glasses rest on the head, from one wearer to the next.

But there are variables in the system that you can adjust to change key display factors such as the field of the eye. In their work, the researchers established the relationship between these variables, giving clear insight into what trade-offs will need to be made to achieve different outcomes.

Photo courtesy of NVIDIA Research

As they show, the eye size is directly related to the pixel pitch (distance between pixels) of the spatial light modulator, while the field of view is related to the overall size of the spatial light modulator. It also shows the limitations of pupillary distance and convergence angle, with regard to a pupillary distance of less than 20 mm (which scientists consider to be the upper limit of the actual “glasses” aspect ratio).

An analysis of this “design retail space,” as they call it, was a key part of the article.

“With our design and experimental prototypes, we hope to stimulate new directions in research and engineering towards ultra-thin, all-day VR displays with form factors comparable to conventional glasses,” they write.

The article is credited to researchers Jonghyun Kim, Manu Gopakumar, Suyeon Choi, Yifan Peng, Ward Lopes, and Gordon Wetzstein.

Leave a Reply