Stanford and NVIDIA researchers shrink VR headsets to regular glasses

Don't get your hopes up high, it is not a commercial project yet.
Ameya Paleja
NVIDIA VR headset
NVIDIA VR headset

Jonghyun Kim/YouTube 

  • Conventionally VR headsets are bulky since they use an older design
  • NVIDIA used a concept called "pancake lenses" to make the headsets as this as 2.5 mm.
  • This is the first time 3D images have been displayed on pancake lenses.

Researchers at Stanford University and NVIDIA teamed up to tackle one of the biggest challenges facing virtual reality (VR) experiences, the bulky headsets. In a new research paper, the team showed how they could be reduced down to a thickness of a pair of regular-looking glasses, a press release from the company said.

Since Mark Zuckerberg declared his intention to focus on developing the metaverse last year, the world has been riding on new waves of technologies such as virtual reality (VR). Central to the theme of the metaverse is creating a new digital world where people can interact, work and entertain themselves. However, the VR headset is an indispensable part of this interaction.

Why are VR headsets so bulky?

For years, companies working on VR headsets have followed similar designs, which include a lens that magnifies an image that is placed a short distance away from it. This requires that the display and the lens be spaced away from each other, which ends up making the headsets bulky.

Experiencing the metaverse in these headsets might be great, but from the ease-of-use perspective, these headsets fail miserably. If the metaverse indeed has to become the future of the internet as it is touted to be, the mode of experience needs to be much simpler, like a pair of glasses and that's what NVIDIA plans to do.

How did NVIDIA manage to do this?

NVIDIA's solution is a concept called pancake lenses. While the company cannot take credit for coming up with this concept initially, it has managed to get them to work with three-dimensional (3D) images.

It might be strange to believe that previous attempts to use pancake lenses by other research groups only managed to get them to work with two-dimensional (2D) images. A few years ago, these were massive achievements, but considering the promise of immersive experiences that VR tech has thrown around repeatedly, anything below 3D is purely unacceptable these days.

NVIDIA and its collaborative team of researchers from Standford managed to do this while also reducing the distance between the lens and the display. To achieve the latter, the researchers used a "phase-only spatial light modulator (SLM)" which is lit by a coherent light source and "creates a small image behind the device", the researcher says in their recently published paper.

One more feat that this design manages is to reduce the device's weight to a mere two ounces (60 g) as against a pound (503 g) that the Meta's Quest headset weighs. Having said that, the headset is not without its limitations. The field-of-view (FOV) on the headset prototype is much smaller than the one we see on the commercially available headsets today.

Also, the slimmer headset requires the accurate measurement of the user's pupil, which is simply not possible for a product you would be looking to make millions of and ship worldwide. An infrared gaze tracker could do this job, but that could also increase the device's size. Not to forget that the ribbon-like structures on these new designs don't make for an attractive-looking product.

So, there is a fair bit of work to be done before the product becomes commercially available. If you would like to read more about how it was built, you can read about it here.

Abstract

We present Holographic Glasses, a holographic near-eye display system with an eyeglasses-like form factor for virtual reality. Holographic Glasses are composed of a pupil-replicating waveguide, a spatial light modulator, and a geometric phase lens to create holographic images in a lightweight and thin form factor. The proposed design can deliver full-color 3D holographic images using an optical stack of 2.5 mm thickness. A novel pupil-high-order gradient descent algorithm is presented for the correct phase calculation with the user’s varying pupil size. We implement benchtop and wearable prototypes for testing. Our binocular wearable prototype supports 3D focus cues and provides a diagonal field of view of 22.8◦ with a 2.3 mm static eye box and additional capabilities of dynamic eye box with beam steering, while weighing only 60 g excluding the driving board.

message circleSHOW COMMENT (1)chevron