Meta research suggests that VR’s most transformative gains in telepresence and visual realism may come from improvements in screen brightness and dynamic range.
Talk about Meta CTO Andrew Bosworth’s podcast, the company’s head of display systems research spoke of the huge gap in brightness between the 100 nits delivered by Meta’s industry-leading Quest 2 headset and the more than 20,000 nits delivered by the Starburst research prototype. The latter can even match bright indoor lighting, while far outperforming today’s best-performing high dynamic range (HDR) televisions, peaking at around 1,000 nits.
Douglas Lanman, Meta’s top display researcher, referred to this gap as what “we want most, but are least able to deliver right now.” The prototype is so heavy at 5 to 6 pounds with heat sinks, a powerful light source and optics, that when you look at Starburst, it has to be hung comfortably from above and held against the face with handles. While we know that Sony’s PlayStation VR 2 display will bring HDR to consumer VR for the first time, the exact brightness and dynamic range is unknown.
“You said you feel your eye react to it in a certain way,” meta researcher Nathan Matsuda told Tested’s Norman Chan when he tried Starburst. “We know there are a variety of perceptual cues that you get from that expanded luminance, and some of that is due to work that was done for the television and cinema display industry, but of course if you have a more immersive display device like this where you have a wide field of view, binocular parallax and so on, we don’t know if the perceptual responses actually come straight from the previous work that was done with TVs, so one of the reasons we built this to begin with is so we can starting to unravel where those differences are, where the thresholds might be where you feel like you’re looking at a real light rather than an image of a light, which will eventually lead us to being able to build devices that content creators can then use to create content produce that takes advantage of this full range.”
For those who missed it, Meta this week offered an unprecedented look at its prototype VR headset research, paired with announcing a goal to pass the “visual Turing test” one day. Passing the test would mean creating a VR headset with images indistinguishable from reality. on Bosworth’s podcast, Boz to the FutureLanman described the challenges of bringing VR displays to this goal in four ways: resolution, varifocal, distortion correction and HDR, the latter described as arguably the most challenging to fully achieve.
In this [Starburst] prototypes that we’ve built, you’re watching a sunset… And when we want to talk about presence, you feel like you’re there. You are on Maui, watching the sun go down and the hairs on the back of your neck rise.
So this is the one we want most, but least can deliver right now. Where we are is just conducting studies, to determine what would work? How can we change the rendering engine? How could we change the optics and displays to give us this? But high dynamic range, that’s the fourth, arguably the king of them all.
The Starburst prototype, pictured below, demonstrated an implementation of extremely bright imagery in high dynamic range (HDR) VR, which Meta CEO Mark Zuckerberg described as “perhaps the most important dimension of all.”
While Starbust’s brightness greatly enhances the sense of presence and realism, the current prototype would be “extremely impractical” to ship as a product, as Zuckerberg put it. If you haven’t dived in yet, we highly recommend taking the time to watch Tested’s full video above and listen to the podcast with Lanman and Bosworth embedded below. As Meta’s CTO said, “the prototypes give you” the ability to reason about the future, which is super helpful because it allows us to focus.”
We also reached out to Norman Chan on Tested, as his exclusive look at the hardware prototypes and the comment he made to Zuckerberg that Starbust was “the demo I didn’t want to launch” suggests HDR will likely become a critical area for improvement. for future HMDs. Where the gap between the angular resolution of Quest 2 and the “retinal” resolution of the Butterscotch prototype is 3x, the gap between the brightness of Starburst and a Quest 2 is almost 200x, meaning there is a wider gap in brightness and dynamics range before you can fit into “pretty much any indoor environment,” as Lanman said about Starburst.
“The qualitative benefits of HDR were noticeable in the Starburst prototype demo I tried, even though the headset screen was far from retinal resolution,” Chan wrote to us. “It’s going to be a big technical challenge to get to about 20,000 nits in a consumer headset, but I could see incremental improvements in luminance through efficiency in display transmission. What excites me is that producing HDR images isn’t computationally taxing – there are so many existing media with built-in HDR metadata that will benefit in HDR VR headsets, I can’t wait to replay some of my favorite VR games that have been remastered for HDR!”
UploadVR News Writer Harry Baker contributed to this report.