The current generation of VR is great, but it's far from perfect. One way VR falls short of the real world is the fixed focus distance.
Despite the illusion of depth from the stereo images you're looking at, each image is essentially flat, at a fixed perceived distance from your face, and with a focus selected by the game engine instead of your eyeballs.
This can lead to fatigue or worse kinds of discomfort, and also puts limits on developers who need to strive to keep content in a visual sweet spot and a constant focus distance or risk discomfort for their users.
Oculus Research is showing off a solution to this problem
at SIGGRAPH this year. They call it a "focal surface display," and have
already built a working prototype.
The technology puts something called
a spatial light modulator (SLM) in between the screen and the headset's
eyepiece lenses, and the SLM actively bends light to give a scene a 3D
contour.
The edges of this "focal surface" aren't very sharp, so they
also do some color tricks to correct for the distortion.
This technique, combined with eye tracking, allows you to
look around a scene in VR and focus naturally on different areas at
different perceived distances.
Oculus Research says it's "a long way
out" from a consumer version of this technology, but it's exciting to
see how far along they already are in solving this problem.
Also, I’m
very excited to learn that there’s such a thing as a “spatial light modulator” and I have so many questions.
What do you think?
No comments:
Post a Comment