Is it a good practice to include a way to calibrate VR scale in a VR dream?
"How do I?" is a solved problem in this case - this is more of a "Should I?"
Unless a VR headset has hardware to measure the distance between the user's pupils - and as far as I am aware, the PlayStation VR headset doesn't - the separation in scene space between the VR left camera and the VR right camera must be based on an assumed average pupil separation.
If the user's pupil separation is wider than that, the user will see what a person with narrower pupil separation than their own would see - the scene will appear large.
If the user's pupil separation is narrower than that, the user will see what a person with wider pupil separation than their own would see - the scene will appear small.
Considering this, should a VR dream include a way to calibrate the VR scale? For example, a 1m x 1m x 1m cube 3m in front of the camera, a Controller Sensor with a control wired via some logic to the VR scale tweak of the active camera, and an instruction to the user to adjust the VR scale until the cube *looks like* a 1m x 1m x 1m cube 3m in front of the camera?
I think in general this isn't how it's handled I think. The difference would likely be fairly negligable I guess? Or have more to do with focus than how large the scene feels? Not sure.
Also... I personally don't know what 1m looks like to that degree of precision so I probably wouldn't be able to figure out if I should adjust it or not. ?
Please sign in to leave a comment.