"How do I?" is a solved problem in this case - this is more of a "Should I?"
Unless a VR headset has hardware to measure the distance between the user's pupils - and as far as I am aware, the PlayStation VR headset doesn't - the separation in scene space between the VR left camera and the VR right camera must be based on an assumed average pupil separation.
If the user's pupil separation is wider than that, the user will see what a person with narrower pupil separation than their own would see - the scene will appear large.
If the user's pupil separation is narrower than that, the user will see what a person with wider pupil separation than their own would see - the scene will appear small.
Considering this, should a VR dream include a way to calibrate the VR scale? For example, a 1m x 1m x 1m cube 3m in front of the camera, a Controller Sensor with a control wired via some logic to the VR scale tweak of the active camera, and an instruction to the user to adjust the VR scale until the cube *looks like* a 1m x 1m x 1m cube 3m in front of the camera?
Please sign in to leave a comment.