I see your point -- but at least in the current iteration the HMD doesnt let the user view the actual outside world directly with their eyes (there is no optical pass through). What the user sees with their eyes and focusses on is whatever the HMD presents on the internal 4K displays, one for each eye.
Similarly from perspective of others around, they dont see the real eyes of the user - just a simulation of that. So any "eye-contact" -- even with the best execution of technology with high fidelity and low latency -- will still be something different from real eye-contact. I am no purist but difficult to accept at this point for in-person interactions. We have already gotten used to it for facetime / video calls.
Something like Google Glass might be easier to assimilate, maybe?
Similarly from perspective of others around, they dont see the real eyes of the user - just a simulation of that. So any "eye-contact" -- even with the best execution of technology with high fidelity and low latency -- will still be something different from real eye-contact. I am no purist but difficult to accept at this point for in-person interactions. We have already gotten used to it for facetime / video calls.
Something like Google Glass might be easier to assimilate, maybe?