Conversation

This is pretty funny but also instructive. The key thing, which I’ve been saying for a while now, is bad input. The user struggles to do basic things because the input method of VR is terrible. Until that is solved, there will be no Metaverse
Replying to
It just doesn’t matter how good displays get or how comfortable headsets feel. People can’t DO anything useful in the Metaverse other than swing their arms or inaccurately shoot stuff. Literally everything else about the medium is bullshit until that problem is fixed.
1
2
What that means is platform companies like need to go back to drawing board and rethink controls. The split-gamepad paradigm has been tried repeatedly but is just too crude.
1
One avenue to explore would be to develop a deliberately abstract control method instead. The core assumption of most VR is control that it must be realistic, like the next stage of evolution from mouse to touchscreen to 3D.
1
But mouse and touch are both support methods of interaction to the primary method of interaction on computers and phones: keyboards. Keyboards are massively abstract, but powerful, and so we learn to use them. VR is missing an equivalent powerful abstract input method.
2
1
I think a lot of why interface design has moved toward gestures and realism is about overcoming learning and accessibility gaps, which is important work. But to use VR you see the limits of how far that line of thinking can go. Abstract methods require learning a skill, but still
1
2
Skill adoption is not necessarily a barrier as long as a motivated use case exists. Gamers learn the skill of using gamepads, navigating in 3d from 2d and understanding button/action relationships. Like learning to drive a car, this is quite hard at first. But motivated by fun.
1
Old Palm Pilot users had to learn a modified alphabet so that its stylus input could be interpreted. But we did because once learned that device was goddamn magic in 1998
2
VR etc probably need something similar. Note I probably don’t mean something in-OS like a better soft keyboard implementation etc. I mean a physical control device. For example maybe something as abstract as a braille keyboard on your forearm.
2
1
For example gloves that have contact points on each finger, which when combined act like function keys to perform reliable actions regardless of application
1
For example a physical tablet you hold (think iPad mini size) that correlates with you in VR and displays contextual controls and a soft keyboard. On the other hand, a thimble on your finger that supports hover detection when interacting with the tablet.
1
And so on. My point is not that any one of these solutions is The Way. It’s to encourage Metaverse developers and engineers to move beyond the assumption that the secret to success lies in better fidelity of gestures (or hand tracking, etc) because that’s all a dead end
1
What we need to make the Metaverse work is a physical input method to do fast, robust, skillful actions inside it, rather than to flail about and watch passive content. The real world will be better for all of the above until we get over this “interface hump”
1
1