This virtual reality app requires the use of an Oculus Rift DK2 and a Rift-mounted Leap Motion Controller (using the VR Developer Mount or unofficial alternative).
(GitHub) Source Available.
(YouTube) VR Keyboard.
This is a demonstration of a touch-type virtual keyboard in VR, using Leap Motion hand tracking. Their tracking software was updated recently and I was curious to see if it was accurate enough to enable this kind of text input, since text input in VR is currently difficult. It isn't the most efficient thing. In testing I was able to get between 9 and 10 WPM, which is nothing like my 90 with a meatspace keyboard, but if you've only got your hands it's better than nothing.
The design of this keyboard is very deliberate. Most work went into the buttons themselves. I set out to solve the following problems:
- Ensuring that only the buttons targeted are the ones that are activated.
- Feedback on button selection
- Feedback on button throw distance
- Feedback on button activation
The first one, ensuring that the targeted button is the one that triggers, is first accomplished by the wide space between buttons. Also, the trigger volume is a capsule, but a very thin one. The fingertip itself is fairly wide, so a skinny collision volume allows the fingertip to still occlude it quite easily, but makes double-selection difficult. This, on top of the wide spacing pretty much guarantees it will never happen. Buttons also only respond to finger directions that are perpendicular to their face. This prevents a user from pushing their palm into the keyboard and slamming several keys at once.
For button selection, the user is given three channels of feedback: Glowing rays shoot out of the button's surface, and the outline of the button itself does a ping-pong style color tween from highlight to normal color. The last channel of feedback was actually a happy accident, due to the way the throw mechanism works. When you hover over the button, the face 'pops' to it's least-depressed position instantly.
Buttons have a property known as 'throw distance', and this is how far a button must be pressed inwards before activation. Here we give the user lots of feedback as well. When the fingertip is within range, the buttons' face will move to the position along the button axis that the fingertip lies on, and when it gets pushed past a certain distance, the button will activate. This distance is actually illustrated with a 'shadow' mesh that has the same outline as the button itself, but grey and opaque. Audio feedback is also given here. When the button press is initiated, a whining wave sound fades in and increases in pitch, letting the user know intuitively that there is a clear escalation to pushing the button further inwards.
For the last one, when the button passes the threshold, two things happen: an activation sound is played, and the outline mesh of the button is cloned, and quickly scaled outwards.