Exclusive: Validating an Experimental Shortcut Interface with Flaming Arrows & Paper Planes

Have Leap Motion and a VR headset? Try the demo yourself.

6

Last time, we detailed our initial explorations of single-handed shortcuts systems. After some experimentation, we converged on a palm-up pinch to open a four-way rail system. Today we’re excited to share the second half of our design exploration along with a downloadable demo on the Leap Motion Gallery.

Guest Article by Barrett Fox & Martin Schubert

Barrett is the Lead VR Interactive Engineer for Leap Motion. Through a mix of prototyping, tools and workflow building with a user driven feedback loop, Barrett has been pushing, prodding, lunging, and poking at the boundaries of computer interaction.

Martin is Lead Virtual Reality Designer and Evangelist for Leap Motion. He has created multiple experiences such as Weightless, Geometric, and Mirrors, and is currently exploring how to make the virtual feel more tangible.

Barrett and Martin are part of the elite Leap Motion team presenting substantive work in VR/AR UX in innovative and engaging ways.

We found the shortcuts system comfortable, reliable, and fast to use. It also felt embodied and spatial since the system didn’t require users to look at it to use it. Next it was time to put it to the test in a real-world setting. How would it hold up when we were actually trying to do something else with our hands?

We discussed a few types of potential use cases:

#1. Direct abstract commands. In this scenario, the system could be used to directly trigger abstract commands. For example, in a drawing application either hand could summon the shortcut system – left to undo, right to redo, forward to zoom in, or backwards to zoom out.

#2. Direct contextual commands. What if one hand could choose an action to take upon an object being held by the other hand? For example, picking up an object with your left hand and using your right hand to summon the shortcut system – forward to duplicate the object in place, backward to delete it, or left/right to change its material.

#3. Tool adjustments. The system could also be used to adjust various parameters of a currently active tool or ability. For example, in the same drawing application your dominant hand might have the ability to pinch to draw in space. The same hand could summon the shortcut system and translate left/right to decrease/increase brush size.

#4. Mode switching. Finally, the system could be used to switch between different modes or tools. Again in a drawing application, each hand could use the shortcut system to switch between free hand direct manipulation, a brush tool, an eraser tool, etc. Moreover, by independently tool-switching with each hand, we could quickly equip interesting combinations of tools.

Of these options, we felt that mode switching would test our system the most thoroughly. By designing a set of modes or abilities that required diverse hand movements, we could validate that the shortcuts system wouldn’t get in the way while still being quickly and easily accessible.

Mode Switching and Pinch Interactions

In thinking about possible abilities we’d like to be able to switch between, we kept returning to pinch-based interactions. Pinching, as we discussed in our last blog post, is a very powerful bare handed interaction for a few reasons:

  • It’s a gesture that most people are familiar with and can do with minimal ambiguity, making it simple to successfully execute for new users.
  • It’s a low-effort action, requiring only movement of your thumb and index fingers. As a result, it’s suitable for high-frequency interactions.
  • Its success is very well-defined for the user who gets self-haptic feedback when their finger and thumb make contact.

However, having an ability triggered by pinching does have drawbacks, as false triggers are common. For this reason, having a quick and easy system to enable, disable, and switch between pinch abilities turned out to be very valuable. This led us to design a set of pinch powers to test our shortcut system.

Pinch Powers!

We designed three pinch powers, leaving one shortcut direction free as an option to disable all pinch abilities and use free hands for regular direct manipulation. Each pinch power would encourage a different type of hand movement to test whether the shortcut system would still function as intended. We wanted to create powers that were interesting to use individually but could also be combined to create interesting pairs, taking advantage of each hand’s ability to switch modes independently.

The Plane Hand

For our first power, we used pinching to drive a very common action: throwing. Looking to the physical world for inspiration, we found that paper plane throwing was a very expressive action with an almost identical base motion. By pinching and holding to spawn a new paper plane, then moving your hand and releasing, we could calculate the average velocity of your pinched fingers over a certain number of frames prior to release and feed that into the plane as a launch velocity.

Using this first ability together with the shortcuts system revealed a few conflicts. A common way to hold your hand while pinching a paper plane is with your palm facing up and slightly inwards with your pinky furthest away from you. This fell into the gray area between the palm direction angles defined as ‘facing away from the user’ and ‘facing toward the user’. To avoid false positives, we adjusted the thresholds slightly until the system was not triggered accidentally.

To recreate the aerodynamics of a paper plane, we used two different forces. The first added force is upwards, relative to the plane, determined by the magnitude of the plane’s current velocity. This means a faster throw produces a stronger lifting force.

The other force is a little less realistic but helps make for more seamless throws. It takes the current velocity of a plane and adds torque to bring its forward direction, or nose, inline with that velocity. This means a plane thrown sideways will correct its forward heading to match its movement direction.

With these aerodynamic forces in play, even small variations in throwing angle and direction resulted in a wide variety of plane trajectories. Planes would curve and arc in surprising ways, encouraging users to try overhanded, underhanded, and side-angled throws.

In testing, we found that during these expressive throws, users often rotated their palms into poses which would unintentionally trigger the shortcut system. To solve this we simply disabled the ability to open the shortcut system while pinching.

Besides these fixes for palm direction conflicts, we also wanted to test a few solutions to minimize accidental pinches. We experimented with putting an object in a user’s pinch point whenever they had a pinch power enabled. The intention was to signal to the user that the pinch power was ‘always on.’ When combined with glowing fingertips and audio feedback driven by pinch strength, this seemed successful in reducing the likelihood of accidental pinches.

We also added a short scaling animation to planes as they spawned. If a user released their pinch before the plane was fully scaled up the plane would scale back down and disappear. This meant that short unintentional pinches wouldn’t spawn unwanted planes, further reducing the accidental pinch issue.

The Bow Hand

For our second ability we looked at the movement of pinching, pulling back, and releasing. This movement was used most famously on touchscreens as the central mechanic of Angry Birds and more recently adapted to three dimensions in Valve’s The Lab: Slingshot.

Virtual slingshots have a great sense of physicality. Pulling back on a sling and seeing it lengthen while hearing the elastic creak gives a visceral sense of the potential energy of the projectile, satisfyingly realized when launched. For our purposes, since we could pinch anywhere in space and pull back, we decided to use something a little more lightweight than a slingshot: a tiny retractable bow.

Pinching expands the bow and attaches the bowstring to your pinched fingers. Pulling away from the original pinch position in any direction stretches the bowstring and notches an arrow. The longer the stretch, the greater the launch velocity on release. Again we found that users rotated their hands while using the bow into poses where their palm direction would accidentally trigger the shortcut system. Once again, we simply disabled the ability to open the shortcut system, this time while the bow was expanded.

To minimize accidental arrows spawning from unintentional pinches, we again employed a slight delay after pinching before notching a new arrow. However, rather than being time-based like the plane spawning animation, this time we defined a minimum distance from the original pinch. Once reached, this spawns and notches a new arrow.

The Time Hand

For our last ability, we initially looked at the movement of pinching and rotating as a means of controlling time. The idea was to pinch to spawn a clock and then rotate the pinch to turn a clock hand, dialing the time scale down or back up. In testing, however, we found that this kind of pinch rotation actually only had a small range of motion before becoming uncomfortable.

Since there wasn’t much value in having a very small range of time-scale adjustment, we decided to simply make it a toggle instead. For this ability, we replaced the pinch egg with a clock that sits in the user’s pinch point. At normal speed the clock ticks along quite quickly, with the longer hand completing a full rotation each second. Upon pinching, the clock time is slowed to one-third normal speed, the clock changes color, and the longer hand slows to complete a full rotation in one minute. Pinching the clock again restores time to normal speed.

Continued on Page 2: Mixing & Matching

1
2
3

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.


  • Molmir

    If Oculus can get Leap Motion handtracking, stereo-AR-passthrough, Googles high-res standalone display, and eyetracking added to the final third version of Santa Cruz for Oculus Connect this year they will have a killer product.

    • Luke

      I prefere killer applications for actual oculus rift other than new killer products (there will be always something better that others will place on the market).
      …but killer applications do the story of a company.

    • dk

      optional pc connection also will be huge
      …..btw pass through has problems like latency and high video quality that r hard to solve

    • They are already working on their own hands tracking

  • Awesome

  • Mal1t1a

    I just finished trying out this experiment. It’s quite neat! I have never actually tried using my Leap Motion with my HTC Vive and this was quite special. I felt like I finally had the realistic hand tracking I wanted (with exception to the limited FoV). Combined with a simple to use interface. In particular, I especially enjoyed how it felt like a smooth VR experience. My actions and motions really felt like I was throwing a paper airplane.