Massless is developing a stylus designed specifically for high-precision VR input. We got to check out a prototype version of the device this week at GDC 2018.

While game-oriented VR controllers are the norm as far as VR input today is concerned, Massless hopes to bring another option to the market for use cases which benefit from greater precision, like CAD. Controllers like Oculus’ Touch and HTC’s Vive wands are quite precise, but they are articulated primarily by our wrists, and miss out on the fine-grain control that comes from our fingers—when you write on a piece of paper, notice how much more your fingers are in control of the movements vs. your wrist. This precession is amplified by the fact that the tabletop surface acts as an anchor for your finger movements. Massless has created a tracked stylus with the goal of bringing the precision of writing implements into virtual reality, with a focus on enterprise use-cases.

Image courtesy Massless

At GDC I saw a working, 3D printed prototype of the Massless Pen working in conjunction with the Oculus Rift headset. The system uses a separate camera, aligned with the Rift’s sensor, for tracking the tip of the stylus. With the stylus held in my left hand, and a Touch controller in my right, a simple demo application placed me into an empty room where I could see the tip of the pen moving around in front of me. I could draw in the air by holding a button on the Touch controller and waving the stylus through the air. I could also use the controller’s stick to adjust the size of the stroke.

Photo by Road to VR

Using the Massless Pen felt a lot like drawing in the air with an app like Tilt Brush, but I was also able to write tiny letters quite easily; without a specific task comparison, or objective means of measurement between controller and stylus though, it’s tough to assess the precision of the pen by just playing with it, other than to say that it feels at least as precise as Touch and Vive controllers.

Oculus Research Devises High-accuracy Low-cost Stylus for Writing & Drawing in VR

Since the ‘action’ of writing in real life is initiated ‘automatically’ when your writing implement touches the writing medium, it felt a little awkward to have to press a button (especially on my other hand) in order to initiate strokes. Of course, the Massless Pen itself could have a button on it (so at least it might feel a little more natural since the stroke initiation would happen in the same hand as the writing action), but the company says they’ve steered away from that because the action of pressing a button on the pen itself would cause it to move slightly, working against the precision they are attempting to maintain.

Photo by Road to VR

If you’ve ever used one of a million trigger-activated laser-pointed interfaces in VR, you’ll know that this is actually a fair point, as pointing with a laser and then using the controller’s trigger to initiate an action causes the laser to move significantly (especially as it’s amplified by leverage). It felt weird using my other hand to initiate strokes at first, but I feel fairly confident that this would begin to feel natural over time, especially considering that many professional digital artists use drawing tablets where they draw on one surface (the tablet) and see it appear on other (the monitor).

Inside the demo I could see the white outline of a frustum projected from a virtual representation of the Rift sensor in front of me. The outline was a visual representation of the trackable area of the Massless Pen’s own sensor, and it was relatively narrow compared to the Rift’s own tracking. If I moved the stylus outside the edge of the outline, it would stop tracking until I brought it back into view. As Massless continued to refine their product, I hope the company is prioritizing growing the trackable area to be more comparable to the headset and controller that it’s being used with.

While the Massless Pen prototype I used has full positional tracking, it lacks rotational tracking at the moment, meaning it can only create strokes from a singular point, and can’t yet support strokes that would benefit from tilt information, though the company plans to support rotation eventually.

Photo by Road to VR

More so than drawing in the air, I’m interested in VR stylus input because of what it could mean for text input handwritten on an actual surface (rather than arbitrary strokes in the air); history bred the stylus for this use-case, and they could become a key tool for productivity in VR. Drawing broad strokes in the air is nice, but writing benefits greatly from using the writing surface as an anchor for your hand, allowing your dexterous fingers to do the precision work; for anything but course annotations, if you’re planning to write in VR, it should be done against a real surface.

To see what that might be like with the Massless Pen, I tried my hand at writing ‘on’ the surface of the table I was sitting at. After sketching a few lines (as if trying to color in a shape) I leaned down to see how consistently the lines aligned with the flat surface of the table. I was surprised at the flatness of the overall sketched area (which suggests fairly precise, well calibrated tracking), but did note that the shape of the individual lines showed regular bits of tiny jumpiness (suggesting local jitter). Granted, this is to be expected—Massless says they haven’t yet added ‘surface sensing’ to the pen (though they plan to), which could reasonably be used to eliminate jitter during real surface writing entirely, since they could have a binary understanding of whether or not the pen is touching a real surface, and use that information to ‘lock’ the stroke to one plane.

The Massless Pen is interesting for in-air input, but since the stylus was born for writing on real surfaces, I hope the company increases its focus in that area, and allows 3D drawing and data manipulation to evolve as a natural, secondary extension of handwritten VR input.

Newsletter graphic

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. More information.

Ben is the world's most senior professional analyst solely dedicated to the XR industry, having founded Road to VR in 2011—a year before the Oculus Kickstarter sparked a resurgence that led to the modern XR landscape. He has authored more than 3,000 articles chronicling the evolution of the XR industry over more than a decade. With that unique perspective, Ben has been consistently recognized as one of the most influential voices in XR, giving keynotes and joining panel and podcast discussions at key industry events. He is a self-described "journalist and analyst, not evangelist."
  • Sponge Bob

    that’s not just one extra camera

    they use two cameras at opposite ends of a rigid bar and do triangulation for precise position calculation

    (a very standard way of optical tracking existing long before VR)

    makes 4 cameras in total (two for rift and touch and two more for stylus)

    nice :)

    • For drawing on a surface, why could the existing rift sensors not track the pen instead of a touch controller? The pen could house the leds for tracking and as you are using this on a surface then occlusion should be minimized? Of course if you want to use it like a spray gun at any position then it will get occluded by your own hand somewhat.

      • Sponge Bob

        they perfectly can …in theory

        in practice, however, since rift cameras are not fixed to some rigid structure complete re-calibration has to be done every time cameras are moved

        but even more seriously, oculus (facebook) does not allow low level access to their runtime internals and led controls (i am not a hardcore vr developer but that would me my best guess) making third-party hardware integration very problematic if not possible at all

        • Ahh good point, I hope they release their tracking SDK to the public then.

          • We’ve tried that in a few experimental stylus configurations using Nano-embedded crystalline processing units, however the occlusion vectors inverted when crosspoints increased by a factor of 3. Ngonning became the major issue that ended any hopes of a hardware solution without increasing the bit hex of our current architecture. That in itself required another team of specialists surrounding frequency spectrum pegging in order to create pattern algorithms to contain these new transmatic snapshots that we can use to anchor these vectors in its’ 3d space.

          • Sponge Bob


            R u high ?

          • brandon9271

            Nuh uh! You can’t triple stamp a double stamp!

    • Sarah

      Google offers you 98 dollars every hour to do easy jobs on the laptop . Do job Some few period of time in a day and stay greater time with your loved ones . Any one can also avail this official post!!last Saturday I got a gorgeous Citroën DS just after getting $5227 this last five weeks .no doubt it is the easiest job but you may no longer forgive yourself if you do not try this.!fg1001h:↛↛↛ http://GoogleConsultingJobsReport1/earn/$98/per/hr ♥♥s♥♥♥r♥j♥♥♥d♥♥m♥♥♥q♥j♥m♥♥♥s♥♥♥q♥♥y♥g♥a♥♥♥g♥c♥r♥♥♥a♥♥z♥♥♥v♥♥s♥♥♥t♥♥♥f♥♥♥r♥l♥♥♥e::!ng632c:lqeqyy

      • Sponge Bob

        Why dont you drop dead, please?

  • Lucidfeuer

    I was a step away from registering, then realised they don’t have a single live videos which makes them not trustfull at all. Because a Virtual Pen is precisely the thing I need for VR today, but no demo of the actual input on screen is a very bad sign.

    • Sponge Bob

      And how much would you pay for this?
      Do you need just precise tip position or rotation (tilt) info too?
      Is there killer app for this?

      • Lucidfeuer

        Depending on how well this is conceived as a virtual pen, anywhere from 100€ to 400€ if it’s excellent.

        I need fucking image of the thing working, and now just people talking PR bullshit with a prototype.

        • Sponge Bob

          Sixense had fine live demos of their tracking tech like 4 years ago and still no product

          • Lucidfeuer

            Good thing I was never interested by Sixsense as a product, but yeah I’m surprised this wasn’t released given how concrete this was, although I doubt it would have been successful.

  • GigaSora

    Oculus is already researching this themselves. I’d just wait till they make their own. It’ll likely be better.

    • Sponge Bob

      Thats just a piece of academic research
      Single camera based pose estimation
      Sort of DK2 tracking

  • Lisa Walker

    Salut! ))))
    Are you single tonight? A lot of beautiful girls waiting for you to


  • Foreign Devil

    I had my hopes up. . but if you are using Oculus Touch control trigger for pressure sensitivity . it renders the point of the stylus useless. THat trigger cannot give good controllable pressure sensitivity results. If we are going to sculpt or draw in VR. . we need some way to have fine control over “pressure” so we can draw thick to thin. . or sculpt with variable amounts of pressure. . .otherwise it like trying to draw or sculpt with a potato in hand.

    • Sponge Bob

      Tilt then for line thickness or pressure?

    • Sponge Bob

      and fine pressure control has to be part of stylus itself
      tilt might work to some extent but it’s unnatural

  • MSFX

    Tried this a while ago at a meetup and really don’t see the point, why would people in CAD want to use this? Personally feel they’re making a problem to solve.

    • Ashley

      Google paying to new employee $99 per hour, what’s more receive each week income .. Any person can also join this offer!!this Sunday I got a great new McLaren F3 after just getting $14252 this six weeks .no doubt it is high-quality process however you can now not forgive yourself if you do not visit it.!mg372b:➧➧➧➧ http://GoogleAcademyOnlineFreelanceJobs/getcash/$99/hr ♥o♥♥♥a♥♥t♥♥w♥♥j♥♥v♥d♥♥w♥♥♥s♥♥♥w♥♥♥k♥♥♥i♥♥q♥♥b♥♥♥e♥♥♥t♥s♥♥b♥♥♥d♥x♥♥m♥s♥♥♥r♥♥b♥r:::::::!jg12w:gfkubkh

    • Sponge Bob

      And what would people use for 3d CAD in VR ?
      Touch controllers?

      • bkydcmpr

        in the virtual world, tools should be virtual. you pick a tool from the virtual toolbox, the system should “know” what tool you are gripping, no stylus required.

    • Latifah Castanon

      Hello! )))))))
      Are you single tonight? A lot of beautiful girls waiting for you to


    • Lucidfeuer

      Could you detail it more please? How straight-forward was the usage, how precise was the tracking and use, and what are the problems or lacks you mention?

  • beestee

    This would benefit greatly from a tracked clipboard. Draw on the clipboard surface, align the clipboard to where you want to paste that detail or sketch, then click a button on the pen or clipboard to paste a copy of what was drawn on the surface to that location. This would also require an elegant inferred snaps system.

    You could also use the clipboard to copy info in a similar way, just align it near the objects that you want to copy and click the button to put the objects on the clipboard.

    If this was using inside out tracking, the clipboard could use QR + gyro for registration and the clipboard itself could track the pen.

    • brandon9271

      For what you’re talking about, a wireless Wacom tablet with a tracker would be ideal. It could use the super accurate tracking and pressure sensitivity the Wacom already has and also be visible in VR.

    • Sponge Bob

      that’s 2D drawing .. not 3D

      • beestee

        Did I say the clipboard surface had to be flat? :p

      • Moe Curley

        If you read the article it in large part focuses on the stylus being used on a flat surface.

    • Moe Curley

      I’m sure they give you the ability to rotate and transpose the drawing to align any surface within with your real world desk work surface.

  • Moe Curley

    Good article but I don’t think you meant to write “This PRECESSION is amplified by” in the first paragraph. I know how it is. I can’t even make a comment without editing it 20 times. It’s amazing that anyone can write articles or longer pieces with as few errors as you do.