Toyota recently revealed T-HR3, the company’s third-generation humanoid robot. Designed primarily as an experiment to explore new technologies to make robots more physically capable, T-HR3 demonstrates a new “remote maneuvering system” that not only mirrors a user’s movements to the robot, but lets them see and interact with the world through the ‘eyes’ and arms of the robot using a robotic exoskeleton, an HTC Vive headset, and a pair of Vive Trackers.

Controlled via what Toyota calls a “Master Maneuvering System,” T-HR3 allows the entire body of the robot to be operated by a person thanks to wearable controls that the company says mirrors the user’s head, hand, arm and foot movements.

Both the robot itself and the Master Maneuvering System contain  a series of motors, reduction gears and torque sensors connected to each joint. A total of 16 controls command 29 individual robot body parts, making for what the company calls “a smooth, synchronized user experience.”

Toyota is positioning the robot as the next logical step in the ultimate goal of creating a friendly assistant capable of helping people in a variety of settings, including the home and medical facilities, and more dangerous places like construction sites, disaster-stricken areas and outer space. As investment in telepresence-controlled humanoid robot grows though, there’s bound to be a number of happy side effects for VR users like better force-feedback haptics and full immersion rigs that could equally be used to control VR avatars. Because projects like these are still in prototyping, we’ll just have to wait and see what happens during the inevitable rise of our robotic companions. In any case, we’ll be here reporting (until the journo-bots take our jobs, that is).

Meta is Bringing One of Its Most Popular VR Games to PSVR 2

If you want to see T-HR3 in action, it will be featured at the upcoming International Robot Exhibition 2017 from November 29th through December 2nd at Tokyo Big Sight.

This article may contain affiliate links. If you click an affiliate link and buy a product we may receive a small commission which helps support the publication. See here for more information.

  • Ombra Alberto

    I had already seen this thing. Pussy !!

    • NooYawker


      • Ombra Alberto

        Cool … hahah sorry

        • NooYawker

          Big difference You must have some kind of rogue autocorrect LOL.

  • Firestorm185

    Cool stuff! Just need faster robots and we’ll be set for the takeover! xD

  • Cl

    What was that movie where they wore the vr headsets and thought there were In a game killing aliens, but in reality they were controlling war robots killing people?

  • Foreign Devil

    remote controlling robots in 1st person perspective. . I could see this being most useful in dangerous or strenuous situations. . whether it be decontaminating nuclear accident sights, fire rescue, bomb detection, dangerous or strenuous factory production work or warfare.

    • yag

      Yes that’s a much simpler solution than semi-autonomous robots (see DARPA Grand Challenge).

  • I am quite skeptical of the balance aspects of this. It looks like the person is sitting in a standing chair, what would happen if they lifted both legs? I feel like telepresence is best done by not replicating actual feet movement, but focus on the upper body and use Boston Dynamics robots for the bottom etc.

  • brandon9271

    “a smooth, synchronized user experience.” and I bet a $1000 it’ll be used for porn at some point..

  • Joe Black

    Yeah see.. Avatar robots like this could have solved Fukushima’s problems EZ.

    • yag

      The tricky problem is robots (and electronical stuff in general) have a very short life-expectancy under strong radiations, even with radiation-hardened circuits.

      • Joe Black

        yeah in a high radiation environment shielding is crucial, but I believe in the case of Fukushima the valve that needed to be closed was in an area where robotics could work, except that autonomy was not advanced enough.

  • yag

    Great stuff but the motion-to-photon latency must be huge… for hard stomachs only !

  • brubble