Jump to content


Mind-controlled robot avatars inch towards reality

  • Please log in to reply
1 reply to this topic

#1 Asrokhel



  • 1,027 posts
  • Joined: 05-April 12
  • OS: Windows 8 Pro x64 (testing to see if I keep it or go back to Windows 7)

Posted 14 November 2012 - 02:14

Researchers at the CNRS-AIST Joint Robotics Laboratory (a collaboration between France's Centre National de la Recherche Scientifique and Japan's National Institute of Advanced Industrial Science and Technology) are developing software that allows a person to drive a robot with their thoughts alone. The technology could one day give a paralyzed patient greater autonomy through a robotic agent or avatar.

The system requires that a patient concentrate their attention on a symbol displayed on a computer screen (such as a flashing arrow). An electroencephalography (EEG) cap outfitted with electrodes reads the electrical activity in their brain, which is interpreted by a signal processor. Finally, the desired command is sent to the robot to carry out.

The system does not provide direct fine-grain motor control: the robot is simply performing a preset action such as walking forward, turning right or left, and so on. The robot's artificial intelligence, developed over several years at the lab, allows it to perform more delicate tasks such as picking up an object from a table without needing human input. In this scenario, the robot's camera images are parsed by object recognition software, allowing the patient to choose one of the objects on a table by focusing their attention on it.

With training, the user can direct the robot's movements and pick up beverages or other objects in their surroundings. The system can be seen in use in the DigInfo video at the bottom of the page.

This is similar to but more sophisticated than previous projects, one involving Honda's ASIMO robot from 2006, and another at the University of Washington from 2007.

A different but more direct approach would be to track a patient's eye movements. Recent research conducted at the Université Pierre et Marie Curie-Paris enabled cursive writing on a computer screen through eye movement alone. The same technology could allow a patient to move a cursor and select from a multitude of action icons without having to go through the EEG middle-man. The hitch is that – in some circumstances – eye movement isn't possible or can't be tracked reliably due to eye conditions. In that case, brain implants may be the way to go.

No matter how you slice it, researchers aren't giving up, and with further progress robot avatars may cease being the stuff of science fiction. No doubt patients would feel empowered and liberated by this technology, but it will be awhile before it can be implemented, and the robots being deployed will likely look more like Toyota's recently unveiled Human Support Robot than advanced bipedal robots.



#2 68k


    Neowinian Senior

  • 2,460 posts
  • Joined: 20-January 10
  • Location: Australia

Posted 14 November 2012 - 02:27

I can see this having a military use too.