"Beaming" technology allows for remote human/rat communications

By

November 1, 2012

A rat interacts with a small robot, which represents a remotely-located human

A rat interacts with a small robot, which represents a remotely-located human

 

Earlier this week, we reported on the “beaming” telepresence system being developed by the EU Commission’s Community Research and Development Information Service (CORDIS). Once developed, the system should allow users to virtually experience being in a remote location by seeing, hearing and even feeling that location through the sensory inputs of a robot located there. That robot, in turn, would relay the user’s speech and movements to the people at that location. Now, two of the CORDIS partners have put an interesting slant on the technology – they’ve used it to let people interact with rats.

The experiment was conducted in Spain by University College London, and the University of Barcelona. It involved a person located in a room at U Barcelona’s Mundet campus, and a rat located in a cage 12 kilometers (7.5 miles) away in Barcelona’s Bellvitge neighborhood. Sensors at each location tracked the movements of both the person and the rat, and transmitted them in real time to the other location.

The rat was remotely represented by a computer-generated human avatar, which the person in Mundet saw via a head-mounted virtual reality display – whatever the rat did in Bellvitge, its animated human representative appeared to do in Mundet.

A human test subject, seeing the rat represented by an on-screen human avatar

A human test subject, seeing the rat represented by an on-screen human avatar

The person’s movements, meanwhile, were used to control an actual rat-sized wheeled robot in Bellvitge, which was in the cage with the rat. Whatever direction the person moved in Mundet, the robot did likewise in Bellvitge. Using a tray of food attached to the front of the robot, the person attempted to lure the rat from one side of its cage to the other.

So, what was the point? University College London’s Prof. Mandayam Srinivasan explained, “The process demonstrated here not only shows the range of our technology, but also provides a new tool for scientists, explorers or others to visit distant and alien places without themselves being placed in any kind of danger, and importantly, to be able to see animal behavior in a totally new way – as if it were the behavior of humans.”

A paper on the research was published today in the journal PLOS ONE. Part of the experiment can be seen in the video below.

Source: University College London

About the Author
Ben Coxworth An experienced freelance writer, videographer and television producer, Ben's interest in all forms of innovation is particularly fanatical when it comes to human-powered transportation, film-making gear, environmentally-friendly technologies and anything that's designed to go underwater. He lives in Edmonton, Alberta, where he spends a lot of time going over the handlebars of his mountain bike, hanging out in off-leash parks, and wishing the Pacific Ocean wasn't so far away.   All articles by Ben Coxworth

 

Copyright © gizmag 2003 - 2012  To subscribe or visit go to:  http://www.gizmag.com