Posts tagged MIT

Cynthia Breazeal: The rise of personal robots

This great TED talk Cynthia Breazeal expands further on one of the recurring themes of Robots and Avatars – the increase in personal and domestic use robots and the implications this may have for young people in particular. As a grad student, Breazeal wondered why we were using robots on Mars, but not in our living rooms. The key, she realized: training robots to interact with people. Now she dreams up and builds robots that teach, learn — and play. Watch for amazing demo footage of a new interactive game for kids.

Cynthia Breazeal founded and directs the Personal Robots Group at MIT’s Media Lab. Her research focuses on developing the principles and technologies for building personal robots that are socially intelligent—that interact and communicate with people in human-centric terms, work with humans as peers, and learn from people as an apprentice.

She has developed some of the world’s most famous robotic creatures, ranging from small hexapod robots to highly expressive humanoids, including the social robot Kismet and the expressive robot Leonardo. Her recent work investigates the impact of social robots on helping people of all ages to achieve personal goals that contribute to quality of life, in domains such as physical performance, learning and education, health, and family communication and play over distance.

Virtual/Physical Play Robot

Here is an intresting article from Gizmag about The Playtime Computing System developed by MIT Media Laboratory’s Personal Robots Group. The system blends robotic and virtual interfaces and is currently designed for children between the ages of 4 and 6 years old. This blend also allows the device to be used with telepresence, enabling a play and learning to occur in real-time across continents.

children playing with the playtime computing system

As Alphabot passes through a hole in the display panel, it appears to continue its journey through the virtual world projected onto the panels. Image Source: Gizmag/MIT

In an increasingly tech-centric world, keeping kids interested in learning can be an uphill battle. With teaching that involves play recently attracting some powerful supportive voices, students from MIT’s Media Lab have developed a system which merges technology and play to stimulate young minds. The Playtime Computing system uses infrared emitters and tracking cameras to monitor the position of a special robot within a play area. As the bot disappears into a hole in a panel, it appears to continue its journey into a virtual world projected onto the walls.

The Playtime Computing system developed by MIT Media Laboratory’s Personal Robots Group is aimed at children between 4 and 6 years old and allows them to get up and about instead of sitting around and getting bored, a hot topic at the moment given Michelle Obama’s Let’s Move campaign. It also allows for early experimentation in such things as symbolic reasoning and social roles.

The system is made up of three panels with projectors behind them, and a set of four ceiling projectors for sending images to the play area floor. Alphabot, a cube-shaped robot with infrared emitters at its corners, is tracked by ceiling-mounted cameras. A virtual landscape is projected onto the panels and floor to blur the barriers between reality and the artificially-created world. To further add to the illusion, as Alphabot disappears into a hole in the panel and some robotic foliage closes behind, the image projected onto the panel appears to show it continuing its journey into the virtual world.

A set of RFID-tagged wooden alphabet letters or symbols such as musical notes was also created so that the children can stick them onto Alphabot’s face. Placing letters onto the bot results in its face changing color to match, with musical notes causing music to be played through its onboard speakers. As the robot disappears into the virtual world beyond the panel, the symbol placed by the kids will also continue through to the animated version.

International playtime

The fun needn’t stop with just one play room, however. “One of the things we’re really excited about is having two of these spaces, one here and maybe one in Japan, and when the robot goes into the virtual word here, it comes out of the virtual world in Japan,” explained the group’s Adam Setapen. “So that kind of fits in with that one-reality concept, that there’s one robot, and whether it’s physical or virtual is based on the state of the robot in the Playtime Computing system.”

Of course, kids being kids, the young prototype testers crammed lots of different symbols onto the bot, which it wasn’t developed to handle. They also expected other objects placed in the hole to appear on the screen. Future developments of the system may well take such things in stride, with children perhaps being able to send a favorite toy into the virtual world.

Maybe it would also be interesting to see how they would deal with a digital twin!

Another aspect of the system is the Creation Station, a table-top computer where youngsters can arrange objects or draw pictures. Whatever is on the Station is recreated on the panels via the projectors.

The researchers also kitted out the playful system testers with baseball caps sporting infrared emitters. This allowed the system to keep track of the kids as well as the Alphabot, which could make it possible for such things as interaction with the computer animated robot in future versions. If the team can develop the system to operate using something like Microsoft’s Kinect gaming technology, then players could be tracked without having to rely on infrared clothing.

The team says that the current prototype was put together using off-the-shelf parts at a cost of just a few hundred dollars, and believe that mass production for home use is a viable possibility.

Source: Gizmag

Ever heard of a Robotar?

mebotPresenting as part of the Robots and Avatars Collaborative Futures Panel at the Kinetica Art Fair 2010, Professor Noel Sharkey coined the phrase “Robatars”, citing the example of physical military drones operating in war-zones, yet controlled by operators in the Nevada desert. He explained how “virtual Reality is coming into play in a new way, which you could call “Real Virtuality” – you’re looking at VR in a cocoon, where you can smell, touch and so on.”

MIT’s recent MeBot – a semi-autonomous robotic avatar that gives people a richer way to interact remotely with an audience than is allowed with phone and video conferencing, brings this idea from military spheres into the personal domain. The MeBot is designed to be able to convey its users’ gestures, head movements and proxemics and as it does its designers aim to expand the capabilities of mobile and wireless communication.  Initial experiments showed that users felt more psychologically involved in the remote interaction particularly because of the basic embodiment that the robot allows.

Check out this video to see the MeBot in action:

[http://www.youtube.com/watch?v=aME2aeIzbQo]