Posts tagged robotar

 

Virtual/Physical Play Robot

Here is an intresting article from Gizmag about The Playtime Computing System developed by MIT Media Laboratory’s Personal Robots Group. The system blends robotic and virtual interfaces and is currently designed for children between the ages of 4 and 6 years old. This blend also allows the device to be used with telepresence, enabling a play and learning to occur in real-time across continents.

children playing with the playtime computing system

As Alphabot passes through a hole in the display panel, it appears to continue its journey through the virtual world projected onto the panels. Image Source: Gizmag/MIT

In an increasingly tech-centric world, keeping kids interested in learning can be an uphill battle. With teaching that involves play recently attracting some powerful supportive voices, students from MIT’s Media Lab have developed a system which merges technology and play to stimulate young minds. The Playtime Computing system uses infrared emitters and tracking cameras to monitor the position of a special robot within a play area. As the bot disappears into a hole in a panel, it appears to continue its journey into a virtual world projected onto the walls.

The Playtime Computing system developed by MIT Media Laboratory’s Personal Robots Group is aimed at children between 4 and 6 years old and allows them to get up and about instead of sitting around and getting bored, a hot topic at the moment given Michelle Obama’s Let’s Move campaign. It also allows for early experimentation in such things as symbolic reasoning and social roles.

The system is made up of three panels with projectors behind them, and a set of four ceiling projectors for sending images to the play area floor. Alphabot, a cube-shaped robot with infrared emitters at its corners, is tracked by ceiling-mounted cameras. A virtual landscape is projected onto the panels and floor to blur the barriers between reality and the artificially-created world. To further add to the illusion, as Alphabot disappears into a hole in the panel and some robotic foliage closes behind, the image projected onto the panel appears to show it continuing its journey into the virtual world.

A set of RFID-tagged wooden alphabet letters or symbols such as musical notes was also created so that the children can stick them onto Alphabot’s face. Placing letters onto the bot results in its face changing color to match, with musical notes causing music to be played through its onboard speakers. As the robot disappears into the virtual world beyond the panel, the symbol placed by the kids will also continue through to the animated version.

International playtime

The fun needn’t stop with just one play room, however. “One of the things we’re really excited about is having two of these spaces, one here and maybe one in Japan, and when the robot goes into the virtual word here, it comes out of the virtual world in Japan,” explained the group’s Adam Setapen. “So that kind of fits in with that one-reality concept, that there’s one robot, and whether it’s physical or virtual is based on the state of the robot in the Playtime Computing system.”

Of course, kids being kids, the young prototype testers crammed lots of different symbols onto the bot, which it wasn’t developed to handle. They also expected other objects placed in the hole to appear on the screen. Future developments of the system may well take such things in stride, with children perhaps being able to send a favorite toy into the virtual world.

Maybe it would also be interesting to see how they would deal with a digital twin!

Another aspect of the system is the Creation Station, a table-top computer where youngsters can arrange objects or draw pictures. Whatever is on the Station is recreated on the panels via the projectors.

The researchers also kitted out the playful system testers with baseball caps sporting infrared emitters. This allowed the system to keep track of the kids as well as the Alphabot, which could make it possible for such things as interaction with the computer animated robot in future versions. If the team can develop the system to operate using something like Microsoft’s Kinect gaming technology, then players could be tracked without having to rely on infrared clothing.

The team says that the current prototype was put together using off-the-shelf parts at a cost of just a few hundred dollars, and believe that mass production for home use is a viable possibility.

Source: Gizmag

 

Robonaut Tweets

RobonautWe have already blogged about Robotars – humanoid robots that are controlled virtually from a remote location – and NASA’s efforts in this field are developing further with their Robonaut 2.

At the recent Artifical Intelligence Lunch Debate the diverse group of experts discussed the implications of this sort of blended reality. This is particularly in relation to the use of sensory feedback technology which gives users a more heightened and tactile experience and that provides new and more tangible ways of behaving through and with new representational forms.

Commenting about the problems with traditional understandings of artifical intelligence at the Lunch Debate in June, Professor Noel Sharkey suggested that with robots and avatars we should not be saying “I think therefore I am” but instead,  ‘I feel therefore I am’.

Daily Galaxy has a great article on the Robonaut 2 which is below:

NASA’s Robonaut 2, or R2, is getting ready to work on the International Space Station in November but it’s already tweeting about preparations under the account, @AstroRobonaut.

The humanoid robot — complete with a head, arms and an upper torso — will be the first dexterous humanoid robot in space and it assures its followers in one of its first tweets alluding to 2001: A Space Odyssey that, “No, no relation to Hal. Don’t know if I’d want to admit to having him on my family tree if I was. [Definately] don’t condone his actions.” It also tweeted that it’s not related to Boba Fett.

Is this another vivid sign that we have entered the dawn of the age of post-biological intelligence?

Although there are already several robots in space — including the famous now AI-enhanced Mars Rovers, which have been zipping around the red planet for years — NASA and G.M.have created the first human-like robot to leave Earth.

The robot is called Robonaut 2, or R2 for short, and it weighs in at 300 pounds, with a head, torso and two fully functional arms. At first, R2 will be monitored in space to see how it performs in weightlessness, but NASA hopes to eventually use R2 to assist astronauts during space walks and to work alongside engineers in the space station.

In a joint news release, John Olson, director of NASA’s Exploration Systems Integration Office, said, “The partnership of humans and robots will be critical to opening up the solar system and will allow us to go farther and achieve more than we can probably even imagine today.”

According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”

Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.

In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.

According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”

With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.

Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”

Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.

R2 will be a passenger on the Space Shuttle Discovery, which is scheduled to head to the space station in September.

Source Credits:

http://www.dailygalaxy.com/my_weblog/2010/08/robonaut-2-the-first-humanoid-tweeting-from-space.html

http://www.spacedaily.com/reports/Avatars_In_Space_999.html

http://bits.blogs.nytimes.com/2010/04/14/nasa-and-gm-robot-heading-to-space-station/?src=busln


 

Robot Resemblance

IMG_2279 (1)Little Island of Japan is a company that comes up with clone robots, and to date their efforts with robotic dolls have managed to bear a close resemblance to celebrities as well as politicians, being highlighted in TV shows as well as worldwide news.

For those who want a robotic avatar of yourself, it will take around 3 months from your order for the robot to be churned out and delivered right to your doorstep. These robots come with sensors built-in to detect when people are nearby, and are full well capable of waving its hands and saying a simple “Hello”. Each robot stands at 70cm in height and will set you back by a cool $2,200 after conversion.

Source: Ubergizmo

 

Anybots – Work Anywhere

qbCutoutOne of the central questions of Robots and Avatars is to ask what it would be like to collaborate with a robot in the workplace? Further, we are exploring what the implications of this would be for how we present ourselves to our colleagues in both physical and virtual space?

We wonder how it will be possible to envisage robots as colleagues and are incredibly excited by the potential in a hybrid between robots and avatars – ‘Robotars’ as Prof. Noel Sharkey calls it – which we think will help us push forward the possibilities for new and blended methods of work, play and collaboration in 10-15 years time.

Anybots a California based company who make telepresence robots announced the launch today of QB, the first professional-quality mobile proxy robot. QB is the first in a line of Anybots made to connect people and locations. Accessible from any web browser, QB represents you throughout the workplace from wherever you are.

Trevor Blackwell, Founder & CEO, Anybots says:
“Remote-presence robots add a new layer to the engagement available for a distributed workforce. The global Internet is now fast enough for millions of people to be streaming live video and 4G cellular data will soon be deployed everywhere — so in very short order, web-based robotics will no longer be limited to facilities with Wi-Fi.”

Hyoun Park, Research Analyst, Aberdeen Group
“By combining audiovisual telepresence with the freedom of robotic mobility and an easy-to-use remote control, Anybots has created a new level of remote presence. The QB telepresence robot provides the functionality needed for business processes without falling prey to the “uncanny valley” of discomfort associated with fully anthropomorphic robotic builds. QB could change the current model for remote interactions in research and development, corporate collaboration, retail, sales and customer service.”

 

NASA’s Robotic Avatars

6a00d8341bf7f753ef01310fac105a970c-320wi“If every habitable world in the universe is unique, and the precise chemical conditions of a planet helps shape the life that evolves there, then avatars could allow aliens to visit other worlds from the safety of their spaceship. Could it be that all the stories of alien encounters on Earth were really encounters with alien avatars? Maybe aliens don’t actually look like grey humanoids with large eyes and no noses. Instead, that haunting image may simply be what we look like to them.”

Astrobiology Magazine

At the Kinetica Art Fair Collaborative Futures Panel, Anna Hill (Creative Director of Space Synapse) explained that she is “…working on systems to get from space to Earth, and offer some sort of collaboration between the two.”

She offered some examples, including Remote Suit, a wearable system designed to share the experience of being in space with people on Earth, and the Symbiotic Sphere  – a pod which gathers inspirational space data including images, videos, sound and haptics from space, the idea being to give those who sit in it an idea of what it is like to be in space.

Anna outlined her vision of the future: “I can envisage a feminising of technology. I’m very interested in augmented learning and collective and systemic thinking – there will be fewer top-down organisations. And there’s a need for robots not to replace humans.”

NASA is no stranger to robotics, with more than 50 robotic spacecraft studying Earth and reaching throughout the solar system, from Mercury to Pluto and beyond. But their latest development in the field of ‘Telerobotics’ marks a new development in how robots and avatars could work together to facilitate more sophisticated unmanned space exploration.

“Tomorrow’s NASA space program will be different,” says Wallace Fowler of the University of Texas, a renowned expert in modeling and design of spacecraft, and planetary exploration systems. “Human space flight beyond Low Earth Orbit (LEO), beyond Earth’s natural radiation shields (the Van Allen belts), is dangerous. Currently, a human being outside the Van Allen belts could receive the NASA defined “lifetime dose” of galactic cosmic radiation within 200 days.”

The current Robots used by NASA, however, are a long way off the vision proposed in the film Avatar where human users truly ‘experience’ the environment they are placed in. This is where virtual reality environments begin to change things as highlighted in the Daily Galaxy blog:

The Virtual Interactive Environment Workstation (VIEW) was an early virtual reality instrument developed at NASA Ames. It was a leap forward in true ‘immersion’ of the user in a virtual environment, and was the first systems to use a ‘data glove’. This glove measured and tracked how a user moved their fingers, allowing interaction with the virtual world.

Today, NASA uses 3D and virtual technologies for a number of public outreach and education projects. The technology can also be used for training purposes, allowing an astronaut to practice, say, walking on the surface of mars. NASA is developing technologies that will  allow a human explorer based on Earth, or in the relative safety of a space station or habitat, to actually experience exploration of a distant location. If the technology can be tied to robotic ‘avatars’ on a planetary surface in real-time, the user would not simply experience a simulation of the world – but could directly participate in exploration and science as if they were there.

Closer to the exploration front, similar technologies are also being used in NASA’s most avatar-like experiment of all – the Robonaut. According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”

138676main_robonaut-006_1Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.

In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.

More recently, NASA revealed the next generation of Robonaut, dubbed R2. General Motors has now joined on as a partner, and hopes that Robonaut will not only explore other worlds, but will help humans build safer cars. For more information on the R2 project, click here to see videos with some of the key researchers involved.

According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”

With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.

Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”

Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.

Sources: Steve Boxer on the Robots and Avatars Collaborative Futures Panel, DailyGalaxy.com

 

Ever heard of a Robotar?

mebotPresenting as part of the Robots and Avatars Collaborative Futures Panel at the Kinetica Art Fair 2010, Professor Noel Sharkey coined the phrase “Robatars”, citing the example of physical military drones operating in war-zones, yet controlled by operators in the Nevada desert. He explained how “virtual Reality is coming into play in a new way, which you could call “Real Virtuality” – you’re looking at VR in a cocoon, where you can smell, touch and so on.”

MIT’s recent MeBot – a semi-autonomous robotic avatar that gives people a richer way to interact remotely with an audience than is allowed with phone and video conferencing, brings this idea from military spheres into the personal domain. The MeBot is designed to be able to convey its users’ gestures, head movements and proxemics and as it does its designers aim to expand the capabilities of mobile and wireless communication.  Initial experiments showed that users felt more psychologically involved in the remote interaction particularly because of the basic embodiment that the robot allows.

Check out this video to see the MeBot in action:

[http://www.youtube.com/watch?v=aME2aeIzbQo]

Go to Top