Robots and Avatars

Posts relating to all things that bring robots and avatars together. Also posts that relate to the overall project itself.

 

Outrace – Robots in Trafalgar Square

outrace robots projecting into the air in trafalgar square

Credit: Outrace

This year as part of London Design Festival the public were invited to take control of eight industrial robots on loan from Audi’s production line. OUTRACE is an installation by Clemens Weisshaar and Reed Kram, that consists of 6 independent systems coordinated by one KWTC CONTROLLER. Messages were sent in by the public, via a website, and then processed by the system every 60 seconds.

By way of a powerful LED light source, positioned at the tool head of each robot, people’s messages were traced into the public space of Trafalgar Square. Long-exposure cameras captured the interactive light paintings and relayed them to the project website and social media platforms to be shared.

Robots and Avatars sent in a message to OUTRACE which was shown at the rather unsociable time of 7.08am! Here is the video of the drawing – see if you can work out what we sent…

Digitalarti Magazine

Robots and Avatars features in a new and interactive edition of Digitalarti Magazine.

Digitalarti welcomes digital art pros, artists, festival organizers, journalists, collectors, galleries, institutions, digital art fans and all festival-goers around the world, and invites them to share experiences, information, artworks presentations, use the tools and data bases, and have fun.

The site includes information and blogs about hundreds Digital Art festivals, Artists and places worldwide including text, videos, pictures and much more.

Digitalarti is published by Digital Art International.

 

NASA’s Robotic Avatars

6a00d8341bf7f753ef01310fac105a970c-320wi“If every habitable world in the universe is unique, and the precise chemical conditions of a planet helps shape the life that evolves there, then avatars could allow aliens to visit other worlds from the safety of their spaceship. Could it be that all the stories of alien encounters on Earth were really encounters with alien avatars? Maybe aliens don’t actually look like grey humanoids with large eyes and no noses. Instead, that haunting image may simply be what we look like to them.”

Astrobiology Magazine

At the Kinetica Art Fair Collaborative Futures Panel, Anna Hill (Creative Director of Space Synapse) explained that she is “…working on systems to get from space to Earth, and offer some sort of collaboration between the two.”

She offered some examples, including Remote Suit, a wearable system designed to share the experience of being in space with people on Earth, and the Symbiotic Sphere  – a pod which gathers inspirational space data including images, videos, sound and haptics from space, the idea being to give those who sit in it an idea of what it is like to be in space.

Anna outlined her vision of the future: “I can envisage a feminising of technology. I’m very interested in augmented learning and collective and systemic thinking – there will be fewer top-down organisations. And there’s a need for robots not to replace humans.”

NASA is no stranger to robotics, with more than 50 robotic spacecraft studying Earth and reaching throughout the solar system, from Mercury to Pluto and beyond. But their latest development in the field of ‘Telerobotics’ marks a new development in how robots and avatars could work together to facilitate more sophisticated unmanned space exploration.

“Tomorrow’s NASA space program will be different,” says Wallace Fowler of the University of Texas, a renowned expert in modeling and design of spacecraft, and planetary exploration systems. “Human space flight beyond Low Earth Orbit (LEO), beyond Earth’s natural radiation shields (the Van Allen belts), is dangerous. Currently, a human being outside the Van Allen belts could receive the NASA defined “lifetime dose” of galactic cosmic radiation within 200 days.”

The current Robots used by NASA, however, are a long way off the vision proposed in the film Avatar where human users truly ‘experience’ the environment they are placed in. This is where virtual reality environments begin to change things as highlighted in the Daily Galaxy blog:

The Virtual Interactive Environment Workstation (VIEW) was an early virtual reality instrument developed at NASA Ames. It was a leap forward in true ‘immersion’ of the user in a virtual environment, and was the first systems to use a ‘data glove’. This glove measured and tracked how a user moved their fingers, allowing interaction with the virtual world.

Today, NASA uses 3D and virtual technologies for a number of public outreach and education projects. The technology can also be used for training purposes, allowing an astronaut to practice, say, walking on the surface of mars. NASA is developing technologies that will  allow a human explorer based on Earth, or in the relative safety of a space station or habitat, to actually experience exploration of a distant location. If the technology can be tied to robotic ‘avatars’ on a planetary surface in real-time, the user would not simply experience a simulation of the world – but could directly participate in exploration and science as if they were there.

Closer to the exploration front, similar technologies are also being used in NASA’s most avatar-like experiment of all – the Robonaut. According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”

138676main_robonaut-006_1Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.

In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.

More recently, NASA revealed the next generation of Robonaut, dubbed R2. General Motors has now joined on as a partner, and hopes that Robonaut will not only explore other worlds, but will help humans build safer cars. For more information on the R2 project, click here to see videos with some of the key researchers involved.

According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”

With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.

Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”

Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.

Sources: Steve Boxer on the Robots and Avatars Collaborative Futures Panel, DailyGalaxy.com

Is your virtual identity credible?

Go for a walk around Second Life and you might well to bump into avatars wearing next to nothing or perhaps you might find yourself talking to a human sized dog. However with many companies and organisations using virtual worlds as a new workspace, the way that employees might present themselves as avatars is becoming increasingly important to consider. Analysts Gartner Inc. predicted that by the end of 2013, 70% of companies will have set behavior guidelines and dress codes for employees who use the remotely controlled online characters in business settings.

At the first Robots and Avatars Forum, Pear Urishima from Apple shared her visions of future jobs going beyond the presence of avatars in the future world of work, she placed emphasis on how we might manage the trustworthiness and credibly of avatars in the workspace and suggested the need for ‘Virtual Identity Managers’, who might manage an individual’s representation online. Click here to find out more about the Robots and Avatars Forum.

How the iPhone can Reboot Education

How do you educate a generation of students eternally distracted by the internet, cellphones and video games? Easy. You enable them by handing out free iPhones — and then integrating the gadget into your curriculum.

To find out more visit Wired  http://tiny.cc/kAITR

Documentation

Documentation team at Robots and Avatars ForumBelow you can find the archive of the Scribble Live feed from the Robots and Avatars Forum. The Robots and Avatars Forum was documented by live writers, video and photography.

Photos

Please see the gallery section to see photos from the event.

Go to Top