Archive for March, 2010
At the Robots and Avatars Forum, Pear Urishima from Apple flagged up the use of the iPhone in terms of health, explaining how doctors could monitor patients statistics in real-time right from their phone. She also showed images of how projections of x-rays and scans could be placed onto human bodies to allow doctors to operate more effectively and precisely. This introduction of the virtual into the health sector marks a significant development in how doctors will carry out their work in the future and highlights the skills that the doctors of the future need to be learning today.
How will this increase in information from benefit the specialised work that doctors and surgeosn do? Will the role of the doctor or surgeon develop to become based soley around virtual interaction and avatars rather than the physical ‘hands on’ approach? These questions are pertinet at South Miami Hosptial in the US where the year just 19 surgeons will be performing over 1,000 robotic surguries.
Since the programme began in 2007 the hospital has become one of the key locations for using robots in surgical procedures, which are known as the Da Vinci Surgical System. Dr. Jonathan Masel, a urologist in the Memorial Healthcare System who does surgery by open, traditional laparoscopic and robotic methods, is convinced the robot is the most precise.
“The more complex the procedure, the more I move to the robot. Its 3D optics are just like the movie Avatar.”
Even though there is still more work to be done in terms of scientifc studies regarding the use robots in surgery – to a layperson the developents are remarkable. The human surgeon sits at a computer console peering into a monitor that gives him or her a virtual view inside the patient’s body that is full-color, three-dimensional and magnified 10 times. Across the room, the robot’s four massive arms wield delicate surgical instruments inside the patient, carrying out the surgeon’s instructions with space-age precision.
“The robot is better,” says Dr. Ricardo Estape, a gynecological surgeon at South Miami Hospital who helped start its robotic program. “You can see what you’re doing so much better than even with open surgery. You can’t stick your head in somebody’s pelvis with open surgery when you’re doing a radical hysterectomy.”dvss-v2
“The robot is amazing,” says Dr. Lynn Seto, a cardiac surgeon who performed 450 robotic heart surgeries at Cleveland Clinic in Ohio before South Miami recruited her to help start its robotic heart program. “The view is so good you actually think you’re inside the body.”
“If every habitable world in the universe is unique, and the precise chemical conditions of a planet helps shape the life that evolves there, then avatars could allow aliens to visit other worlds from the safety of their spaceship. Could it be that all the stories of alien encounters on Earth were really encounters with alien avatars? Maybe aliens don’t actually look like grey humanoids with large eyes and no noses. Instead, that haunting image may simply be what we look like to them.”
At the Kinetica Art Fair Collaborative Futures Panel, Anna Hill (Creative Director of Space Synapse) explained that she is “…working on systems to get from space to Earth, and offer some sort of collaboration between the two.”
She offered some examples, including Remote Suit, a wearable system designed to share the experience of being in space with people on Earth, and the Symbiotic Sphere – a pod which gathers inspirational space data including images, videos, sound and haptics from space, the idea being to give those who sit in it an idea of what it is like to be in space.
Anna outlined her vision of the future: “I can envisage a feminising of technology. I’m very interested in augmented learning and collective and systemic thinking – there will be fewer top-down organisations. And there’s a need for robots not to replace humans.”
NASA is no stranger to robotics, with more than 50 robotic spacecraft studying Earth and reaching throughout the solar system, from Mercury to Pluto and beyond. But their latest development in the field of ‘Telerobotics’ marks a new development in how robots and avatars could work together to facilitate more sophisticated unmanned space exploration.
“Tomorrow’s NASA space program will be different,” says Wallace Fowler of the University of Texas, a renowned expert in modeling and design of spacecraft, and planetary exploration systems. “Human space flight beyond Low Earth Orbit (LEO), beyond Earth’s natural radiation shields (the Van Allen belts), is dangerous. Currently, a human being outside the Van Allen belts could receive the NASA defined “lifetime dose” of galactic cosmic radiation within 200 days.”
The current Robots used by NASA, however, are a long way off the vision proposed in the film Avatar where human users truly ‘experience’ the environment they are placed in. This is where virtual reality environments begin to change things as highlighted in the Daily Galaxy blog:
The Virtual Interactive Environment Workstation (VIEW) was an early virtual reality instrument developed at NASA Ames. It was a leap forward in true ‘immersion’ of the user in a virtual environment, and was the first systems to use a ‘data glove’. This glove measured and tracked how a user moved their fingers, allowing interaction with the virtual world.
Today, NASA uses 3D and virtual technologies for a number of public outreach and education projects. The technology can also be used for training purposes, allowing an astronaut to practice, say, walking on the surface of mars. NASA is developing technologies that will allow a human explorer based on Earth, or in the relative safety of a space station or habitat, to actually experience exploration of a distant location. If the technology can be tied to robotic ‘avatars’ on a planetary surface in real-time, the user would not simply experience a simulation of the world – but could directly participate in exploration and science as if they were there.
Closer to the exploration front, similar technologies are also being used in NASA’s most avatar-like experiment of all – the Robonaut. According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”
Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.
In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.
More recently, NASA revealed the next generation of Robonaut, dubbed R2. General Motors has now joined on as a partner, and hopes that Robonaut will not only explore other worlds, but will help humans build safer cars. For more information on the R2 project, click here to see videos with some of the key researchers involved.
According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”
With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.
Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”
Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.
Go for a walk around Second Life and you might well to bump into avatars wearing next to nothing or perhaps you might find yourself talking to a human sized dog. However with many companies and organisations using virtual worlds as a new workspace, the way that employees might present themselves as avatars is becoming increasingly important to consider. Analysts Gartner Inc. predicted that by the end of 2013, 70% of companies will have set behavior guidelines and dress codes for employees who use the remotely controlled online characters in business settings.
At the first Robots and Avatars Forum, Pear Urishima from Apple shared her visions of future jobs going beyond the presence of avatars in the future world of work, she placed emphasis on how we might manage the trustworthiness and credibly of avatars in the workspace and suggested the need for ‘Virtual Identity Managers’, who might manage an individual’s representation online. Click here to find out more about the Robots and Avatars Forum.
How do you educate a generation of students eternally distracted by the internet, cellphones and video games? Easy. You enable them by handing out free iPhones — and then integrating the gadget into your curriculum.
To find out more visit Wired http://tiny.cc/kAITR
Presenting as part of the Robots and Avatars Collaborative Futures Panel at the Kinetica Art Fair 2010, Professor Noel Sharkey coined the phrase “Robatars”, citing the example of physical military drones operating in war-zones, yet controlled by operators in the Nevada desert. He explained how “virtual Reality is coming into play in a new way, which you could call “Real Virtuality” – you’re looking at VR in a cocoon, where you can smell, touch and so on.”
MIT’s recent MeBot – a semi-autonomous robotic avatar that gives people a richer way to interact remotely with an audience than is allowed with phone and video conferencing, brings this idea from military spheres into the personal domain. The MeBot is designed to be able to convey its users’ gestures, head movements and proxemics and as it does its designers aim to expand the capabilities of mobile and wireless communication. Initial experiments showed that users felt more psychologically involved in the remote interaction particularly because of the basic embodiment that the robot allows.
Check out this video to see the MeBot in action:
Robots and Avatars is a innovative and fascinating project exploring how young people will work and play with new representational forms of themselves and others in virtual and physical life in the next 10-15 years.
It examines multi-identity evolutions of today’s younger generations within the context of a world in which virtual and physical spaces are increasingly blended. A participatory web and events led programme with connected educational activities is taking place across 2010 and onwards, in the UK and internationally.
We will be posting the latest content relating to the many questions and issues that Robots and Avatars programme explores. Looking at wide ranging areas including, education, virtual worlds, robotics, the arts, and health, this site is an invaluable resource for anyone interested in what our work and play spaces of the future will be like and what skills we might need to make the most of them.
Do check back here regularly to keep updated with the going’s on in the world of Robots and Avatars. You can also subscribe to our RSS feed here.
Robots and Avatars held a panel discussion at the Kinetica Art Fair in London on 6th Feburary 2010 which looked at future collaboration with robots and avatars in work and play space.
The panel was made up of some fascinating experts from digital, creative, academica and educational sectors and included Professor Noel Sharkey (University of Sheffield), Ron Edwards (Ambient Performance), Ghislaine Boddington (body>data>space), Peter McOwan (Queen Mary University of London), Anna Hill (Space Synapse) and Michael Takeo Magruder (King’s Visualisation Lab, King’s College London).
Ghislaine Boddington introduced the event by talking about body>data>space’s work and how the Robots and Avatars programme will “look at robots and avatars in the future, and examine how young people will work and play with representative forms in both the virtual and physical worlds.” Peter McOwen revealed details of his work on a European project called LIREC: Living with Robots and Interactive Companions and delved into human relationships with robots. Anna Hill from Space Synapse explored how earth to space collaborations work and emphasises the imporatance on a ‘feminised view of technology’. Michael Takeo Magruder talked about how we relate to avatars and share with us his work at Kings College Visualisation Lab within Second Life. Bringing the virtual into the workplace was the central theme of Ron Edwards’s presentation as he explained about his “enterprise-grade virtual worlds” that bring data into virtual training environments. Robots And Avatars stalwart and project champion Noel Sharkey wrapped up the presentations by talking about his new phrase “Robatars” – suggesting a hybrid between robots and avatars and challenging the ways in which we think of them now.
Click here to see further content from the Collaborative Futures Panel, including Steve Boxer’s full report on the Panel.