Posts tagged noel sharkey
Read Robots and Avatars’ project champion Noel Sharkey‘s opinion column on the mindless use of robots in war in this week edition of the Guardian (3 December 2012).
‘The rational approach to the inhumanity of automating death by machines beyond the control of human handlers is to prohibit it
Are we losing our humanity by automating death?
Human Rights Watch (HRW) thinks so. In a new report, co-published with Harvard Law School’s International Human Rights Clinic, they argue the “case against killer robots“. This is not the stuff of science fiction. The killer robots they refer to are not Terminator-style cyborgs hellbent on destroying the human race. There is not even a whiff of Skynet.
These are the mindless robots I first warned Guardian readers about in 2007 – robots programmed to independently select targets and kill them. Five years on from that call for legislation, there is still no international discussion among state actors, and the proliferation of precursor technologies continues unchecked.’
We have already blogged about Robotars – humanoid robots that are controlled virtually from a remote location – and NASA’s efforts in this field are developing further with their Robonaut 2.
At the recent Artifical Intelligence Lunch Debate the diverse group of experts discussed the implications of this sort of blended reality. This is particularly in relation to the use of sensory feedback technology which gives users a more heightened and tactile experience and that provides new and more tangible ways of behaving through and with new representational forms.
Commenting about the problems with traditional understandings of artifical intelligence at the Lunch Debate in June, Professor Noel Sharkey suggested that with robots and avatars we should not be saying “I think therefore I am” but instead, ‘I feel therefore I am’.
Daily Galaxy has a great article on the Robonaut 2 which is below:
NASA’s Robonaut 2, or R2, is getting ready to work on the International Space Station in November but it’s already tweeting about preparations under the account, @AstroRobonaut.
The humanoid robot — complete with a head, arms and an upper torso — will be the first dexterous humanoid robot in space and it assures its followers in one of its first tweets alluding to 2001: A Space Odyssey that, “No, no relation to Hal. Don’t know if I’d want to admit to having him on my family tree if I was. [Definately] don’t condone his actions.” It also tweeted that it’s not related to Boba Fett.
Is this another vivid sign that we have entered the dawn of the age of post-biological intelligence?
Although there are already several robots in space — including the famous now AI-enhanced Mars Rovers, which have been zipping around the red planet for years — NASA and G.M.have created the first human-like robot to leave Earth.
The robot is called Robonaut 2, or R2 for short, and it weighs in at 300 pounds, with a head, torso and two fully functional arms. At first, R2 will be monitored in space to see how it performs in weightlessness, but NASA hopes to eventually use R2 to assist astronauts during space walks and to work alongside engineers in the space station.
In a joint news release, John Olson, director of NASA’s Exploration Systems Integration Office, said, “The partnership of humans and robots will be critical to opening up the solar system and will allow us to go farther and achieve more than we can probably even imagine today.”
According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”
Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.
In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.
According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”
With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.
Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”
Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.
R2 will be a passenger on the Space Shuttle Discovery, which is scheduled to head to the space station in September.
We have just finished the first of our highlight videos which document the Lunch Debate on Artificial Intelligence, which took place on 28th June 2010. The Robots and Avatars Lunch Debates bring together a diverse and specialised group of professionals and experts to deepen the research and conversation around Robots and Avatars and ask ‘What sort of world are we educating our young people for?’
Lunch Debate #1 - Provocation by Professor Noel Sharkey
Provocation by Professor Noel Sharkey, University of Sheffield
Produced by body>data>space
Lunch Debate #1 - Artificial Intelligence – Highlights
At NESTA, June 28th 2010
Produced by body>data>space
Between June and October 2010, Robots and Avatars are hosting a series of Lunch Debates which bring together a diverse and specialised group of professionals and experts to deepen the research and conversation around Robots and Avatars and ask ‘What sort of world are we educating our young people for?’ The Lunch Debates help focus the overall theme of the programme into a series of specific areas, which include:
- Artificial Intelligence
- Behaviours and Ethics
- Health and Wellbeing
- Future Workplaces
Content from these debates will be shared shortly after each event on this site and will include video and writing.
Artificial Intelligence – its evolution in Robots and Avatars – this will be a highly topical debate on the illusions and realisms of intelligence and cognition, free will and stand alone decisions by human agents such as robots and avatars, blended robots/avatars (robotars), M2M interfaces and communication developments of all types.
It will envision the involvement of a mix of robots, avatars, tele-presence and real time presence in the work place and examine the consequences of AI into future team space.
Provocateur – Professor Noel Sharkey BA PhD FIET, FBCS CITP FRIN FRSA – Professor of AI and Robotics / Professor of Public Engagement at University of Sheffield (Department Computer Science) Project Champion for Robots and Avatars.
Moderators – Ghislaine Boddington (Creative Director, body>data>space) and Benedict Arora (Programme Director Education, NESTA)
Presenting as part of the Robots and Avatars Collaborative Futures Panel at the Kinetica Art Fair 2010, Professor Noel Sharkey coined the phrase “Robatars”, citing the example of physical military drones operating in war-zones, yet controlled by operators in the Nevada desert. He explained how “virtual Reality is coming into play in a new way, which you could call “Real Virtuality” – you’re looking at VR in a cocoon, where you can smell, touch and so on.”
MIT’s recent MeBot – a semi-autonomous robotic avatar that gives people a richer way to interact remotely with an audience than is allowed with phone and video conferencing, brings this idea from military spheres into the personal domain. The MeBot is designed to be able to convey its users’ gestures, head movements and proxemics and as it does its designers aim to expand the capabilities of mobile and wireless communication. Initial experiments showed that users felt more psychologically involved in the remote interaction particularly because of the basic embodiment that the robot allows.
Check out this video to see the MeBot in action:
Robots and Avatars held a panel discussion at the Kinetica Art Fair in London on 6th Feburary 2010 which looked at future collaboration with robots and avatars in work and play space.
The panel was made up of some fascinating experts from digital, creative, academica and educational sectors and included Professor Noel Sharkey (University of Sheffield), Ron Edwards (Ambient Performance), Ghislaine Boddington (body>data>space), Peter McOwan (Queen Mary University of London), Anna Hill (Space Synapse) and Michael Takeo Magruder (King’s Visualisation Lab, King’s College London).
Ghislaine Boddington introduced the event by talking about body>data>space’s work and how the Robots and Avatars programme will “look at robots and avatars in the future, and examine how young people will work and play with representative forms in both the virtual and physical worlds.” Peter McOwen revealed details of his work on a European project called LIREC: Living with Robots and Interactive Companions and delved into human relationships with robots. Anna Hill from Space Synapse explored how earth to space collaborations work and emphasises the imporatance on a ‘feminised view of technology’. Michael Takeo Magruder talked about how we relate to avatars and share with us his work at Kings College Visualisation Lab within Second Life. Bringing the virtual into the workplace was the central theme of Ron Edwards’s presentation as he explained about his “enterprise-grade virtual worlds” that bring data into virtual training environments. Robots And Avatars stalwart and project champion Noel Sharkey wrapped up the presentations by talking about his new phrase “Robatars” – suggesting a hybrid between robots and avatars and challenging the ways in which we think of them now.
Click here to see further content from the Collaborative Futures Panel, including Steve Boxer’s full report on the Panel.