Lunch Debates

Vodcast #6 – Professor Kevin Warwick on Cybernetics

 

Health and Wellbeing Lunch Debate

participants at robots and avatars lunch debate at NESTA

The recent lunch debate on Health and Wellbeing explored future scenarios for old age, medicine, care and the human body, asking what sort of future should we be preparing our young people for? Robots and Avatars brought together a range of experts including Professor Raymond Tallis (Emeritus Professor of Geriatric Medicine at the University of Manchester) and Professor Kevin Warwick (Professor of Cybernetics at Reading University) to discuss and debate the issue.

As part of the debate participants visioned care being administered by and through robots and avatars, the development of implants that would alter the way the brain works in order to cure many common diseases, a greater life expectancy with a shorter period of ‘woe’ towards the end of peoples lives and the possibility of self diagnosis and treatment as a result of advances in medical technology.

Central to the debate were questions around the representation of humans in care scenarios. Some participants resolutely argued that there can be no replacement for human to human care, emphaising the importance of empathy in care. Others were keen to emphasise the relatively low uptake of new technologies such as telecare and mobile apps which help patients self-diagnose. Whilst participants’ personal trepidations about their own old age entered into the debate, it was also emphasized that a future of health and wellbeing where robots and avatars play an increasingly important role, is very unlikely to completely replace the human to human contact but instead would most likely serve to augment it. This area of the debate touched on many of the issues that Robots and Avatars has been exploring over the course of the Lunch Debate Series including the credibility of artificial intelligence, the need to address illusion within representational forms and thinking about the ways in which we adopt new technologies.

Another key area of the debate focused around increased life expectancy, new ways of thinking about ‘old age’ and how we might go about changing perceptions now? According to ‘most attractive model’ for the future of ageing put forward by Professor Raymond Tallis, today’s young people are expected to live longer and have better health for longer, significantly affecting the way the population ages. As such, it is clear that we will have to develop new ways of thinking about not just old age but age more generally. It’s interesting to note that the word ‘teenager’ originated in the early 20th Century and has given rise to a complex set of ideas that strongly inform the ways we relate to, provide for and deal with 13-19 year olds, now it is time for teenagers to start thinking about what they want to be called when they are fit and healthy and in their 80’s – and still with another 20 years to live.

We will be sharing video content from the Health and Wellbeing soon. To see video and reports from previous debates click here.

Vodcast #4 – Thecla Schiphorst

Thecla Schiphorst is a Media Artist/Designer and Faculty Member in the School of Interactive Arts and Technology at Simon Fraser University in Vancouver, Canada.

On sensory technologies and the future of education and can be used as an introduction for young people to Sensory Technologie

Vodcast #5 – Professor Raymond Tallis

Robots and Avatars Vodcast

Vodcast #3 – Professor Anna Craft

Professor of Education, University of Exeter and The Open University

On behaviours and ethics in education. With Professor Anna Craft, Professor of Education at the University of Exeter and the Open University.
robotsandavatars.net

 

Robonaut Tweets

RobonautWe have already blogged about Robotars – humanoid robots that are controlled virtually from a remote location – and NASA’s efforts in this field are developing further with their Robonaut 2.

At the recent Artifical Intelligence Lunch Debate the diverse group of experts discussed the implications of this sort of blended reality. This is particularly in relation to the use of sensory feedback technology which gives users a more heightened and tactile experience and that provides new and more tangible ways of behaving through and with new representational forms.

Commenting about the problems with traditional understandings of artifical intelligence at the Lunch Debate in June, Professor Noel Sharkey suggested that with robots and avatars we should not be saying “I think therefore I am” but instead,  ‘I feel therefore I am’.

Daily Galaxy has a great article on the Robonaut 2 which is below:

NASA’s Robonaut 2, or R2, is getting ready to work on the International Space Station in November but it’s already tweeting about preparations under the account, @AstroRobonaut.

The humanoid robot — complete with a head, arms and an upper torso — will be the first dexterous humanoid robot in space and it assures its followers in one of its first tweets alluding to 2001: A Space Odyssey that, “No, no relation to Hal. Don’t know if I’d want to admit to having him on my family tree if I was. [Definately] don’t condone his actions.” It also tweeted that it’s not related to Boba Fett.

Is this another vivid sign that we have entered the dawn of the age of post-biological intelligence?

Although there are already several robots in space — including the famous now AI-enhanced Mars Rovers, which have been zipping around the red planet for years — NASA and G.M.have created the first human-like robot to leave Earth.

The robot is called Robonaut 2, or R2 for short, and it weighs in at 300 pounds, with a head, torso and two fully functional arms. At first, R2 will be monitored in space to see how it performs in weightlessness, but NASA hopes to eventually use R2 to assist astronauts during space walks and to work alongside engineers in the space station.

In a joint news release, John Olson, director of NASA’s Exploration Systems Integration Office, said, “The partnership of humans and robots will be critical to opening up the solar system and will allow us to go farther and achieve more than we can probably even imagine today.”

According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”

Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.

In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.

According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”

With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.

Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”

Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.

R2 will be a passenger on the Space Shuttle Discovery, which is scheduled to head to the space station in September.

Source Credits:

http://www.dailygalaxy.com/my_weblog/2010/08/robonaut-2-the-first-humanoid-tweeting-from-space.html

http://www.spacedaily.com/reports/Avatars_In_Space_999.html

http://bits.blogs.nytimes.com/2010/04/14/nasa-and-gm-robot-heading-to-space-station/?src=busln


Lunch Debate Highlights – Artificial Intelligence

We have just finished the first of our highlight videos which document the Lunch Debate on Artificial Intelligence, which took place on 28th June 2010. The Robots and Avatars Lunch Debates bring together a diverse and specialised group of professionals and experts to deepen the research and conversation around Robots and Avatars and ask ‘What sort of world are we educating our young people for?’

Lunch Debate #1 – Provocation by Professor Noel Sharkey

Provocation by Professor Noel Sharkey, University of Sheffield
Produced by body>data>space

Lunch Debate #1 – Artificial Intelligence – Highlights

At NESTA, June 28th 2010
Produced by body>data>space

 

Robot Teachers

Some of the key themes of the recent Robots and Avatars Lunch Debates on Artificial Intelligence and Behaviours and Ethics concern the relativity of humans to robots and avatars. We have been asking, how might we relate to these new representation forms in the future and what the implications might be of this? Central to this debate have been questions of authenticity, credibility, trust and security which have been highlighted due to the fact that, ultimately, it is currently impossible for an inanimate object or a graphical representation to ‘love you back’. In short, the relationships that we think we might be having with robots or avatars tend to be deceptive and illusory.

One area that this seems to have particular implications for is in the use of service or domestic robots, which is set to be a major growth area in the near future. Below is a great article by blogger Donald Clark about robots teachers, which surveys some of the the developments and issues around this sort of future.

Robot teachers – endlessly patient, CPD updates in seconds

Andrea Thomaz, right, and Nick DePalma in 2009 with Simon, a robot being developed at Georgia Tech.

The article in the New York Times will no doubt drive traditionalists to apoplexy, so there must be something in this report about the rise of robots in schools. Imagine a teacher that is endlessly patient, always teaches the correct skill in the correct way, provides lot of constructive feedback, can receive CPD updates in seconds, never gets ill and costs less than one month’s teacher’s salary? That’s the long-term promise.

What makes all this possible are advances in AI, motion tracking and language recognition. We have already seen what Microsoft have done with Natal in terms of speech, gesture and motion recognition. Plonk this into a robot and we’re really getting somewhere. The point in the short term is not to replace teachers but to replace the teaching of SOME TEACHING TASKS.

The focus, for the moment, is on early years education, playing to the ‘cute’ factor. This makes sense. We could see significant advances in early numeracy, literacy and second language skills in schools with software that provides guidance superior to that of many teachers. In addition, they can be updated, wirelessly, in seconds and can even learn as they teach.

The basic premise is sound and was shown convincingly by Nass and Reeves in The Media Equation – we treat technology as humans if it has the right affective behaviours – good timing, being polite, co-operative etc.. This is how computer games work. With the right movement, sounds and behaviour, avatars and robots seem real. Tens of millions use and experience this phenomenon every day. There’s even a touch of this in your ATM, where you’d rather deal with a hole in a wall than a live teller.

Robots don’t get hangovers, don’t take holidays, never discriminate on grounds of gender, race or accent. They’re patient, scalable and consistent. The ideal teacher!

Robots & language learning

The initial trials have been in learning tasks in autism and English as a second language. S Korea sees English as a key skill in terms of growth and its Institute of Science and Technology has developed Engey, to teach English. Their goal is to have an effective robot, that is better than the average teacher, in 3-5 years. This is part of a general push in robotics that sees robots do things humans do in areas such as production, military, health and education. Hundreds of robots are already in S Korea’s 8,400 kindergartens and the plan is to have one in every Kindergarten by 2013.

The University of California has been doing studies with a robot called RUBI, teaching Finnish. Initial studies show that the children do as well in tests as children taught by real teachers on specific language tasks. The retention was significantly better after 12 weeks with a reduction in errors of over 25%. Another interesting finding is that the robots need not look like real people, in fact hi-fidelity seems to be a little ‘creepy’ for kids. Although for an amazingly life like robot watch this.

CES 2010 featured a wonderful talking robot that follows you around the house and teaches you languages. It was remarkably sophisticated, with voice recognition, face recognition, picture recognition (show it a picture and it will say the word in your chosen language).l

Robots & autism

In a collaborative Japanese/US research project, children with autism have been shown to respond positively to synchronised behaviour from a robot. This is used to move the child on to other types of social interaction. In the University of Connecticut, a French robot is being used to with autistic children using mimicry to establish trust. Have a look at Beatbot’s Keepon robot designed for kids with autism.

Robots and personalised learning

Personalised learning can also be realised through one on one interaction and the robot engaging in conversations and learning from the learner. Work of this kind has been going on at the Georgia Institute of Technology, with a robot called Simon. The improvements in AI and natural language processing have led to results in the robotic world that promise one to one tuition in the future.

Robots & physical tasks

There’s also the teaching of physical tasks, such as setting a table, where Honda Labs have taught older children to complete the task without the aid of teachers. Robots can already complete physical manufacturing tasks way beyond the physical capability, speed and accuracy of a human. We’ve had 25 years of robotic surgery, with robots being used to do surgery at a distance, unmanned surgery and to minimise invasion. In May 2006 the first AI doctor-conducted unassisted robotic surgery on a 34 year old male to correct heartarrhythmia. The results were rated as better than an above-average human surgeon. The machine had a database of 10,000 similar operations, and so, in the words of its designers, was “more than qualified to operate on any patient.” The designers believe that robots can replace half of all surgeons within 15 years. In January 2009, the first all-robotic-assisted kidney transplant was performed at in the US by Dr. Stuart Geffner. The same team performed eight more fully robotic-assisted kidney transplants over the next six months.

Conclusion

It is only natural that robots, which have replaced highly skilled tasks in manufacturing, should be considered for teaching. Automating repetitive, difficult and dangerous tasks has always been technology’s trump card. If we know one thing about teaching, it’s that it is difficult and demanding, leading to unnatural levels of stress and illness. If we can, at the very least, relieve the pressure on teachers, that is surely a noble aim. In its own way, simple robotic, screen programmes like BBCBitesize and e-learning have already automated a lot of education and training. Robots promise to personalise this process. Every passing month sees improvements in movement, gesture and language recognition, with the technology appearing in the games world this year by Christmas. I have no doubt that robo-teaching will be common in schools in my lifetime.

Source: http://donaldclarkplanb.blogspot.com/

Robots and Avatars Vodcasts

Robots and Avatars are producing a series of vodcasts which will be available to view on this site. They explore the themes of the programme, including Artifical Intelligence, Behaviours and Ethics, Health and Wellbeing and the Future Workplace from the perspective of a diverse array of professionals and experts who share their expertise and insight in this series of interviews

Vodcast #1 – Professor Noel Sharkey

University of Sheffield

On artifical intelligence and the future of work and place. With Professor Noel Sharkey, professor of Robotics and Artifical Intelligence, University of Sheffield.
robotsandavatars.net

Vodcast #2 – Fiddian Warman

Artist/Director-  Soda

On artifical intelligence and the future of work and place. With Fiddian Warman – Artist/Director – Soda.
www.robotsandavatars.net

Lunch Debates 2010

Between June and October 2010, Robots and Avatars are hosting a series of Lunch Debates which bring together a diverse and specialised group of professionals and experts to deepen the research and conversation around Robots and Avatars and ask ‘What sort of world are we educating our young people for?’ The Lunch Debates help focus the overall theme of the programme into a series of specific areas, which include:

  • Artificial Intelligence
  • Behaviours and Ethics
  • Health and Wellbeing
  • Future Workplaces

Content from these debates will be shared shortly after each event on this site and will include video and writing.

June 2010

Artificial Intelligence

Artificial Intelligence – its evolution in Robots and Avatars – this will be a highly topical debate on the illusions and realisms of intelligence and cognition, free will and stand alone decisions by human agents such as robots and avatars, blended robots/avatars (robotars), M2M interfaces and communication developments of all types.

It will envision the involvement of a mix of robots, avatars, tele-presence and real time presence in the work place and examine the consequences of AI into future team space.

Provocateur – Professor Noel Sharkey BA PhD FIET, FBCS CITP FRIN FRSA – Professor of AI and Robotics / Professor of Public Engagement at University of Sheffield (Department Computer Science) Project Champion for Robots and Avatars.

Moderators – Ghislaine Boddington (Creative Director, body>data>space) and Benedict Arora (Programme Director Education, NESTA)

Go to Top