alex

alex

(0 comments, 36 posts)

This user hasn't shared any profile information

Posts by alex
 

Robonaut Tweets

RobonautWe have already blogged about Robotars – humanoid robots that are controlled virtually from a remote location – and NASA’s efforts in this field are developing further with their Robonaut 2.

At the recent Artifical Intelligence Lunch Debate the diverse group of experts discussed the implications of this sort of blended reality. This is particularly in relation to the use of sensory feedback technology which gives users a more heightened and tactile experience and that provides new and more tangible ways of behaving through and with new representational forms.

Commenting about the problems with traditional understandings of artifical intelligence at the Lunch Debate in June, Professor Noel Sharkey suggested that with robots and avatars we should not be saying “I think therefore I am” but instead,  ‘I feel therefore I am’.

Daily Galaxy has a great article on the Robonaut 2 which is below:

NASA’s Robonaut 2, or R2, is getting ready to work on the International Space Station in November but it’s already tweeting about preparations under the account, @AstroRobonaut.

The humanoid robot — complete with a head, arms and an upper torso — will be the first dexterous humanoid robot in space and it assures its followers in one of its first tweets alluding to 2001: A Space Odyssey that, “No, no relation to Hal. Don’t know if I’d want to admit to having him on my family tree if I was. [Definately] don’t condone his actions.” It also tweeted that it’s not related to Boba Fett.

Is this another vivid sign that we have entered the dawn of the age of post-biological intelligence?

Although there are already several robots in space — including the famous now AI-enhanced Mars Rovers, which have been zipping around the red planet for years — NASA and G.M.have created the first human-like robot to leave Earth.

The robot is called Robonaut 2, or R2 for short, and it weighs in at 300 pounds, with a head, torso and two fully functional arms. At first, R2 will be monitored in space to see how it performs in weightlessness, but NASA hopes to eventually use R2 to assist astronauts during space walks and to work alongside engineers in the space station.

In a joint news release, John Olson, director of NASA’s Exploration Systems Integration Office, said, “The partnership of humans and robots will be critical to opening up the solar system and will allow us to go farther and achieve more than we can probably even imagine today.”

According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”

Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.

In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.

According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”

With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.

Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”

Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.

R2 will be a passenger on the Space Shuttle Discovery, which is scheduled to head to the space station in September.

Source Credits:

http://www.dailygalaxy.com/my_weblog/2010/08/robonaut-2-the-first-humanoid-tweeting-from-space.html

http://www.spacedaily.com/reports/Avatars_In_Space_999.html

http://bits.blogs.nytimes.com/2010/04/14/nasa-and-gm-robot-heading-to-space-station/?src=busln


Lunch Debate Highlights – Artificial Intelligence

We have just finished the first of our highlight videos which document the Lunch Debate on Artificial Intelligence, which took place on 28th June 2010. The Robots and Avatars Lunch Debates bring together a diverse and specialised group of professionals and experts to deepen the research and conversation around Robots and Avatars and ask ‘What sort of world are we educating our young people for?’

Lunch Debate #1 – Provocation by Professor Noel Sharkey

Provocation by Professor Noel Sharkey, University of Sheffield
Produced by body>data>space

Lunch Debate #1 – Artificial Intelligence – Highlights

At NESTA, June 28th 2010
Produced by body>data>space

 

Robot Teachers

Some of the key themes of the recent Robots and Avatars Lunch Debates on Artificial Intelligence and Behaviours and Ethics concern the relativity of humans to robots and avatars. We have been asking, how might we relate to these new representation forms in the future and what the implications might be of this? Central to this debate have been questions of authenticity, credibility, trust and security which have been highlighted due to the fact that, ultimately, it is currently impossible for an inanimate object or a graphical representation to ‘love you back’. In short, the relationships that we think we might be having with robots or avatars tend to be deceptive and illusory.

One area that this seems to have particular implications for is in the use of service or domestic robots, which is set to be a major growth area in the near future. Below is a great article by blogger Donald Clark about robots teachers, which surveys some of the the developments and issues around this sort of future.

Robot teachers – endlessly patient, CPD updates in seconds

Andrea Thomaz, right, and Nick DePalma in 2009 with Simon, a robot being developed at Georgia Tech.

The article in the New York Times will no doubt drive traditionalists to apoplexy, so there must be something in this report about the rise of robots in schools. Imagine a teacher that is endlessly patient, always teaches the correct skill in the correct way, provides lot of constructive feedback, can receive CPD updates in seconds, never gets ill and costs less than one month’s teacher’s salary? That’s the long-term promise.

What makes all this possible are advances in AI, motion tracking and language recognition. We have already seen what Microsoft have done with Natal in terms of speech, gesture and motion recognition. Plonk this into a robot and we’re really getting somewhere. The point in the short term is not to replace teachers but to replace the teaching of SOME TEACHING TASKS.

The focus, for the moment, is on early years education, playing to the ‘cute’ factor. This makes sense. We could see significant advances in early numeracy, literacy and second language skills in schools with software that provides guidance superior to that of many teachers. In addition, they can be updated, wirelessly, in seconds and can even learn as they teach.

The basic premise is sound and was shown convincingly by Nass and Reeves in The Media Equation – we treat technology as humans if it has the right affective behaviours – good timing, being polite, co-operative etc.. This is how computer games work. With the right movement, sounds and behaviour, avatars and robots seem real. Tens of millions use and experience this phenomenon every day. There’s even a touch of this in your ATM, where you’d rather deal with a hole in a wall than a live teller.

Robots don’t get hangovers, don’t take holidays, never discriminate on grounds of gender, race or accent. They’re patient, scalable and consistent. The ideal teacher!

Robots & language learning

The initial trials have been in learning tasks in autism and English as a second language. S Korea sees English as a key skill in terms of growth and its Institute of Science and Technology has developed Engey, to teach English. Their goal is to have an effective robot, that is better than the average teacher, in 3-5 years. This is part of a general push in robotics that sees robots do things humans do in areas such as production, military, health and education. Hundreds of robots are already in S Korea’s 8,400 kindergartens and the plan is to have one in every Kindergarten by 2013.

The University of California has been doing studies with a robot called RUBI, teaching Finnish. Initial studies show that the children do as well in tests as children taught by real teachers on specific language tasks. The retention was significantly better after 12 weeks with a reduction in errors of over 25%. Another interesting finding is that the robots need not look like real people, in fact hi-fidelity seems to be a little ‘creepy’ for kids. Although for an amazingly life like robot watch this.

CES 2010 featured a wonderful talking robot that follows you around the house and teaches you languages. It was remarkably sophisticated, with voice recognition, face recognition, picture recognition (show it a picture and it will say the word in your chosen language).l

Robots & autism

In a collaborative Japanese/US research project, children with autism have been shown to respond positively to synchronised behaviour from a robot. This is used to move the child on to other types of social interaction. In the University of Connecticut, a French robot is being used to with autistic children using mimicry to establish trust. Have a look at Beatbot’s Keepon robot designed for kids with autism.

Robots and personalised learning

Personalised learning can also be realised through one on one interaction and the robot engaging in conversations and learning from the learner. Work of this kind has been going on at the Georgia Institute of Technology, with a robot called Simon. The improvements in AI and natural language processing have led to results in the robotic world that promise one to one tuition in the future.

Robots & physical tasks

There’s also the teaching of physical tasks, such as setting a table, where Honda Labs have taught older children to complete the task without the aid of teachers. Robots can already complete physical manufacturing tasks way beyond the physical capability, speed and accuracy of a human. We’ve had 25 years of robotic surgery, with robots being used to do surgery at a distance, unmanned surgery and to minimise invasion. In May 2006 the first AI doctor-conducted unassisted robotic surgery on a 34 year old male to correct heartarrhythmia. The results were rated as better than an above-average human surgeon. The machine had a database of 10,000 similar operations, and so, in the words of its designers, was “more than qualified to operate on any patient.” The designers believe that robots can replace half of all surgeons within 15 years. In January 2009, the first all-robotic-assisted kidney transplant was performed at in the US by Dr. Stuart Geffner. The same team performed eight more fully robotic-assisted kidney transplants over the next six months.

Conclusion

It is only natural that robots, which have replaced highly skilled tasks in manufacturing, should be considered for teaching. Automating repetitive, difficult and dangerous tasks has always been technology’s trump card. If we know one thing about teaching, it’s that it is difficult and demanding, leading to unnatural levels of stress and illness. If we can, at the very least, relieve the pressure on teachers, that is surely a noble aim. In its own way, simple robotic, screen programmes like BBCBitesize and e-learning have already automated a lot of education and training. Robots promise to personalise this process. Every passing month sees improvements in movement, gesture and language recognition, with the technology appearing in the games world this year by Christmas. I have no doubt that robo-teaching will be common in schools in my lifetime.

Source: http://donaldclarkplanb.blogspot.com/

Robots and Avatars Vodcasts

Robots and Avatars are producing a series of vodcasts which will be available to view on this site. They explore the themes of the programme, including Artifical Intelligence, Behaviours and Ethics, Health and Wellbeing and the Future Workplace from the perspective of a diverse array of professionals and experts who share their expertise and insight in this series of interviews

Vodcast #1 – Professor Noel Sharkey

University of Sheffield

On artifical intelligence and the future of work and place. With Professor Noel Sharkey, professor of Robotics and Artifical Intelligence, University of Sheffield.
robotsandavatars.net

Vodcast #2 – Fiddian Warman

Artist/Director-  Soda

On artifical intelligence and the future of work and place. With Fiddian Warman – Artist/Director – Soda.
www.robotsandavatars.net

Lunch Debates 2010

Between June and October 2010, Robots and Avatars are hosting a series of Lunch Debates which bring together a diverse and specialised group of professionals and experts to deepen the research and conversation around Robots and Avatars and ask ‘What sort of world are we educating our young people for?’ The Lunch Debates help focus the overall theme of the programme into a series of specific areas, which include:

  • Artificial Intelligence
  • Behaviours and Ethics
  • Health and Wellbeing
  • Future Workplaces

Content from these debates will be shared shortly after each event on this site and will include video and writing.

June 2010

Artificial Intelligence

Artificial Intelligence – its evolution in Robots and Avatars – this will be a highly topical debate on the illusions and realisms of intelligence and cognition, free will and stand alone decisions by human agents such as robots and avatars, blended robots/avatars (robotars), M2M interfaces and communication developments of all types.

It will envision the involvement of a mix of robots, avatars, tele-presence and real time presence in the work place and examine the consequences of AI into future team space.

Provocateur – Professor Noel Sharkey BA PhD FIET, FBCS CITP FRIN FRSA – Professor of AI and Robotics / Professor of Public Engagement at University of Sheffield (Department Computer Science) Project Champion for Robots and Avatars.

Moderators – Ghislaine Boddington (Creative Director, body>data>space) and Benedict Arora (Programme Director Education, NESTA)

 

Want to find out about Avatars – make your own!

Robots and Avatars discusses and debates the future implications of using avatars within education and for young people; how they might or might not be best used to mediate identity or how we can think about collaboration with them but it is important not to forget that one of the best ways to find out about virtual presence is by making and using your own avatar. On the right is one I have just created! Alex - Avatar

The progression from experts having to create avatars to pretty much any user being able to experiment with virtual presence and virtual worlds has enabled a far greater integration of avatars, not just into our experience of using the web but also, into our everyday lives. The foremost environment for this is of course Second Life but avatars pop up all over the place – sometimes we don’t even realise that we are using them. Facets of more complex avatar identities found in Second Life and online gaming environments can be seen in much simpler terms on our Facebook profiles and Twitter accounts and more and more sites are asking you to create an ‘avatar’ as an important basis for communication via websites, in comment boxes and so on.

As security is a vital issue for students who wish to have an online presence Robots and Avatars seeks to find ways to open out the discussion and create new models for leaners around how to safely, creatively and intuitively empower them to make these decisions themselves. There is also another more playful and creative exercise in actual the creation of avatars themselves. Underpinning this is a consistent interplay between your ‘real’ self/identity and the virtual version you choose to put out there.

OSOQ100525300YTBelow are a series of tools that you can use to create an avatar to express your identity but still retain a degree of anonymity. These would be an excellent starting point for teachers and educators interested in integrating avatars into their lessons as they allow simple, playful and creative engagement with virtual identity. Robots and Avatars is just putting the final touches into a series of workshops which explore these sorts of issues in more depth and with key experts and professionals. To find out more about this check out our new education section.

Avatar Making Tools:

Osoq – A nice little tool to create an animated Avatar, plenty of options and works very well.

Simsonize Yourself – Have you ever wanted to be in the Simpsons?

Doppel Me – Creates a very life-like Avatar in no time at all.

Build Your Wild Self – Something a little different. This allows you to build an avatar which is half human half animal.

LegoMan – Create a lego version of yourself. Not as lifelike obviosuly and you need to do a screen print in order to copy the image as there is no way to export the image.

Meez – The most sophisticated tool which creates an animated avatar to use as your identity. There are a range of download options, if it cannot be embedded directly to your website you can download the file as an animated gif which can then be inserted as an image file.

Mikons – This site doesn’t allow you to create a personal avatar but rather a personalised icon (Mikon) which could be used to represent your students. This is a screen shot and well worth a look if you are looking to create an online presence/logo etc.

Evolver – A new site that allows you to create a 3D avatar. Complete control over the look and avatar can be saved as a static image or an animated Gif. There is also a function to upload a real photo of you. This site also offers access to a 3D world.

HeroMaker – Create your own superhero avatar.

Voki – allows you to create personalized speaking avatars and use them on your blog, profile, and in email messages created by Oddcast

Avatar yourself – Oddcast is the leader in providing talking characters- a more sophisticated option over Voki. They produce tools for a range of marketing campaigns and they can be viewed as a collection by following the link. Simply upload a picture of yourself and begin.

Some sources from Web 2.0 in Education

 

Robot Resemblance

IMG_2279 (1)Little Island of Japan is a company that comes up with clone robots, and to date their efforts with robotic dolls have managed to bear a close resemblance to celebrities as well as politicians, being highlighted in TV shows as well as worldwide news.

For those who want a robotic avatar of yourself, it will take around 3 months from your order for the robot to be churned out and delivered right to your doorstep. These robots come with sensors built-in to detect when people are nearby, and are full well capable of waving its hands and saying a simple “Hello”. Each robot stands at 70cm in height and will set you back by a cool $2,200 after conversion.

Source: Ubergizmo

 

Anybots – Work Anywhere

qbCutoutOne of the central questions of Robots and Avatars is to ask what it would be like to collaborate with a robot in the workplace? Further, we are exploring what the implications of this would be for how we present ourselves to our colleagues in both physical and virtual space?

We wonder how it will be possible to envisage robots as colleagues and are incredibly excited by the potential in a hybrid between robots and avatars – ‘Robotars’ as Prof. Noel Sharkey calls it – which we think will help us push forward the possibilities for new and blended methods of work, play and collaboration in 10-15 years time.

Anybots a California based company who make telepresence robots announced the launch today of QB, the first professional-quality mobile proxy robot. QB is the first in a line of Anybots made to connect people and locations. Accessible from any web browser, QB represents you throughout the workplace from wherever you are.

Trevor Blackwell, Founder & CEO, Anybots says:
“Remote-presence robots add a new layer to the engagement available for a distributed workforce. The global Internet is now fast enough for millions of people to be streaming live video and 4G cellular data will soon be deployed everywhere — so in very short order, web-based robotics will no longer be limited to facilities with Wi-Fi.”

Hyoun Park, Research Analyst, Aberdeen Group
“By combining audiovisual telepresence with the freedom of robotic mobility and an easy-to-use remote control, Anybots has created a new level of remote presence. The QB telepresence robot provides the functionality needed for business processes without falling prey to the “uncanny valley” of discomfort associated with fully anthropomorphic robotic builds. QB could change the current model for remote interactions in research and development, corporate collaboration, retail, sales and customer service.”

Social Robots and Social Networks

(Image: Mixed Reality Lab/National University of Singapore)

A core theme of Robots and Avatars concerns how young people might negotiate their identities online in the future.  For many, the multi identities that virtual spaces create afford them a certain freedom.  This brings with it empowerment and new possibilities for the ways that they craft their social spaces. The energy and openness that many young people show when talking about these questions should certainly be celebrated but questions of online credibility, security and cyber-bullying must of course be discussed as well. Petimos, due to be launched later this year, are aimed at 7 to 10-year-olds and are designed to place checks on the processes of interacting online, particularly through social networks.

Children will only be able to accept new online “friends” if their Petimos are brought into physical contact first, to guard against cyberbullies and paedophiles masquerading as children. The devices work in conjunction with an online social network called Petimo-World in which they are represented by avatars. By squeezing their physical Petimos, or pressing buttons on them, children can send messages or “gifts” to their online friends.

Parents are notified each time a friend request is made and can block approaches that concern them, so children only see and interact online with the avatars of approved friends.

“Internet and text-based communication is only a small part of human communication that we have evolved with,” says the device’s inventor, Adrian David Cheok at the National University of Singapore. “I want to use new media to help develop more natural human forms of communication. Petimo is one step in this direction.”

A study published in January by the California-based Kaiser Family Foundation found 18 per cent of 8 to 10-year-olds in the US use social networking sites. My Secret Circle, Yoursphere and FaceChipz are sites used by children in this age group. Other surveys suggest that as many as 3 in 10 children have been subjected to bullying while online.

In March, leading UK police officers called on social networking sites to place a standard “panic button” designed by the Child Exploitation and Online Protection Centre on all pages.

Parents who took part in tests of Petimo in Singapore this year said the need for children to physically meet those who they wished to interact with online helped ease their fears about the risks posed by strangers.

While the devices will initially only be used with Petimo-World, Cheok hopes that eventually they could be used to provide safer access to other social networks.

Jennifer Perry of E-Victims, a group in the UK that helps victims of online crime, says the system’s appeal might be limited if children get bored with the restricted content of Petimo-World. “Children young enough to be content with a walled garden approach and its limitations will probably be too young to be seriously interested in the chat element,” she says.

Source: New Scientist

 

Mobile Phone Robot

calloCanadian researchers trying to integrate robots into our lives have come up with a pair of dancing, crying mobile phone ‘bots. The robots, called Callo and Cally, are mobile phones with limbs.

Cally stands about 18cm high and walks, dances and mimics human behavior. Callo stands about 23cm tall, and his face, which is a cell phone display screen, shows human facial expressions when he receives text-messaged emotions. When he receives a smile emoticon, Callo stands on one leg, waves his arms and smiles. If he receives a frown, his shoulders slump and he will cry. If he gets an urgent message, or a really sad one, he’ll wave his arms frantically.

PhD student Ji-Dong Yim and Prof Chris D. Shaw from the School of Interactive Arts and Technology at Simon Fraser University in Canada have collaborated to create a robot using the combination of Nokia N82 along with components from a Bioloid kit.

Along with an the ability to move in preprogrammed patterns when receiving phone calls from different numbers, robot is also capable to detect human faces using OpenCV (Open Source Computer Vision). Robot uses wireless networking, text messaging and other interactive technologies to communicate human emotions. It’s a “simple avatar system” according to Yim.

The robot’s face, which is actually a phone screen, registers text-messaged emotions as human-like facial express.

“When you move your robot, my robot will move the same, and vice versa, so that we can share emotional feelings using ‘physically smart’ robot phones,” he says in an SFU release.

More videos here.

alex's RSS Feed
Go to Top