Virtual/Physical Play Robot

Here is an intresting article from Gizmag about The Playtime Computing System developed by MIT Media Laboratory’s Personal Robots Group. The system blends robotic and virtual interfaces and is currently designed for children between the ages of 4 and 6 years old. This blend also allows the device to be used with telepresence, enabling a play and learning to occur in real-time across continents.

children playing with the playtime computing system

As Alphabot passes through a hole in the display panel, it appears to continue its journey through the virtual world projected onto the panels. Image Source: Gizmag/MIT

In an increasingly tech-centric world, keeping kids interested in learning can be an uphill battle. With teaching that involves play recently attracting some powerful supportive voices, students from MIT’s Media Lab have developed a system which merges technology and play to stimulate young minds. The Playtime Computing system uses infrared emitters and tracking cameras to monitor the position of a special robot within a play area. As the bot disappears into a hole in a panel, it appears to continue its journey into a virtual world projected onto the walls.

The Playtime Computing system developed by MIT Media Laboratory’s Personal Robots Group is aimed at children between 4 and 6 years old and allows them to get up and about instead of sitting around and getting bored, a hot topic at the moment given Michelle Obama’s Let’s Move campaign. It also allows for early experimentation in such things as symbolic reasoning and social roles.

The system is made up of three panels with projectors behind them, and a set of four ceiling projectors for sending images to the play area floor. Alphabot, a cube-shaped robot with infrared emitters at its corners, is tracked by ceiling-mounted cameras. A virtual landscape is projected onto the panels and floor to blur the barriers between reality and the artificially-created world. To further add to the illusion, as Alphabot disappears into a hole in the panel and some robotic foliage closes behind, the image projected onto the panel appears to show it continuing its journey into the virtual world.

A set of RFID-tagged wooden alphabet letters or symbols such as musical notes was also created so that the children can stick them onto Alphabot’s face. Placing letters onto the bot results in its face changing color to match, with musical notes causing music to be played through its onboard speakers. As the robot disappears into the virtual world beyond the panel, the symbol placed by the kids will also continue through to the animated version.

International playtime

The fun needn’t stop with just one play room, however. “One of the things we’re really excited about is having two of these spaces, one here and maybe one in Japan, and when the robot goes into the virtual word here, it comes out of the virtual world in Japan,” explained the group’s Adam Setapen. “So that kind of fits in with that one-reality concept, that there’s one robot, and whether it’s physical or virtual is based on the state of the robot in the Playtime Computing system.”

Of course, kids being kids, the young prototype testers crammed lots of different symbols onto the bot, which it wasn’t developed to handle. They also expected other objects placed in the hole to appear on the screen. Future developments of the system may well take such things in stride, with children perhaps being able to send a favorite toy into the virtual world.

Maybe it would also be interesting to see how they would deal with a digital twin!

Another aspect of the system is the Creation Station, a table-top computer where youngsters can arrange objects or draw pictures. Whatever is on the Station is recreated on the panels via the projectors.

The researchers also kitted out the playful system testers with baseball caps sporting infrared emitters. This allowed the system to keep track of the kids as well as the Alphabot, which could make it possible for such things as interaction with the computer animated robot in future versions. If the team can develop the system to operate using something like Microsoft’s Kinect gaming technology, then players could be tracked without having to rely on infrared clothing.

The team says that the current prototype was put together using off-the-shelf parts at a cost of just a few hundred dollars, and believe that mass production for home use is a viable possibility.

Source: Gizmag

Vodcast #4 – Thecla Schiphorst

Thecla Schiphorst is a Media Artist/Designer and Faculty Member in the School of Interactive Arts and Technology at Simon Fraser University in Vancouver, Canada.

On sensory technologies and the future of education and can be used as an introduction for young people to Sensory Technologie

Vodcast #5 – Professor Raymond Tallis

 

Learning Experiences in Harris Academies

you can do it robot poster

The new school year is well underway and body>data>space and Robots and Avatars have been delivering learning experiences at both the Harris Academy South Norwood and Harris Academy Merton.

With the students at South Norwood we have been exploring Telepresence – full body two way video connections projected onto very large screens and at Merton we have been experimenting with, creating and modifying our own Avatars in Second Life. We have also been running learning experiences which how social media and social networks can be used by young people to get their voices heard about the issues that really matter to them.

The students have been doing some great work, thinking about innovation and creatively experimenting with new technologies and the skills that will help them in their future work lives.  We hope to share some of the work done by the students as the term progresses.

 

idiscover Learning Experiences

idiscover logo and flower image

Robots and Avatars are delivering a series of learning experiences as part of iDiscover which engage young people in the skill-sets, aptitudes, resources and methodologies they will require for work and play in the future.

The learning experiences explore issues of identity, communication and team work for the 21st century with young people. Through the blending of virtual/physical worlds we give young people the opportunity to investigate and play with their relativity to others in online virtual spaces.

We provide key creative trainers to deliver the sessions who bring with them excellence in areas of collaboration, articulation, self presentation and socialisation.

Learning experiences are offered in the areas of:-

  • Avatars and Virtual Worlds
  • Robotics
  • Telematics
  • Virtual Physical Event Production & Management
  • Social Media

A full education pack is available on request.
body>data>space is a learning provider for NESTA’s idiscover education programme working with schools in London (The Harris Federation), Manchester and the Scottish Highlands which helps young people develop skills and attributes needed in an innovation driven society.

 

Outrace – Robots in Trafalgar Square

outrace robots projecting into the air in trafalgar square

Credit: Outrace

This year as part of London Design Festival the public were invited to take control of eight industrial robots on loan from Audi’s production line. OUTRACE is an installation by Clemens Weisshaar and Reed Kram, that consists of 6 independent systems coordinated by one KWTC CONTROLLER. Messages were sent in by the public, via a website, and then processed by the system every 60 seconds.

By way of a powerful LED light source, positioned at the tool head of each robot, people’s messages were traced into the public space of Trafalgar Square. Long-exposure cameras captured the interactive light paintings and relayed them to the project website and social media platforms to be shared.

Robots and Avatars sent in a message to OUTRACE which was shown at the rather unsociable time of 7.08am! Here is the video of the drawing – see if you can work out what we sent…

Robots and Avatars Vodcast

Vodcast #3 – Professor Anna Craft

Professor of Education, University of Exeter and The Open University

On behaviours and ethics in education. With Professor Anna Craft, Professor of Education at the University of Exeter and the Open University.
robotsandavatars.net

 

State of the UK Gaming Sector

TIGA logoTIGA, the trade association representing the UK games industry, today released key findings from a new report ‘State of the UK Video Games Development Sector’. The report is a comprehensive survey of 78 UK games development businesses and provides an accurate picture of games development in the UK.

The report covers areas such as industry profile; platforms and genres; self-publishing; in-game advertising; outsourcing; the cost of games development; customers and markets; the main obstacles to business growth and policies to promote growth. The report was supported by Train2Game.

Over the next week TIGA will be releasing a number of findings from different sections of the report. Today’s findings relate to the overall profile of the games development industry. To purchase a copy of the full report visit www.tiga.org.

Profile of the Games Development industry [report excerpts]:
• The average size of an independent developer is 51. The average size of an independent developer who also publishes games is 45. The average size of a publisher owned studio is 245.
• Games development businesses on average employ a workforce comprising 88 per cent male and 12 per cent female.
• 12 per cent of the UK games development workforce is on average non-UK citizens.
• The average mean turnover of an independent development studio that develops games was £3,130,600. The equivalent figures for independent developers that also publish games and for in-house, publisher owned studios were £4,055,000 and £15,500,000 respectively.
• The average UK game development business has been in operation for 7 years.
• On average, developers surveyed spent £570,800 to develop a game over the last year. This figure is based on the cost of developing games on all types of platforms. There is a large difference between independent developers (£897,700), independent developers who also publish games (£133,700) and publisher owned studios (£3,000,000).
• For 72 per cent of UK game developers surveyed, the USA constitutes one of their most important geographical markets. For 44 per cent of developers, the UK is regarded as one of their most important market. 41 per cent cited the rest of the EU, excluding the UK, as one of their most vital markets.

Dr. Richard Wilson TIGA CEO stated: “The State of the UK Video Games Development Sector Report is intended to provide the games industry with an accurate set of data that can be used to shape a model of the sector as a whole. The report clearly showed the incredible diversity that exists in the development community from size of studio to location, genre of game and distribution method. Games development is a real UK success story, we have an immensely talented workforce and we are at the cutting edge of changes in technology and business practices.”

For more information visit www.tiga.org.

1 Gibson, R. And Gibson, N., Raise the Game (NESTA, December 2008), p. 9.

 

Robonaut Tweets

RobonautWe have already blogged about Robotars – humanoid robots that are controlled virtually from a remote location – and NASA’s efforts in this field are developing further with their Robonaut 2.

At the recent Artifical Intelligence Lunch Debate the diverse group of experts discussed the implications of this sort of blended reality. This is particularly in relation to the use of sensory feedback technology which gives users a more heightened and tactile experience and that provides new and more tangible ways of behaving through and with new representational forms.

Commenting about the problems with traditional understandings of artifical intelligence at the Lunch Debate in June, Professor Noel Sharkey suggested that with robots and avatars we should not be saying “I think therefore I am” but instead,  ‘I feel therefore I am’.

Daily Galaxy has a great article on the Robonaut 2 which is below:

NASA’s Robonaut 2, or R2, is getting ready to work on the International Space Station in November but it’s already tweeting about preparations under the account, @AstroRobonaut.

The humanoid robot — complete with a head, arms and an upper torso — will be the first dexterous humanoid robot in space and it assures its followers in one of its first tweets alluding to 2001: A Space Odyssey that, “No, no relation to Hal. Don’t know if I’d want to admit to having him on my family tree if I was. [Definately] don’t condone his actions.” It also tweeted that it’s not related to Boba Fett.

Is this another vivid sign that we have entered the dawn of the age of post-biological intelligence?

Although there are already several robots in space — including the famous now AI-enhanced Mars Rovers, which have been zipping around the red planet for years — NASA and G.M.have created the first human-like robot to leave Earth.

The robot is called Robonaut 2, or R2 for short, and it weighs in at 300 pounds, with a head, torso and two fully functional arms. At first, R2 will be monitored in space to see how it performs in weightlessness, but NASA hopes to eventually use R2 to assist astronauts during space walks and to work alongside engineers in the space station.

In a joint news release, John Olson, director of NASA’s Exploration Systems Integration Office, said, “The partnership of humans and robots will be critical to opening up the solar system and will allow us to go farther and achieve more than we can probably even imagine today.”

According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”

Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.

In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.

According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”

With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.

Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”

Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.

R2 will be a passenger on the Space Shuttle Discovery, which is scheduled to head to the space station in September.

Source Credits:

http://www.dailygalaxy.com/my_weblog/2010/08/robonaut-2-the-first-humanoid-tweeting-from-space.html

http://www.spacedaily.com/reports/Avatars_In_Space_999.html

http://bits.blogs.nytimes.com/2010/04/14/nasa-and-gm-robot-heading-to-space-station/?src=busln


Lunch Debate Highlights – Artificial Intelligence

We have just finished the first of our highlight videos which document the Lunch Debate on Artificial Intelligence, which took place on 28th June 2010. The Robots and Avatars Lunch Debates bring together a diverse and specialised group of professionals and experts to deepen the research and conversation around Robots and Avatars and ask ‘What sort of world are we educating our young people for?’

Lunch Debate #1 – Provocation by Professor Noel Sharkey

Provocation by Professor Noel Sharkey, University of Sheffield
Produced by body>data>space

Lunch Debate #1 – Artificial Intelligence – Highlights

At NESTA, June 28th 2010
Produced by body>data>space

Go to Top