alex

alex

(0 comments, 36 posts)

This user hasn't shared any profile information

Posts by alex

Robots on the Timetable at the Hi-tech High School

From BBC News – read the full article here.

San Diego’s high tech high school integrates robot making and gaming right into the heart of their curriculum. “Daisy May”, a waist-high robots that scuttles around, scooping balls off the ground and projecting them into a bin, has been designed by students at the school reached the semi-finals of an international competition.

The moment you walk into San Diego’s High Tech High you realise this is a school unlike most others.

Andrew Webb, BBC

David Berggren, High Tech High’s engineering instructor and the person responsible for integrating robots in to the curriculum explains that students “learn through doing, through experiencing, building and creating not so much out of lecturing and testing” and are able to balance their other subjects by learning how to “balance loads” which he thinks reflects how we operate when having to apply skills learnt in school in future jobs and workplaces.

You can read the full article by Andrew Webb Technology reporter, BBC News including videos of the students work here on the BBC News website.

Digitalarti Magazine

Robots and Avatars features in a new and interactive edition of Digitalarti Magazine.

Digitalarti welcomes digital art pros, artists, festival organizers, journalists, collectors, galleries, institutions, digital art fans and all festival-goers around the world, and invites them to share experiences, information, artworks presentations, use the tools and data bases, and have fun.

The site includes information and blogs about hundreds Digital Art festivals, Artists and places worldwide including text, videos, pictures and much more.

Digitalarti is published by Digital Art International.

Catwalk Robot

 

The Avatar Gaze

MulticolouredEyeDon’t know if your friend in the virtual world is lying to you or not? Well, now avatars that can mimic our real-world eye movements can make it easier to spot if someone is telling the truth online.

Most virtual worlds, such as Second Life, are full of avatars with static or pre-programmed gazes. One way to make interactions feel more realistic is to reproduce a person’s eye movement on their avatar, said William Steptoe of University College London and colleagues.

Now research has found that real-world eye movement could make it easier to spot whether an avatar is telling the truth or not. The researchers asked 11 volunteers personal questions, such as to name their favourite book, and told them to lie in some of their answers. During the interviews, the volunteers wore eye-tracking glasses that recorded their blink rate, direction and length of gaze, and pupil dilation. Then, a second group of 27 people watched a selection of clips of avatars as they delivered the first group’s answers.

Some avatars had eye movements that mirrored those of the original volunteers, while others had no eye movement at all. The volunteers were asked whether they believed the avatars were being truthful or lying. On average, the participants were able to identify 88 per cent of truths correctly when the avatars had eye movement, but only 70 per cent without.

Spotting lies was harder, but eye movement provided 48 per cent accuracy compared with 39 per cent without. It is unclear exactly how the eye movements help. However, the eye-tracking glasses did show that people tended to hold the gaze of the interviewer for longer when telling the truth than when lying.

“Perhaps they were overcompensating,” New Scientist quoted Steptoe as saying. What’s more, their pupils dilated more when lying – something previous studies have linked with the greater cognitive load required for deception. “This is one of a small handful of cues that you can’t control,” said Steptoe.

Enhancing expressive features such as eye movement could eventually make avatar-mediated communication feel more trustworthy than online video, because only relevant visual cues need to be displayed, said Steptoe. The technology could help in business meetings held in virtual environments, or to enhance communication between people with social phobias, where face-to-face interaction can seem daunting, he added.

The results of the study will be presented at the 2010 Conference on Human Factors in Computing Systems in Atlanta, Georgia. (ANI)

Source: Science News

Everything is becoming more beautiful…

Jesse Schell, has taught Game Design and led research projects at Carnegie Mellon’s Entertainment Technology Center since 2002 and rounded off the 2010 Games Based Learning Conference with a  live web-streamed keynote from the US. His provocation provided some fascinating insights about the future of an education based around gaming, online communication and virtual exploration.

Jesse suggested that ‘everything is becoming more beautiful’ and that we like it that way. Siting the cornerstone example of the iPhone he emphasized how collaboration (in particular between artists and engineers) is central, if not essential, to making technology ‘more beautiful’ and hence more usable. Recognizing specialization, Schell suggested, is vital and he offered the suggestion of working hard to engineer situations where there is no choice but to work together as a framework to export from gaming development into education and virtual collaboration. He also explained how young people in particular expect the ability to customize not only their virtual environments but their real lives too.

Emphasizing that ‘people love sharing things – photos, music, knowledge’, Schell affirmed the Open Source movement, asserting: “that [the fact] Wikipedia works at all, gives tremendous faith for the human race”. He also explained that we all want ‘real things’ and that young people want all of these things too. But how to translate this into the classroom? His provocation continued to suggest that educators often prefer standardized (e.g. text books) when perhaps they should be interested in customization and that as opposed to withholding (individual work) they should look towards sharing.

Questions of ‘beauty’, customization, sharing and reality are central to the conversation generated by Robots and Avatars, which seeks to explorethis discussion from a wide range of angles including education, creative industries, the arts and academia.

Seven species of robot – Dennis Hong

At TEDxNASA, Dennis Hong introduces seven award-winnning, all-terrain robots — like the humanoid, soccer-playing DARwIn and the cliff-gripping CLIMBeR — all built by his team at RoMeLa, Virginia Tech. Watch to the end to hear the five creative secrets to his lab’s incredible technical success.

Dennis Hong is the founder and director of RoMeLa — a Virginia Tech robotics lab that has pioneered several breakthroughs in robot design and engineering.

 

Surgical Robots

At the Robots and Avatars Forum, Pear Urishima from Apple flagged up the use of the iPhone in terms of health, explaining how doctors could monitor patients statistics in real-time right from their phone. She also showed images of how projections of x-rays and scans could be placed onto human bodies to allow doctors to operate more effectively and precisely. This introduction of the virtual into the health sector marks a significant development in how doctors will carry out their work in the future and highlights the skills that the doctors of the future need to be learning today.

How will this increase in information from benefit the specialised work that doctors and surgeosn do? Will the role of the doctor or surgeon develop to become based soley around virtual interaction and avatars rather than the physical ‘hands on’ approach? These questions are pertinet at South Miami Hosptial in the US where the year just 19 surgeons will be performing over 1,000 robotic surguries.DaVinci-Robot

Since the programme began in 2007 the hospital has become one of the key locations for using robots in surgical procedures, which are known as the Da Vinci Surgical System. Dr. Jonathan Masel, a urologist in the Memorial Healthcare System who does surgery by open, traditional laparoscopic and robotic methods, is convinced the robot is the most precise.

“The more complex the procedure, the more I move to the robot. Its 3D optics are just like the movie Avatar.”

Even though there is still more work to be done in terms of scientifc studies regarding the use robots in surgery – to a layperson the developents are remarkable. The human surgeon sits at a computer console peering into a monitor that gives him or her a virtual view inside the patient’s body that is full-color, three-dimensional and magnified 10 times. Across the room, the robot’s four massive arms wield delicate surgical instruments inside the patient, carrying out the surgeon’s instructions with space-age precision.

“The robot is better,” says Dr. Ricardo Estape, a gynecological surgeon at South Miami Hospital who helped start its robotic program. “You can see what you’re doing so much better than even with open surgery. You can’t stick your head in somebody’s pelvis with open surgery when you’re doing a radical hysterectomy.”dvss-v2

“The robot is amazing,” says Dr. Lynn Seto, a cardiac surgeon who performed 450 robotic heart surgeries at Cleveland Clinic in Ohio before South Miami recruited her to help start its robotic heart program. “The view is so good you actually think you’re inside the body.”

 

NASA’s Robotic Avatars

6a00d8341bf7f753ef01310fac105a970c-320wi“If every habitable world in the universe is unique, and the precise chemical conditions of a planet helps shape the life that evolves there, then avatars could allow aliens to visit other worlds from the safety of their spaceship. Could it be that all the stories of alien encounters on Earth were really encounters with alien avatars? Maybe aliens don’t actually look like grey humanoids with large eyes and no noses. Instead, that haunting image may simply be what we look like to them.”

Astrobiology Magazine

At the Kinetica Art Fair Collaborative Futures Panel, Anna Hill (Creative Director of Space Synapse) explained that she is “…working on systems to get from space to Earth, and offer some sort of collaboration between the two.”

She offered some examples, including Remote Suit, a wearable system designed to share the experience of being in space with people on Earth, and the Symbiotic Sphere  – a pod which gathers inspirational space data including images, videos, sound and haptics from space, the idea being to give those who sit in it an idea of what it is like to be in space.

Anna outlined her vision of the future: “I can envisage a feminising of technology. I’m very interested in augmented learning and collective and systemic thinking – there will be fewer top-down organisations. And there’s a need for robots not to replace humans.”

NASA is no stranger to robotics, with more than 50 robotic spacecraft studying Earth and reaching throughout the solar system, from Mercury to Pluto and beyond. But their latest development in the field of ‘Telerobotics’ marks a new development in how robots and avatars could work together to facilitate more sophisticated unmanned space exploration.

“Tomorrow’s NASA space program will be different,” says Wallace Fowler of the University of Texas, a renowned expert in modeling and design of spacecraft, and planetary exploration systems. “Human space flight beyond Low Earth Orbit (LEO), beyond Earth’s natural radiation shields (the Van Allen belts), is dangerous. Currently, a human being outside the Van Allen belts could receive the NASA defined “lifetime dose” of galactic cosmic radiation within 200 days.”

The current Robots used by NASA, however, are a long way off the vision proposed in the film Avatar where human users truly ‘experience’ the environment they are placed in. This is where virtual reality environments begin to change things as highlighted in the Daily Galaxy blog:

The Virtual Interactive Environment Workstation (VIEW) was an early virtual reality instrument developed at NASA Ames. It was a leap forward in true ‘immersion’ of the user in a virtual environment, and was the first systems to use a ‘data glove’. This glove measured and tracked how a user moved their fingers, allowing interaction with the virtual world.

Today, NASA uses 3D and virtual technologies for a number of public outreach and education projects. The technology can also be used for training purposes, allowing an astronaut to practice, say, walking on the surface of mars. NASA is developing technologies that will  allow a human explorer based on Earth, or in the relative safety of a space station or habitat, to actually experience exploration of a distant location. If the technology can be tied to robotic ‘avatars’ on a planetary surface in real-time, the user would not simply experience a simulation of the world – but could directly participate in exploration and science as if they were there.

Closer to the exploration front, similar technologies are also being used in NASA’s most avatar-like experiment of all – the Robonaut. According to researchers on the project, “Robonaut systems are the first humanoids specifically designed for space.”

138676main_robonaut-006_1Robonaut is a collaboration between the Robot Systems Technology Branch at the NASA Johnson Space Center and the US military’s Defense Advanced Research Projects Agency (DARPA) to build a robotic ‘astronaut equivalent’. Robonaut looks a bit like a human, with an upper torso, two arms and a head – all controlled by a human operator through telerobotic technologies. Robonaut was designed with the concept of creating a robot for tasks that ‘were not specifically designed for robots.’ In order for the Robonaut to complete these ‘human-like’ tasks, it is equipped with hands that are actually more dexterous than those of an astronaut in a pressurized spacesuit.

In 2004, the second generation of Robonaut gained mobility when engineers attached its body to a Segway Robotic Mobility Platform (RMP) commissioned by DARPA. Using virtual reality instruments, a human operator was immersed in the Robonaut’s actual environment and was able to perform remote operations.

More recently, NASA revealed the next generation of Robonaut, dubbed R2. General Motors has now joined on as a partner, and hopes that Robonaut will not only explore other worlds, but will help humans build safer cars. For more information on the R2 project, click here to see videos with some of the key researchers involved.

According to researchers on Robonaut, “As the project matures with increased feedback to the human operator, the Robonaut system will approach the handling and manipulation capabilities of a suited astronaut.”

With more ‘haptic technology’ which uses sensory feedback to recreate the sense of touch, a user might wear gloves that allow them to ‘feel’ objects in a virtual world. You could examine the texture and weight of rocks, or even experience the crunch of icy martian dirt.

Dr Grace Augustine’s Avatars on Pandora go well beyond current technologies. We’re not going to be growing any biological avatars for human explorers in the lab – but modern robotics are getting close to providing a ‘human’ experience through increased dexterity and mobility. Robotic avatars could allow humans to fully experience the environment of other worlds. Through the eyes of robotic avatars we could watch the sunrise over the rusty, red crater rims without having to “experience suffocation, the icy death of -200 degrees C on their skin or the sting of microscopic dust in their eyes.”

Even though NASA and others have come a long way in developing avatars, the technology still has a long way to go before we’re having adventures on Pandora-like planets. Perhaps more advanced civilizations on distant worlds have developed avatars just as good as those in the movie.

Sources: Steve Boxer on the Robots and Avatars Collaborative Futures Panel, DailyGalaxy.com

Is your virtual identity credible?

Go for a walk around Second Life and you might well to bump into avatars wearing next to nothing or perhaps you might find yourself talking to a human sized dog. However with many companies and organisations using virtual worlds as a new workspace, the way that employees might present themselves as avatars is becoming increasingly important to consider. Analysts Gartner Inc. predicted that by the end of 2013, 70% of companies will have set behavior guidelines and dress codes for employees who use the remotely controlled online characters in business settings.

At the first Robots and Avatars Forum, Pear Urishima from Apple shared her visions of future jobs going beyond the presence of avatars in the future world of work, she placed emphasis on how we might manage the trustworthiness and credibly of avatars in the workspace and suggested the need for ‘Virtual Identity Managers’, who might manage an individual’s representation online. Click here to find out more about the Robots and Avatars Forum.

How the iPhone can Reboot Education

How do you educate a generation of students eternally distracted by the internet, cellphones and video games? Easy. You enable them by handing out free iPhones — and then integrating the gadget into your curriculum.

To find out more visit Wired  http://tiny.cc/kAITR

alex's RSS Feed
Go to Top