![]()
by
Guido de Croon
Space
Articles
October 3, 2016
During an experiment performed on board of the International Space Station (ISS) a small drone successfully learned by itself to see distances using only one eye, reported scientists at the 67th International Astronautical Congress (IAC) in Guadalajara, Mexico.
Although humans effortlessly estimate distances with one eye, it is not clear how we learn this capability, nor how robots should learn the same. The experiment was designed in collaboration between the Advanced Concepts Team (ACT) of the European Space Agency (ESA), the Massachusetts Institute of Technology (MIT) and the Micro Air Vehicles lab (MAV-lab) of Delft University of Technology (TU Delft), and was the final step of a five-year research effort aimed at in-orbit testing of advanced artificial intelligence (AI) concepts.
The paper, “Self-supervised learning as an enabling technology for future space exploration robots: ISS experiments”, describes how during the experiment a drone started navigating in the ISS while recording stereo vision information on its surroundings from its two ‘eyes’ (cameras). It then started to learn about the distances to walls and obstacles encountered so that when the stereo vision camera would be switched off, it could start an autonomous exploratory behaviour using only one ‘eye’ (a single camera).
While humans can close one eye and still be able to tell whether a particular object is far, in robotics many would consider it as being extremely hard. “It is a mathematical impossibility to extract distances to objects from one single image, as long as one has not experienced the objects before,” says Guido de Croon from Delft University of Technology and one of the principal investigators of the experiment. “But once we recognise something to be a car, we know its physical characteristics and we may use that information to estimate its distance from us. A similar logic is what we wanted the drones to learn during the experiments.” Only, in an environment with no gravity, where no particular direction is favourite and thus had also to overcome this difficulty.
The self-supervised learning algorithm developed and used during the in-orbit experiment was thoroughly tested at the TU Delft CyberZoo on quadrotors, proving its value and robustness.
“It was very exciting to see, for the first time, a drone in space learning using cutting edge AI methods,” added Dario Izzo who coordinated the scientific contribution from ESA’s Advanced Concepts Team.
“At ESA, and in particular here at the ACT, we worked towards this goal for the past 5 years. In space applications, machine learning is not considered as a reliable approach to autonomy: a ‘bad’ learning result may result in a catastrophic failure of the entire mission. Our approach, based on the self-supervised learning paradigm, has a high degree of reliability and helps the drone autonomy: a similar learning algorithm was successfully applied to self-driving cars, a task where reliability is also of paramount importance.”
The little drone that successfully learned “to see” was one of the SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellite) drones on board the ISS. The SPHERES are capable of rotation and translation in all directions. Twelve carbon dioxide thrusters are used for control and propulsion, and allow the satellites to manoeuvre with great precision in the zero gravity environment of the station. The MIT Space Systems Laboratory, in conjunction with NASA, DARPA, and Aurora Flight Sciences, developed and operates the SPHERES system to provide a safe and reusable zero gravity platform to test sensor, control and autonomy technologies for use in satellites. Developing these technologies is an enabler for new types of satellite systems.
The drone experiments on earth were performed by Kevin van Hecke for his MSc thesis. He also went to the MIT Space Systems Lab to translate the drone programs to the software required by the SPHERES: “It was my life-long dream to work on space technology, but that I would contribute to a learning robot in space even exceeds my wildest dreams!”
Indeed, the experiment seems to hold promise for the future: “This is a further step in our quest for truly autonomous space systems, increasingly needed for deep space exploration, complex operations, for reducing costs, and increasing capabilities and science opportunities,” comments Leopold Summerer, head of ESA’s Advanced Concepts Team.
TU Delft MAV-Lab: Guido de Croon, Laurens van der Maaten, Kevin van Hecke. Massachusetts Institute of Technology: Timothy P. Setterfield, Alvar Saenz-Otero. Advanced Concepts Team: Dario Izzo, Daniel Hennes.
Guido de Croon is assistant-professor at the Micro Air Vehicle lab of Delft University of Technology in the Netherlands... read more
Alvar Saenz-Otero Daniel Hennes Dario Izzo Guido de Croon Kevin van Hecke Laurens van der Maaten Massachusetts Institute of Technology MAV-Lab Timothy P. Setterfield TU Delft EU Research
Research & Innovation
Business & Finance
Health & Medicine
Politics, Law & Society
Arts & Entertainment
Education & DIY
Events
Military & Defense
Exploration & Mining
Mapping & Surveillance
Enviro. & Agriculture
Aerial
Automotive
Industrial Automation
Consumer & Household
Space
latest posts popular reported elsewhere
Robohub Digest 09/16: AI100, RoboLaw, sailing and farming robots
by
Jana Witt, Kassie Perlongo
3-D printed robots with shock-absorbing skins
by
CSAIL MIT
A drone with insect-inspired folding wings
by
Linda Seward, NCCR Robotics
Gangyuan Jing , Rico Jonschkowski , Matthew Gombolay , Dorsa Sadigh - RSS 2016 Posters
by
Audrow Nash
Replicable and measureable robotics research: Back to the basics of the scientific method
by
Fabio Bonsignorio
GoPro unveils long-awaited Karma stabilized drone system and kit
by
Frank Tobe
$60 million committed to Smart Cities Initiative
by
the National Science Foundation (NSF)
Farewell to Vic Scheinman, inventor of the modern robot arm
by
Silicon Valley Robotics
46 research reports analyze the robotics industry and autonomous vehicles
by
Frank Tobe
The Robot Economy: Interview with Alan Manning
by
SPARC
latest posts popular reported elsewhere
The evolution of assembly lines: A brief historyRobots can successfully imitate human motions in the operating roomHow do self-driving cars work?Farming with robotsThe Robot Economy: Interview with Alan ManningReplicable and measureable robotics research: Back to the basics of the scientific methodWhat’s new in robotics this week: British Standards Institute releases guidelines for ethical robot designCubli – A cube that can jump up, balance, and walk across your deskA practical look at latency in robotics: The importance of metrics and operating systemsSweep: a low cost LiDAR sensor for smart consumer products
latest posts popular reported elsewhere
Umamaheswar Duvvuri, MD: Surgical robotics – past, present and future | CMU RI Seminar
Tesla drivers wake up to a serious upgrade | Bloomberg
Victor Scheinman, assembly line robot inventor, dies at 73 | The New York Times
Artificial intelligence software is booming. But why now? | The New York Times
Do no harm, don’t discriminate: official guidance issued on robot ethics | The Guardian
Brenna D. Argall: Human Autonomy through Robotics Autonomy | CMU RI Seminar
Autopilot cited in death of Chinese Tesla driver | The New York Times
Uber starts self-driving car pickups in Pittsburgh | TechCrunch
Surgeons use robot to operate inside eye in world first
The Head of CMU’s Robotics Lab Says Self-Driving Cars Are ‘Not Even Close’
Japan to develop 3-D maps for self-driving cars- Nikkei Asian Review
No Sailors Needed: Robot Sailboats Scour the Oceans for Data
Boris Sofman & Hanns Tappeiner: The Journey to Consumer Robotics | CMU RI Seminar
Drive.ai uses deep learning to teach self-driving cars – and to give them a voice
Baidu and Nvidia to Build Artificial Intelligence Platform for Self-Driving Cars
How Tech Giants Are Devising Real Ethics for Artificial Intelligence
Robot Tractor Draws Crowds on Debut at Iowa Farm Show
3D Print Your Own Breakfast
Inside the Robot-Run Genetics Lab of Tomorrow (Just Watch Your Step)
Interview: How to build a lionfish-killing robot
Artificial neural networks and intelligent information processing
March 6, 2015
A dedicated jobs board for the global robotics community.
Sr. Machine Vision / SLAM Engineer - Mayfield RoboticsRobotic Engineering Leader - ConfidentialTechnical Lead (Robotics Software Development) - Advanced Remanufacturing and Technology CentreSenior – Principal Development Scientist (Robotics Development) - Advanced Remanufacturing and Technology CentreDevelopment Engineer – Senior (Robotic Software Development) - Advanced Remanufacturing and Technology Center