All Stories
Follow
Subscribe to Technische Universität München

Technische Universität München

Spatial artificial intelligence: how drones navigate

TECHNICAL UNIVERSITY MUNICH

PRESSRELEASE

Mapping forests and inspecting ships

Spatial artificial intelligence: how drones navigate

  • To accomplish spatial AI, a drone must be able to establish its orientation in space and to generate a map of its surroundings.
  • Through neural networks, the system can learn to recognize 3D objects in space.
  • Prof. Stefan Leutenegger is currently using spatial AI in three research projects.

People are able to perceive their surroundings in three dimensions and can quickly spot potential danger in everyday situations. Drones have to learn this. Prof. Stefan Leutenegger refers to the intelligence needed for this task as ‘spatial artificial intelligence’, or spatial AI. This new approach will be used by cartographers mapping forests, in ship inspections and when building walls.

In humans, it is entirely automatic: they recognize objects and their characteristics, can assess distances and hazards, and interact with other people. Stefan Leutenegger speaks of a coherent 3D representation of the surrounding area, yielding a uniform overall picture. Enabling a drone to distinguish between static and dynamic elements and recognize other actors: that is one of the more important areas for the professor of machine learning in robotics at TUM, who is also the head of the innovation field artificial intelligence at the Munich Institute of Robotics and Machine Intelligence (MIRMI).

Spatial AI, step 1: estimate the robot’s position in space and map it

Prof. Leutenegger uses spatial AI to provide a drone with the necessary on-board intelligence to fly through a forest without crashing into fine branches, to perform 3D printing or to inspect the cargo holds of tankers or freighters. Spatial AI is made up of several components that are adapted to specific tasks. It starts with the selection of sensors:

  • Computer vision: The drone uses one or two cameras to see its surroundings. For depth perception, two cameras are required – just as humans need two eyes. Leutenegger uses two sensors and compares the images in order to gain a sense of depth. There are also depth cameras that generate 3D images directly.
  • Inertial sensors: These sensors measure acceleration and angular velocity to detect the motion of bodies in space.

“Visual and inertial sensors complement one another very well,” says Leutenegger. That is because merging their data yields a highly precise image of a drone’s movements and its static surroundings. This enables the entire system to assess its own position in space. This is necessary for such applications as the autonomous deployment of robots. It also permits detailed, high-resolution mapping of the robot’s static surroundings – an essential requirement for avoiding obstacles. Initially, mathematical and probabilistic models are used without artificial intelligence. Leutenegger calls this the lowest level of “Spatial AI” – an area where he conducted research at Imperial College in London before coming to TUM.

Spatial AI, step 2: Neural networks for understanding surroundings

Artificial intelligence in the form of neural networks plays an important role in the semantic mapping of the area. This involves a deeper understanding of the robot’s surroundings. By means of deep learning, the information categories that are comprehensible to humans and clearly visible on the image can be captured and digitally mapped. To do this, neural networks use image recognition based on 2D images to represent them in a 3D map. The resources needed for deep learning recognition depend on how many details need to be captured to perform a specific task. Distinguishing a tree from the sky is easier than precisely identifying a tree or determining its state of health. For this kind of specialized image recognition, there is often not enough data for neural networks to learn from. For that reason, one goal of Leutenegger’s research is to develop machine learning methods that can make efficient use of sparse training data and allow robots to learn continually while in operation. In a more advanced form of spatial AI, the aim will be to recognize objects or parts of objects even when they are in motion.

Current AI projects of the MIRMI professor: forest mapping, inspecting ships, construction robotics

Spatial artificial intelligence is already being applied in three research projects:

  • Building walls: In construction robotics a robot equipped with grabbers (manipulators) is used. Its task in the SPAICR project, with four years of funding by the Georg Nemetschek Institute, is to build and dismantle structures such as walls. The special challenge in the project, in which Prof. Leutenegger is collaborating with Prof. Kathrin Dörfler (TUM professor of digital fabrication), is to enable robots to work without motion tracking, in other words with no external infrastructure. In contrast to past research projects, which used a clearly marked space in a lab with orientation points, the goal is for the robot to operate with precision on any building site.
  • Digitizing the forest: In the EU project Digiforest, the University of Bonn, the University of Oxford, ETH Zürich, the Norwegian University of Science and Technology and TUM are working to develop “digital technology for sustainable forestry”. For that purpose, the forest needs to be mapped. Where is which tree located? How healthy is it? Are there diseases? Where does the forest need to be thinned out and where is new planting needed? “The research will provide the forester with additional information for making decisions,” explains Prof. Leutenegger. TUM’s task: Prof. Leutenegger’s AI drones will fly autonomously through the forest and map it. They will have to find their way around the trees despite wind and small branches to produce a complete map of the wooded area.
  • Inspecting ships: In the EU project AUTOASSESS, the goal is to send drones into the interior of tankers and freighters to inspect the inside walls. They will be equipped with ultrasound sensors, among other instruments, to detect cracks. For this task the drones will need to be capable of flying autonomously in enclosed spaces with poor radio connectivity. In this application, too, motion tracking is not possible.

Spatial AI creates basis for decisions

“We’re working to provide people in a wide range of fields with sufficient quantities of data to reach informed decisions,” says Prof. Leutenegger. He adds, however: “Our robots are complementary. They enhance the capabilities of humans and will relieve them of hazardous and repetitive tasks.”

Direct link to the articel on tum.de.

Papers

Z. Landgraf, R. Scona, S. Leutenegger et al: SIMstack: A Generative Shape and Instance Model for Unordered Object Stacks; https://ieeexplore.ieee.org/document/9710412

S. Zhi, T. Laidlow et al: In-Place Scene Labelling and Understanding with Implicit Scene Representation; https://ieeexplore.ieee.org/document/9710936

G. Gallego, T. Delbrück, S. Leutenegger et al: Event-Based Vision: A Survey; https://ieeexplore.ieee.org/document/9138762

Pictures

Prof. Stefan Leutenegger mit seiner Hightech-Drohne im Labor

h ttp://go.tum.de/780102

Letzte Eingriffe vor dem Flug einer KI-gestützten Drohne von Prof. Stefan Leutenegger

http://go.tum.de/668050

Ein Wissenschaftler aus dem Labor von Prof. Stefan Leutenegger bereitet eine KI-unterstützte Drohne für einen Flug im Labor vor.

http://go.tum.de/138684

Prof. Stefan Leutenegger und sein Team beobachten den Flug seiner KI-unterstützten Drohne im Labor.

http://go.tum.de/904095

Further information

Prof. Stefan Leutenegger is a principal investigator and leader of the innovation field artificial intelligence at the Munich Institute of Robotics and Machine Intelligence (MIRMI). With MIRMI, TUM has established an integrative research center for science and technology to develop innovative and sustainable solutions for the central challenges of our time. The institute has leading expertise in central fields of robotics, perception and data science. More information: https://www.mirmi.tum.de/.

Scientific Contact:

Prof. Dr. Stefan Leutenegger

Professor for Machine Learning in Robotics

Sector Lead Artificial Intelligence in the Munich Institute of Robotics and Machine Intelligence (MIRMI)

Techniical University Munich

E-Mail: stefan.leutenegger@tum.de

Contact in TUM Corporate Communications Center:

Andreas Schmitz

Press Relations Robotics and Maschine Intelligence

Tel. +49 89 289 18198

andreas.schmitz@tum.de

www.tum.de

The Technical University of Munich (TUM) is one of Europe’s leading research universities, with more than 600 professors, 50,000 students, and 11,000 academic and non-academic staff. Its focus areas are the engineering sciences, natural sciences, life sciences and medicine, combined with economic and social sciences. TUM acts as an entrepreneurial university that promotes talents and creates value for society. In that it profits from having strong partners in science and industry. It is represented worldwide with the TUM Asia campus in Singapore as well as offices in Beijing, Brussels, Mumbai, San Francisco, and São Paulo. Nobel Prize winners and inventors such as Rudolf Diesel, Carl von Linde, and Rudolf Mößbauer have done research at TUM. In 2006, 2012, and 2019 it won recognition as a German "Excellence University." In international rankings, TUM regularly places among the best universities in Germany.

More stories: Technische Universität München
More stories: Technische Universität München
  • 04.05.2023 – 13:26

    Räumliche künstlicher Intelligenz: Wie Drohnen sich zurechtfinden

    TECHNISCHE UNIVERSITÄT MÜNCHEN PRESSEMITTEILUNG Zur Waldkartierung und Inspektion von Schiffen vorgesehen Räumliche künstlicher Intelligenz: Wie Drohnen sich zurechtfinden - Voraussetzung für räumliche künstliche Intelligenz ist die Orientierung im Raum sowie eine Karte aufzubauen. - Über neuronale Netze kann das System lernen, Gegenstände im Raum in drei ...

  • 03.05.2023 – 12:01

    Einladung: Zukunft der Drohnenindustrie - hochkarätig besetzte Konferenz am 09.05.23

    TECHNISCHE UNIVERSITÄT MÜNCHEN PRESSEEINLADUNG Hochkarätig besetzte Konferenz am TUM Campus Garching Die Zukunft der Drohnenindustrie Leben retten und zwar dort, wo widrige Umstände den Transport von medizinischem Gerät erschweren. Dieser Mission hat sich die Studierendeninitiative Horyzn der Technischen Universität München (TUM) verschrieben. Mit seiner ...

  • 28.04.2023 – 14:43

    Politik unterstützt Umrüstung auf niedrig angereichertes Uran

    TECHNISCHE UNIVERSITÄT MÜNCHEN PRESSEMITTEILUNG Freude an der TUM über nächsten Meilenstein für Forschungsreaktor Politik unterstützt Umrüstung auf niedrig angereichertes Uran - Zustimmung durch Bayerns Wissenschaftsminister Blume - Forschungsreaktor ist ein alternativloses Werkzeug für die Wissenschaft - Vorbereitungen für Genehmigung des neuen Brennstoffs starten Wichtiges Signal für die Zukunft der ...