The free-flying, spherical technology demonstrator with artificial intelligence (AI) showed off a number of its features during interactions with ESA astronaut Luca Parmitano. CIMON-2 started its journey to the ISS on 05 December 2019, launching with the CRS-19 supply mission from the Kennedy Space Centre in Cape Canaveral, Florida.
It is scheduled to stay on the ISS for up to three years. Just shy of two months after the successful first use of CIMON-2, the project team has now received the analysis.
A number of tests have now been carried out on CIMON-2, for example on its autonomous flight capabilities, voice-controlled navigation, and its ability to understand and complete various tasks.
It also managed to fly to a specific point in the ISS Columbus module for the first time. Thanks to absolute navigation capabilities, CIMON-2 was able to follow verbal commands to move to a particular location, regardless of where it was to begin with.
For example, while starting up its new hardware and software, Parmitano asked CIMON-2 to fly to the Biological Experiment Laboratory (Biolab) inside the Columbus module.
It was also given the task of taking photos and videos in the European ISS module on request – and then showing these to the astronaut.
Using these capabilities, CIMON-2 will be able to help with future scientific experiments on the ISS.
The microphones of the current version of the technology demonstrator are more sensitive than its predecessor’s (CIMON), and it has a more advanced sense of direction. Its AI capabilities and the stability of its complex software applications have also been significantly improved.
The degree of autonomy of the battery-powered assistant has been increased by around 30 per cent. Astronauts can also activate a feature on CIMON-2 that allows it to analyse emotion in language and show empathy when interacting with the astronauts.
In addition, the project aims to research whether intelligent assistants such as CIMON could help reduce stress. As a partner and assistant, CIMON could support astronauts with their high workload of experiments and maintenance and repair work, thereby reducing their exposure to stress.
CIMON lays the foundations for social assistance systems that could reduce stress resulting from isolation or group dynamics during long-term missions. Such systems could also possibly help to minimise similar problems back on Earth as well.
With the new improved hardware and complex software working so well, the CIMON team from DLR, Airbus, IBM, Ludwig Maximilian University in Munich (LMU) and the ESA User Support Centre BIOTESC in Lucerne (Switzerland) are extremely satisfied with CIMON-2’s performance.
This continued success of the CIMON project is yet another pioneering achievement in the use of AI in human space flight.
Developed and built in Germany, CIMON is a technology experiment to support astronauts and increase the efficiency of their work. CIMON is able to show and explain information and instructions for scientific experiments and repairs.
The voice-controlled access to documents and media is an advantage, as the astronauts can keep both hands free. It can also be used as a mobile camera to save astronaut crew time. In particular, CIMON could be used to perform routine tasks, such as documenting experiments, searching for objects and taking inventory.
CIMON can also see, hear, understand and speak. CIMON can orientate itself using its ‘eyes’ – a stereo camera and a high-resolution camera that it uses for facial recognition – as well as two other cameras fitted to its sides that it uses for photos and video documentation.
Ultrasound sensors measure distances to prevent potential collisions. Its ‘ears’ consist of eight microphones to identify directions, and an additional directional microphone to improve voice recognition. Its ‘mouth’ is a loudspeaker that it can use to speak or play music.
At the heart of the AI for language understanding is IBM Watson AI technology from IBM Cloud. CIMON has not been equipped with self-learning capabilities and requires active human instruction.
The AI used for autonomous navigation was provided by Airbus and is designed for movement planning and object recognition. Twelve internal rotors allow CIMON to move and rotate freely in all directions. This means it can turn towards the astronaut when addressed, nod and shake its head, and follow the astronaut – either autonomously or on command.
Receive the latest developments and updates on Australia’s space industry direct to your inbox. Subscribe today to Space Connect here.