This world first serves to showcase how artificial intelligence and robotics could work hand-in-hand with astronauts in future space exploration.
Dubbed “ICHIBAN”, short for IntBall-2 CIMON Hovering Intelligences Building AI Network, the mission saw two independently developed robotic assistants – Japan’s Int-Ball2 and Europe’s Crew Interactive MObile companioN (CIMON) – communicate and collaborate in real time across different modules of the International Space Station (ISS).
It marks the first time robots from different international space agencies have coordinated their functions and completed a joint task in orbit.
During the demonstration, Japan Aerospace Exploration Agency (JAXA) astronaut Takuya Onishi, working from the Columbus European Laboratory, remotely operated Int-Ball2, which was stationed in Japan’s Kibo module.
The operation was coordinated using CIMON’s voice recognition AI, allowing Onishi to instruct Int-Ball2 via verbal commands. CIMON interpreted these instructions and location data, then translated them into navigation commands, directing Int-Ball2 to move through the Kibo module and locate a target item all while live video from Int-Ball2’s onboard camera streamed back to CIMON for monitoring.
This seamless collaboration between two separate robotic platforms developed by different countries and housed in different modules is a significant leap forward in the field of autonomous systems in space.
“This breakthrough is more than just a technical demonstration,” a JAXA spokesperson said. “It’s a vision of how astronauts and intelligent systems might work side by side in deep space reducing workload, increasing safety and expanding our capabilities in orbit.”
The ICHIBAN mission had three primary objectives:
- Establish a common interface enabling robots to communicate directly with one another.
- Demonstrate simultaneous communication between ground control and orbiting robots, as well as inter-robot communication.
- Develop and validate operational procedures for managing multiple autonomous systems from Earth.
Int-Ball2 is the second-generation inboard drone developed by JAXA. Designed to assist with photography and observation tasks inside the Kibo module, it reduces the workload on astronauts by autonomously capturing video and still images. Controlled from Earth, it can freely navigate within the ISS.
CIMON is a spherical AI-powered assistant co-developed by DLR (Deutsches Zentrum für Luft- und Raumfahrt), Airbus and IBM. Equipped with natural language processing and voice recognition, CIMON is designed to interact with astronauts, provide support for experiments and offer a degree of emotional engagement, effectively serving as both an assistant and companion.
All objectives were met during the exercise, with both robots responding effectively to real-time data and successfully completing the collaborative search task.
The findings will directly inform future mission planning for autonomous robotic support systems in space, particularly as agencies look towards long-duration missions to the moon and Mars.
As its name suggests, the ICHIBAN mission represents a first step not just in robot-to-robot cooperation in space but in building a broader international framework for how intelligent systems can support human spaceflight.
Both JAXA and DLR have confirmed their intention to continue using the ISS, particularly the Kibo module, as a proving ground for robotic and AI integration in space operations.
Future experiments are likely to push for greater autonomy, cooperation among multiple robotic agents and deeper integration with astronaut crews.
With space agencies worldwide planning increasingly complex missions to the moon, Mars and beyond, missions like ICHIBAN offer a glimpse into how human-robot partnerships could shape the future of exploration – smart, seamless, and above all, collaborative.