Healthcare in space is evolving. On the International Space Station, astronauts already use ultrasound to monitor their health, but until now, they have relied on real-time guidance from experts on Earth. This works well in low Earth orbit but will not be possible for future missions to the Moon or Mars, where communication delays make remote guidance impossible.
In the photo, ESA astronaut Sophie Adenot trains on EchoFinder‑2, an experiment run by the French space agency CNES and supported by ESA. Sophie trained at ESA’s European Astronaut Centre in Cologne, Germany, alongside her NASA Crew-12 crewmates, astronauts Jessica Meir (right) and Jack Hathaway (left).
The system uses augmented reality (AR) and artificial intelligence (AI) to make ultrasound scans without ground assistance—a key step towards healthcare autonomy in space.
Ultrasound is one of the most versatile medical tools: non-invasive, lightweight and radiation-free, making it ideal for space. But using it well requires expertise.
ESA astronaut Thomas Pesquet was the first to use the ECHO system by following instructions to position the probe during the Proxima mission. This breakthrough in remote medical imaging allowed researchers to operate the ultrasound device and receive high-quality images in real time. Since its 2017 commissioning, ECHO has supported studies like Vascular Echo/Vascular Ageing, Myotones and CIPHER expanding our understanding of how spaceflight affects the human body.
EchoFinder-2 takes the next step. Before flight, an expert sonographer performs a baseline data collection on each astronaut, recording the exact position and orientation of the ultrasound probe for selected organs. These reference points are stored and uploaded to the Space Station.
The setup is simple: the subject lies in a supine position with a chest marker, while the operator uses the AR interface to guide the probe.
In orbit, the astronaut uses a tablet running EchoFinder software, with a camera and QR markers attached to the probe and chest. The software displays virtual shapes on the screen: blue spheres for the current probe position and orange cubes for the target position. The operator moves the probe until the shapes overlap and turn green, signaling the correct placement. Then AI takes over, detecting the organs and saving the ultrasound image automatically.
Crew-12 will be the first astronauts to test EchoFinder‑2 aboard ESA’s Columbus module on the Space Station. ESA astronaut Sophie Adenot will use the system during her εpsilon mission, serving as both subject and operator.
EchoFinder‑2 opens the door to autonomous ultrasound using minimal training and low-tech hardware for space missions. Beyond space, this technology could also benefit remote regions on Earth, reducing the need for specialised expertise to perform ultrasound scans.