Earlier this summer, NASA astronaut Jonny Kim guided a team of robots across a simulated martian landscape while orbiting 400 km above Earth on the International Space Station. The demonstration marked the final session of Surface Avatar, a joint effort between ESA and the German Aerospace Center (DLR) to investigate how astronauts can coordinate robotic partners for future missions to the Moon and Mars.
As Jonny shared on X after the experiment, “One of the most fun technology demonstrations I took part in during this mission was with a global team exploring how remote robotic operations could support future missions on other worlds. Growing up, I played a lot of video games and while I still enjoy gaming with my kids, time is scarcer these days. This demo brought me right back, blending elements of real-time strategy, RPGs [role-playing games], and first-person play into something very real.”
At the heart of the experiment is a custom-built software interface developed by ESA and DLR. Running on a laptop inside ESA’s Columbus module, the interface allows operators to see where the robots are, select pre-defined actions such as picking up an object, and smoothly switch between different camera views. It combines a first-person perspective for teleoperating a robot directly, with a strategic top-down map to manage multiple robots.
To interact with the robots, Jonny used two controllers: a joystick and a haptic-feedback device with seven degrees of freedom – motion in all directions plus rotation – and tactile feedback, so astronauts can feel when a robot’s arm meets resistance on Earth.
Jonny continues: "The setup was awesome. A joystick and advanced robotic arm controller let me mimic finger and wrist movements with precision. A heads-up display kept me informed with battery levels, location data, and quick access to either an AI assistant or ground teams. I could enlarge a mini-map to see each robot's perspective, like a ‘fog of war’ in strategy games, and send parallel commands to different units. With the fine arm controls, I could enter into the perspective of a humanoid robot to manipulate the environment, whether moving science samples or shifting a rock that blocked the way. My favourite was a rover with a deployable mini-robot designed to crawl into tight spaces like caves, a feature that felt straight out of a game but with real scientific potential.”
Each session begins with a symbolic “haptics handshake”, in which engineers on the ground shake the robot’s hand while the astronaut senses the gesture in orbit. This not only demonstrates the technology but also helps operators adapt to the 850-millisecond delay between the Station and the ground.
Over four sessions, the Surface Avatar project has advanced how astronauts and robots can work together, balancing direct control with increasing autonomy. Through European expertise and international collaboration, ESA and DLR are laying the foundation for human-robot teamwork on future exploration missions.
Jonny ended with: “Hats off to the ESA and Surface Avatar teams for bringing this vision to life. It was not just a technology demonstration, but a glimpse into how play, imagination, and innovation intersect to shape the future of exploration.”
Watch our video and read our blog to find out more about this experiment.