A test campaign took place this autumn at the Madrid headquarters of Spain’s GMV Innovating Solutions company to assess an ESA developed vision-based navigation camera called NPAL (for Navigation for Planetary Approach and Landing) along with a set of navigation algorithms. GMV supplied the 6-degrees-of-freedom robotic arms setup – known as ‘platform-art’, short for Advanced Robotic Testbed for Orbital and Planetary Systems and Operations Testing – which had the camera placed on its tip, to navigate in relation to a model asteroid.
In 2020 ESA’s proposed Asteroid Impact Mission would find its way across deep space using star trackers and radio ranging. Then once it rendezvouses with the two Didymos asteroids, would come the real challenge: navigating around this unprecedented environment to close in on the smaller of the bodies, to perform close-range observations and put down a lander.
By including an actual navigation camera in the loop, the testing team was able to maximise the realism and fidelity of their simulation. The camera itself is fully equipped with a detector that can acquire the images, a ‘frame store’ for their intermediate storage and an image processing chip to perform the feature tracking, plus connection interface to AIM’s guidance and navigation computer.
The NPAL navigation camera takes the images to allow the image processing algorithms to first select landmark ‘feature points’ within its field of view and then to follow them from frame to frame. The changing tracks of the various feature points over time (shown in purple in the video) are checked against the onward and rotational motion of the spacecraft to determine its position and orientation relative to the asteroid.