Video Navigation and GNC System Layout for a Rendezvous with a Noncooperative Tumbling Target
The imperilment in space by debris or incapacitated spacecraft, in particular in nearly polar low earth orbits, has materialized latest after the collision between an Iridium and a superannuated Kosmos satellite in February 2009. Today all space agencies are working more or less intensive on concepts for space waste disposal. A key technology for this is a system, which allows the approach to an uncooperative passive target in low earth orbits down to a relative distance, where capturing is possible. The final distance depends on the capture system and varies between 1m for a manipulator arm and several 10m for tether based systems like the net. Today's operating RVD systems like the one from ATV require a cooperative target, i.e. there is a need for a target pattern and an inter-satellite RF link for data exchange (RGPS). For old or incapacitated spacecraft this is not available, so that the navigation must rely on active sensors (radar, laser scanner, flash light) or exploit environmental illumination or temperature (video- or infrared camera). In close vicinity to the target, it is not sufficient to measure distance and line of sight, but the target attitude needs to be known as well. This requires onboard real time image processing, whereby the images may be generated by a camera (video, IR, PMD), a laser scanner or even an imaging radar. This paper presents results achieved within Inveritas (‘Innovative Technologien zur Relativnavigation und Capture mobiler autonomer Systeme'), an Astrium internal project cofounded by the German space agency DLR. It describes a conceptual GNC system layout and presents preliminary performance results for a rendezvous with a disused but known space vehicle, i.e. the knowledge of the S/C geometry is exploited by the navigation dedicated onboard image processing. First the typical mission segments and corresponding GNC requirements are summarized. Thereafter a preliminary GNC system layout for the chaser spacecraft in accordance to the mission needs is presented. For the GNC description focus is given to the image based navigation over the complete approach distance. In the proposed concept, a video camera has been selected as the primary navigation sensor for far range distance, a laser scanning system (Lidar) provides the required measurements for mid range and the same sensor, while operating in the 3D modus, serves as the primary sensor in close range distances. In the 3D modus the Lidar provides a point cloud, which is used for pose estimation. The preliminary performance of the image processing algorithms has been tested in both, a simulation environment and with real sensors in a test facility. Finally the laboratory environment for navigation design and analysis including the use of a test facility for sensor testing is briefly described.