The Challenge | Design by Nature

Design an autonomous free-flyer to inspect a spacecraft for damage from Micro-Meteoroid and Orbital Debris (MMOD).


Act like a human. Think like a spider



Space debris and micrometeorites fluxes are a major concern for every spacecraft (SC) operator. The damages produced by these high-velocity particles depend on their diameter and velocity. While impacts with particles larger than 1 cm can derive into catastrophic damages, a shielding can be useful to protect the SC against the smaller ones. However, this shielding can still be damaged and requires to be inspected periodically.

Current inspection techniques make use of fixed cameras and result in a high cost of time and resources for both the astronauts and the on-ground segment. Extra-Vehicular-Activities (EVA) aimed to inspect the damage also increase the risk for the astronauts. A solution to automate and optimize this process would be then highly desirable.

The ARACNE Project finds inspiration in how the spiders detect a prey by simply sensing the vibrations that they produce in their webs. If the concepts of damage detection and inspection are treated separately, it is possible to obtain easier, faster and safer solutions. The detection system is composed by a network of piezoelectric sensors located over/under the external skin of the spacecraft. The inspection system is basically a propelled free flyer (FF) that, when commanded, analyzes the damaged area. This FF is normally stored in a charging station at the surface of the SC.

The estimation of the impact position performed by the detection system allows the FF to stay far away from the SC, as the pointing directions are known. This results in a safe operation, possible due to the combination of a high resolution camera and a narrow field of view (FOV). As the impact damage can be estimated, the FF is used only when required (that is, when a predetermined cumulative damage has been reached), hence minimizing propellant consumption. Depending on the SC characteristics, additional propellant-saving techniques, such as SC spinning-based inspection, can be employed. Finally, the system can be easily automated, as the detection system is highly independent and the computation of the FF path, pointing targets and post-processing can be handled by the Central Processing Unit (CPU).


Space is no longer what it was in the old days. Dense clouds of bad tempered bullets, which we call space debris, occupy what once was vacuum. It seems better to adapt to this new scenario, as nobody would like to see how our spacecrafts suffer any damage, or an astronaut receives an impact from the rests of Apollo 13.

Current damage inspection techniques are based on visualizing the external skin of our spacecrafts by making use of fixed cameras. In the most critical situations, astronauts are forced to perform Extra-Vehicular Activities (Space Walks) to inspect and eventually repair the damage. Is there a way to avoid those increasingly risky activities? Can Nature give us the answer? There is certainly more than one answer for this question, but we have found a good one: our friend, the spider.

Spiders are some of the most amazing animals on Earth. As they cannot see very well, they have developed a sophisticated system for having lunch. Thanks to the extraordinary sensitivity of their foots, spiders can quickly locate the reckless insects that fall into their webs. When trying to escape, the insects transmit a vibration to one of the eight legs of the spider, hence revealing their position. Given that scanning the skin of a spacecraft with high accuracy is far from easy, why don't we borrow this approach? Why not creating our custom spider web? With this idea, the proposed system is divided into:

  1. Detection System: Composed by a network of wireless piezoelectric sensors (1) located over/under the external skin of the spacecraft (analogous to the spider web). When a sufficiently strong impact is produced, the whole network is activated and records the vibration of the structure. The vibration start time, amplitude peak and damping coefficient are computed for each node of the sensing network, and the data is delivered to the Central Processing Unit (CPU).
  2. Inspector System: Once a critical cumulative damage has been reached (or a very energetic impact has been measured), the CPU creates a 3D map of the impacts location and optimizes the FF flight path to inspect them with minimum propellant consumption. The FF moves then autonomously to a series of preallocated observation positions, from which it is possible to observe the whole SC. The images taken from there are compared with others obtained on ground at the very same positions. Post-processing finally reveals the exact position and size of the impact.

Detection System

Our "spider web" consists of a network of sensors distibuted on/under the SC's surface . The aim of this system is to perform the first detection of the impact. In standard navigation conditions, the sensors are in sleep mode. As shown in the video, when the SC intercepts a cloud of space debris, if there is a sufficiently strong collision, the sensor that firstly detects it is activated. This sensor activates the entire network, whose measurements (condensed in a set of 4 parameters per sensor) are sent to a CPU, which triangulates the impact point position and stores the data. Each sensor consumption was estimated based on previous technologies (1) and a space debris flux calculated with ESA MASTER (~9 impacts/yr·m^2) for the ISS, resulting in a battery lifetime of more than 2 years (0.3Ah).

The deterministic triangulation problem for homogeneous surfaces is well understood. We performed a simple experiment, letting a nut (space debris) fall in a water recipient with four markers (sensors) (2). The fluid wave motion is governed by equations which are analogous to a structural wave. Using data taken from the experiment video frames (3), we developed an algorithm to show how the triangulation can be performed with high accuracy (4). However, deterministic triangulation is far from being accurate when dealing with complex non-homogeneous structures. The problem is even more complicated if we want to perform a sensor topology optimization.There is an alternative path, that we have also explored, which is based on the use of Deepnets (5). If a FEM model of a structure is built, a Montecarlo simulation can be performed to simulate impacts at different locations. Measuring a predetermined sensor network output and feeding a neural network in a proper way, it should be possible to estimate the impact location and energy. We performed a simple simulation with a flat plate model and four sensors. The expected sensors output was used to feed a 3-layer deepnet, which was able to predict the impact location with high accuracy. This very simple example demonstrates that a neural network is able to obtain the positions with our reduced set of parameters, and that it should be adaptable to more complex geometries and materials.

The data from the sensors, that in a real case feeds the embebed neural network, can be used to retrain the net. This accounts for the structure changes on orbit. The retraining can be carried out on ground, as the transfer data consists on the reduced set of impact parameters and the output are the coefficients of the neural network.

Inspection System

If the detected impact energy is over a fixed threshold, a FF will be sent to inspect the impact locaiton. The FF is located in a proper point which is chosen to avoid effects on the SC's dynamics (i.e. above the center of mass). From this point, the FF will move to one of the pre-fixed points around the SC, the closest to the impact point. These pre-fixed points are chosen in order to have a complete view of the spacecraft, minimum fuel consumption and minimum number of positions. As a possible starting point to define an optimized path for the point-to-point transfer, we suggest this paper by Chen et al (6). The right orientation of the satellite in order to focus on the impact point is done through an attitude control and determination system. Thus, the FF performs an optical analysis with a 4° FOV high resolution camera which allows to have a focus area of one square meter from a safe altitude of about 15 m. We also considered the possibility of having an impact on the shadow side of the spacecraft. In addition to rotating the spacecraft or waiting for better lighting conditions, it is possible to put some luminscent marker on the surface of the sensors to evaluate the damage with a sufficiently sensible optic technology, such as CCD Cameras. (7)


From the previous it is clear that:

  1. The FF can be operated outside the safety sphere of the SC, hence resulting in a safe operation.
  2. Propellant consumption is minimized thanks to the CPU trajectory optimization.
  3. The system can be easily automated.
  4. The risk for astronauts is reduced, as inspection EVAs are no longer necessary.
  5. The overall system has a low weight and low energy consumption.
  6. Detection system addapts to the changes of the structure thanks to neural network retraining.
  7. Impact data can be used for scientific purposes in order to estimate the flux of micrometeoroids and space debris.


- Facebook :

- Instagram :

- GitHub :


(1) Distributed Impact Detector System (DIDS) Health Monitoring System Evaluation;

William H. Prosser/NESC - Eric I. Madaras

(2) Experiment video

(3) Algorithm to extract datas from the video

(4) Algorithm for triangulation

(5) An Artificial Neural Network based approach for impact detection on composite panel for aerospace application.

M. Viscardi, P.Napolitano

(6) Integrated guidance and control for microsatellite real-time automated proximity operations;

Ying Chen, Zhen He, Ding Zhou, Zhenhua Yu, Shunli Li

School of Astronautics,Harbin Institute of Technology, Harbin, PRChina

(7) CCD Cameras


SpaceApps is a NASA incubator innovation program.