The Challenge | Design by Nature

Design an autonomous free-flyer to inspect a spacecraft for damage from Micro-Meteoroid and Orbital Debris (MMOD).

UAVigilante-Autonomous MMOD damage recognition vehicle

Our recognition vehicles will be modularly sent on orbit to investigate and recognize possible MMOD damage to spacecrafts using a multitude of technologies such as visual recognition, while following a helix trajectory around the spacecraft.

  1. Problem/Opportunity

The softball in orbit. There may also be some 650,000 softball-to-fingernail-size objects and there are about 23,000 satellites, rocket bodies, and other human-made objects larger than 70 million bits of debris smaller than the tip of a pen, such as flecks of paint and fragments of explosive bolts. Each piece of junk is orbiting around our planet at roughly 17,500 mph, or 10 times faster than a bullet. Led by the US military, the SSN uses an global network of partners to identify, track, and share information about objects in space, especially any potential close calls (>10 cm). A crash would create even more debris, meaning there'd be a much greater chance of more collisions in the future.

2) Nature’s inspiration

For this challenge, we took inspiration from birds of prey, whose ability to spot small animals from kilometers away is truly remarkable and it is also the key concept of our implementation. Among the key details which inspired us is the fact that birds of prey have what is called binocular vision. Like many other animals that hunt for food, including humans, a raptor’s eyes face forward, which creates some overlap between what the right eye and left eye see. This helps them determine how far or close an object is. If you watch a bird of prey long enough, you will probably see it bob its head from side to side, move its head around in circles, or even turn its head almost completely upside down. This isn’t some kind of strange dance. Raptors do this to triangulate an object and better determine how far away it is. Moreover, to be more specific, the common kestrel is one of the birds of prey we chose for inspiration. It has the ability to hover while seeking for its food source. The other raptor is the peregrine falcon, which in order to decrease drag while diving and keeping an eye on its prey,it follows a spiraling path. Although this path is logarithmical, we chose to follow a simple helicoidal one, for more accuracy and simplicity.


3) Value Proposition

Due to untracked small debris and micrometeoroids, spacecraft are at constant risk of being damaged and in the worst case lose their functioning capability and fall out of orbit and becoming a source of new debris, causing significant financial losses. Thus inspections for potential damages are becoming crucial for prevention purposes.

The high definition camera coupled with a self learning visual recognition software trained to identify potential damages caused by collisions with orbital debris offers a faster solution for finding this points of interest that the usual method. The visual recognition software being run on a land platform reduces the computational requirements of the drones CPU, therefore lowering the power consumption.

Another key aspect which we add is the overflight around the spacecraft of the autonomous vehicle and creating a detailed imaged model, which can be easily inspected.

Also, instead of using the autonomous vehicles own radio communication system to downlink the images for processing, these are first sent via the inspected spacecraft. As the telecommunication system of the spacecraft is more powerful than that of our autonomous vehicle, this has the advantage of not requiring a heavy compression algorithm for the images, lowering the potential losses and also lowering the power consumption.

Such solution would benefit space agencies and privately owned spacecraft as it can provide an overview of the current state of the spacecraft. Furthermore, an immediate action can be taken in case of important damage by taking down a spacecraft before it produces more space debris. We are certain that an initial investment would be will be high regarding the cost, but it will cushion in the following years as the technology becomes spread and easier to implement.



4)Tech solution

As the technology expands rapidly, so are the methods to solve current engineering problems. Therefore, the solution of the project proposes the use of an autonomous vehicle which can orbit around a spacecraft and identify the damage done by MMOD. In order to do that, the impressive technology of artificial intelligence is used, by integrating computer vision processes. This is done by adding a camera to the launched vehicle, which transmits the pictures taken to a computer that can easily interpret the results.

Problem of perturbations is to be considered during the whole process: mechanical vibration perturbations, where a negative feedback system is to be developed, optical perturbation, achieved by changing relative position of the local plane and lens, rotating the optical wedge and reflection mirror; post-image processing etc

For the image processing system, machine learning algorithms are used by a performant computer such as IBM Watson and programming languages such as node.js or python, both of them having rich resources in image recognition systems. The dataset is hardly accessible to the public, thus a collaboration with a space agency would be required.

The images of the spacecraft are taken by the vehicle, which is slowly orbiting around the spacecraft. It uses a helix motion, starting from the one end of the spacecraft, until it reaches the other end. This motion is described by a high frequency of rotations, in order to capture the spacecraft in its entirety. With every shot taken, a set of coordinates is associated with the place at which the picture is taken in order to pinpoint the exact location of every image. After image processing, should a possible impact location be found by a visual recognition tool ( such as IBM Watson ), the vehicle coming back to the exact location of the photograph and applying a structured 3D scanning ( measuring the light pattern deformation on that surface ).

For the coordinate system, the inertial navigation system is used to determine the exact position of the vehicle in respect to the spacecraft for better accuracy.

Finally, in order to launch only one of the above mentioned autonomous vehicle would be costly ineffective. We are proposing the usage of multi-stage deployment method which implies launching several vehicles on different levels of the orbit in order to check and inspect several spacecraft. In this way, we are reducing the cost per vehicle launch and we optimize the process of maintenance.

5)Use case

Around 95% of what’s floating above us is in fact space debris.

Below 2000 km, debris is denser than meteoroids; most of it is dust from solid rocket motors, surface-erosion debris, used out-of-control parts, even lost astronaut tools. Travelling at 10 times the speed of bullets, these pieces interfere with and destroy critical infrastructure: It was on 24 July 1996, when such a situation was encountered for the very first time: a fragment from the upper stage of European Ariane rocket collided with Cerise, a micro-satellite. The collision tore off a 2.8 m portion of Cerise’s gravity gradient stabilization boom, leaving the satellite severely damaged, yet still functioning. Another worst-case scenario occurred in 2013 , when debris from Fengyun FY-1C affected Russia’s BLITS nano-satellite, bringing its mission to a tragic end. Such cases demonstrate the reliability our product has to offer, therefore recommending it for usage.

6)Project development

We have started by pinpointing the exact needs of our current problem and we changed it into a challenge. We identified that the number of spacecraft hit is continuously increasing by the rapid growth of space debris, which let us to the development of an orbital damage recognition system.

In order to do that, we have firstly though about a vehicle which is capable of overflying around a spacecraft and perform a visual type of inspection which is mainly performed by machine learning and computer vision algorithms on a computer such as IBM Watson with programming languages like node.js and python.

Secondly, we developed a trajectory for our vehicle that can enhance the probability of damage finding, by following helix type of motion. This has been simulated with ease in Matlab.

Next, we identified that for our idea a single vehicle launch would be cost ineffective. That’s why, as part of our project, we have proposed a multi-stage deployment launch which can bring up in space several vehicles at once.

Another important aspect of this project is the documentation through which we have been. In this manner, we have encountered several sensors which can help in damage detection such as Ultrasonic sensors and Fiber Bragg Grating. Both of them provide accurate measurements of damage, but require a fairly close inspection which poses a challenge for the high velocities of the moving objects in space

7)Further development

For the expansion of this project, in order to improve the detection of on-board damage, a new type of sensors usage must be used. That is why one of the solutions is to integrate a mesh network of vibration sensors. By doing so, the identification of the damaged zone is very precise, provided the fact that multiple sensors can mark the spot between which one the damage has been produced.

For a more precise prediction of the damaged areas, for further development a terahertz radiation based technology might be developed and deployed to measure the imaged surfaces for discrepancies in thickness of the outer layer of the spacecraft. This data coupled into the visual recognition algorithm would increase the predictive capabilities of the AV.

NASA Logo

SpaceApps is a NASA incubator innovation program.