Awards & Nominations

Asgardians has received the following awards and nominations. Way to go!

Global Nominee

The Challenge | Design by Nature

Design an autonomous free-flyer to inspect a spacecraft for damage from Micro-Meteoroid and Orbital Debris (MMOD).

Galacticus

This project uses state-of-the-art technology for scanning and analyzing micro-meteoroid and space debris impacts

Asgardians

Problem Description

The “Design by Nature” contest by NASA challenged us to solve the problem of micro-meteoroid and orbital debris (MMOD) damage detection using nature’s solutions as an inspiration source. The goal was to create a free-flyer style robot capable of following an inspection routine with precision while in space.

Expeditions like Apollo 13 or STS-107 were lost due to the lack of identification of external damages in the spacecraft. Other experiences, such as the accident that happened aboard MIR also shows us how important is to have a reliant way of inspecting the external spacecraft’s structure. With the intention of going further and further into space, eventually arriving on Mars, this problem becomes increasingly important.


The Hackathon

“Hackathon is an event that brings together programmers, designers and other professionals connected to software development for a marathon programming, whose goal is to develop software that meets specific purpose or free projects that are innovative and usable.”

NASA, together with FATEC OSASCO, have promoted a hackathon in São Paulo, Brazil, to develop solutions to real problems encountered by NASA today. We, Asgardians, decided to participate to test our knowledge, not only in software development, but mainly in the development of robotic systems that the challenge “Design by Nature” proposes; while still meeting with new people from all over our country.


The Project

The project consists of developing a free-flyer style, autonomously operated robot with a camera system. Our main purpose is to scan the entire spacecraft and create a 3D model of its structure for comparison with an existing CAD model. Most of the time, overlapping both data sets allows us to identify where micro-meteoroid impacts occurred.

Taking inspiration from the marine environment, we analyzed the relationship that a cleaning fish (clownfish) has with the reef. According to an article from Unicamp the existence of a reef depends directly on the cleaning that the clownfish realizes. This is a concept of symbiosis known as mutualism, where two different species need each other in order to survive. Another concept of symbiosis is commensalism, where one species is a beneficiary and the other has no benefits, but also has no losses. Our solution is a system inspired by commensalism, in which the main spacecraft is completely benefited by the sweep of our robot, and at the same time possess no detriment to our robot.

For scanning the structure, we used optical metrology systems, inspired by the type of hunting done by eagles. They have an outstanding quality of vision that, with a determined search pattern, has great hunting capacity to small and large species.

For estimates, we assumed that the inspected ship would be NASA’s upcoming Orion capsule, fullfed with additional modules for long-range trips. We assumed a 5.0 meter diameter ship with 41.0 m. length.


Propulsion and Attitude Control

Given the need to follow the spacecraft under analysis for long periods, a positioning and locomotion system with high accuracy, repeatability and low supply needs (fuel) was required. To achieve these goals, our robot uses two technologies: micro-nozzles driven by compressed air and a set of reaction wheels.

The four reaction wheels used in our design ensure agile turns and extra redundancy for malfunctions. They work by storing angular momentum by rotating a steel disk at high speed and, as necessary, decreasing or increasing its speed. Each disk is kept in motion by a brushless DC motor, ensuring high energy efficiency and service life.

Micro-nozzles are used with a gaseous mixture maintained under high pressure (approximately 4 to 6 atm. was stipulated), which can be breathable air from the spacecraft or methane – one of the by-products of the air treatment process (currently without use). They are controlled by solenoid valves and can generate up to 0.5 N. of thrust. Our design has eight micro-nozzles that along with the reaction wheels, can move and rotate the robot in any direction.

The use of such technologies is only possible due to the microgravity and high vacuum environment where the robot must operate, where there are no friction losses and where there is no need to fight the force of gravity.

The robot’s pressurized gas tank can be easily refilled by the crew once inside the main spacecraft, and its simple design (no hypergolic fuels or multiple moving parts) ensures a long life and the confidence needed to be used near manned spacecrafts.

The robot’s onboard computer is calibrated during launch, and after that maintains its current position using accelerometers and gyroscopes. Eventually, in longer duration operations, its position can be updated via a radio link with the spacecraft.

Expected maximum speed shall be equal to or less than 0.5 meter per second for safety reasons. In the event of a collision with the main spaceship, the robot should be with sufficiently low speed in order to not damage it.


Optical Measurement System based on Stereo Photogrammetry

By using a dual camera system, we can simulate the depth-of-sight we’re familiarized with from our eyes. This technology, known as stereo photogrammetry, allows us to create variety of perspectives from which we can create a 3D point cloud array. This technology is already in use in many factories around the world, measuring parts in production lines for statistic control. This point cloud array can later be converted to a 3D model of the analyzed ship.

Our project also integrates a high-resolution “ordinary” photographic camera sensor to take frames from the spacecraft to help the analysis and search for anomalies. Additionally, for the deep parts of the structure, a laser line measurement device makes measurement, creating a lower resolution 3D model of the parts not covered by the higher-resolution stereo cameras.

For estimate purposes, we assumed a 28 mm. focal length (35 mm. equivalent) attached to a 35 mm. full frame, 50 MP CMOS sensor, creating a horizontal field view of 65.5º. For safety reasons, we assumed that the robot shall remain at a minimum of 1 m. from the spacecraft. This yields a theorical resolution of up to 0.01 mm./pixel, suitable for deep-learning post-processing. At this distance, 6 pictures should be taken for a complete revolution of the Orion. For the 41 m. length, a total of 522 photos must be taken (with a safety region of 10% on each corner on each photo). Assuming they are high quality RAW files, each weighting 65 MB, a total of about 34 GB of onboard storage must be installed.

Due to the high volume of data, we choose to retain the photos onboard the robot and only process them after the scanning mission ends (with powerful computers abroad Orion). Although inconvenient, this small delay in processing does not interfere with the urgency of operation. The total time for a complete scan should be around 4 hours, observing the parameters mentioned before.

Because in space there is a great presence of sunlight (which emits many wavelengths at very high intensities), and the absence of sunlight on the opposite side of the spacecraft, we created a theoretical analysis logic so that the robot can analyze the structure of the spacecraft without restrictions. On the dark side of the spacecraft, the analysis would be from the excess reflection caused by a micro-meteoroid hole when exposed to a powerful light bean from our robot and thus, when processed, the image would have the location of the damage. On the bright side of the spacecraft, the analysis shall be done from the lack of reflection because theoretically the hole will cause a shadow when the white light of the sun is thrown directly into the profile.

However, space-related devices generally use foil for thermal management, making harder to compare sensor data to a theorical, ideal model. For these parts, a high-resolution CMOS sensor captures images and, through deep-learning technics, analyses and process it, deciding when there is a damage in the foil/blanket.

NASA Logo

SpaceApps is a NASA incubator innovation program.