Solaris Lab | Design by Nature

Awards & Nominations

Solaris Lab has received the following awards and nominations. Way to go!

Global Nominee

The Challenge | Design by Nature

Design an autonomous free-flyer to inspect a spacecraft for damage from Micro-Meteoroid and Orbital Debris (MMOD).

SPACE BUDDY | A SATELLITE BASED ROBOT AUTONOMOUS SYSTEM

Space Buddy is a CubeSat equipped with a neural network vision software, our sat-drone is maneuverable making it capable of flying over the main spacecraft to investigate and evaluate damaged parts, threat alerting saves astronauts from risky EVAs.

Solaris Lab

The Team

The Team is passionate about providing practical engineering solutions to real-world problems using drones and robotics, we are a group of interdisciplinary engineers and computer scientist making our solutions very reliable and robust.


The Problem

Currently, the earth has a layer of debris made up of space junk and meteorites collectively termed as (MMOD). These MMODs threaten the safety of ongoing and future space missions. Some of this debris traveling at high velocities(10km/s to 20km/s) the size of a drop of paint impact the spacecraft causing damages some of which are life-threatening, and if there is a significant damage that requires urgent inspection at blind spotted regions astronauts would risk their lives to physically perform Risky EVAs to investigate the extent of the damage.


The Inspiration

Our solution to the problem was inspired by the mutualism in nature. Mutualism is a type of symbiotic relationship in which both partners benefit from the relationship. Leaf cutter ants cut various types of foliage into pieces. They then carry these cut leaves back to their colony where they grind up the plant matter. They then inject the leaves with a fungus. The harvested fungus is then used as a food source for the ant colony.


Design

Our design is based on a 2 unit CubeSat with a propulsion and 2 axis reaction wheel attitude control system making it possible to autonomously maneuver over the surface of the host Spacecraft taking video shots in real time of the surface. These real-time video shots pass through an inherent trained neural network model that actively identifies the parts of the spacecraft determining the regions that may be damaged and evaluating the extent of the damage. it is possible to set space buddy on a fixed orbit around the main spacecraft enabling it to have a larger Field of view and scanning larger surface regions.


Specifications

An onboard 32-bit computer for high-level processing and 16-bit computer for low-level processing collectively work as the flight computer to manage and control the subsystems. The Neural network vision identification software runs on board the flight computer.

  • 2 modes of operation 1) Fully autonomous control 2) Semi-Autonomous
  • 2 reaction wheels for 2 axis Attitude Control System.
  • CubeSat camera.
  • 3 Solar panels,lithium-ion battery, and power distribution for Electrical Power Subsystem.
  • A standard 1 unit cubs at Aluminum 7075 frame with some modifications.
  • A standard UHF communication system for communication with the main spacecraft
  • A cold gas propulsion system
  • Small sized enabling space buddy to be carried on any space mission

Neural Network Visual identification

With an onboard camera system with our specially trained deep learning algorithm in Darknet YOLO we can visually inspect and evaluate the surface of the spacecraft. Darknet YOLO combines open computer vision(OpenCV) and fast Region based Convolutional Neural Networks(R-CNN) giving the ability of space buddy to look only once at objects in its surroundings and give a feedback per the trained models.


Deployment

Step 1- Detachment from an internal dock

Space buddy is activated and autonomously leaves the ISS internal docking bay, and using its attitude control and propulsion system it positions itself at a suitable distance from the surface of the main spacecraft for inspection. With a pulse from the propulsion engine, the surface scanning begins

Step 2- Visual identification system activate

The Camera is turned on the visual scanning begins as space buddy maneuvers up and around the main spacecraft, upon visually detecting and identifying of damaged surfaces it evaluates the extent of damage and identifies the location then quickly reports back in real-time to the main spaceship via UHF communication

Step 3 End of inspection

Space buddy maneuvers back to the start point at the end of inspection cycle then position itself for internal docking.


Future Prospects

Our idea marries together drone technology, Computer vision, Deep learning neural network and satellite technology on one single platform. Advanced forms of space buddy will be equipped with manipulators and robotic arms. This has the potential to significantly minimize the need for astronaut Extravehicular activities (EVA) or spacewalking by performing autonomous or semi-autonomous repairs and quick inspections in a swarm formation.


Resources

1) "Micrometeoroid and Orbital Debris (MMOD) Risk Overview " by Eric Christianse Available: https://www.nasa.gov/sites/default/files/files/E_C...

2) Darknet Yolo available:https://pjreddie.com/darknet/yolo/

3) "Inspection of GEO Spacecraft for Commercial and Military Customers " by Dr. Gordon Roesler Available: https://www.nasa.gov/sites/default/files/files/G_R...

4) "Autonomous Mission Operations Roadmap " by Dr. Jeremy Frank.Available: https://www.nasa.gov/sites/default/files/files/J_F...

5) "Compute model Cubesat" Available: https://www.raspberrypi.org/blog/compute-module-cu...

6) "Leaf Cutter any:Social symbionts" Available: https://projects.ncsu.edu/cals/course/ent525/close....



NASA Logo

SpaceApps is a NASA incubator innovation program.