TabbyRangers has received the following awards and nominations. Way to go!
Inspired by nature, we’ve developed an autonomous damage inspection framework, CSMS, to improve current methods on spacecraft damages such as Micro-Meteoroid and Orbital Debris (MMOD). This system consists of multiple flyer robots, BeeBots, which are capable of collaborating with each other in harmony like a socialized bee colony. In this project we provide main algorithms and hardware investigations to discuss the feasibility of this method, and we hope this system can eventually benefit humankind in astro exploration.
The main defects of them are time-consuming, light source or angle limitation, and waste of extra-vehicular activity (EVA) resource, etc. Inspired by natural creatures, we found that bees are an outstanding basis for the development of a damage inspection system. Thus, the followings are three key features in our robot design:
We mainly adopted Artificial Bee Colony Algorithm to mimic bee's collaboration and task division. Our BeeBots can be grouped into employed bees, onlookers and scouts to exchange information and discover potential damage efficiently.
Bees' dancing plays a crucial part in their cooperation and society. Unlike natural bees, we adopted the spirit but modified the implementation by using radio broadcasting system on our BeeBots to communicate with each other.
Compound eyes are one of the core sensors on bee which are able to detect visible light, ultraviolet light and polarized light. Some of the flowers emit ultraviolet marks to attract employed bees to land and collect nectar and pollen. Similarly, we implemented structured light scanning as BeeBot's "compound eyes" to observe features on the spacecraft surface.
Manipulators can get highly accurate results without spending much time and efforts, and also acquire quantified data to avoid limitation and errors.
In the future, the system should play an important role in aerospace industry: BeeBots may be upgraded as astro-droids providing real-time hardware repairment; furthermore, they may also be used as pioneers or scouts to briefly report the surface situation of newly discovered planets, helping us explore the outer space more thoroughly.
The image below is the system structure of a BeeBot. Each BeeBot will switch among employed bee, onlooker and scout during the mission. Employed bees will either share obtained information or collect damage data. For collecting data, they will use structured light to scan damage areas based on location information, and store the data to database after using a machine learning model to predict the damage type. For onlookers, they wait for broadcasted information and proceed to assist the scanning. As for scouts, they are in charge of exploring new potential areas either randomly or from gathered information.
The methods are introduced in two aspects: hardware and software.
While executing damage inspection, each BeeBot needs to know its position relative to the spacecraft, therefore, we designed special landmark patterns painted on the surface of spacecraft with location information. Each pattern is unique, so when the flyer scans the pattern on the surface, it knows which part of the spacecraft component it is as well as its relative position. Besides, there are thrusters around the BeeBot, providing propulsion for moving around the spacecraft. Combining all the features above, the BeeBot has the ability to scan the surface smoothly and efficiently
The inspection system is composed by a set of camera system and one structured light generator, which projects a specific pattern on the surface. Since the light pattern is projected by the system, the damage inspection system can be executed without sufficient light source. If there’s any damage on the surface of the spacecraft, the projected light pattern will be distorted by the shape of the damaged segment, and this information can be used for surface reconstruction. As a result, we can gather detail information about the depth of the surface of the spacecraft.
From this algorithm, bees can be divided into three groups: employed bees, onlookers and scouts, and the objective is to find the best nectar source. In short, employed bees share nectar sources information with the onlookers, and the latter can discover high-quality nectar sources based on the information. The scouts, on the other hand, randomly search for new nectar sources.
Adopting the ABC algorithm, our swarm of BeeBots can group up to search for the area with the most serious damage through social cooperation. With the power of dynamic programming, we can successfully reduce the dimension of the tasks to make damage inspection more efficient. Moreover, we modified the communication method of bees, dancing, into radio broadcasting, which transmits information more straight forward. For astronauts, they can either choose to let the BeeBots scan the whole spacecraft or designate a specific area if any prior knowledge is acquired.
Apart from the structured light, we want to leverage the potential of machine learning to further enhance the capability of our system. From the structured light scanning we can obtain depth information of an area, and on each BeeBot, a Convolutional Neural Network (CNN) model is adopted to give preliminary prediction to the spacecraft crew, and from the recognized damage type, The system is able to give suggestion to relevant staff for further repairment. The network uses bitmaps of depth values for training, and also in the future, active learning can be implemented on the system: the model selects potential damaged area images to the scientists and gets the correct labels from them. Through this method, the model can gradually raise the accuracy of prediction and lower the manpower required.
These are the progress of designing out BeeBot model:
We used Microsoft Kinect to demonstrate the depth information scanning:
We use LEGO Mindstorm EV3 to build a control moment gyroscope model to demonstrate the effect:
We used auto-generated depth data to simulate the damage condition of the surface, and used our CNN model to classify and predict the unseen observed damage. The following charts are the prediction results:
Every time an astronaut soars out of the atmosphere, what awaits is a difficult environment beyond our understanding on Earth. A subtle decision may bring an overwhelming impact on one’s life, and on the possibility of coming home safe and sound. What horrifies us is the uncertainty in the outer space, the silent and the most torturous suffering.
James Ke
National Taiwan University of Science and Technology
Department of Industrial Design
Lin Yu-Cheng
National Taiwan University of Science and Technology
Department of Electronic & Computer Engineering
Joey Yang
National Taiwan University
Graduate Institute of Networking & Multimedia
Ming Cue
National Taiwan University
Graduate Institute of Electrical Engineering
SpaceApps is a NASA incubator innovation program.