Dragonfly Mini – 2000km mission.

We at Stratiteq love to explore new technologies and see how they can be applied in real life. Here is a story of one of our recent projects.

Inspiration for this project came from NASA’s Dragonfly mission that was announced last year and will be launched 2026 with the goal of exploring Titan, the largest moon of Saturn. Dragonfly will be a drone-like spacecraft, size-wise like the Marsrover, and it will be the first vehicle ever that is capable to transfer its entire scientific payload from one place to another on Titan’s surface for material analysis. Dragonfly is not the only mission that will use drone technology. The Perseverance rover that will be launched tomorrow (July 30, 2020) will also carry the Mars Helicopter with it to Mars in order to test this technology for the first time on another world.

Our project, Dragonfly Mini, is an autonomous vehicle that consists of a rover and a drone. The rover is powered by an NVIDIA Jetson Nano, a small supercomputer that is capable of processing different AI algorithms. The rover can move autonomously, but it can also be controlled remotely. It is equipped with a camera and mobile phone that is used as an access point to the Internet. Besides being a communication device, we also made full usage of the mobile phone cameras, sensors and its processing power. The rover is capable of controlling camera equipped drone, which gives it better overview of the surrounding area. Collected data is processed with the Jetson Nano and data points are uploaded to Microsoft’s Azure cloud.

In order to test our rover, we decided to take a trip to Iceland. Besides being able to record in a nice environment, Iceland is over 2000 kilometres away from our home office in Malmö and we consider it to be an impressive enough distance for a test of this project. Tests were done in the southern part of Iceland and we were careful to test it on allowed areas only, since, as you probably know, moss on Iceland is protected and if damaged takes years to grow back. If you are planning a similar experiment, please have that in mind. Feel free to check the video below to see what people who worked on the project said about it.

There were different challenges in order to execute this project. The first thing we had to figure out is how to move the rover. We equipped the Jetson Nano with a camera, and we trained it for obstacle detection through deep learning.

Collision avoidance with Jetson Nano

Collision avoidance worked well in the office, but in the real world there are unknown obstacles so we decided to use the FCRN (Fully Convolutional Residual Network) model on the iOS phone, which we, as mentioned earlier, added to the rover to be able to have Internet connection. With Apple’s Core ML we were able to easily make depth prediction.

Depth estimation with Core ML on iOS

The mobile phone was also used to provide the data from its sensors, so we built an API in Azure cloud where we were sending GPS location, altitude, compass data and other data points. Data was organised in sessions and it was visualized in Power BI reports.

Data visualisation in Power BI

The Ryze Tello drone seemed like an ideal drone for our use case because it is lightweight and programmable. This drone has a Python SDK which can be easily connected with Jetson Nano.

To be able to control the rover manually we created a dashboard and we added manual controls to it. We decided to use MQTT, a lightweight transfer protocol used for small IoT-enabled devices. Besides controls, we also added the latest data, together with a camera snapshot and depth estimation picture created with Core ML. The whole communication was monitored live with Azure Application Insights.

Dragonfly Mini dashboard

Iceland is home to the Atlantic puffins, small birds with brightly coloured beaks. These birds live only in the North Atlantic Ocean and they are protected on Iceland. So far, we have seen several use cases how AI can help track endangered species and inspiration for our first use case came from that. We decided to train a custom model that is capable to detect puffins and we tested the model on the Jetson Nano with the videos we recorded. We also decided to explore possible usage for search and rescue scenarios, since drones can cover vast areas.

There are many other use cases where such technology can be used. Our summer intern, Cecilia Rundberg, wrote a blog post about the project she did at Stratiteq, demonstrating how such technology can be used for measuring the social distancing. You can read all about it here.

We’ll continue working on this project and we’ll publish a series of technical blogposts on how we solved specific challenges while building this project; feel free to follow us on Dev.to if you like to know more about it.

Special thanks go out to Charlotte, Eliass, Erik, Frank, Jörgen, Josefine, Kim, Paul, Rasmus, Sinisa, Yurdaer and to all other Stratiteq employees who helped with this project.