Hier kommt der Banner hin

Project goal

The aim of this project is the 3D positioning and control of several mini drones using embedded systems such as the Raspberry Pi. The vision is, among other things, a modular station for events with which several mini-drones can fly in formation. However, we have other goals that we want to achieve with slight extensions to the same system: 

  • Gesture controlled camera stick 
  • Flight inside the limited space of our workshop window 
  • Other interactions like playing Tic-Tak-Toe or draw geometric shapes  

Project details

After initial trials with ArUco markers, which showed immense problems with low-light, long distances and fast movements, we decided to use simple LED markers mounted on the drone. 

The markers are recognised on a Raspberry Pi system, as both the PiCamera and the platform itself are very flexible. In particular, it is important for effective recognition that the exposure time can be set very low so as not to overexpose the LEDs. 

Instead of an OpenCV implementation on the CPU, we decided to utilise the hardware of the Raspberry Pi and use our own blob detection algorithm on the VideoCore GPU with OpenGL as the basis. This allows us to achieve higher frame rates and, above all, lower latencies, as the resource-intense movement of data from GPU to CPU is almost completely avoided. This also allows us to use a small €5 Raspberry Pi Zero computer and still achieve real-time performance. 

After recognising the LEDs, the 3D position of the marker must then be recognised on the CPU, which will probably be done with solvePnP from OpenCV. This is followed by filtering and predicting this position to ensure smooth and accurate tracking of the drone. Finally, the desired movements of the drone are calculated, converted into control commands and sent to the drone using an external RF module. 

Progress

The following steps are still necessary to finalise the basic software: 

  • Hardware-optimized blob detection of the LEDs 
  • Marker detection from the identfied blobs 
  • Deriving maker 3D-position from image data 
  • Filtering and predicting future positions 
  • Porting the driver for controlling the drone with an RF module to the Raspberry Pi 
  • Correct the position of the drone to the desired position 

Then it's the turn of our specific goals. For our first target, the camera stick, we are aiming for a specially designed body with an ergonomic grip and elegant design (in other words: no new software). For formations, more complex predictions and more accurate settings (PIDs) are needed to maintain a desired trajectory.

Significance

With this project we show that real-time computer vision tasks are possible on a low-cost embedded system. With the Raspberry Pi Zero (€5) and a cheap PiCamera (€2.50), this is an extremely competitive and quite powerful low-cost optical tracker. Unoptimised, the current system achieves 640×480@45fps / 960×544@30fps. 

This can also be easily expanded into a stereo system, as two Zeros can be synchronised with an external microcontroller (€3), which enables significantly greater accuracy. 

Other, more complicated CV algorithms are also possible by writing directly to the CPU.

Contact

Do you have question regarding the project or want to join us? Send us an email and get in touch.

en_GBEnglish