Difference between revisions of "Gesture-Controlled Tello Drone- Rapolas Kairys"

From RoboWiki
Jump to: navigation, search
(Abstract)
(Gesture-Controlled Tello Drone Project)
Line 1: Line 1:
== Gesture-Controlled Tello Drone Project ==
+
== Gesture Controlled Tello Drone Project ==
=== Abstract ===
+
=== Goal of the Project===
This project implements a real-time gesture control system for DJI Tello drones using:
+
The main objective of this project is to control a Tello drone using gesture recognition from a laptop camera. Users can raise their arms in specific poses to make the drone move left, right, up or toggle flight (take off/land) by holding an “UP” pose for four seconds. This project aims to:
* [https://github.com/google-ai-edge/mediapipe MediaPipe] for pose estimation
+
* Provide a way to interact with and control a small drone using hand gestures.
* [https://opencv.org/ OpenCV] for video processing
+
* Explore computer vision and machine learning for gesture recognition.
* [https://github.com/damiafuentes/DJITelloPy DJITelloPy] library for drone communication
+
* Demonstrate real-time controls using multithreading to have a smooth video feed.
 +
 
 +
=== Tools used ===
 +
* [https://github.com/google-ai-edge/mediapipe MediaPipe] for pose estimation.
 +
* [https://opencv.org/ OpenCV] for video processing.
 +
* [https://github.com/damiafuentes/DJITelloPy DJITelloPy] library for drone communication over wifi.
 +
 
 +
 
 +
 
  
 
Key features:
 
Key features:
* Hand sign recognition for directional control
+
* Hand movement recognition for drone directional control
 
* First-person view (FPV) video recording
 
* First-person view (FPV) video recording
 
* Command queue system for safe operation
 
* Command queue system for safe operation

Revision as of 15:51, 2 February 2025

Gesture Controlled Tello Drone Project

Goal of the Project

The main objective of this project is to control a Tello drone using gesture recognition from a laptop camera. Users can raise their arms in specific poses to make the drone move left, right, up or toggle flight (take off/land) by holding an “UP” pose for four seconds. This project aims to:

  • Provide a way to interact with and control a small drone using hand gestures.
  • Explore computer vision and machine learning for gesture recognition.
  • Demonstrate real-time controls using multithreading to have a smooth video feed.

Tools used



Key features:

  • Hand movement recognition for drone directional control
  • First-person view (FPV) video recording
  • Command queue system for safe operation