Difference between revisions of "Gesture-Controlled Tello Drone- Rapolas Kairys"
From RoboWiki
(→Abstract) |
(→Gesture-Controlled Tello Drone Project) |
||
Line 1: | Line 1: | ||
− | == Gesture | + | == Gesture Controlled Tello Drone Project == |
− | === | + | === Goal of the Project=== |
− | This project | + | The main objective of this project is to control a Tello drone using gesture recognition from a laptop camera. Users can raise their arms in specific poses to make the drone move left, right, up or toggle flight (take off/land) by holding an “UP” pose for four seconds. This project aims to: |
− | * [https://github.com/google-ai-edge/mediapipe MediaPipe] for pose estimation | + | * Provide a way to interact with and control a small drone using hand gestures. |
− | * [https://opencv.org/ OpenCV] for video processing | + | * Explore computer vision and machine learning for gesture recognition. |
− | * [https://github.com/damiafuentes/DJITelloPy DJITelloPy] library for drone communication | + | * Demonstrate real-time controls using multithreading to have a smooth video feed. |
+ | |||
+ | === Tools used === | ||
+ | * [https://github.com/google-ai-edge/mediapipe MediaPipe] for pose estimation. | ||
+ | * [https://opencv.org/ OpenCV] for video processing. | ||
+ | * [https://github.com/damiafuentes/DJITelloPy DJITelloPy] library for drone communication over wifi. | ||
+ | |||
+ | |||
+ | |||
Key features: | Key features: | ||
− | * Hand | + | * Hand movement recognition for drone directional control |
* First-person view (FPV) video recording | * First-person view (FPV) video recording | ||
* Command queue system for safe operation | * Command queue system for safe operation |
Revision as of 15:51, 2 February 2025
Gesture Controlled Tello Drone Project
Goal of the Project
The main objective of this project is to control a Tello drone using gesture recognition from a laptop camera. Users can raise their arms in specific poses to make the drone move left, right, up or toggle flight (take off/land) by holding an “UP” pose for four seconds. This project aims to:
- Provide a way to interact with and control a small drone using hand gestures.
- Explore computer vision and machine learning for gesture recognition.
- Demonstrate real-time controls using multithreading to have a smooth video feed.
Tools used
- MediaPipe for pose estimation.
- OpenCV for video processing.
- DJITelloPy library for drone communication over wifi.
Key features:
- Hand movement recognition for drone directional control
- First-person view (FPV) video recording
- Command queue system for safe operation