Difference between revisions of "Gesture-Controlled Tello Drone- Rapolas Kairys"
From RoboWiki
(→Gesture-Controlled Tello Drone Project) |
(→Gesture Controlled Tello Drone Project) |
||
Line 6: | Line 6: | ||
* Demonstrate real-time controls using multithreading to have a smooth video feed. | * Demonstrate real-time controls using multithreading to have a smooth video feed. | ||
− | === | + | === Description === |
− | * [https://github.com/google-ai-edge/mediapipe MediaPipe] | + | The system uses: |
+ | * [https://github.com/google-ai-edge/mediapipe MediaPipe] Pose Estimation to detect body landmarks (shoulders, elbows). | ||
* [https://opencv.org/ OpenCV] for video processing. | * [https://opencv.org/ OpenCV] for video processing. | ||
− | * [https://github.com/damiafuentes/DJITelloPy DJITelloPy] library | + | * [https://github.com/damiafuentes/DJITelloPy DJITelloPy] library to send commands to the Tello drone over Wi-Fi, handling takeoff, landing, and movement. |
+ | * Multithreading to ensure that drone commands (which can block) do not freeze the camera feed or the user interface. | ||
+ | * Custom gesture logic to determine gestures (LEFT arm up, RIGHT arm up, both arms up, or none). | ||
+ | When the user performs a gesture in front of the laptop camera, the system detects it and translates it into a drone command: | ||
− | + | * Holding both arms raised for 4 seconds toggles flight (either takeoff or land). | |
− | + | * Once in the air, raising both arms makes the drone go up. | |
− | + | * Raising only the left or right arm makes the drone move left or right. | |
− | * | + | * Doing neither results in a hover command. |
− | * | + | * The drone moves 30cm for all commands, which can be changed in ''drone_controller.py''. |
− | * | + | * The command has to be held for 1.5 seconds to take effect. This helps prevent accidental commands and ensure safety. |
+ | * While the program is running and a drone is connected pressing "t" makes the drone take off/land. Pressing "q" shuts down the program. |
Revision as of 16:05, 2 February 2025
Gesture Controlled Tello Drone Project
Goal of the Project
The main objective of this project is to control a Tello drone using gesture recognition from a laptop camera. Users can raise their arms in specific poses to make the drone move left, right, up or toggle flight (take off/land) by holding an “UP” pose for four seconds. This project aims to:
- Provide a way to interact with and control a small drone using hand gestures.
- Explore computer vision and machine learning for gesture recognition.
- Demonstrate real-time controls using multithreading to have a smooth video feed.
Description
The system uses:
- MediaPipe Pose Estimation to detect body landmarks (shoulders, elbows).
- OpenCV for video processing.
- DJITelloPy library to send commands to the Tello drone over Wi-Fi, handling takeoff, landing, and movement.
- Multithreading to ensure that drone commands (which can block) do not freeze the camera feed or the user interface.
- Custom gesture logic to determine gestures (LEFT arm up, RIGHT arm up, both arms up, or none).
When the user performs a gesture in front of the laptop camera, the system detects it and translates it into a drone command:
- Holding both arms raised for 4 seconds toggles flight (either takeoff or land).
- Once in the air, raising both arms makes the drone go up.
- Raising only the left or right arm makes the drone move left or right.
- Doing neither results in a hover command.
- The drone moves 30cm for all commands, which can be changed in drone_controller.py.
- The command has to be held for 1.5 seconds to take effect. This helps prevent accidental commands and ensure safety.
- While the program is running and a drone is connected pressing "t" makes the drone take off/land. Pressing "q" shuts down the program.