Gå til hovedinnhold

DOG Robot Playing Dodge Ball Game

Hopp til hovedinnhold

Overview

This project aims to enhance the Spot robot's capabilities by integrating an Event camera, Depth camera, and IMU (Inertial Measurement Unit) for real-time trajectory estimation of fast-moving objects. The robot will autonomously dodge these objects, as seen in a dodgeball game, and then return to its original position. This capability will allow Spot to navigate and react in highly dynamic environments, improving its potential for real-world applications such as disaster management, industrial environments, and collaborative robotics.

Project Components

The system will consist of the following key components:

• Spot Robot: The agile quadruped robot by Boston Dynamics, known for its mobility and versatility in unstructured environments.

• Event Camera: A high-speed sensor capable of detecting changes in the scene at the pixel level, enabling rapid object motion detection.

• Depth Camera: A sensor that provides 3D spatial information, critical for understanding the environment and estimating distances of moving objects.

• IMU (Inertial Measurement Unit): A sensor used to track the robot's orientation and motion, assisting in trajectory estimation and stabilization during object avoidance.

Methodology

The proposed system will be developed in phases. Initially, the sensors—Event camera, Depth camera, and IMU—will be integrated with Spot's onboard systems, ensuring seamless communication and synchronization between these components. Once integrated, the calibration of these sensors will be performed to guarantee accurate data acquisition. The next phase involves developing a real-time trajectory estimation algorithm, which will analyze the data collected from the Event and Depth cameras, supplemented by IMU data. This algorithm will predict the trajectory of fast-moving objects using either machine learning techniques or a physics-based approach. Once the trajectory is estimated, a control system will be designed to allow Spot to execute precise dodge maneuvers, adjusting its posture and movement dynamically based on the incoming object's path. Additionally, feedback will be incorporated to ensure Spot returns to its original position after the dodge. Finally, extensive testing will be conducted in real-world scenarios to evaluate the system's performance in terms of reaction time, accuracy, and adaptability in different environmental conditions. The system will be iteratively optimized for better response and precision.

Benefits

The successful completion of this project will significantly enhance Spot's mobility, allowing it to avoid fast-moving obstacles in real time, thus increasing its applicability in unpredictable environments. One of the major benefits of this system is improved safety, as Spot will be able to dodge objects in hazardous settings, such as disaster zones or fast-paced industrial environments, reducing the risk of collision. Additionally, the integration of Event cameras, which are capable of detecting motion at extremely high speeds, will enhance Spot's responsiveness, making it suitable for complex environments where quick reflexes are essential. This solution is versatile and adaptable, opening doors for applications in search and rescue missions, industrial automation, and even human-robot collaboration, where the robot's agility and the ability to navigate dynamic obstacles are critical.

Conclusion

This project will push the boundaries of dynamic obstacle avoidance by leveraging advanced sensing technologies, including Event cameras, Depth cameras, and IMUs. By enabling Spot to estimate and dodge fast-moving objects in real time, the robot's autonomy and usefulness in real-world applications will be significantly improved.

Oppdragsgiver

University of Agder

Biomechatronics and Collaborative Robotics Research Group

Oppgaveforslag

Type: Fra virksomhet
Publisert: 2024-10-14
Status: Ledig
Grad: Uspesifisert

Fagområder

Dette forslaget er åpent for flere studentoppgaver.