This project is a four-axis parallel robot arm equipped with a suction cup as the end-effector, controlled via an electromagnet valve. The system runs on a Linux-based environment and introduces several advanced And I really think National University Internet of Things Design Competition is such a stupid ass that invited a lot of freshman to be the judge.Such a funny story !
the project is based on https://github.com/wissem01chiha/robotic-arm-manipulator ,and we transferred the code into C++ and Python.
- Arduino: Controls the suction cup and electromagnet valve.
- Embedded Development Board: Handles the motion control of the arm.
- YOLOv5: Used for detecting packaging damage.
- Machine Vision: For recognizing QR codes on items.
- SQLite Database: For storing data related to detected damages, QR codes, and operational logs.
- Package Damage Detection (YOLOv5): The system uses YOLOv5, a real-time object detection algorithm, to detect if any packaging is damaged before palletizing.
- QR Code Recognition: Machine vision algorithms are integrated to scan and identify QR codes on items.
- Data Storage: Detected damage reports and QR code information are stored in an SQLite database for later retrieval and analysis.
- Arduino File: Code for controlling the electromagnet valve, located with the
arduino
tag. - ARM File: Code for controlling the arm's movement, located with the
Python
tag. - The system runs on Linux, where you can execute the requirement.txt.
- Connect the suction cup and electromagnet valve to the Arduino.
- Connect the motors for the three-axis parallel mechanism to the embedded development board.
- Set up a camera for capturing images for damage detection and QR code scanning.
- Power on the Arduino and the embedded development board(recommended RDK X3).
- Run the python file "main.py"and "usbtest.py"
This simplified version should match your expectations more closely. Let me know if any additional changes are needed!