This is a human-robot-interaction experiment based on human emotion detection
Before catkin_make the workspace, remember only keep the src folder.
cd HRI_emotion
source devel/setup.bash
roslaunch ur3_driver ur3_driver.launch
cd src/emotion/src
rosrun emotion emotionDetector.py
rosrun hri emotionFeedback.py
Step1: click the camera visualization window.
Step2: make an emotion and press "p" to tell the robot.
Step3: wait for robot's feedback.
Repeat: make another emotion, press "p"...