This project has two parts. The first part is about multi-emotion classification(happy, sad, surprise and disgust) based on Opencv and machine learning. The second part is about Baxter drawing control based on joint trajectory action server, Baxter can draw a specific face based on the detected emotion.
Video 'winter_project.mp4' is the result of this project. Also you can see this video on YouTube.
The Cohn-Kanade AU-Coded Facial Expression Database is for research in automatic facial image analysis and synthesis and for perceptual studies.
I choose and relabeled data for four emotions: happy (302), sad (238), surprise (256) and disgust (218).
1). Use Opencv Haar cascade to detect human face, and trainsform it to gray;
2). Extract a new face based on offset coefficients, and reshape it to a 64*64 image;
3). Extraxt face features based on dense SIFT.
Use SVM machine learning algorithm for training and prediction. The kernel is "linear", and multiclass classification mathod is One-Vs.-One.
In this project, I choose training data: testing data = 7:3.
Use 5 folds corss-validation to calculate mean score. And then calculate accuracy for training and testing set(98%), and shows the classification report and confusion matrix.
Use Opencv to capture each frame of the video and do the same pre-processing for the frame. It can show the emotion it detects.
During live detection, 'Disgust' is hard to detect for some people, and other emotions are easy to be classified for everyone.
I design four simple faces to represent happy, sad, surprise and disgust. Then I use joint trajectory action server for the Baxter drawing control. When Baxter get the drawing command, it will use inverse kinematics to calculate a list of reference angles for each joint for baxter arm moving, which based on the trajectory and time setting. There is a code walkthrough for joint trajectory client.
'Control.py' is based on move_to_joint_positions and 'trajectory.py' is based on joint trajectory action server.
In this project, Baxter uses a new designed hand to capture the pen. The new hand has springs, which makes the pen more flexible on the vertical direction.
Here is the drawing results for Baxter drawing.
1). Find new method to improve the live detection for 'Disgust' emotion;
2). Use more data(people with glasses, Asian people, different directions and light situations);
3). Detect more emotions like fear and anger.
Use velocity controller for Baxter movement.