Emojis or avatars are ways to indicate nonverbal cues. These cues have become an essential part of online chatting, product review, brand emotion, and many more. It also lead to increasing data science research dedicated to emoji-driven storytelling.
With advancements in computer vision and deep learning, it is now possible to detect human emotions from images. In this deep learning project, I have classified human facial expressions to filter and map corresponding emojis or avatars.
The main aim of this project is to accurately turn human facial expressions into an emoji/avatar.
The FER-2013 dataset ( facial expression recognition) consists of 48*48 pixel grayscale images. The images are centered and occupy an equal amount of space. This dataset consist of facial emotions of following categories:
- 0:angry
- 1:disgust
- 2:fear
- 3:happy
- 4:sad
- 5:surprise
- 6:neutral
This dataset can also be found on Kaggle.
In this deep learning project, I have built a convolution neural network to recognize facial emotions. I have trained the model on the FER-2013 dataset. Then, I have mapped those emotions with the corresponding emojis/avatars.
- Tensorflow 1.5
- Python 3.x
- Keras
- OpenCV 3.4
- h5py
- A good grasp over the above 4 topics along with neural networks (preferably convolutional neural networks). You can also refer to internet for any further doubts regarding any technical issues.
- A good CPU (preferebly with a GPU).
- Patience....A LOT OF IT!
- Jupyter Notebook (optional)
- Dataset
- Emojis/Avatars
- 0:Neutral
- 1:Happy
- 2:Sad
- 3:Disgusted
- 4:Fearful
- 5:Surprised
- 6:Angry
Knowledge required of the following topics in order to understand the technical details of this project:
- Tensorflow
- Keras
- Neural Networks
- Python
- OpenCV
- Various libraries like Numpy,pandas,Tkinter,etc.