You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
EmoPy is a python toolkit with deep neural net classes which accurately predict emotions given images of people's faces.
10
+
EmoPy is a python toolkit with deep neural net classes which aims to make accurate predictions of emotions given images of people's faces.
11
11
12
12

13
13
*Figure from [@Chen2014FacialER]*
14
14
15
-
The aim of this project is to make accurate [Facial Expression Recognition (FER)](https://en.wikipedia.org/wiki/Emotion_recognition) models free, open, easy to use, and easy to integrate into different projects.
15
+
The goal of this project is to explore the field of [Facial Expression Recognition (FER)](https://en.wikipedia.org/wiki/Emotion_recognition) using existing public datasets, and make neural network models which are free, open, easy to research, and easy to integrate into different projects. The behavior of the system is highly dependent on the available data, and the developers of EmoPy created and tested the system using only publicly-available datasets.
16
16
17
-
The developers of EmoPy have written two guides you may find useful:
17
+
To get a better grounding in the project you may find these write-ups useful:
18
18
* [Recognizing human facial expressions with machine learning](https://www.thoughtworks.com/insights/blog/recognizing-human-facial-expressions-machine-learning)
19
19
* [EmoPy: a machine learning toolkit for emotional expression](https://www.thoughtworks.com/insights/blog/emopy-machine-learning-toolkit-emotional-expression)
20
20
21
-
We aim to expand our development community, and we are open to suggestions and contributions. Please [contact us](mailto:[email protected]) to discuss.
21
+
We aim to expand our development community, and we are open to suggestions and contributions. Usually these types of algorithms are used commercially, so we want to help open source the best possible version of them in order to improve public access and engagement in this area. Please [contact us](mailto:[email protected]) to discuss.
22
22
23
23
## Overview
24
24
@@ -32,27 +32,38 @@ Description: # EmoPy
32
32
- `directory_data_loader.py`
33
33
- `data_generator.py`
34
34
35
-
The `fermodel.py` module uses pretrained models for FER prediction, making it the easiest entry point to get a trained model up and running quickly.
35
+
The `fermodel.py` module uses pre-trained models for FER prediction, making it the easiest entry point to get a trained model up and running quickly.
36
36
37
37
Each of the modules contains one class, except for `neuralnets.py`, which has one interface and four subclasses. Each of these subclasses implements a different neural net architecture using the Keras framework with Tensorflow backend, allowing you to experiment and see which one performs best for your needs.
38
38
39
39
The [EmoPy documentation](https://emopy.readthedocs.io/) contains detailed information on the classes and their interactions. Also, an overview of the different neural nets included in this project is included below.
40
40
41
-
## Datasets
41
+
## Operating Constraints
42
42
43
-
Try out the system using your own dataset or a small dataset we have provided in the [examples/image_data](examples/image_data) subdirectory. The sample datasets we provide will not yield good results due to their small size, but they serve as a great way to get started.
43
+
Commercial FER projects are regularly trained on millions of labeled images, in massive private datasets. By contrast, in order to remain free and open source, EmoPy was created to work with only public datasets, which presents a major constraint on training for accurate results.
44
44
45
-
Predictions ideally perform well on a diversity of datasets, illumination conditions, and subsets of the standard 7 emotion labels (happiness, anger, fear, surprise, disgust, sadness, calm/neutral) seen in FER research. Some good example public datasets are the [Extended Cohn-Kanade](http://www.consortium.ri.cmu.edu/ckagree/) and [FER+](https://github.com/Microsoft/FERPlus).
45
+
EmoPy was originally created and designed to fulfill the needs of the [RIOT project](https://thoughtworksarts.io/projects/riot/), in which audience members facial expressions are recorded in a controlled lighting environment.
46
46
47
-
## Environment Setup
47
+
For these two reasons, EmoPy functions best when the input image:
48
48
49
-
EmoPy runs using Python 3.6, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. You can install [Python 3.6.6](https://www.python.org/downloads/release/python-366/) from the Python website.
49
+
* is evenly lit, with relatively few shadows, and/or
50
+
* matches to some extent the style, framing and cropping of images from the training dataset
50
51
51
-
Please note that this is not the most current version of Python, but the TensorFlow package doesn't work with Python 3.7 yet, so EmoPy cannot run with Python 3.7.
52
+
As of this writing, the best available public dataset we have found is [Microsoft FER+](https://github.com/Microsoft/FERPlus), with around 30,000 images. Training on this dataset should yield best results when the input image relates to some extent to the style of the images in the set.
52
53
53
-
Python is compatible with multiple operating systems. If you would like to use EmoPy on another OS, please convert these instructions to match your target environment. Let us know how you get on, and we will try to support you and share you results.
54
+
For a deeper analysis of the origin and operation of EmoPy, which will be useful to help evaluate its potential for your needs, please read our [full write-up on EmoPy](https://www.thoughtworks.com/insights/blog/emopy-machine-learning-toolkit-emotional-expression).
55
+
56
+
## Choosing a Dataset
57
+
58
+
Try out the system using your own dataset or a small dataset we have provided in the [Emopy/examples/image_data](Emopy/examples/image_data) subdirectory. The sample datasets we provide will not yield good results due to their small size, but they serve as a great way to get started.
59
+
60
+
Predictions ideally perform well on a diversity of datasets, illumination conditions, and subsets of the standard 7 emotion labels (happiness, anger, fear, surprise, disgust, sadness, calm/neutral) seen in FER research. Some good example public datasets are the [Extended Cohn-Kanade](http://www.consortium.ri.cmu.edu/ckagree/) and [Microsoft FER+](https://github.com/Microsoft/FERPlus).
61
+
62
+
## Environment Setup
54
63
64
+
EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. You can install [Python 3.6.6](https://www.python.org/downloads/release/python-366/) from the Python website.
55
65
66
+
Python is compatible with multiple operating systems. If you would like to use EmoPy on another OS, please convert these instructions to match your target environment. Let us know how you get on, and we will try to support you and share you results.
56
67
57
68
If you do not have Homebrew installed run this command to install:
58
69
@@ -75,7 +86,7 @@ Description: # EmoPy
75
86
```
76
87
python3.6 -m venv venv
77
88
```
78
-
where the second `venv` is the name of your virtual environment. To activate, run from the same directory:
89
+
where the second `venv` is the name of your virtual environment. To activate, run from the same directory:
79
90
```
80
91
source venv/bin/activate
81
92
```
@@ -112,15 +123,15 @@ Description: # EmoPy
112
123
113
124
## Running the examples
114
125
115
-
You can find example code to run each of the current neural net classes in [examples](examples). You may either download the example directory to a location of your choice on your machine, or find the example directory included in the installation.
126
+
You can find example code to run each of the current neural net classes in [examples](EmoPy/examples). You may either download the example directory to a location of your choice on your machine, or find the example directory included in the installation.
116
127
117
128
If you choose to use the installed package, you can find the examples directory by starting in the virtual environment directory you created and typing:
118
129
```
119
130
cd lib/python3.6/site-packages/EmoPy/examples
120
131
```
121
132
122
133
123
-
The best place to start is the [FERModel example](examples/fermodel_example.py). Here is a listing of that code:
134
+
The best place to start is the [FERModel example](EmoPy/examples/fermodel_example.py). Here is a listing of that code:
124
135
125
136
```python
126
137
from EmoPy.src.fermodel import FERModel
@@ -233,6 +244,25 @@ Description: # EmoPy
233
244
234
245
[@vanGent2016]: http://www.paulvangent.com/2016/04/01/emotion-recognition-with-python-opencv-and-a-face-dataset/ "Emotion Recognition With Python, OpenCV and a Face Dataset. A tech blog about fun things with Python and embedded electronics."
235
246
247
+
## Contributors
248
+
249
+
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
250
+
251
+
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
0 commit comments