Skip to content

Commit a5f249e

Browse files
authored
Merge pull request #16 from thoughtworksarts/pr-13/sinbycos/ferWebcamOpenCV
Impoves on PR #13 : Webcam example
2 parents 20b2057 + 519f882 commit a5f249e

File tree

7 files changed

+59
-46
lines changed

7 files changed

+59
-46
lines changed

.gitignore

+1
Original file line numberDiff line numberDiff line change
@@ -13,3 +13,4 @@ trained_models/
1313
venv/
1414
output/
1515
emopy_venv/
16+
EmoPy/examples/image_data/image.jpg

EmoPy.egg-info/PKG-INFO

+25-13
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
Metadata-Version: 2.1
22
Name: EmoPy
3-
Version: 0.0.2
3+
Version: 0.0.4
44
Summary: A deep neural net toolkit for emotion analysis via Facial Expression Recognition (FER)
55
Home-page: https://github.com/thoughtworksarts/EmoPy
66
Author: ThoughtWorks Arts
7-
Author-email: andy@thoughtworks.io
7+
Author-email: info@thoughtworksarts.io
88
License: UNKNOWN
99
Description: # EmoPy
1010
EmoPy is a python toolkit with deep neural net classes which accurately predict emotions given images of people's faces.
@@ -85,13 +85,12 @@ Description: # EmoPy
8585

8686
## Installation
8787

88-
<!--
88+
8989
### From PyPi
9090
Once the virtual environment is activated, you may install EmoPy using
9191
```
9292
pip install EmoPy
9393
```
94-
-->
9594

9695
### From the source
9796

@@ -113,26 +112,38 @@ Description: # EmoPy
113112

114113
## Running the examples
115114

116-
You can find example code to run each of the current neural net classes in [examples](examples). The best place to start is the [FERModel example](examples/fermodel_example.py). Here is a listing of that code:
115+
You can find example code to run each of the current neural net classes in [examples](examples). You may either download the example directory to a location of your choice on your machine, or find the example directory included in the installation.
116+
117+
If you choose to use the installed package, you can find the examples directory by starting in the virtual environment directory you created and typing:
118+
```
119+
cd lib/python3.6/site-packages/EmoPy/examples
120+
```
121+
122+
123+
The best place to start is the [FERModel example](examples/fermodel_example.py). Here is a listing of that code:
117124

118125
```python
119-
import sys
120-
sys.path.append('../')
121-
from fermodel import FERModel
126+
from EmoPy.src.fermodel import FERModel
127+
from pkg_resources import resource_filename
122128

123129
target_emotions = ['calm', 'anger', 'happiness']
124130
model = FERModel(target_emotions, verbose=True)
125131

126132
print('Predicting on happy image...')
127-
model.predict('image_data/sample_happy_image.png')
133+
model.predict(resource_filename('EmoPy.examples','image_data/sample_happy_image.png'))
134+
135+
print('Predicting on disgust image...')
136+
model.predict(resource_filename('EmoPy.examples','image_data/sample_disgust_image.png'))
137+
138+
print('Predicting on anger image...')
139+
model.predict(resource_filename('EmoPy.examples','image_data/sample_anger_image2.png'))
128140
```
129141

130142
The code above loads a pre-trained model and then predicts an emotion on a sample image. As you can see, all you have to supply with this example is a set of target emotions and a sample image.
131143

132-
Once you have completed the installation, you can run this example by moving into the examples folder and running the example script.
144+
Once you have completed the installation, you can run this example from the examples folder by running the example script.
133145

134146
```
135-
cd examples
136147
python fermodel_example.py
137148
```
138149

@@ -143,7 +154,6 @@ Description: # EmoPy
143154
To train your own neural net, use one of our FER neural net classes to get started. You can try the convolutional_model.py example:
144155

145156
```
146-
cd examples
147157
python convolutional_example.py
148158
```
149159

@@ -224,5 +234,7 @@ Description: # EmoPy
224234
[@vanGent2016]: http://www.paulvangent.com/2016/04/01/emotion-recognition-with-python-opencv-and-a-face-dataset/ "Emotion Recognition With Python, OpenCV and a Face Dataset. A tech blog about fun things with Python and embedded electronics."
225235

226236
Platform: UNKNOWN
227-
Requires-Python: >=3.6.3,<3.7
237+
Classifier: Programming Language :: Python :: 3.6
238+
Classifier: Operating System :: MacOS :: MacOS X
239+
Requires-Python: >=3.6.3
228240
Description-Content-Type: text/markdown

EmoPy.egg-info/SOURCES.txt

+1
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ EmoPy/examples/__init__.py
1515
EmoPy/examples/convolutional_lstm_model.py
1616
EmoPy/examples/convolutional_model.py
1717
EmoPy/examples/fermodel_example.py
18+
EmoPy/examples/fermodel_example_webcam.py
1819
EmoPy/examples/timedelay_conv_model.py
1920
EmoPy/examples/transferlearning_model.py
2021
EmoPy/examples/image_data/sample.csv

EmoPy/examples/fermodel_example_webcam.py

+22-19
Original file line numberDiff line numberDiff line change
@@ -2,40 +2,43 @@
22

33
import cv2
44
import sys
5-
sys.path.append('../')
5+
from EmoPy.src.fermodel import FERModel
6+
from pkg_resources import resource_filename
67

7-
#Choose the type of Face Expression Model
8-
from src.fermodel import FERModel
8+
fontFace = cv2.FONT_HERSHEY_SIMPLEX;
9+
fontScale = 1;
10+
thickness = 2;
911

10-
#Frame Number
11-
FRAME_NUM = 0
12-
13-
#Choose the type of face detector cascade you want to use
14-
cascPath = "~/EmoPy/venv/lib/python3.5/site-packages/cv2/data/haarcascade_frontalface_default.xml"
15-
faceCascade = cv2.CascadeClassifier(cascPath)
1612
#Specify the camera which you want to use. The default argument is '0'
1713
video_capture = cv2.VideoCapture(0)
14+
#Capturing a smaller image fçor speed purposes
15+
video_capture.set(cv2.CAP_PROP_FRAME_WIDTH, 640)
16+
video_capture.set(cv2.CAP_PROP_FRAME_HEIGHT, 360)
17+
video_capture.set(cv2.CAP_PROP_FPS, 15)
18+
19+
#Can choose other target emotions from the emotion subset defined in fermodel.py in src directory. The function
20+
# defined as `def _check_emotion_set_is_supported(self):`
21+
target_emotions = ['calm', 'anger', 'happiness']
22+
model = FERModel(target_emotions, verbose=True)
1823

1924
while True:
2025
#Capture frame-by-frame
2126
ret, frame = video_capture.read()
2227
#Save the captured frame on disk
23-
file = '~/EmoPy/models/examples/image_data/image.jpg'
28+
file = 'image_data/image.jpg'
2429
cv2.imwrite(file, frame)
25-
#Can choose other target emotions from the emotion subset defined in fermodel.py in src directory. The function
26-
# defined as `def _check_emotion_set_is_supported(self):`
27-
target_emotions = ['calm', 'anger', 'happiness']
28-
model = FERModel(target_emotions, verbose=True)
30+
2931
frameString = model.predict(file)
30-
#Display frame number and emotion
31-
cv2.putText(frame, 'Frame:' + str(FRAME_NUM), (10, 40), cv2.FONT_HERSHEY_COMPLEX_SMALL, 3, (0, 0, 255), 2, cv2.LINE_AA)
32-
cv2.putText(frame, frameString, (10,450), cv2.FONT_HERSHEY_COMPLEX_SMALL, 3, (0,255,0), 2, cv2.LINE_AA)
32+
33+
#Display emotion
34+
retval, baseline = cv2.getTextSize(frameString, fontFace, fontScale, thickness)
35+
cv2.rectangle(frame, (0, 0 ), (20 + retval[0], 50 ), (0,0,0), -1 )
36+
cv2.putText(frame, frameString, (10, 35), fontFace, fontScale, (255, 255, 255), thickness, cv2.LINE_AA)
3337
cv2.imshow('Video', frame)
3438
cv2.waitKey(1)
35-
FRAME_NUM += 1
39+
3640
#Press Esc to exit the window
3741
if cv2.waitKey(1) & 0xFF == 27:
3842
break
3943
#Closes all windows
4044
cv2.destroyAllWindows()
41-

EmoPy/src/fermodel.py

+6-6
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
from scipy import misc
44
import numpy as np
55
import json
6+
from pkg_resources import resource_filename
67

78
class FERModel:
89
"""
@@ -56,7 +57,7 @@ def predict(self, image_file):
5657
resized_image = cv2.resize(gray_image, self.target_dimensions, interpolation=cv2.INTER_LINEAR)
5758
final_image = np.array([np.array([resized_image]).reshape(list(self.target_dimensions)+[self.channels])])
5859
prediction = self.model.predict(final_image)
59-
### Return the dominant expression
60+
# Return the dominant expression
6061
dominant_expression = self._print_prediction(prediction[0])
6162
return dominant_expression
6263

@@ -93,10 +94,10 @@ def _choose_model_from_target_emotions(self):
9394
sorted_indices = [str(idx) for idx in sorted(model_indices)]
9495
model_suffix = ''.join(sorted_indices)
9596
#Modify the path to choose the model file and the emotion map that you want to use
96-
model_file = '~/EmoPy/models/conv_model_%s.hdf5' % model_suffix
97-
emotion_map_file = '~/EmoPy/models/conv_emotion_map_%s.json' % model_suffix
98-
emotion_map = json.loads(open(emotion_map_file).read())
99-
return load_model(model_file), emotion_map
97+
model_file = 'models/conv_model_%s.hdf5' % model_suffix
98+
emotion_map_file = 'models/conv_emotion_map_%s.json' % model_suffix
99+
emotion_map = json.loads(open(resource_filename('EmoPy', emotion_map_file)).read())
100+
return load_model(resource_filename('EmoPy', model_file)), emotion_map
100101

101102
def _print_prediction(self, prediction):
102103
normalized_prediction = [x/sum(prediction) for x in prediction]
@@ -110,4 +111,3 @@ def _print_prediction(self, prediction):
110111
# print('Dominant emotion: %s' % dominant_emotion)
111112
# print()
112113
return dominant_emotion
113-

README.md

+2-6
Original file line numberDiff line numberDiff line change
@@ -38,14 +38,10 @@ Predictions ideally perform well on a diversity of datasets, illumination condit
3838

3939
## Environment Setup
4040

41-
EmoPy runs using Python 3.6, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. You can install [Python 3.6.6](https://www.python.org/downloads/release/python-366/) from the Python website.
42-
43-
Please note that this is not the most current version of Python, but the TensorFlow package doesn't work with Python 3.7 yet, so EmoPy cannot run with Python 3.7.
41+
EmoPy runs using Python 3.6 and up, theoretically on any Python-compatible OS. We tested EmoPy using Python 3.6.6 on OSX. You can install [Python 3.6.6](https://www.python.org/downloads/release/python-366/) from the Python website.
4442

4543
Python is compatible with multiple operating systems. If you would like to use EmoPy on another OS, please convert these instructions to match your target environment. Let us know how you get on, and we will try to support you and share you results.
4644

47-
48-
4945
If you do not have Homebrew installed run this command to install:
5046

5147
```
@@ -67,7 +63,7 @@ Create and activate the virtual environment. Run:
6763
```
6864
python3.6 -m venv venv
6965
```
70-
where the second `venv` is the name of your virtual environment. To activate, run from the same directory:
66+
where the second `venv` is the name of your virtual environment. To activate, run from the same directory:
7167
```
7268
source venv/bin/activate
7369
```

setup.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55

66
setuptools.setup(
77
name="EmoPy",
8-
version="0.0.4",
8+
version="0.0.5",
99
author="ThoughtWorks Arts",
1010
author_email="[email protected]",
1111
description="A deep neural net toolkit for emotion analysis via Facial Expression Recognition (FER)",
@@ -18,7 +18,7 @@
1818
"Programming Language :: Python :: 3.6",
1919
"Operating System :: MacOS :: MacOS X"
2020
],
21-
python_requires='>=3.6.3,<3.7',
21+
python_requires='>=3.6.3',
2222
install_requires=[
2323
'keras>=2.2.0',
2424
'lasagne',

0 commit comments

Comments
 (0)