DeepFace not detecting emotions correctly

Hi,

I’m trying to detect emotion using DeepFace Python Library.

However, I’m facing an issue where “angry” is always detected as the dominant emotion, regardless of whether I use my webcam or a static image.

I suspect the problem lies in how I’m converting the image for analysis. Below is my Python code:

# me - this DAT
# scriptOp - the OP which is cooking

from deepface import DeepFace
import cv2
import numpy as np

# press 'Setup Parameters' in the OP to call this function to re-create the parameters.
def onSetupParameters(scriptOp):
	
	return

# called whenever custom pulse parameter is pushed
def onPulse(par):
	return


def onCook(scriptOp):
	img = op('null1').numpyArray(delayed=True)

	# Emotions
	# ['age', 'gender', 'race', 'emotion']
	analyze = DeepFace.analyze(img_path=img,enforce_detection=False, actions = ['emotion'])
	print(analyze)

	print("###################################################")

	scriptOp.copyNumpyArray(img)

	return

I’ve also attached an image showing the node setup I’m using to analyze my webcam feed.

Does anyone have any ideas on what might be causing this issue? Any suggestions would be greatly appreciated!

Thanks in advance.

Best,
Jordi

Hi @Garreta11,

do you see the same behavior when running the code in a python file using an external python interpreter? This is just to determine if the issue is with TouchDesigner or with the library itself.

cheers
Markus

Hi @snaut ,

thanks for your reply. Yes, if I use Google Colab, the code works perfectly. So I guess the issue is with TouchDesigner. Any possible ideas?

Thanks