In my last blog I said I'd try restricting my smile search to the mouth and eyes areas of the face. I went ahead and did that on the same video I used in my last blog. This worked perfectly, but I didn't get hits on some of the video models (and they are all supposed to be smiles). This probably just means that I need more data in my haarcascade xml file for smiles; haarcascade_smile.xml. However, it can also be caused by fake smiles; i.e., some of the models probaby aren't really smiling, but are faking them:
Checking out the mouth region for smiles is an obvious thing to do, but eyes might not be as obvious. A noted psychologist who specializes in autism research, Simon Baron-Cohen created a test where the participants were supposed to determine what emotion somebody was expressing, just by inspecting images of their eyes. You can take a look by googling it; there are plenty of versions of this test on the Internet. Here's a link to the test. Since I know that eyes can express emotions, I'm testing them for smiles in this video. I need to add more smile data to haarcascade_smile.txt and create some more xml files for anger, disgust, fear, sadness, and surprise (detecting smiles is the same as testing for joy). No hits implies neutral emotions (feeling nothing). So a completed program will take a while, but at this point it is a matter of just doing the work.