by phil on Monday Jun 16, 2003 2:19 PM
artificial intelligence, emotions, facial expressiveness
I was reading that they were able to get a computer to detect facial expressions and identify emotions on humans. Well, maybe this is the missing link in the AI chain. What they could do is hookup the emotion reader to a subject who interacts with a text-bot. Then, it should read the emotions as the conversations progress, learning and understanding patterns between certain conversaitonal directions or sentences and emotions that are evoked. After the computer is trained to connect conservational bits with emotion, it could then seek to create an emotion in the other person, try to push them toward positive and happy responses. Bam, the artificial counselor. The program could also learn to associate emotion that he should feel as a counselor depending on how the conversation is going. And then his goal should be to improve his emotion (which could be tied, partially, to whether the other person is feeling emotion). Heck, I bet there's some psychological map in two-person scenarios that could be programmed in there.
Eh? eh? what do you think?