My +100 Aggression Face Looks Like Taking a Shit
November 16, 2008 4:31 PM Subscribe
Is anyone doing real-time emotion detection in video games?
I have been playing a lot of Fallout 3, and Fable 2 and Mass Effect before that, and I think that a lot of these branching NPC interactions could be enhanced by real-time emotion detection.
I know of a few precursors to this idea, such as the EyeToy, but that's sort of like using the camera consciously as an input device. Is anybody in video games doing anything with emotion recognition, either from facial expression or voice input? If not, why not?
I have been playing a lot of Fallout 3, and Fable 2 and Mass Effect before that, and I think that a lot of these branching NPC interactions could be enhanced by real-time emotion detection.
I know of a few precursors to this idea, such as the EyeToy, but that's sort of like using the camera consciously as an input device. Is anybody in video games doing anything with emotion recognition, either from facial expression or voice input? If not, why not?
Not that I know of. We'd need reasonably common devices to track it, and I don't think the Eyetoy, etc. are there yet.
Some developers track emotional response during playtesting -- it generally falls under "usability testing." See Nicole Lazzaro's research in particular. Microsoft Game Studios User Research might also have some good stuff (here's an overview of their work from a few years ago).
posted by liet at 4:55 PM on November 16, 2008
Some developers track emotional response during playtesting -- it generally falls under "usability testing." See Nicole Lazzaro's research in particular. Microsoft Game Studios User Research might also have some good stuff (here's an overview of their work from a few years ago).
posted by liet at 4:55 PM on November 16, 2008
Best answer: Oh yeah. Check out Rene Weber's work. But a Google Scholar search for emotion and "video game" will come up with a lot of results.
posted by k8t at 5:20 PM on November 16, 2008
posted by k8t at 5:20 PM on November 16, 2008
I don't think facial expression or voice would be a valid method of determining emotion. I can tell you that, all through Fallout 3, my face was screwed up tight, teeth a-gritting, growling and all squinty-eyed, and I was happy as a clam.
posted by Cat Pie Hurts at 5:28 PM on November 16, 2008
posted by Cat Pie Hurts at 5:28 PM on November 16, 2008
"I think that a lot of these branching NPC interactions could be enhanced by real-time emotion detection."
I love gadgetry, but, no. You're pretty much outright wrong. Detecting the player's emotional state doesn't give valid information about what the player wants her character's emotional state to be. The protagonist's story is not the same thing as the player's story.
As a player, it's entirely possible I'm going to be either beaming with intense pride or scowling in deep thought while I somberly inform the prince that I'm terribly sad to report the complete loss of his cargo to mysterious bandits and lucky to have escaped with my life. What does my face say about what the character's story should be?
posted by majick at 5:32 PM on November 16, 2008 [3 favorites]
I love gadgetry, but, no. You're pretty much outright wrong. Detecting the player's emotional state doesn't give valid information about what the player wants her character's emotional state to be. The protagonist's story is not the same thing as the player's story.
As a player, it's entirely possible I'm going to be either beaming with intense pride or scowling in deep thought while I somberly inform the prince that I'm terribly sad to report the complete loss of his cargo to mysterious bandits and lucky to have escaped with my life. What does my face say about what the character's story should be?
posted by majick at 5:32 PM on November 16, 2008 [3 favorites]
Not so much console video games, but I'd look into combat/conflict simulations. I'm pretty sure I read an article about police training sims using voice detection to branch their AI. If you sound mean you get shot, if you sound like a wuss you get shot, if you're authoritative you get to go on to the next level.
posted by wavering at 6:23 PM on November 16, 2008
posted by wavering at 6:23 PM on November 16, 2008
You might look into affect detection for educational technology. I've read about academics using galvanic skin response sensors integrated into the mouse or other peripherals to gauge emotions. There's also the Affective Computing Group at MIT, which does a lot of cool research.
Keep in mind that while heart rate, skin conductance, etc as indicator of emotional state (or at least arousal) are pretty uniform across people in general, a lot of other nuances are person-specific -- I read one paper on detecting emotional state from full-body mocap (ACM membership required, sorry!), and it notes that a classifier trained for a specific individual is relatively accurate for that individual over time, but very inaccurate for other users.
Anyway, point is, this stuff is still in the lab, and as Magick points out, player emotion doesn't necessarily match avatar emotion.
posted by Alterscape at 7:15 PM on November 16, 2008
Keep in mind that while heart rate, skin conductance, etc as indicator of emotional state (or at least arousal) are pretty uniform across people in general, a lot of other nuances are person-specific -- I read one paper on detecting emotional state from full-body mocap (ACM membership required, sorry!), and it notes that a classifier trained for a specific individual is relatively accurate for that individual over time, but very inaccurate for other users.
Anyway, point is, this stuff is still in the lab, and as Magick points out, player emotion doesn't necessarily match avatar emotion.
posted by Alterscape at 7:15 PM on November 16, 2008
It doesn't involve facial or vocal recognition, but Brainball (or Mindball, if you want to buy it) relies on brain activity (or lack thereof) to determine who is "winning" the game.
posted by Remy at 7:36 PM on November 16, 2008
posted by Remy at 7:36 PM on November 16, 2008
Take a look at Emotiv Systems. I was just reading this blurb in SEED last night (second item).
Quick summary: They've developed an EEG device that does two things: (1) Maps brain states to operations (think "jump" to make the character jump) and (2) Adjust the game based on the user's level of interested. If you're bored, it gets harder. If you get excited, the music might change to match.
posted by natabat at 9:27 AM on November 17, 2008
Quick summary: They've developed an EEG device that does two things: (1) Maps brain states to operations (think "jump" to make the character jump) and (2) Adjust the game based on the user's level of interested. If you're bored, it gets harder. If you get excited, the music might change to match.
posted by natabat at 9:27 AM on November 17, 2008
You're probably best off using a device that detects Galvanic Skin Response (GSR) a-la a lie detector test - two little patches that you put on your skin & it measures electrical conductance, which increases when you sweat. While you're at it you could include a heart-rate monitor like at the gym - just two metal pads that can read electrical activity in the body.
It would be kind of cool if these things were integrated into the game controller & this information was used in the game - such as having to keep your heart rate down during an interrogation, or knowing when to pop a zombie out at you.
Existing studies require way too much other equipment, from sticking a wire in the brain to EEG machines to powerful computers that can read facial expressions.
Note that GSR and heart rate are one-dimensional readings, but I'm sure if you're really good at making video games, you could make an educated guess about what emotion the player was feeling at the time.
posted by Muffy at 10:40 PM on November 17, 2008
It would be kind of cool if these things were integrated into the game controller & this information was used in the game - such as having to keep your heart rate down during an interrogation, or knowing when to pop a zombie out at you.
Existing studies require way too much other equipment, from sticking a wire in the brain to EEG machines to powerful computers that can read facial expressions.
Note that GSR and heart rate are one-dimensional readings, but I'm sure if you're really good at making video games, you could make an educated guess about what emotion the player was feeling at the time.
posted by Muffy at 10:40 PM on November 17, 2008
This thread is closed to new comments.
If instead the game simply had the ability to enhance gameplay for people with the required hardware you're still going to run into problems. Why spend the money designing all this extra stuff when only a certain percentage of users will take advantage of it? Instead designers focus on improving things that will allow the greatest amount of fun for the lowest demographic. This is especially obvious when you look at add on physics cards and the lack of games that support them.
posted by GhostChe at 4:54 PM on November 16, 2008