Garett’s post on the prevalence of sheer malevolence is fascinating, but I’m not convinced. A key fact about experiments is that many participants just want to please the experimenter. Once they sit down in the lab, they start asking, “What are we supposed to do?”
Thus, when an experiment explicitly gives them an option to hurt other players, many subjects take this as sign that the experimenter wants them to hurt other players. “It’s all part of the game.” If your only options are to (a) sit still and wait for the experiment to end, or (b) hurt other players, the experimenter is giving subjects a strong hint that they’re supposed to do (b).
I’d be much more impressed by an experiment showing that subjects spontaneously try to hurt others. Suppose you tell them they can pay some money in order to change others’ endowments. Start with an example where one player pays money to increase others’ endowments. Then see if anyone spontaneously tries to pay money to decrease endowments.
I doubt more than 5% of subjects would do so. Many people will deliberately hurt each other if an authority gives them a not-so-subtle hint that he wants them to do so. But few are positively itching to torment a random stranger.
READER COMMENTS
Ken B
Oct 16 2012 at 9:43am
Yes. This is one of the biggest problems with “survey science” and “experimental game theory”. The game people are actually playing is almost never the one the experimenters want them to be playing. They consciously or unconsciously factor in the conditions.
D. F. Linton
Oct 16 2012 at 9:49am
There is also the problem that these surveys/experiments only sample people who are willing to piss away some of their irretrievable lifetime on such tasks.
Drewfus
Oct 16 2012 at 10:37am
I guess experimentees need to be trained to some extent, and not just regarded as “unskilled labor”. Maybe this attitude is just a hangover from less affluent times.
Having said that it might be the case that scientific standards in general are due for an improvement. What else could you suppose after watching Ben Goldacre: What doctors don’t know about the drugs they prescribe?
Matt Mitchell
Oct 16 2012 at 10:42am
I can think of another reason why this result might not hold outside of the lab: Maybe people just want to test whether the experiment really works the way the experimenter says it does? Most peoples’ instinct when they sit down with a new object is to take it for a test drive: see if it really works as advertised. It is pretty easy to tell an evolutionary story for why we would want to do this. And it may explain why young kids do mean things like pull wings off of insects. Once they’ve figured out the consequences of their actions, most kids grow out of this behavior. Which suggests a way to test my theory: I’d guess that, when presented with the consequences of their actions and when given the chance to do it again, most experimental subjects will grow out of this behavior.
Yancey Ward
Oct 16 2012 at 10:53am
Someone actually gets that the test subjects are nearly as smart, or smarter, than the experimenters themselves.
Daublin
Oct 16 2012 at 11:18am
I had the same reaction. I think this experiment is a Rorschach test for prior beliefs of the person interpreting the experiment.
Two issues scream out to me:
1. People playing a game will try all the moves just to see how it goes. Everyone involved knows that nobody is really going to get hurt, so why not fool around?
2. People playing a game have an implicit convention of trying to win, and usually it helps to win if the other players do worse off.
MikeDC
Oct 16 2012 at 1:05pm
At best this argument scales back the result from 15% of folks will act out of sheer malevolence to only 5% of folks will act out of sheer malevolence and the other 10% will act out of malevolence at the slightest perceived approval or consent from “authority”.
That doesn’t make me feel any better.
Matt C
Oct 16 2012 at 2:04pm
Don’t disagree that the experiment is biased toward what it’s trying to show (what else is new), but isn’t it obvious that human beings do have an innate taste for cruelty and destruction? Is it really controversial (at Econlog anyway) that “nice” people will often behave nastily and viciously if they know they can get away with it?
We socialize the worst of it away, and punish when socialization fails, but we’re still tribal killer apes under our modern first world clothes.
MikeDC, great comment.
Steve Sailer
Oct 16 2012 at 5:07pm
This is a big part of “Stereotype Threat” experiments. The black students pick up on the fact that the experimenters wants them to not work hard on this meaningless test. Why not make the experiment happy?
Peter Twieg
Oct 16 2012 at 6:21pm
My methodological suspicion of the malevolence literature is that a lot of subjects simply generate noise by random behaviors that aren’t really attributable to these sorts of narratives. 15% seems like a high proportion to explain in this manner, but I’d say that a fair share of 15% is probably just due to noise-generating participants.
Pandaemoni
Oct 17 2012 at 10:40am
I agree that showing obedience to an authority figure who implicitly or expressly is leading one to harm others is not in and of itself a sign that the individuals inflicting the harm are malevolent. That said, that individuals cede the right to make their own moral judgments to “authority” figured is troubling and makes me strongly question the morality of those who do so.
Combine that with evidence along the lines of the Stanford Prison Experiment, and I would not be confident that only 5% of such people would be spontaneously malevolent. Those people would be malevolent as soon as anything happened to annoy them, unless they feared the consequences of malevolence.
They may not be positively itching to inflict pain at the start, but I wouldn’t trust them to pass on the opportunity in those moments where they felt frustrated. Once they do that, moreover, I suspect that our ability to rationalize our own misconduct (which we are all prone to) would reinforce the behavior (i.e., sure I just hurt that guy, but he deserved it), which would make it more likely they would engage in the misconduct again in future periods.
My guess is that if in any game extended over longer periods, even without encouragement, the number of players who would regularly engage in misconduct would be far higher than 5%.
Andy
Oct 23 2012 at 3:59pm
As with surveys asking about whether the Sun revolves around the Earth, some % of participants in experiments are going to take arbitrary actions just because it seems fun (especially considering most participants are college students).
In addition, the experiment takes away the normal social consequences of certain types of behavior. The same effect explains why rude behavior is much more prevalent online.
Comments are closed.