Wilson and Brekke’s justly famous article also contains an eye-opening survey of the empirics of “mental correction,” better known at GMU as overcoming bias. While I’m sure the sub-field has advanced since 1994, it’s amazing how much was already known at the time:
A number of studies have attempted to reduce biases in information processing and judgment by forewarning people about or drawing their attention to potentially biasing information and examining the extent to which they are able to avoid the unwanted effects of this information… In general, these studies have revealed a wide range of seemingly contradictory effects. Some studies have shown that an increase of people’s awareness eliminates mental contamination; some have found that awareness causes people to adjust insufficiently, leading to undercorrection; some have indicated that awareness causes people to adjust their responses too much, leading to overcorrection; and some have shown that awareness does not cause people to adjust their responses.
One case they discuss in detail is efforts to correct for priming effects.
Consider, for example, the classic priming effect, whereby people’s judgments shift in the direction of the primed category (e.g., if the category of “kindness” is accessible, people typically rate another person as more kind than they normally would; Higgins et al., 1977; Srull & Wyer, 1989). Recent studies have shown that making people aware that the category has been primed by an arbitrary event causes them to adjust their responses (Lombardi, Higgins, & Bargh, 1987; Martin, 1986; Martin, Seta, & Crelia, 1990). Interestingly, however, increasing awareness does not make the priming effect disappear; it often reverses, resulting in a contrast effect (Lombardi et al., 1987; Martin, 1986; Martin et al., 1990). For example, if people realize that kind thoughts are accessible for arbitrary reasons, they end up rating the target person as less kind than they normally would.
If Wilson and Brekke were economists, they’d probably be more inclined to treat a mixture of undercorrection and overcorrection as evidence in favor of human rationality. Either way, it’s fascinating to discover that this research not only exists, but has been sitting on the shelf for decades. Why didn’t I hear about this in grad school?
READER COMMENTS
jc
Dec 28 2010 at 1:09am
An excerpt from the comments section of Arnold’s “recognized but marginalized” post…(Bryan, you seem to have a lot of ‘books to write’ already in the queue. For me, this would be an interesting one you might consider adding.)
fundamentalist
Dec 28 2010 at 9:35am
Research in public relations shows that people differ on how they accept new knowledge. Some depend upon an authority while some think for themselves. Those who depend on authority must get the new information from an authority they respect or they won’t process it. Those who think for themselves must be able to see how the new information fits with what they already know.
BTW, depending upon an authority to filter one’s information is perfectly rational. It’s the old division of labor thing. Most people don’t have the time or the inclination to become experts in everything and it would be foolish to try. So they become experts at some things and rely on experts for other things.
Christian Galgano
Dec 29 2010 at 12:50am
I recommend the whole video, but check out 11:45 on for what’s almost the avant garde of confirmation bias and moral psychology: http://www.edge.org/3rd_culture/morality10/morality.haidt.html#haidt-video
Professor Haidt will be sending you the final paper I wrote that synthesizes the state of confirmation bias and moral psychology with TMORTV in less than two weeks.
I read your exchange with Professor Haidt from the spring, and he has updated his theory of morality since then (partly with the aid of his bet…go Haidt/Caplan).
The video may also answer why you didn’t hear about Wilson (UVA) and Brekke at Ptown.
–Christian, psych/econ major at UVA
Comments are closed.