Post by Optimus on Jan 6, 2017 2:26:04 GMT -5
I've long felt there were significant problems in implicit bias research, mostly because many of the claims of the main researchers in the field (and the science-illiterate humanities professors who lap it up like Kool-Aid) that are derived from using the IAT are not strongly supported by the actual evidence. The main reason for that, in my opinion (and the opinion of a quickly growing number of researchers), is that the Implicit Associations Test (IAT; the main tool used to measure "implicit bias") is about as valid and reliable as a Myers-Briggs personality test a teenager might take on Buzzfeed's Facebook page.
That's not at all to say that people don't have implicit biases of some sort. We all have innumerable biases that relate to a variety of issues (race, health, employment, social relationships, etc). Some are strong, some are weak. I'm just not convinced that any of them truly lie below our conscious awareness (some people just don't like to admit their biases and pretend they don't exist). But, even if they are truly "unconscious," that doesn't mean they have much of an affect on our overt behavior. Racists gonna be racist. Haters gonna hate. Etc. etc.
There's just not any convincing evidence showing that implicit biases lead to consistent explicit behavior, despite the claims of the two guys who created the IAT (of course they're going to defend their baby to the death and disparage any criticism of it. They've pretty much attacked anyone who's questioned its validity over the past 20 years). Some of the conclusions/claims made by implicit bias research proponents have been questionable and it's been a growing perception in the field over the recent few years that the IAT is likely mostly crap (for various reasons), which casts the entire research area under doubt.
An overview of the criticism of the IAT here: en.wikipedia.org/wiki/Implicit-association_test#Criticism_and_controversy
Again, that's not to say we don't have implicit biases. There's a lot of research on it, and a some of it likely has merit. But, lot of it doesn't and is still credulously accepted as "true," even though it is on theoretically shaky ground. Indeed, some things that are automatically interpreted as "bias" might actually have alternate, better explanations.
Anyway, given that there's been a lot of talk about implicit bias in the political arena lately (Hillary Clinton brought it up several times toward the end of the election...the cynic in me feeling that it was a ploy for votes), I thought this article and research were pretty timely and interesting:
Full article here: www.chronicle.com/article/Can-We-Really-Measure-Implicit/238807
That's not at all to say that people don't have implicit biases of some sort. We all have innumerable biases that relate to a variety of issues (race, health, employment, social relationships, etc). Some are strong, some are weak. I'm just not convinced that any of them truly lie below our conscious awareness (some people just don't like to admit their biases and pretend they don't exist). But, even if they are truly "unconscious," that doesn't mean they have much of an affect on our overt behavior. Racists gonna be racist. Haters gonna hate. Etc. etc.
There's just not any convincing evidence showing that implicit biases lead to consistent explicit behavior, despite the claims of the two guys who created the IAT (of course they're going to defend their baby to the death and disparage any criticism of it. They've pretty much attacked anyone who's questioned its validity over the past 20 years). Some of the conclusions/claims made by implicit bias research proponents have been questionable and it's been a growing perception in the field over the recent few years that the IAT is likely mostly crap (for various reasons), which casts the entire research area under doubt.
An overview of the criticism of the IAT here: en.wikipedia.org/wiki/Implicit-association_test#Criticism_and_controversy
Again, that's not to say we don't have implicit biases. There's a lot of research on it, and a some of it likely has merit. But, lot of it doesn't and is still credulously accepted as "true," even though it is on theoretically shaky ground. Indeed, some things that are automatically interpreted as "bias" might actually have alternate, better explanations.
Anyway, given that there's been a lot of talk about implicit bias in the political arena lately (Hillary Clinton brought it up several times toward the end of the election...the cynic in me feeling that it was a ploy for votes), I thought this article and research were pretty timely and interesting:
But what he calls the "very weak overall" connection between implicit bias and discriminatory behavior should, he believes, put researchers on notice. "You would think that if you change the associations, and the associations predict behavior, then the behavior would change too," Nosek says. "But the evidence is really limited on it."
Patrick Forscher, who shares the title of first author of the paper with Calvin Lai, a Harvard postdoc, thinks that there’s been pressure on researchers over the years to make the science of implicit bias sound more definitive and relevant than the evidence justifies. "A lot of people want to know, How do we tackle these disparities?" says Forscher, a postdoc at the University of Wisconsin at Madison. "It makes us feel important to say, Aha, we have these measures that can tell us what the problem is, and, not only that, we can tell them how to fix the problem."
That’s essentially Blanton’s argument as well. Public discussion about implicit bias has been based largely on the results from one particular test, and that test, in his view, has been falsely sold as solid science. "They have engaged the public in a way that has wrapped the feeling of science and weight around a lot of ‘cans’ and ‘maybes,’" Blanton says. "Most of your score on this test is noise, and what signal there is, we don’t know what it is or what it means."
Patrick Forscher, who shares the title of first author of the paper with Calvin Lai, a Harvard postdoc, thinks that there’s been pressure on researchers over the years to make the science of implicit bias sound more definitive and relevant than the evidence justifies. "A lot of people want to know, How do we tackle these disparities?" says Forscher, a postdoc at the University of Wisconsin at Madison. "It makes us feel important to say, Aha, we have these measures that can tell us what the problem is, and, not only that, we can tell them how to fix the problem."
That’s essentially Blanton’s argument as well. Public discussion about implicit bias has been based largely on the results from one particular test, and that test, in his view, has been falsely sold as solid science. "They have engaged the public in a way that has wrapped the feeling of science and weight around a lot of ‘cans’ and ‘maybes,’" Blanton says. "Most of your score on this test is noise, and what signal there is, we don’t know what it is or what it means."
Full article here: www.chronicle.com/article/Can-We-Really-Measure-Implicit/238807