Evan Soltas and Seth Stephens-Davidowitz write: Hours after the massacre in San Bernardino, Calif., on Dec. 2, and minutes after the media first reported that at least one of the shooters had a Muslim-sounding name, a disturbing number of Californians had decided what they wanted to do with Muslims: kill them.
The top Google search in California with the word “Muslims” in it was “kill Muslims.” And the rest of America searched for the phrase “kill Muslims” with about the same frequency that they searched for “martini recipe,” “migraine symptoms” and “Cowboys roster.”
People often have vicious thoughts. Sometimes they share them on Google. Do these thoughts matter?
Yes. Using weekly data from 2004 to 2013, we found a direct correlation between anti-Muslim searches and anti-Muslim hate crimes.
There are about 1,600 searches for “I hate my boss” every month in the United States. In a survey of American workers, half of the respondents said that they had left a job because they hated their boss; there are about 150 million workers in America.
In November, there were about 3,600 searches in the United States for “I hate Muslims” and about 2,400 for “kill Muslims.” We suspect these Islamophobic searches represent a similarly tiny fraction of those who had the same thoughts but didn’t drop them into Google. [Continue reading…]
Oh brother!
Let’s cut some slack for Soltas since he hasn’t graduated yet, but Stephens-Davidowitz dubs himself a data scientist. I guess he’s illustrating the fact that the quality of analysis generally matches the quality of the data.
Don’t get me wrong. I have little doubt that Islamophobia is peaking in the U.S. right now — much to Donald Trump’s advantage and with a lot of his assistance. What is much harder to determine is what qualifies as an Islamophobic search query.
As much as search algorithms have advanced over the last two decades, search engines have yet to perfect the art of mind-reading. The raw material they still work with is words — the thoughts that might lie behind those words remains a mystery.
If someone wants to kill Muslims, they might type “kill Muslims” in Google — although I’m not sure exactly what the query would be meant to solicit.
On the other hand, someone with a basic understanding of how Google works — that it matches queries with documents in which the query terms appear — might do the same search because they want to find out who’s writing about killing Muslims.
In other words, the query, “kill Muslims,” might be an expression of Islamophobia, or, it might be an inquiry about the prevalence of Islamophobia.
This is true even if the query is “I want to kill Muslims” because Google has no way of differentiating between the person making the query and the author of the documents it matches to that query.
This is the problem of disambiguation.
Type “kill Muslims” into Google and what you’ll find — apart from references to this op-ed — is much of the hard data on Islamophobia, such as a compilation of reports on hate crimes targeting Muslims occurring across the U.S. just this week.