In President Obama’s January 2017 farewell speech, he addressed the danger of retreating into our own bubbles, be they neighborhoods, campuses, houses of worship or social media feeds “surrounded by people who look like us and share the same political outlook and never challenge our assumptions.”
“We become so secure in our bubbles that we accept only information, whether true or not, that fits our opinions, instead of basing our opinions on the evidence that’s out there,” he said.
Nearly two years later, we’re still grappling with the issue as Google chief executive Sundar Pichai prepares to testify before the House Judiciary Committee about alleged search engine bias against conservatives and the world awaits the results of special counsel Robert Mueller’s investigation into Russian interference in the 2016 election tied to a troll farm that harnessed user data on Facebook and Twitter to craft targeted, incendiary native content meant to reinforce existing biases and sway the election. Confirmation bias, it turns out, is real.
It gets worse: A new study from privacy-focused search engine DuckDuckGo says Google’s filter bubble is real and that it is employing user data even when users are logged out and searching in incognito mode.
DuckDuckGo found Google delivered unique results to most users even when they searched for identical terms, such as “gun control”, “immigration” or “vaccinations,” at the same time and in private browsing mode.
The incognito searches for “vaccinations” had the most variants with 73 for 87 participants. That’s followed by “gun control” with 62 and “immigration” with 57.
The “gun control” search yielded 19 domains ordered 31 different ways. There were 22 different domains in the “vaccinations” search and 15 for “immigration.” DuckDuckGo said the order of links is important because each link gets twice as many clicks as subsequent links.
Results in Google’s news and video carousels also displayed different sources for different users.
According to DuckDuckGo, the variations are a result of Google tapping into personal information like users’ search, browsing and purchase history to cater results to what it thinks each user is most likely to click on.
A Google spokesperson called the study’s methodology and conclusions “flawed, saying they’re “based on the assumption that any difference in search results are based on personalization. That is simply not true. In fact, there are a number of factors that can lead to slight differences, including time and location, which this study doesn’t appear to have controlled for effectively.”
If search results really were anonymous in incognito mode, DuckDuckGo said, queries for the same terms at the same time would be similar, and the different results “could not be explained by changes in location, time, by being logged in to Google or by Google testing algorithm changes to a small subset of users.”
As a result, DuckDuckGo said, Google tailors search results regardless of browsing mode, which proves its filter bubble is real.
“People should not be lulled into a false sense of security that so-called ‘incognito’ mode makes them anonymous,” DuckDuckGo said. “Unfortunately, this is a common misconception as websites use IP addresses and browser fingerprinting to identify people that are logged out or in private browsing mode.”
And, DuckDuckGo said, this is particularly troublesome when it comes to politics.
“That’s because undecided and inquisitive voters turn to search engines to conduct basic research on candidates and issues in the critical time when they are forming their opinions on them,” the study said. “If they’re getting information that is swayed to one side because of their personal filter bubbles, then this can have a significant effect on political outcomes in aggregate.”
Google insists this is not the case.
“If a user is logged out and searching in Incognito mode, we do not do any personalization based on your signed-in search history, which you control and can access in your Google account at myaccount.google.com,” Google said in an email.
Instead of personalization, the company said, a number of factors could lead to different rankings or results like time (even seconds), which is “particularly true for rapidly evolving news topics;” data centers in many locations that are refreshed constantly; and localization.
In addition, it said, “ranking contextualization” in results is limited and is most often applied to clarify an ambiguous query like determining whether a search for “Barcelona” is for the city or the football club.