Google’s claim to “organize the world’s information and make it universally accessible and useful” has earned it an aura of objectivity. Its dominance in search, and the disappearance of most competitors, make its lists of links appear still more canonical. An experimental new interface for Google Search aims to remove that mantle of neutrality.
Search Atlas makes it easy to see how Google offers different responses to the same query on versions of its search engine offered in different parts of the world. The research project reveals how Google’s service can reflect or amplify cultural differences or government preferences—such as whether Beijing’s Tiananmen Square should be seen first as a sunny tourist attraction or the site of a lethal military crackdown on protesters.
Divergent results like that show how the idea of search engines as neutral is a myth, says Rodrigo Ochigame, a PhD student in science, technology, and society at MIT and cocreator of Search Atlas. “Any attempt to quantify relevance necessarily encodes moral and political priorities,” Ochigame says.
Ochigame built Search Atlas with Katherine Ye, a computer science PhD student at Carnegie Mellon University and a research fellow at the nonprofit Center for Arts, Design, and Social Research.
Just like Google’s homepage, the main feature of Search Atlas is a blank box. But instead of returning a single column of results, the site displays three lists of links, from different geographic versions of Google Search selected from the more than 100 the company offers. Search Atlas automatically translates a query to the default languages of each localized edition using Google Translate.
Ochigame and Ye say the design reveals “information borders” created by the way Google’s search technology ranks web pages, presenting different slices of reality to people in different locations or using different languages.
When they used their tool to do an image search on “Tiananmen Square,” the UK and Singaporean versions of Google returned images of tanks and soldiers quashing the 1989 student protests. When the same query was sent to a version of Google tuned for searches from China, which can be accessed by circumventing the country’s Great Firewall, the results showed recent, sunny images of the square, smattered with tourists.
Google’s search engine has been blocked in China since 2010, when the company said it would stop censoring topics the government deemed sensitive, such as the Tiananmen massacre. Search Atlas suggests that the China edition of the company’s search engine can reflect the Chinese government’s preferences all the same. That pattern could result in part from how the corpus of web pages from any language or region would reflect cultural priorities and pressures.
A Google spokesperson said the differences in results were not caused by censorship and that content about the Tiananmen Square massacre is available via Google Search in any language or locale setting. Touristy images win prominence in some cases, the spokesperson said, when the search engine detects an intent to travel, which is more likely for searchers closer to Beijing or typed in Chinese. Searching for Tiananmen Square from Thailand or the US using Google’s Chinese language setting also prompts recent, clean images of the historic site.
“We localize results to your preferred region and language so you can quickly access the most reliable information,” the spokesperson said. Google users can tune their own results by adjusting their location setting and language.
The Search Atlas collaborators also built maps and visualizations showing how search results can differ around the globe. One shows how searching for images of “God” yields bearded Christian imagery in Europe and the Americas, images of Buddha in some Asian countries, and Arabic script for Allah in the Persian Gulf and northeast Africa. The Google spokesperson said the results reflect how its translation service converts the English term “God” into words with more specific meanings for some languages, such as Allah in Arabic.
Other information borders charted by the researchers don’t map straightforwardly onto national or language boundaries. Results for “how to combat climate change” tend to divide island nations and countries on continents. In European countries such as Germany, the most common words in Google’s results related to policy measures such as energy conservation and international accords; for islands such as Mauritius and the Philippines, results were more likely to cite the enormity and immediacy of the threat of a changing climate, or harms such as sea level rise.
Search Atlas was presented last month at the academic conference Designing Interactive Systems; its creators are testing a private beta of the service and considering how to widen access to it.
Search Atlas can’t reveal why different versions of Google portray the world differently. The company’s lucrative ranking systems are closely held, and the company says little about how it tunes results based on geography, language, or a person’s activity.
Whatever the exact reason Google shows—or doesn’t show—particular results, they have a power too easily overlooked, says Search Atlas cocreator Ye. “People ask search engines things they would never ask a person, and the things they happen to see in Google’s results can change their lives,” Ye says. “It could be ‘How do I get an abortion?’ restaurants near you, or how you vote, or get a vaccine.”
WIRED’s own experiments showed how people in neighboring countries could be steered by Google to very different information on a hot topic. When WIRED queried Search Atlas about the ongoing war in Ethiopia’s Tigray region, Google’s Ethiopia edition pointed to Facebook pages and blogs that criticized Western diplomatic pressure to deescalate the conflict, suggesting that the US and others were trying to weaken Ethiopia. Results for neighboring Kenya, and the US version of Google, more prominently featured explanatory news coverage from sources such as the BBC and The New York Times.
Ochigame and Ye are not the first to point out that search engines aren’t neutral actors. Their project was partly inspired by the work of Safiya Noble, cofounder and codirector of UCLA’s Center for Critical Internet Inquiry. Her 2018 book Algorithms of Oppression explored how Google searches using words such as “Black” or “Hispanic” produced results reflecting and reinforcing societal biases against certain marginalized people.
Noble says the project could provide a way to explain the true nature of search engines to a broader audience. “It’s very difficult to make visible the ways search engines are not democratic,” she says.
Web search can feel like vintage technology, but Noble says spotlighting its intricacies is as important as ever because of Google’s dominance and the way attention to social media’s skews can make search look benign by comparison.
Google is unlikely to lose its grip on the search market any time soon, but Noble sees reasons for optimism. The growing user base of the privacy-centric search company DuckDuckGo suggests that some netizens are open to alternatives. Noble sees growing interest from policymakers and the public in regulating tech platforms more tightly, and in finding ways to support alternatives that better serve the public interest. “We have a community of scholars calling for that, in dialog with foundations and governments,” she says.
- 📩 The latest on tech, science, and more: Get our newsletters!
- The battle between the lithium mine and the wildflower
- A space laser’s warning about sea level rise
- I’ve finally found the perfect camera bag
- The wild history of the Drug Wars calculator game
- Aldrona Nelson wants to make science and tech more just
- 👁️ Explore AI like never before with our new database
- 🎮 WIRED Games: Get the latest tips, reviews, and more
- 🎧 Things not sounding right? Check out our favorite wireless headphones, soundbars, and Bluetooth speakers