Google has a moral obligation to hide offensive autocomplete search suggestions

Google covers girls eyes

If a query becomes popular enough on Google, it will show up as an autocomplete suggestion after you type the first words. For instance, if you write "what's my" one of the things that Google will propose is "what's my IP". That's to help you find what you are looking for more quickly. But there's a dark side to it: if left alone, it can expose you to some pretty offensive searches.

Case in point is "are Jews evil", which my colleague Mark Wilson wrote about earlier. Yes, a high enough number of users searched for those exact terms that it showed up as an autocomplete suggestion -- until Google decided to do something about it. Mark strongly believes that's wrong, but his arguments are childish. Why? Well, because if Google does nothing, your young children can also see "how to rape a woman" or "how to murder your mother" as autocomplete suggestions after writing "how to" in Google, just because some people wanted to make those queries popular. Think about it, and I mean really think about it, and let me know if that's something you would like to see happen. Could you live with it if, for instance, your easily influenced six year old stabs someone, as a result? Scary thought, isn't it?

Yes, if people want to search for specific things, they will be able to do so and see the according search results. But they should not be influenced to do it. If you, for instance, are not interested in how to make a bomb, maybe you should not see it as a suggestion after you type "how to make a" -- especially if you are very upset by something -- when you want to learn to prepare carbonara. Or, if you are feeling really depressed, the last thing that you need is to see "how to kill yourself" as a suggestion.

Google may not have a legal obligation to intervene, but there is certainly a moral one here. Its search engine is not addressed to a niche, like a search engine on a porn site is (where people know what to expect beforehand). It is meant to be leveraged by folks of all ages and preferences. And, the "normal folks", if they can be called that, shouldn't be forcefully exposed to toxic content or thoughts because others are or want them to be.

As I said, the only reason why these searches should be listed in the autocomplete suggestions, and Mark's arguments certainly make this case, is that some people made them happen. There is no other reason why they pop up there. It is not a matter of free speech, just being curious about something.

Personally, I am all for allowing people to speak their minds and allowing others to hear them, but when some peoples decision to look into home made poisons for research can indirectly or directly influence someone to kill all of their coworkers in their lunch break, that's when I am all for Google stepping in.

The Internet can be a nasty place, but that does not mean that it should be a nasty place, as Mark suggests. There is a difference. No one in their right mind can expect people to think twice before writing a couple of words in Google, and, really, people shouldn't have to. There are things that are not necessarily relevant for others, like the poison example above, and many place their faith in a company to weed that kind of content out. I am fine with that and so should you be.

Photo credit: Rawpixel.com / Shutterstock

42 Responses to Google has a moral obligation to hide offensive autocomplete search suggestions

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.