Google refuses to take down website responsible for dozens of suicides

0

You can click here for a list of resources from the Suicide Prevention Resource Center.

There is an extremely disturbing site in the Internet ether where users are pushing each other into suicide – and according to Google, there is nothing they can do to remove the site from its search results.

A deep dive into the stomach The New York Times explore the site, which we will not name here. The story raises tough questions about both ethics and censorship – and particularly about Google, which has chosen to passively condemn the site while allowing it to remain a major hit in its search results.

Both an electronic bulletin board and a macabre instruction manual, this site considers itself to be “pro-choice”, that is to say pro-people having the choice to commit suicide and having access to information on how to do so. do, alongside a community that will help them do it without judging them or trying to help them stay alive.

During the last two years of activity of the site, according to the TimeCount, at least 45 users have died by suicide, and probably many more. Many of them learned how on the site, gained “support” from other users when their belief in ending their own life faltered, and even blogged their deaths live.

Run by two men in their twenties who live thousands of miles apart in Alabama and Uruguay, the site came about after Reddit shut down a forum with the same mission. Both operators were previously known under pseudonyms, but were exposed by the Time.

It should be noted that there is a raging debate about assisted suicide, in which terminally ill people can access treatments to end their lives. This conversation is as passionate as it is loaded, from the Kevorkian machines of the 1990s to states and countries that are on the verge of legalizing physician-assisted suicide.

The site highlighted by the Time’ the investigation, however, should not be part of this debate. His targets, on the other hand, are for the most part healthy people for whom the decision to end their life is almost certainly a gross miscalculation, as the Time clearly by refusing to include physician-assisted suicides in his charts describing the sharp increase in suicides over the past two decades.

Regardless of his personal beliefs about euthanasia and suicide – hell, whatever your beliefs about censorship either – the concept of arming a group of sick people with specific information and support to end their crimes. own lives is troubling.

In many ways, this suicide-cursed site represents the latest in a long line of problematic online documents that provide people with information about all kinds of horrible things, from pro-anorexia blogs and ineffective COVID-19 treatments to forums. for white nationalists and the “involuntary bachelors” called incels.

These three topics have prompted tech companies like Google and Facebook to censor particularly gruesome online content.

Often they comply. Facebook, for example, often quickly tries (and repeatedly fails) to filter out harmful content.

The situation at Google, and its parent company Alphabet, is more complex. While it has removed medical misinformation and white supremacist content hosted on YouTube, it takes a more casual approach to the content it lists on the open web – as evidenced by this week’s controversy.

“This is a deeply painful and difficult issue, and we continue to focus on how our products can help those in vulnerable situations,” a Google spokesperson told Futurism. “If people go to Google to search for information on suicide, they see features promoting prevention hotlines that can provide essential help and support. “

“We have specialized ranking systems designed to prioritize the highest quality results available for self-harm queries, and we also block autocomplete predictions for those searches,” she continued. “We balance these guarantees with our commitment to giving people open access to information. We are guided by local law when it comes to important and complex questions about what information people should be able to find online.

It’s a pretty fundamentalist position. The First Amendment may allow free speech for neo-Nazis and suicide advocates, but tech companies are not governments. They can, in principle, withdraw whatever they want.

Google’s slogan was “Don’t be mean”. He then removed that phrase from the company’s code of conduct in 2018 – and indeed, it seems there is a small room for evil in the company’s search results.

Updated to clarify that although Google has removed content hosted on YouTube, it does not usually deindex controversial content on its search engine.

READ MORE: Where the desperate connect and learn ways to die [The New York Times]

Learn more about assisted suicide, which this site definitely does not cover: Suicide assistance chamber approved by the authorities in Switzerland

Would you like to support the adoption of clean energy? Find out how much money (and the planet!) You could save by switching to solar power at UnderstandingSolar.com. By registering through this link, Futurism.com may receive a small commission.


Source link

Share.

About Author

Comments are closed.