Microsoft bing

Microsoft’s Bing is irresponsible with suicide

November is generally Mental Health Awareness Month, a month when men around the world grow their mustaches in order to raise awareness of the disturbing trend of male suicide. To give a perspective from a technological point of view, there is one aspect of this that is overlooked here: the guilt of now ubiquitous and ubiquitous search engines like Google and Bing in the process, and whether their parent companies can do more. .

To give credit where credit is due, Microsoft has done some work here. If you search for “How to kill -“, the autocomplete stops there. After searching for “How to kill yourself,” Microsoft displays the UK Samaritans hotline number, urging users to call the hotline for help. If you’re looking for “suicide” or “how to kill yourself,” Microsoft is flooding the seeker with helpful resources aimed at getting the researcher to get help. All of this is good and deserves a fair applause.

However, despite all of this, Microsoft’s Bing search does not do enough in this area, and often provides “useful” results, especially when compared to its main competitor, Google.

As an example, a Bing search for “How to commit suicide” has 3/5 of the top results that incite suicide or provide a guide with methods, while the other two results are message boards. Clicking on the results displayed on the page confirms that those results are what the researcher might want, but not how Microsoft probably wants them, and not what they need. The equivalent Google search, on the other hand, has the very first search result as a link to an NHS webpage, the second result link to self-help, then a few questionable results followed by help resources. on the first page. .

More worrying than the results, Microsoft also gives you alternative searches, just in case your first few results aren’t too helpful. Bing seems to be asking the user, “How do you kill yourself” without giving you the results you wanted? how about “How to poison yourself”, “painless ways to kill yourself” or “the best way to hang yourself?” Google, once again, offers no alternative suggestions.

To illustrate how this undermines their previous efforts, if you’re researching “how to kill yourself,” Microsoft helpfully fills out the resources page. At the top right of the screen, however, Bing offers alternative searches. “Best tablets for suicide”, “pills to take for suicide”. Whether one makes a typo or a slight misspelling, Microsoft oddly offers none of the useful resources, even admitting that the search term is most likely incorrect, whereas Google always does. There are a few more examples I could give, but the point needs to be clear. Microsoft is still failing to do the extra work needed to protect vulnerable users of its Bing search engine.

Search engines are powerful these days, and can and have been manipulated to push certain results onto others. Microsoft, Google, Apple, and all of them know and understand this power, and that’s why they have instituted safeguards on the content of some results, especially those relating to suicide and self-harm.

The Samaritans, the UK’s largest suicide prevention charity, shared some thoughts on suicide and the online world in 2013.

We believe that search engine providers have a corporate social responsibility to ensure that credible sources of support are promoted when people enter suicide-related terms into search engines.

There is also more that could be done to reduce unnecessary “auto-complete” functions on partially entered search terms.

We have good relationships with some search engine providers, but there is more to them and we can do about it. There are also some who do not really measure up to this question.

While Microsoft has added its hotline to suicide-related links as noted above, its provision of alternative search terms could be designed to be tantamount to unnecessary autocomplete search terms the charity calls above.

There is some discussion that can be had regarding the concept of censorship, but it is surely nonsense. Freedom of speech is not, especially in this case, a commitment to a suicide pact. Search engines can, of course, still bring up the offending results, but promoting them on the first page is deeply unnecessary. Microsoft should not promote methods of killing itself or suggest alternative searches for better results. Its algorithms can be tweaked and modified to promote more ideal results, and the company should think more about how its results actually affect people.

Search engine results may seem particularly insignificant and easy to reach, but research has shown that suicide is a very impulsive thing that in many cases can be avoided by even insignificant barriers. A New York Times article on putting up anti-suicide barriers at the Golden Gate Bridge (which you should read) cites work from academic researchers supporting this.

Dr Blaustein said: “The most common myth to explode is that people will go elsewhere. “

In a 1978 study, “Where are they now?” Richard H. Seiden, a former professor at the School of Public Health at the University of California at Berkeley, considered whether a person prevented from committing suicide in one place would go elsewhere. He studied people who attempted suicide off the Golden Gate Bridge from 1937 to 1971 and found that over 90% were still alive in 1978 or had died of natural causes.

Microsoft’s barriers must be tightened. As Satya Nadella noted earlier this year, technology must be built with empathy. A good lesson, which must be applied.