Study highlights global inequality in suicide prevention services

When users of Google's search engine submit suicide-related queries, they are repeatedly provided with helpline hotlines on suicide prevention services. But whether such information is actually displayed depends on the user's location and language.

Queries submitted to Internet search engines not only reveal a lot about the individual user's interests, they may also permit inferences to be drawn about one's state of health. In such cases, some search engines display information pointing to appropriate advisory services, such as emergency hotlines, whenever queries imply that a search might be motivated by the intent to self-harm. However, as LMU researchers Mario Haim and Florian Arendt in collaboration with their colleague Sebastian Scherr (KU Leuven) have now shown in a paper that appears in the journal New Media & Society, the probability of being confronted with such information varies widely depending on one's location and in particular one's language.

Previous studies have indicated that the incidence of suicide can be reduced if those at risk are informed in a timely manner of the availability of support and advisory services and of effective strategies for dealing with suicidal crises. "It is therefore very welcome that Google, for example, has voluntarily agreed to follow the recommendations of the World Health Organization, and displays such information," says Haim. In an earlier investigation, however, these researchers had found evidence that advisory information was actually being presented to users in response to only a fraction of relevant queries. In their latest paper, they systematically investigate the incidence of the appearance of Google's 'suicide-prevention result' (SPR) in a set of 11 countries. The authors used a virtual-agent-based methodology to automatically send appropriate leading queries to Google's search engine, while the location of the query source and the language employed by the virtual agent was varied. Numerous search terms were used, which were more or less suggestive of either suicidal intent or of a wish to seek help in coping with such an impulse. The queries were translated by local experts in suicide prevention and, if necessary, complemented by culturally specific expressions and terminology.

The results show that information on support services was significantly more likely to be displayed in English-speaking countries than in non-Anglophone parts of the world. In Germany, the incidence of its appearance varied between 8% and 23% depending on the search term employed. In the USA, the corresponding span was 34 to 94% of all such queries. Moreover, the study uncovered striking differences not only between countries, but also within countries with more than one officially recognized language. In India, for example, the SPR was shown in response to 91% of queries posed in English, but in only 10% of those put in Hindi. "These differences cannot be explained by temporal, terminological, or social factors, which is why we attribute them primarily to language," says Haim.

The authors interpret their results as one symptom of a new digital divide. "These data are clearly at variance with the often proclaimed commitment to providing free health-related information - in our case, advice on how to find help in a suicidal crisis - equally to all Internet users. The availability of such information is in fact restricted, depending on where and in what language the user frames the relevant search queries," says Arendt. "The designers of search engines should acknowledge the resulting global inequality, and make serious efforts to minimize it."

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Study shows large language models susceptible to misinformation