Suicide Prevention and Everyday Tech

pexels-photo-54377.jpeg

Suicide Prevention Day took place on 10 September. Suicide is a difficult issue that all mental health products have to address. The broader conversation around tech and suicide prevention is, however, still very young. Two central questions in this are: How do we know when someone is dealing with suicidal thoughts? And what’s the most helpful thing to do in that situation? We’re not the only tech company to grapple with this. Google and Facebook have recently implemented tools for dealing with exactly this, developed (laudably) with extensive input from both users and professional mental health organisations. What both have settled on can be summarised as listen, ask, and signpost.

If you search for depression on Google’s US portal, you’ll be asked if you want to take a questionnaire called the PHQ-9. This is a set of 9 questions that medical and psychological professionals have used for nearly 2 decades to grade the symptoms of depression. Question 9 is about self-harm and suicidal thoughts. The PHQ-9 gives a score, and a higher score means you’re more likely to be experiencing depression. Google’s tool helpfully explains what your score means, and signposts you to relevant organisations that can help - including charities and helplines that operate in your area. Interestingly, the tool also nudges people to bring up their experiences with their doctor and loved ones.

Facebook takes a slightly different approach, more suited to the way people interact with their product. They apply the ‘listen and ask’ principle to concerned facebook friends. If you hit the report button on any post - whether it’s a feed update, a livestream, or a message in messenger - one of the options for why you reported is ‘suicide or self-injury’. If you select this, Facebook will then walk you through options for how to help - including contacting the person directly, contacting someone close to them, or getting in touch with a relevant helpline or authority.

There is a more high-tech approach, leveraging data analysis to look for patterns. For posts (but not Live or Messenger), Facebook also has an algorithm to detect some of the same concerning elements people are reporting. In this case, it’s brought to the attention of the community operations team, who can then suggest resources that might help. There’s also been a significant amount of research in clinical settings into using AI to pick up subtle patterns in self-harm and suicide that doctors may miss. These approaches are, however, in their infancy. They require huge amounts of data to be helpful, and there’s a big discussion around the ethics involved in this approach that’s only just starting.

We’ve decided to focus on what we can do well now: listen, ask, and signpost. Like Google, we offer the PHQ-9 within the app. For Tomo users who are struggling with depression, it’s a convenient way of keeping track of their symptoms over time. It also means that if a user wants to tell us they need help beyond what we can offer, we’re listening. And we can direct them towards people who can help.

For confidential support call the Samaritans on 116 123, or visit a local Samaritans branch. See samaritans.org for details.