What can we learn from the failure of Samaritans Radar?

Can social media be used to support those with mental health issues?

Loading ... Loading ...

The UK based charity Samaritans recently released an app designed to address issues related to depression and suicide. Called Samaritans Radar it seeks to turn your social net into a safety net by using an algorithm to spot when friends are in need of help. Could it play a role in fostering greater social support and community response? Or even worse, could it be used by trolls to find vulnerable potential victims?

Due to concerns stemming from the latter, that culminated in a petition and social media campaign, the app has been shutdown.

Aimed primarily at 18-35 year olds it scans tweets that are otherwise public, and is designed to capitalize on the opportunity of social networks which is to foster social support, especially for those who are in need. Due to the tendency for a lot of people to live their lives online, there is a growing sentiment that if there is an opportunity to intervene, everything should be done to ensure that help can be made available.

Dr. Sue Johnson describes this as an “enormous tsunami of loneliness, it is like a chant, you hearing people talking again and again about how lonely they are.”

The app worked by using an algorithm that scans your friends twitter feeds to look for key phrases and keywords. These include things like “depressed”, “help me”, “hate myself”, and “need someone to talk to”. Unfortunately the algorithm cannot distinguish between jokes and serious posts, let alone sarcasm or irony. When the app detects that one of your friends has used one of the key phrases or words, it will send you an email (and notification) so that even if you’re not on twitter at the time you still get notification. Samaritans does not intervene, the idea is that you as a friend (or follower) should intervene, but also use your best judgement when doing so.

The problem however, is that anyone could have used this app to identify people who are vulnerable. This is why there had been a huge reaction against Samaritans Radar, with a petition calling for Twitter to block the service as a whole.

The primary issue is that it could put vulnerable people in even greater harm by making them visible and available to potential harassers and trolls. Since the app is based on surveillance, the concern is that it changes the dynamics of potential Twitter support, increasing the anxiety that if you were to post something asking for help there may be others monitoring who see it and react in an inappropriate manner

A lot of people in the #SamaritansRadar hashtag have been pledging to leave Twitter in response or lock their accounts as private to protect themselves. The response from Samaritans has also received considerable criticism, as they initially insisted the app should be available.

As a large organization that runs a hotline the concern is that Samaritans are not sensitive enough to the diverse and delicate cultures that exist on social networks like Twitter. Engaging and encouraging this kind of system wide surveillance is insensitive given the politicization of surveillance in our era, let alone unethical.

While there is considerable potential to transform social media into social support, it cannot be done through automation or surveillance, but rather needs to be cultivated from the bottom up, within each circle of friends, or for each person. Thus on a basic level an app like this has to be opt-in and consensual and not something that monitors everyone by default (even if that is indeed technically possible).

This opt-in should be an active part of the subject’s relationship to the app, but also their relationship to their network. I.e. the opt-in should involve the people who are willing to offer support as well. It shouldn’t be some basic notification service, but something that assists all involved towards understanding the dynamics of depression, broaden their awareness of mental health issues, and help them connect with professionals who do this for a living.

As a society we are finally starting to appreciate and understand how severe and widespread mental health issues are. We should also recognize there are no quick fixes, and that while social media should be transformed into social support, it has to be delicate, smart, and effective.

As Alice Marwick, the director of the McGannon Center at Fordham University reminds us, “the human desire to connect isn’t different it’s just that the way that dsire manifests itself is completely different.”