Facebook ramps up its suicide prevention tools and Messenger chat support
Facebook is updating the tools it offers to help try to prevent suicide. Aimed both at those thinking of suicide, and friends and family who are concerned about loved ones, the revamped support tools make use of artificial intelligence and real people to offer help when it is needed most.
This is not the first we have heard about suicide prevention tools from Facebook, and the latest announcement sees the social network taking its "unique position" even more seriously, seeing it as a way to offer help and a means of intervention. It comes after a number of suicides have been streamed via Facebook Live.
Tackling the problem of live-streamed suicides, Facebook says that its suicide prevention tools are now integrated into Facebook Live. This means that viewers will not only have the opportunity to reach out to someone they are worried about, but will be able to more easily report a video to Facebook. If an alert is raised, the person hosting the Facebook Live session will be shown links to resources they might find helpful.
In a blog post about building a safer community, Facebook shares details of the updates it is introducing:
- Integrated suicide prevention tools to help people in real time on Facebook Live
- Live chat support from crisis support organizations through Messenger
- Streamlined reporting for suicide, assisted by artificial intelligence
As well as providing links to resources to those feeling suicidal, Facebook also encourages those in difficulty to reach out to people they know. Talking about suicide is never easy, and with this in mind, Facebook will offer pre-populated text to make it easier for people to start a conversation. Messenger is being promoted as a great way to reach out for help, and Facebook has teamed up with the likes of Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline to provide real-time support for people.
Noting concerns about the reporting process that was available previously, Facebook is streamlining things, and is using AI to identify potentially worrying posts. This is something that will start as a limited test in the US, but could spread worldwide if it proves successful. The company says:
Based on feedback from experts, we are testing a streamlined reporting process using pattern recognition in posts previously reported for suicide. This artificial intelligence approach will make the option to report a post about "suicide or self injury" more prominent for potentially concerning posts like these. We're also testing pattern recognition to identify posts as very likely to include thoughts of suicide. Our Community Operations team will review these posts and, if appropriate, provide resources to the person who posted the content, even if someone on Facebook has not reported it yet.
To promote awareness of suicide, Facebook is also launching a video campaign which you can see below: