While the digital age has created a hyperconnected society, the steady increase in suicide rates across all age groups over recent years proves that people still struggle to reach out for help before it’s too late. Facebook is trying to play a role in changing this stigma.
On Wednesday, Facebook announced that they have integrated new tools into the platform to make it easier for people who are struggling with mental health issues to access the necessary resources.
The new features were made in collaboration with mental health organizations such as Save.org, National Suicide Prevention Lifeline, Forefront and Crisis Text Line.
“It’s important that community members, whether they’re online or offline, don’t feel that they are helpless bystanders when dangerous behavior is occurring,” said Dr. John Draper, Director of the National Suicide Prevention Lifeline in a press release. “Facebook’s approach is unique. Their tools enable their community members to actively care, provide support, and report concerns when necessary.”
While people have had access to suicide prevention tools on the platform for more than ten years, these tools provide users with three new capabilities.
- Educational resources during a Facebook Live post.
- Live chat support from trained mental health professionals through messenger.
- Use of artificial intelligence to make reporting easier.
If someone is hosting a Facebook Live post that makes a friend viewing their video concerned for their mental health or safety, that viewer can now either directly reach out to the video host or report the video to Facebook who has employees working 24/7 to assist people in these circumstances. The feature even provides suggested prompts to allow users to choose from if they aren’t sure what message would be most appropriate for the situation.
The person hosting the Facebook Live video will be prompted with some educational resources to help them through the situation and to reach out for help.
Last month, a 14-year-old girl from Florida live-streamed her own suicide using the Facebook Live feature. While one friend of hers did contact the local police to go to the scene, features like this could help better connect friends in the moment.
“As more and more youth and young adults use social media in their daily lives, tools like this become a natural part of their ability to help others when needed,” according to Executive Director of Suicide Awareness Voices of America (SAVE) Dr. Dan Reidenberg in an interview with MediaFile. “If people can become familiar with the suicide reporting and supporting functions that Facebook offers, when they need it to help a friend in college, that support can be available 24/7 and accessed anywhere. This helps save lives.”
People looking for support are also now able to reach out to people for help through Messenger with an integration with suicide support line, Crisis Text Line.
“We want to be wherever people are in crisis–text, Facebook Messenger – and we’ll continue to be on the leading edge of technology, supporting people everywhere they are,” said Nancy Lublin, Founder and CEO, Crisis Text Line.
Lastly, Facebook is in the midst of trying to make reporting simpler by relying on artificial intelligence to better recognize signs of suicidal thoughts on social media.
“Based on feedback from experts, we are testing a streamlined reporting process using pattern recognition in posts previously reported for suicide,” according to Facebook’s press release. “This artificial intelligence approach will make the option to report a post about ‘suicide or self injury’ more prominent for potentially concerning posts like these.”
While this testing will be limited to the U.S. for now, the new suicide safety features are accessible to users across the world thanks to the support of over 70 partners globally.
Facebook’s new suicide prevention features act as a reminder that there is still a lot of stigma around mental health issues in society.
“I think all efforts that help identify mental illness and suicide that can be brought to light help reduce stigma,” said Dr. Reidenberg. “I also think that the more companies like Facebook take on the issues of mental health it helps reduce stigma. It gives the message that these diseases are just as real as any other illness and that there is hope that when caught early prevention is possible.”
Hopefully, these features will help to lay the groundwork for more open conversations both online and in-person.