Tuesday, 14 May 2024
FB tool to help suicide prevention

FB tool to help suicide prevention Featured

Prior to the February initiation of Facebook's rollout of a new Suicide Risk reporting tool, an outdated flagging system was the only way Facebook users could alert the Social Network to their friends posting threats of self harm or a threat to commit suicide. Whether or not those flags ever resulted in action taken is unknown.

Facebook just announced a new multi-level partnership with mental health organizations to launch new self-harm risk reporting tools.

Statistically, there's a suicide in the United States every 13 minutes, totaling 40,000 annual deaths in America.

After complete roll-out, concerned friends or family members will be able to report troubling posts to Facebook by someone struggling with suicidal thoughts, and Facebook then connects the person at risk with assistance, or organizations like the National Suicide Prevention Lifeline.

These updates are rolling out to Facebook users across the United States through February and March, and are working to improve tools and resources for Facebook users outside the U.S.

Someone may be physically healthy, but mental illness isn't visible on the outside. It's visible in expressions made in person, or online, such as pleas for help or attention, and some of us might just scroll by a post like that without a second thought.

Now, Facebook wants you to take that second thought, scroll back, and report that post.

When any Facebook user publishes that they want to take their own life, the first action their friends or family MUST take is contact local emergency services immediately.

Expressions of depression, hopelessness, or self-injury many times are published on Facebook. It is these troubling posts that Facebook wants to be alerted to, so they can connect the poster with support and resources to help.

The way it works is like this:

Users may post on Facebook that they're in a deep funk over a life situation, and are considering injuring themselves in some way.

These posts should be flagged for review by Facebook's 24-hour team dedicated to helping those in distress.

When that team checks out what was written, they will prioritize based upon what appears to be the most serious reports, such as self-injury, and then direct resources towards that individual for help.

The next time the original poster logs onto Facebook, they're presented with a messages:
When Facebook gives these options to the person in distress, they'll be encouraged to connect with a friend or family member, or a mental health expert at the National Suicide Prevention Lifeline. They'll also be given the opportunity to receive support online on how they can work through the emotions they're feeling.

The friend or family member that flagged the original post is also given some options for contact, including encouraging them to reach out to their friend in distress by calling or messaging to let them know they care.

Facebook's Rob Boyle and Nicole Staubli say in the Safety announcement published earlier in February, their team has been working with the mental health organizations Forefront, Now Matters Now, Save.org, and the NSPL on the latest improvements to the reporting tools on Facebook.

These same organizations and clinical partners, in addition to consulting with people who had lived experience with self-injury or suicide, were all involved in the creation of a knowledge base of tips for those at-risk users to learn how to work through their emotions and get support.

(www.wltx.com)

Last modified on Tuesday, 10 March 2015 10:34