Quantcast
Channel: Refinery29
Viewing all articles
Browse latest Browse all 20078

Facebook Is Increasing Its Efforts To Detect Suicidal Thoughts

$
0
0

Female suicide rates in 2015 increased in the UK by 3.8% and 2% in England according to a 2017 report from the Samaritans. Although that report did not cite a specific cause for this alarming rise, other studies have pointed to social media and smartphone use, where cyberbullying has resulted in the creation of a new term: cyberbullicide.

At their core, social networking sites were created to connect people with friends and family. That network also means that platforms can potentially be a powerful tool for counteracting cyberbullying or providing support for a person at risk.

Today, Facebook announced a few new efforts to proactively detect suicidal posts and, hopefully, get people help faster.

First, Facebook is expanding its use of artificial intelligence to identify posts and live streams that may include suicidal thoughts. The AI technology, which the company first tried using as a suicide prevention tool in March, is able to parse videos and text more quickly than someone can report something. It picks up on phrases such as "are you ok?" and "can I help?" that can signal someone may be at risk. This AI is also being used to prioritise which posts may be more at risk than others.

After the AI finds something, a member of the global Community Operations team will take a look. This is another area Facebook is improving upon: Increasing the number of trained reviewers, and introducing automated tools for reaching out to first responders who can get in touch and provide help on the ground.

"Every minute counts when you do this kind of work," Guy Rosen, Facebook's vice president of product management told Refinery29. "This is really about working fast so we can get people help in real-time." According to Rosen, in the last month, proactive detection has resulted in 100 wellness, or on-the-ground, checks.

These resources are in addition to the reporting tools already available: If you're concerned by a friend's post, click the "report" link, select the appropriate issue, and tap "send." From there, a reviewer will take a look. Your report is confidential, though you'll also see additional information about how to reach out and help someone yourself.

Facebook isn't the only tech company seeking new ways to provide help. Earlier this year, Instagram launched its #HereForYou campaign to create a community for those affected by mental illness. Crisis Text Line, meanwhile, offers immediate assistance via the most accessible means for anyone with a smartphone: A text.

These tools are by no means a fix for the issues that have arisen in tandem with the expansion and proliferation of social media, but it is an additional resource on the platforms that very well could be impacting young people's health.

If you are thinking about suicide, please contact Samaritans on 116 123. All calls are free and will be answered in confidence.

Like what you see? How about some more R29 goodness, right here?

Are You Guilty Of Emoji Blackface?

Talking Unicorns & Smiling Poop: How To Use iPhone X's Animoji

The Best Apps For One-Night Stands


Viewing all articles
Browse latest Browse all 20078

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>