3 Mental Health Experts Dissect Facebook's Tech-Heavy Plan To Prevent Suicides
Earlier this week, Facebook announced plans to expand its artificial intelligence-based suicide prevention campaign. The company shared how the program has been working so far, essentially using "proactive detection" technology to scan posts for signs that the user might be suicidal and flagging them to Facebook moderators for next steps, which range from offering links to online resources all the way up to contacting first responders on the user's behalf.
The goal is to shorten the amount of time between a concerning post and Facebook being brought into the conversation. Previously, Facebook wouldn't get involved unless a friend manually flagged someone's post as seeming suicidal.
"When someone is expressing thoughts of suicide, it’s important to get them help as quickly as possible," wrote Facebook Vice President of Product Management Guy Rosen, on the company's blog.
Experts in suicide prevention are pretty excited about this plan, though some have minor reservations. Facebook's plans are "important and groundbreaking," Dr. Christine Moutier, chief medical officer at Americans for Suicide Prevention, writes in an e-mail, lauding this type of "creative and innovative solution."
"With the help of large tech companies like Facebook, we can reduce the suicide rate in the United States, furthering AFSP's Project 2025 goal: to reduce the rate 20% in the US by 2025," she writes.
AI has "demonstrated itself as an effective tool for identifying people who may be in crisis," writes National Suicide Prevention Lifeline Director John Draper, noting there are still some challenges when it comes to machine learning technology understanding the difference between someone joking with their friends and someone who may actually be on the verge of self-harm.
"We’ve been advising Facebook on how to provide assistance for persons in crisis through creating a more supportive environment on their social platform so that people who need help can get it faster, which ultimately, can save lives," he writes, adding that "notifying police is an absolute last resort."
Joining the chorus of supporters of Facebook's plan is Dr. Victor Schwartz of The Jed Foundation, who has two caveats related to transparency and training.
"We applaud Facebook's efforts to enhance their ability to identify and respond to users who may be at increased risk for self-harm," Schwartz writes in an e-mail. "We hope that Facebook users would be made aware of this new protocol and would be alerted to the impending intervention, and that first responders would be properly trained to respond to those in possible crisis."
To the latter point, first responders should be properly trained to respond to suicidal individuals and others experiencing a mental health crisis, but are often not. For example, earlier this month, a suicidal woman in Cobb County, Georgia, was shot and killed by police after she grabbed a gun. In August, a similar scene unfolded in Florida.
There is no doubt that deescalating this type of situation, especially when the person involved is armed, is difficult. That's why ensuring proper training is so important.
Skepticism remains high in the online community, with some worrying that this life-saving tech could be repurposed for more nefarious pursuits in the future. Though Facebook has provided a pretty decent high-level overview of this new tool — which users apparently won't be able to opt out of — the company has been extremely light on details of how it actually works.
Responding to critics worried about how Facebook may use this technology in the future, the company's chief security officer, Alex Stamos, stressed that "it's important to set good norms today around weighing data use versus utility and be thoughtful about bias creeping in."
It's perfectly rational to have concerns over the use and misuse of AI, but the truth is that tech like this is going to play a big role in coming years.
Between this and concerns about ensuring that first responders are properly trained, Facebook's new approach to suicide prevention feels like something humanity can feel cautiously optimistic about — but only time will truly tell.
Share image via Facebook.
This Genetic Testing Tool Could Help Doctors Break The Chain During Superbug Outbreaks Medical sleuths on the trail of drug-resistant bacteria could have a powerful new tool at their disposal.
A Stroke Can Steal Away A Person’s Ability To Move Freely. Virtual Reality Might Help Bring It Back. How do you get to a physical therapy appointment when your limbs are paralyzed? Bring the physical therapy to you.
Emotional Granularity Can Help Reduce Stress Levels We often mislabel our negative emotions as “stress.”
A Promising New Drug Kills Pain Like Morphine—And It’s Not Addictive, Either The compound is a huge step forward in the fight to end America’s dangerous addiction to prescription narcotics.
Spiders Have Super-Hearing Powers — And Scientists Think We Can Steal Them For Ourselves If Spider-Man were scientifically accurate, he’d be able to detect the faintest sounds in any crowd with ease.
New Video Shows How to Find Your Vestigial Organs Your tailbone was once an anchor for … your tail.
As Trump Wages War On Birth Control, Women Are Taking Back The Condom Female-owned condom companies are pitching a sleeker product that’s healthier for women’s bodies. But can our manliest contraceptive method really be feminist? Can we find “girl power” in the manliest contraceptive method of them all?
Woman’s Shocking Before-and-After Pictures Reveal The Truth About Panic Attacks Appearances aren’t always what they seem.
Wentworth Miller Uses A Fat-Shaming Meme To Share An Empowering Message About Depression He’s suffered from depression all of his life