Facebook: Social Media and the Rise in Public Suicide

How Facebook is Leveraging Data and Machine Learning to Protect the Lives of its Users

Naika

On January 22nd, 2017, Naika Venant, a 14-year-old teen from Miami, was broadcasting to her friends on Facebook Live. For nearly two hours, she live-streamed from the bathroom of her foster home, fashioning a homemade noose from her scarf.

In a brief report issued by the Florida Department of Children & Families, the author states that Naika attached her scarf to the “shower-glass door frame”, and ended her own life at around 3:03am. The entire episode was streamed live over Facebook.

In nearly a year since the public rollout of Facebook Live, a feature which allows users to broadcast live videos of themselves to followers, a number of people have used the platform to publicly record themselves taking their own lives. A day after Naika’s death, Frederick Jay Bowdy, a 33-year-old aspiring actor living in Los Angeles County, shot himself in the head as people watched on the social media network.

 

At Risk

In 2014,  the suicide rate in the United States reached a 30-year high, according to the Center for Disease Control and Prevention, and in 2015, more than 44,000 people committed suicide in the United States. Currently, suicide is the third-leading cause of death among people ages 10 to 14, and the second-leading cause of death among those ages 15 to 34.

The CDC does not have data on the number of people who have committed suicide on social media. However, it appears to have become an increasingly common trend for people to share feelings of despair and suicidal thoughts on social media platforms, and in some cases, tragically carry out the act publicly.

Historically, suicide prevention measures have focused on reducing access to items that could help someone carry out the act, such as guns and pills, and educating doctors on how to recognize behaviors as red flags. Unfortunately, over time, these measures have proved highly ineffective at stopping people from taking their own lives.

Fortunately, using pattern-recognition algorithms, artificial intelligence offers the potential to identify suicide-prone behavior much more accurately, providing the opportunity to intervene long before someone takes action to hurt themselves. A soon to be published study by researchers from Florida State University shows that they were able to use machine learning to predict with 80-90% percent accuracy whether or not someone will attempt suicide, as far off as two years in the future.

 

 

Intervention

For years, Facebook has been investing in artificial intelligence and machine learning to understand its users better and to more effectively target them with advertisements and posts. In response to the recent tragic events that have unfolded on Facebook Live however, the company has begun using the same approach to data science and machine learning to build predictive models to identify online posts that could indicate that users are at-risk of suicidal behavior.

This past February, Mark Zuckerberg wrote, “there have been terribly tragic events – like suicides, some live streamed – that perhaps could have been prevented if someone had realized what was happening and reported them sooner…artificial intelligence can help provide a better approach”.

Leveraging the vast amount of data generated by its nearly 1.9 billion monthly active users and billions of daily posts, Facebook has been developing machine learning algorithms to recognize and flag activity that could indicate suicidal behavior and report them to be reviewed internally. If the Facebook team believes that a user could be in trouble, they will reach out with resources, including live chat support. In addition, Facebook is partnering with organizations such as the National Suicide Prevention Lifeline, the National Eating Disorder Association, and the Crisis Text Line so that when a user’s post is flagged and they indicate that they would like to speak to someone, they can reach out and connect immediately via Facebook Messenger.

 

Moving Mountains

Admittedly, it is still early days and it is yet to be seen if pattern recognition mined from Facebook users’ data will ultimately result in identifying, and more importantly, preventing suicides. However, many researchers are hopeful that these types of techniques could prove useful and be positively received by younger people, particularly given that they are accustomed to interacting over social media.

While it is unclear if Facebook users will take advantage of these resources, Willa Casstevens, a professor at North Carolina State University who studies suicide prevention has said that “in the moment, a caring hand reached out can move mountains and work miracles”.

 

 

 

 

 

References:

[1] “Another girl hangs herself while streaming it live – this time in Miami.” Miami Herald. January 24, 2017. http://www.miamiherald.com/news/local/article128563889.html.

[2] “Facebook hopes artificial intelligence can curb the ‘terribly tragic’ trend of suicides.” The Washington Post. March 1, 2017. https://www.washingtonpost.com/news/the-intersect/wp/2017/02/08/why-mental-health-professionals-say-live-streaming-suicides-is-a-very-concerning-trend/?utm_term=.ed502f829b94.

[3] “Actor kills himself on Facebook Live.” The Mercury News. January 25, 2017. http://www.mercurynews.com/2017/01/25/actor-kills-himself-on-facebook-live/.

[4] “Suicide Facts at a Glance 2015.” Center for Disease Control and Prevention. https://www.cdc.gov/violenceprevention/pdf/suicide-datasheet-a.pdf.

[5] “Artificial Intelligence is Learning to Predict and Prevent Suicide.” Wired. March 17, 2017. https://www.wired.com/2017/03/artificial-intelligence-learning-predict-prevent-suicide/.

[6] “Facebook leverages artificial intelligence for suicide prevention.” The Verge. March 1, 2017. http://www.theverge.com/2017/3/1/14779120/facebook-suicide-prevention-tool-artificial-intelligence-live-messenger.

[7] Big Questions Around Facebook’s Suicide-Prevention Tools.” MIT Technology Review. March 1, 2017. https://www.technologyreview.com/s/603772/big-questions-around-facebooks-suicide-prevention-tools/.

 

Previous:

Freebird: Using data to get you out of a jam and onto a plane

Next:

Devon Energy: Big Data Meets Oil & Gas

Student comments on Facebook: Social Media and the Rise in Public Suicide

  1. Thanks so much for the post, Ian. I wrote my post on a similar topic: the Crisis Text Line. I didn’t realize that Facebook and the Crisis Text Line had this partnership in place – how awesome! I’m curious about the data collection aspect of Facebook’s effort. Crisis Text Line claims to have amassed the most comprehensive real-time mental health data set to date and has opened its data set to “(1) inform the public and media, (2) shape government and school policies, and (3) drive cutting edge academic research” (according to the Crisis Text Line’s website). Do you know if Facebook is (or is considering) doing a similar thing?

  2. Thanks, Ian.

    A lot of research talks about the negative effects of Social Media on mental health, including its causal effects on depression and increased suicide rates. As the biggest Social Media platform, Facebook is at the core of these discussions and it’s refreshing to see that company is using its resources to fight the problem.

    I wonder though whether post-mortem addressing of the symptoms of social media use is the most effective way to solve the rising mental health problems or can Facebook utilize its research, technology and resources to address the root causes by changing the way people consume social media that results in lower impact on mental health.

Leave a comment