Facebook wants to get smarter about suicide prevention

crisis text line

Facebook says it's in a unique position to do more about the suicide epidemic.

Suicide is the second leading cause of death among 15- to 29-year-olds -- and Facebook wants to leverage artificial intelligence to help with prevention.

The company announced Wednesday it is testing the ability for AI to identify potential "suicide or self injury" posts based on pattern recognition from posts that have been previously flagged on the site in the past. Its community operations team will then review the posts to decide if Facebook should surface crisis resources to the user.

For now, the test is limited to the U.S., but it marks just one of the ways in which the company is trying to get smarter about protecting its users from self harm. In 2016, Facebook (FB) made some suicide prevention tools available to its 1.86 billion monthly active users: People can flag questionable posts for the company to review. Facebook will either suggest the person in distress reaches out to a friend, with an optional pre-populated text to break the ice. It also suggests help line information and resources.

Related: Meet the 78-year-old Crisis Text Line counselor

Its existing toolkit for crisis support is getting even more robust, and easier for people in need to access.

Facebook also announced on Wednesday that people can now chat directly, using Facebook Messenger, with several support organizations through their pages: National Eating Disorder Association, National Suicide Prevention Lifeline and Crisis Text Line. Anyone has the option to message the organizations by going to their Facebook pages.

"We want to be wherever people are in crisis -- text, Facebook Messenger -- and we'll continue to be on the leading edge of technology, supporting people everywhere they are," Crisis Text Line CEO Nancy Lublin said in a prepared statement.

While it hadn't been publicly announced, the integration between Crisis Text Line and Facebook Messenger has been quietly in place for some time and has already garnered results, Lublin wrote in a separate Facebook post Wednesday. Active rescues -- the term the organization gives to illustrate when someone shows themselves to be a real potential to harm themselves and is escalated to a Crisis Text Line supervisor -- are two times as common on Messenger, compared to the organization's standard texting service. And 22% of those who reached out to Crisis Text Line via Messenger mention suicide.

With new technologies come new challenges, however. The newer platform Facebook Live has been the center of troubling headlines in recent months, with suicides being streamed in real time.

So as part of Wednesday's announcement, the company said it is expanding its suicide tools so they're available to those watching Live videos.

A manifesto from CEO Mark Zuckerberg in January acknowledged those stories, and the important role of early detection.

"There have been terribly tragic events -- like suicides, some live streamed -- that perhaps could have been prevented if someone had realized what was happening and reported them sooner," he wrote. "Going forward, there are even more cases where our community should be able to identify risks related to mental health, disease or crime."

CNNMoney Sponsors