Nearly 900 million people log on to Facebook every day. At least some of them are looking for help.

Now, in a move that attempts to reach into the real lives of those troubled users, Facebook is launching an updated suicide prevention tool that logs out users who have posted worrisome messages and encourages them to look over mental health advice.

“Suicide right now is like AIDS in the ’80s,” said Rob Boyle, a Facebook product manager at his company’s Compassion Research Day on Wednesday. “It’s this huge issue, but people are afraid to talk about it.”

The tool works like this: If a user writes about harming himself or herself on the social network, friends can click a “report” icon on the upper right-hand corner of the post. For those who flag troubling posts, this brings up information about suicide prevention and an option to send a message to the user. Or they can opt for Facebook to intervene. In the past, users were able to flag worrisome posts, but this is the first time Facebook has offered the ability to call a friend.

Facebook said it has a team of hundreds of employees worldwide who review an average of 1 million flagged posts a week, including spam, nudity, threats and anything else reported by users. Those who look over posts flagged as suicidal undergo training in suicide prevention, the company said.

If Facebook determines a post contains a suicidal message, the company will lock the user out of Facebook and Facebook’s Messenger app. Before the user can log back in, he or she would see a page with links to suicide prevention materials and a way to contact Facebook friends or a suicide hotline.

“A friend thinks you might be going through something difficult and asked us to look at your recent post,” a message reads.

Gradual rollout

Right now, a limited number of users in the U.S. have access to the updated tool. Facebook plans to roll out the service to the rest of the country in the coming months and eventually to other nations.

About 41,000 suicides occur in the U.S. each year, outpacing the number of homicides and traffic deaths, suicide prevention experts said. Several experts applauded Facebook’s efforts, saying it’s another avenue for people to get help.

“This is a good start,” said Eve Meyer, executive director for San Francisco Suicide Prevention. “This is a wonderful step for Facebook to take action immediately and let people know what resources are closest to them and hopefully deliver a message that they are concerned and that they want them to get pain relief.”

But some privacy advocates are concerned Facebook’s new tool may lead to online bullying, with users wrongly flagging posts in attempts to lock people out of the social network.

Jamie Court, president of Consumer Watchdog, said Facebook is trying to “practice psychiatry” on the social network when the company shouldn’t.

'So many pitfalls’

“The virtual intervention comes with so many pitfalls that the real answer is to make it a real world intervention if someone believes it to be an actual threat,” Court said.

Facebook presented its updated tool at its annual Compassion Research Day, where the Menlo Park technology giant discussed its research into how people interact and respond to posts on the social network. An audience member asked whether Facebook was concerned that some users may think its suicide prevention intervention could cause a backlash from people who don’t want to be monitored.

Facebook employees on the project said they hoped it wouldn’t come off that way, because they worked hard on the site’s language so it sounds like a friend reaching out to a person in need. The company spent eight months working on the updated tool with suicide prevention groups including Forefront, Now Matters Now, National Suicide Prevention Lifeline and Save.org.

Impressed at effort

Jennifer Stuber, Forefront’s faculty director, said she was skeptical at first about working with Facebook because social networks have been grounds for online bullying and have come under criticism for not doing enough. But so far, she has been impressed at the effort they’re putting toward suicide prevention.

“There is tremendous potential and I’m excited Facebook is taking this on,” Stuber said.

Facebook wouldn’t say how many people currently have access to the updated tool. So far, 58 percent of those whose posts have been locked have chosen to log in without reading the suicide prevention materials or contacting a friend or suicide hotline. Part of the reason could be because they are viewing the message on a smartphone and aren’t in a position to read it, Boyle said.

According to Facebook, 25 percent chose to contact someone, and of that group 30 percent got in touch with a suicide prevention helpline.

Stephen Miller, 32, believes such tools could prevent more suicides. Five years ago, he saw a college classmate post on Facebook, “I just can’t take it anymore.” It troubled Miller, but he didn’t know what to say, or how seriously to take the post. The next time Miller logged in to the social network, he learned his classmate had committed suicide.

“I can’t go back, but I believe he was at a point of crisis where a positive intervention could have helped,” said Miller, who is now an operations manager at Forefront.

Miller recently had another Facebook friend go through a breakup and post something troubling. Miller used the Facebook reporting tool, and reached out to a bunch of friends, and the person appreciated the gesture.

“It’s critical for people to know there’s something they can do. They don’t have to be a therapist or doctor,” Miller said. “They have a tool literally at their fingertips.”

Wendy Lee is a San Francisco Chronicle staff writer. E-mail: wlee@sfchronicle.com Twitter: @thewendylee

———

©2015 the San Francisco Chronicle

Visit the San Francisco Chronicle at sfgate.com

Distributed by Tribune Content Agency, LLC