Facebook offers tools for those who fear a friend may be suicidal
MENLO PARK, Calif. >> When Carrie Simmons opened her Facebook app one day in late 2014, she saw a status update from a high school friend she had not seen in years that alarmed her. It read like a suicide note.
“Thank you for everyone who tried to help me,” Simmons’ friend, whom she declined to name, wrote.
Simmons immediately called a mutual friend, a police officer, who reached out to local authorities in their friend’s town in California. There, police found the friend in a parked car with a pistol in his lap. He did not pull the trigger and is alive today.
“If I hadn’t already been educated in suicide prevention or hadn’t seen the post on Facebook, I don’t know that I would have picked up the phone and known to call,” said Simmons, who is a real estate broker in Seattle.
With more than 1.65 billion members worldwide posting regularly about their behavior, Facebook is planning to take a more direct role in stopping suicide. On Tuesday, in the biggest step by a major technology company to incorporate suicide prevention tools into its platform, the social network introduced mechanisms and processes to make it easier for people to help friends who post messages about suicide or self-harm. With the new features, people can flag friends’ posts that they deem suicidal; the posts will be reviewed by a team at the social network that will then provide language to communicate with the person who is at risk, as well as information on suicide prevention.
The timing coincides with a surge in suicide rates in the United States to a 30-year high. The increase has been particularly steep among women and middle-aged Americans, reflecting widespread desperation. Last year, President Barack Obama declared a World Suicide Prevention Day in September, calling on people to recognize mental health issues early and to reach out to support one another.
Don't miss out on what's happening!
Stay in touch with top news, as it happens, conveniently in your email inbox. It's FREE!
Facebook has long thrust itself into major societal debates because of its vast reach and the enormous diversity of human behavior it sees. About 72 percent of Americans — and 77 percent of U.S. women — use Facebook, according to a 2015 study by Pew Research.
Yet Facebook is walking a tightrope, trying to explore its role as an arbiter of social change without upsetting the hundreds of millions of people who regularly use its services. Some of the suicide prevention tools may trouble groups that have concerns about digital privacy. Many of those groups have already become wary of what they see as Facebook’s overreach in people’s personal lives.
In October 2014, the company was embroiled in a scandal after it emerged that Facebook researchers used its news feed feature to try to manipulate the emotions of its users. The incident resulted in an overhaul of its user research methodology. Last month, Facebook grappled with accusations of political bias and fears about how much it could influence the views of its members.
“The company really has to walk a fine line here,” said Dr. Jennifer Stuber, an associate professor at the University of Washington and the faculty director of Forefront, a suicide prevention organization. “They don’t want to be perceived as ‘Big Brother-ish,’ because people are not expecting Facebook to be monitoring their posts.”
Facebook said it had a role to play in helping its users to help one another. About a third of the posts shared on the site include some form of negative feelings, according to a study released in February by the company’s researchers. Posts with negative associations tended to receive longer, more empathetic comments from Facebook friends, the company said.
“Given that Facebook is the place you’re connected to friends and family, it seemed like a natural fit,” said Dr. Jennifer Guadagno, a researcher at Facebook who is leading the suicide prevention project. Facebook has a team of more than a dozen engineers and researchers dedicated to the project.
Facebook’s new suicide prevention tools start with a drop-down menu that lets people report posts, a feature that was previously available only to some English-speaking users. People across the world can now flag a message as one that could raise concern about suicide or self-harm; those posts will then come to the attention of Facebook’s global community operations team, a group of hundreds of people around the world who monitor flagged posts 24 hours a day, seven days a week.
Posts flagged as potential self-harm notes are to be expedited and reviewed more quickly by the team members, who also examine posts that Facebook users have reported as objectionable. Community operations team members who evaluate potentially suicidal content are given special training, Facebook said.
The person reporting a suicide note is given a menu of options, including the ability to send a Facebook message directly to the friend in distress or to a mutual friend to coordinate help. Facebook will provide a suggested text message to send, or users can fill in their own words.
“People really want to help, but often they just don’t know what to say, what to do or how to help their friends,” said Vanessa Callison-Burch, a Facebook product manager working on the project. Users are also provided with a list of resources, including help lines and suicide prevention material, that they can click through to gain access.
If Facebook evaluators believe a post is a call for help or a distress signal, the person whose message was reported will be presented with a similar list of options the next time that person logs into Facebook and views the news feed, including tips and resources on what to do if the person feels suicidal. Such people are also prompted to reach out to friends who may be able to support them on Facebook.
The company said its work on suicide prevention started about 10 years ago, after a number of suicides in Palo Alto, Calif., the former location of Facebook’s headquarters and the city in which Mark Zuckerberg, its chief executive, lives. Palo Alto’s two public high schools have suicide rates between four and five times the national average.
Over the past year, Facebook has conducted a small test with suicide prevention tools in some countries, in conjunction with outside entities like Forefront, NowMattersNow.org and Save.org, a group of suicide prevention organizations. The company declined to share data on the early results.
“They’ve been very reluctant to release data on the project, even to partners like us, because of the privacy issues,” said Stuber of the University of Washington.
Facebook is becoming more forthcoming on other internal practices and how it conducts user research. On Tuesday, in a move separate from the suicide prevention tools, Facebook laid out its research review methods in a study conducted by two of its researchers. The company is hoping that the study, which is being published in Washington and Lee Law Review, will be used by other academics to help with their research methodologies.
Some nonprofit organizations and researchers that have pushed Facebook to be more aggressive in its suicide prevention efforts hope the changes push other tech giants to act similarly.
“We’re losing more people to suicide than breast cancer, car accidents, homicides,” said Dr. Dan Reidenberg, executive director of Save.org. “We’d be fortunate for other companies to get on the bandwagon.”
© 2016 The New York Times Company