INTERNATIONAL: Facebook Is Using AI to Try to Prevent Suicide
December 01, 2017
Facebook has announced that it is rolling out a new suicide prevention tool to most of its two billion users. The social media giant has developed an artificial intelligence program to identify posts, videos, and comments that indicate a user might be considering suicide or self-harm. Pattern recognition technology will scan user content for concerning words or phrases and then triage it for review by the company’s community operations team. The algorithm will also be applied to user reports of content suggesting potential suicide or self-harm. To help ensure timely responses to those in distress, users will be offered support options, such as contacting a crisis hotline or law enforcement. In the most serious cases, community operations reviewers will call first responders. “We’ve found these accelerated reports—that we have signaled require immediate attention—are escalated to local authorities twice as quickly as other reports,” said Facebook Vice President of Product Management Guy Rosen.
Spark Extra! Read the Facebook press release.