Can Facebook’s Machine-Learning Algorithms Accurately Predict Suicide?
March 10, 2017
Facebook has just expanded the array of tools it provides to reach users at risk for suicide and connect them with mental health resources. The menu of options that allows Facebook users to report posts with content indicating potential thoughts of suicide or self-harm will now be available for Facebook live streams as well. The social media company is also piloting a pattern recognition algorithm that it hopes will automatically identify posts of concern even if they have not yet been reported by users. According to Facebook spokesperson William Nevius, the algorithm will use words or phrases related to suicide or self-harm in a user’s post, and in comments added by friends, to determine if the person may be at risk. The system will automatically alert Facebook’s Community Operations team about posts of concern so that the team can quickly review them. If the team determines that support is warranted, they will ensure that information about helping resources will appear in the user’s news feed.
Spark Extra! Check out a community guide for Facebook users.