WEB Notes: Did anyone ask for this? What if a friend or co-worker called “first responders” to your aid because they thought you were depressed or had suicidal thoughts. How would that make you feel? Like your privacy was invaded maybe? Hey, we sign ourselves up for this type of thing so I suppose we cannot complain about it. Just know the more we continue to feed these machines, the more they learn and the more invasive things will become. The whole frog in the pot people, turn up the heat little by little and you will never feel a thing.

This is software to save lives. Facebook’s new “proactive detection” artificial intelligence technology will scan all posts for patterns of suicidal thoughts, and when necessary send mental health resources to the user at risk or their friends, or contact local first-responders. By using AI to flag worrisome posts to human moderators instead of waiting for user reports, Facebook can decrease how long it takes to send help.

Facebook previously tested using AI to detect troubling posts and more prominently surface suicide reporting options to friends in the U.S. Now Facebook is will scour all types of content around the world with this AI, except in the European Union, where General Data Protection Regulation privacy laws on profiling users based on sensitive information complicate the use of this tech.

Source: Facebook rolls out AI to detect suicidal posts before they’re reported



Subscribe To Receive...

Verse of the Day with commentary, Bible Study, Bible Q&A and daily news with Christian notes in your inbox!



Study With Us!

Visit the Bible Study, Video and Bible Question and Answer section of our site.

Leave A Comment

You are invited to participate in our Christian Community by leaving a comment. We would love to read your point of view and inspiring messages. Please read our Community Guidelines before commenting and note all comments are moderated (Ephesians 4:29).