Over the last month, we have seen a rash of people live streaming suicides, murders, and other crimes using Facebook’s live streaming feature. Mark Zuckerburg, CEO of Facebook, has announced that the company will be hiring an additional 3,000 people to review videos. They want to be able to remove bad videos quickly before they spread.
They already have 4,500 people in place for doing such reviews. Streaming such videos is against Facebook’s rules, but they are not removed until users report them. Facebook should be able to take them down faster.
Zuckerburg said in part:
“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation.”
“And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else.”
Here is Zuckerburg’s entire Facebook post.
Mark Zuckerberg has announced Facebook will hire 3,000 people to review videos after a spate of violent video controversies pic.twitter.com/SvejRhmCJ6
— David Mack (@davidmackau) May 3, 2017
Some of these horrific incidents include:
A man in Thailand hanged himself and his daughter on Facebook live back in April.
“We have a lot of work … we will keep doing all we can to prevent tragedies like this from happening.”
Several people have live streamed their own suicides on Facebook.
Back in March, Facebook announced that it would be using artificial intelligence to monitor Facebook posts for potentially suicidal users.
I personally applaud Facebook’s efforts to prevent these tragedies.
Here is Zuckerberg’s response to the Cleveland murder: