In the wake of several Facebook videos depicting murder, suicide, rape and other violent acts, the social media giant says it is hiring 3,000 more people to review videos and remove those that violate its terms of service.
The company has been facing increased pressure to stop people from posting and sharing violent videos.
According to Facebook’s terms of service, violent videos are not allowed, but as recent events have shown, it can take the company some time to review and remove them.
The announcement to add staff to the already 4,500 who review videos was made Wednesday on Mark Zuckerberg’s Facebook page.
Facebook’s founder and CEO wrote, “Over the last few weeks, we have seen people hurting themselves and others on Facebook – either live or in video posted later. It is heartbreaking, and I have been reflecting on how we can do better for our community.”
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation, “ Zuckerberg wrote. “And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it – either because they’re about to harm themselves, or because they’re in danger from someone else.”
In addition to more staff, Zuckerberg said the company was going to enhance its software to keep violent videos off the site.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” he wrote, adding the company had recently acted on a report of someone considering suicide on Facebook, preventing them from going through with it.
…