The social media network has been plagued by acts of extreme violence.
(On top of that, using too much Facebook is just plain bad for you.) In response, Facebook CEO Mark Zuckerburg announced in a Facebook post plans to hire 3,000 more people to their global "community operations team" in an effort to recognize and remove these posts.
In the post, Zuckerburg said, "If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner—whether that's responding quickly when someone needs help or taking a post down."
According to an article published in Gizmodo, however, this could end up making little difference and actually cause other anticipated problems. The article pointed out, first of all, that Zuckerburg's point left out some glaring details concerning these new hires, such as their employment status, specifics on training in dealing with these situations, and whether or not they'd receive any sort of support or counseling services from Facebook. One can only imagine what it must be like to watch videos of horrific acts as a job.
The article also stated that it has been shown that "many of the largest companies outsource their moderation teams, letting low-paid international workers—who may not have intimate knowledge on US social mores—view the very worst posts, day in and day out."
While it's applaudable that Facebook wants to start taking a stand towards keeping their users safe, this sort of task seems best designated for people with the tools and capacity to recognize and respond effectively, and also who have the resources to deal with the psychological toll such a job would surely manifest onto a person.