Videos of murders, sex assaults, suicides and tortures live-streamed or uploaded on Facebook prompted CEO Mark Zuckerberg to announce Wednesday the hiring of 3,000 more people to its 4,500-person community operations to review and report such incidents (see 1705030010). Experts said in interviews Thursday the move should help the company better manage such postings so they're taken down as quickly as possible and not widely disseminated. Some said the firm should do more.
Family Online Safety Institute CEO Stephen Balkam said the hires will be an enormous escalation of the social media site's global team providing 24/7 response. "There's never ever been a global platform like Facebook before and this is unprecedented in what they've having to deal with," he said. His group is a member of Facebook's safety advisory board, he said, and the company is also among FOSI's more than two dozen corporate members.
Facebook also relies on users to report problems and increasingly can use artificial intelligence and machine learning to help reviewers sift through and spot objectionable material, said Balkam. Asked during a Wednesday Q1 call about AI, Zuckerberg said over time, it could "do a better job" of flagging and prioritizing what the community operations team should be reviewing.
The company has faced several high-profile incidents on its platform this year. In March, males sexually assaulted a teen girl, live-streamed on Facebook. In Thailand, a man killed his 11-month old daughter on Facebook Live and hanged himself. On April 16, Steve Stephens allegedly killed a man and posted the video. The next day, Global Operations Vice President Justin Osofsky wrote in a blog post the company would review its process to ensure people quickly and easily can report content violating its standards.
Kalev Leetaru, a senior fellow at George Washington University who has written on the topic, said Facebook's hires help but are mostly public relations and the company can use more technology to help combat violent content. For instance, Facebook and others use digital fingerprinting to tag terrorist and revenge porn imagery that can be screened and blocked, he said, and the company can deploy the same technology network-wide, which may require more computing power.
Leetaru said Facebook could develop AI and machine learning applications that can potentially help reviewers inspect live videos. As an example, AI could flag imagery such as a gun -- or even the sound of a scream -- and alert a reviewer to pause the video and chat with the user who's streaming it. Leetaru said it could turn out to be a terrorist attack or a police shooting and Facebook's policies may allow that, so a reviewer could unpause that video (which would continue to be recording). "Companies are making an economic decision that violent material is not threatening their bottom line," he said. "Once it does, I think that's when you'll really see action."
Desmond Patton, an assistant professor of social work at Columbia University, isn't sure the hiring is enough, but it's a step in the right direction, he said "What I would like for them to really consider is who should be part of the 3,000-member team," said Patton, who has done research on community violence in neighborhoods and on Twitter, Facebook and YouTube. He wondered whether Facebook will hire people with a background in computer science or human-centered design or will the company look for community leaders and young people who have grown up in underserved communities.
Another issue for Patton is how Facebook could use the community operations team besides just reviewing and removing videos and other content. He said the company has an opportunity to connect with community-based organizations and follow up with some mentoring, emotional and other support services, including jobs. Instead of Facebook reviewers simply taking down negative images, he said it's important to understand why that content was put up.
While Twitter and other social media sites also responded to objectionable material, Balkam said Facebook's approach "appears to be the most appropriate and the most effective way of dealing with violent material" and also bullying, harassment and sexualized words, images and videos. He said he's unsure what kind of training that Facebook reviewers will get but the company has close working relationships with suicide prevention organizations and other similar groups. In the long run, Balkam said the problem is a "generational" one. People need to become good digital citizens, meaning "upstanders and not bystanders when they see bullying or online harassment," he said.
written by Dibya Sarkar
Facebook's Hiring 3,000 to Review Bad Material Not a Complete Solution, Experts Say.pdf
/documents/217/Facebooks_Hiring_3000_to_Review_Bad_Material_Not_a_Complete_Solution_Experts_Say.pdf
Your information will be subject to a different privacy policy that we recommend you review. FOSI has no control over the content of an external site.