Business Web Sites
Daily BulletinHoliday Centre

The Conversation

  • Written by Jennifer Beckett, Lecturer in Media and Communications, University of Melbourne

Facebook has recently come fire for not doing enough to keep disturbing content out of our newsfeeds. It hopes a hiring spree will fix the problem.

In a Facebook post Wednesday, company founder and CEO Mark Zuckerberg announced plans to add an additional 3,000 moderators to its community operations team. These new employees will help review posts that are flagged as troubling by the community and “improve the process for doing it quickly”.

Given the recent spate of news stories about Facebook Live being used to stream everything from rapes to murders in real time, it’s fair to say the social media giant has work to do.

Algorithms fall short

So why are people necessary when algorithms can do the job?

Facebook uses software to filter some types of explicit or illegal content, but there are limits to its capabilities. That’s where humans come in.

At the 2015 SWARM conference for community managers, Mia Garlick, Facebook’s director of policy for Australia and New Zealand, noted the social media company uses Microsoft’s PhotoDNA software to weed out known images of child pornography posted onto the platform. And therein lies the rub – the program must already “know” the images to delete them.

Algorithms are only as good as we teach them to be. In the case of something like PhotoDNA, someone – a human – has to find and add additional images to the software’s database.

These automated systems can also be blunt tools. They see the world as black and white, so they’re not always good at deciding when something falls into a grey area – say, making decisions on whether that nude photo is a piece of art or just a nude photo.

image Mark Zuckerburg hopes new content moderators will help prevent violent content getting into the newsfeed. John Adams/flickr, CC BY

This issue also arises when cultural context or local law is at odds with a Facebook post.

In a 2015 blog post explaining Facebook’s community guidelines, Monika Bickert, Facebook’s head of global policy management, and Chris Sonderby, deputy general counsel, touched on the vexed notion of blasphemy to demonstrate how hard it can be to moderate language and imagery in different countries.

The update followed a news story about Facebook blocking a page in Turkey after a court there ruled it was insulting the Prophet Mohammed.

“While blasphemy is not a violation of the Community Standards, we will still evaluate the reported content and restrict it in that country if we conclude it violates local law,” Bickert and Sonderby explained in the post.

In other words, we need humans to catch anything that falls between the cracks. That’s why Facebook relies on its human community to report content and flagged posts are still screened by its team of human moderators.

Moderators are people, too

Good moderation is at the heart of building a safe space online, and Zuckerburg’s stated commitment to building a better Facebook community is an important first step.

Still, one thing is often forgotten in this discussion: the people whose labour ensures the rest of us have a pleasant experience online.

Take Microsoft, creators of the PhotoDNA software used by Facebook, and its online safety team. In December 2016, two former employees of Microsoft’s team sued the technology company after allegedly developing post traumatic stress disorder.

According to the suit, the pair were responsible for reviewing imagery of “horribly violent acts against children”, among other content.

In particular, the lawsuit alleges the two employees did not receive sufficient support from Microsoft despite the traumatic nature of their work.

If the lawsuit is successful, it could have serious ramifications for any company employing content moderators. That could be anything from banks to the media and, of course, Facebook and its 3,000 new moderators.

These companies have a moral duty to help employees who may be routinely exposed to rape and torture imagery, though recent news reports suggest that’s not always the case.

In December, Germany’s Süddeutsche Zeitung newspaper found Facebook’s Berlin moderators facing dismaying conditions. The journalists reported employees were viewing violent and explicit material daily with little support while making “just slightly above the legal minimum wage”. Facebook declined to comment to the publication.

Whether Facebook’s new influx of moderators will create the “global community that works for everyone” Zuckerburg outlined in his recent manifesto remains to be seen. With billions of pieces of content being posted to Facebook everyday, it’s an almost impossible task.

Authors: Jennifer Beckett, Lecturer in Media and Communications, University of Melbourne

Read more http://theconversation.com/facebook-turns-to-real-people-to-fix-its-violent-video-problem-77156


The Conversation

Politics

Prime Minister - Step up in drought budget support

Drought-hit farmers, small businesses and rural towns are set for an immediate cash injection to keep stock fed and watered, keep businesses open, keep locals in work and pump funds into local eco...

Scott Morrison - avatar Scott Morrison

David Littleproud MP interview with Tom Connell

Interview with Tom Connell – Sky News NewsDay   The Coalition’s latest drought support package   TOM CONNELL: David Littleproud, thanks for your time. We've got this drought package going throug...

Tom Connell - avatar Tom Connell

Prime Minister Address Tom Hughes Oration Dinner

Thank you very much, Julian, for that very kind introduction.  It was very generous. Thank you very much for those words.  It's great to be here with you.  I'm here today to give the vote of thank...

Scott Morrison - avatar Scott Morrison

Business News

Acciona And Surfing Australia Partnership

Acciona Ambassador and 2020 World Surf League (WSL) Women’s World Tour rookie Isabella Nichols with Bede Noonan, Managing Director of Acciona Geotech at today's launch. CASUARINA/NSW (N...

Blainey Woodham - avatar Blainey Woodham

What is identity verification and why your business should start using it

As time goes by, new technologies and practices always find a way into the world of business. The ones that prove valuable are first adopted by the forward-thinking businesses until they finally b...

Diana Smith - avatar Diana Smith

How to protect your business with the help of National Police Check

Hiring for organizations is not an easy task. Every recruiter wants to employ reliable, honest, skilled people who are a good fit for their office culture. In short, hiring someone who will help the...

KONCHECK - avatar KONCHECK

Travel

Style in Summer: A Guide to Visiting Melbourne, Australia This Holiday Season

Summer is one of the most exciting months in the city of Melbourne, with dozens of activities sprouting up all over the place and endless activities for everyone to engage in. If you are someone w...

News Company - avatar News Company

Once in a lifetime things to do in Beijing

The capital of China is an absolutely incredible city where there is lots of see and do. Let's take a look at some truly once in a lifetime things that you can experience in the location. 1 - Cli...

News Company - avatar News Company

Enjoy a short break in Perth this summer

With summer just around the corner, now is the perfect time to book a short break in Perth, a city fast becoming a hub for the arts, shopping and culinary delights as well as the unique natural scen...

Sarah Peattie - avatar Sarah Peattie

ShowPo