Daily Bulletin

Men's Weekly

.

  • Written by Andrew Quodling, PhD candidate researching governance of social media platforms, Queensland University of Technology
Anxieties over livestreams can help us design better Facebook and YouTube content moderation

As families in Christchurch bury their loved ones following Friday’s terrorist attack, global attention now turns to preventing such a thing ever happening again.

In particular, the role social media played in broadcasting live footage and amplifying its reach is under the microscope. Facebook and YouTube face intense scrutiny.

Read more: Social media create a spectacle society that makes it easier for terrorists to achieve notoriety

New Zealand’s Prime Minister Jacinda Ardern has reportedly been in contact with Facebook executives to press the case that the footage should not available for viewing. Australian Prime Minister Scott Morrison has called for a moratorium on amateur livestreaming services.

But beyond these immediate responses, this terrible incident presents an opportunity for longer term reform. It’s time for social media platforms to be more open about how livestreaming works, how it is moderated, and what should happen if or when the rules break down.

Increasing scrutiny

With the alleged perpetrator apparently flying under the radar prior to this incident in Christchurch, our collective focus is now turned to the online radicalisation of young men.

As part of that, online platforms face increased scrutiny and Facebook and Youtube have drawn criticism.

After dissemination of the original livestream occurred on Facebook, YouTube became a venue for the re-upload and propagation of the recorded footage.

Both platforms have made public statements about their efforts at moderation.

YouTube noted the challenges of dealing with an “unprecedented volume” of uploads.

Although it’s been reported less than 4000 people saw the initial stream on Facebook, Facebook said:

In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload […]

Focusing chiefly on live-streaming is somewhat reductive. Although the shooter initially streamed his own footage, the greater challenge of controlling the video largely relates to two issues:

  1. the length of time it was available on Facebook’s platform before it was removed
  2. the moderation of “mirror” video publication by people who had chosen to download, edit, and re-upload the video for their own purposes.

These issues illustrate the weaknesses of existing content moderation policies and practices.

Not an easy task

Content moderation is a complex and unenviable responsibility. Platforms like Facebook and YouTube are expected to balance the virtues of free expression and newsworthiness with socio-cultural norms and personal desires, as well as the local regulatory regimes of the countries they operate in.

When platforms perform this responsibility poorly (or, utterly abdicate it) they pass on the task to others — like the New Zealand Internet Service Providers that blocked access to websites that were re-distributing the shooter’s footage.

People might reasonably expect platforms like Facebook and YouTube to have thorough controls over what is uploaded on their sites. However, the companies’ huge user bases mean they often must balance the application of automated, algorithmic systems for content moderation (like Microsoft’s PhotoDNA, and YouTube’s ContentID) with teams of human moderators.

Read more: A guide for parents and teachers: what to do if your teenager watches violent footage

We know from investigative reporting that the moderation teams at platforms like Facebook and YouTube are tasked with particularly challenging work. They seem to have a relatively high turnover of staff who are quickly burnt-out by severe workloads while moderating the worst content on the internet. They are supported with only meagre wages, and what could be viewed as inadequate mental healthcare.

And while some algorithmic systems can be effective at scale, they can also be subverted by competent users who understand aspects of their methodology. If you’ve ever found a video on YouTube where the colours are distorted, the audio playback is slightly out of sync, or the image is heavily zoomed and cropped, you’ve likely seen someone’s attempt to get around ContentID algorithms.

For online platforms, the response to terror attacks is further complicated by the difficult balance they must strike between their desire to protect users from gratuitous or appalling footage with their commitment to inform people seeking news through their platform.

We must also acknowledge the other ways livestreaming features in modern life. Livestreaming is a lucrative niche entertainment industry, with thousands of innocent users broadcasting hobbies with friends from board games to mukbang (social eating), to video games. Livestreaming is important for activists in authoritarian countries, allowing them to share eyewitness footage of crimes, and shift power relationships. A ban on livestreaming would prevent a lot of this activity.

We need a new approach

Facebook and YouTube’s challenges in addressing the issue of livestreamed hate crimes tells us something important. We need a more open, transparent approach to moderation. Platforms must talk openly about how this work is done, and be prepared to incorporate feedback from our governments and society more broadly.

Read more: Christchurch attacks are a stark warning of toxic political environment that allows hate to flourish

A good place to start is the Santa Clara principles, generated initially from a content moderation conference held in February 2018 and updated in May 2018. These offer a solid foundation for reform, stating:

  1. companies should publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines
  2. companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension
  3. companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension.

A more socially responsible approach to platforms’ roles as moderators of public discourse necessitates a move away from the black-box secrecy platforms are accustomed to — and a move towards more thorough public discussions about content moderation.

In the end, greater transparency may facilitate a less reactive policy landscape, where both public policy and opinion have a greater understanding around the complexities of managing new and innovative communications technologies.

Authors: Andrew Quodling, PhD candidate researching governance of social media platforms, Queensland University of Technology

Read more http://theconversation.com/anxieties-over-livestreams-can-help-us-design-better-facebook-and-youtube-content-moderation-113750

Business News

Physical retail roars back: Christmas 2025 expected to be the biggest in years

Physical retail is back and it’s booming. Shopping centres across Australia are preparing for one of the biggest Christmas and Boxing Day sale seasons on record, driven by strong consumer confidence...

Daily Bulletin - avatar Daily Bulletin

Groundbreaking investment positions Agile Energy to slash power costs for Australian businesses and accelerate Australia’s rise as a green economic powerhouse

Agile Energy is now positioned to play a defining role in reducing energy costs for Australian businesses and fast-tracking the nation’s transformation into a globally competitive green economic pow...

Daily Bulletin - avatar Daily Bulletin

Why Most Companies Discover Data Breaches Too Late

Data breaches are more common than many people realise. They often occur quietly, with no alarms or visible signs, while sensitive information is exposed. Once the damage is done, it is difficult to u...

Daily Bulletin - avatar Daily Bulletin

Speed Dating For Business
hacklink hack forum hacklink film izle hacklink หวยออนไลน์jojobetสล็อตเว็บตรงgamdom girişpadişahbetMostbetbetofficejojobetcarros usadospin updizipalStreameastholiganbet girişpradabetcocktail glassessahabetpusulabet girişcasibomjojobet girişultrabetbetofficeBets10jojobetjojobetholiganbet色情 film izlecasibomnakitbahisgrandpashabet 7027jojobet girişjojobet girişholiganbet girişYakabet1xbet girişjojobetGrandpashabetgobahistrendbetbetofficekingroyaljojobetgiftcardmall/mygiftultrabet girişvaycasinomatadorbetbets10palacebetselçuksportscasibommadridbetbetciosekabetjojobetcasibomJojobetmeritkingcasibomcasibom girişdeneme bonusucryptobetjokerbetcasibomcasibommasterbettingmasterbettingmeritkingSekabetCasibomcasibom girişsekabetDinamobetparmabetVdcasinobetpuanMarsbahistrendbetultrabet girişpaşacasinoselçuksportspaşacasinokingroyalmavibetmeritkingmeritkingmeritkingçanakkale tırnakkalebetrinabetsahabetcasibomcasibomcolor pickerpadişahbetvbetsahabetcolor pickermeritbet girişkralbet girişultrabet girişultrabet girişultrabet girişbetnano girişcratosslot girişคลิปหลุดไทยCasibomcasibomHoliganbetdeneme bonusu veren sitelermeritbetonwindiyarbakır escorttimebetantalya escortgrandbettingjojobet girişmarsbahisbahsegelgrandbettingqueenbetqueenbetbahiscasinobahiscasinoultrabetbets10matbet girişRoyal Reelsroyal reelsnorabahiskolaybet girişKayseri Escortjojobet girişJojobetbetpasNişantaşı EscortmatbetmatbetbettiltStreameastcasibom girişKalebetCasibomfixbetaviator gametimebettimebettimebetbahislionistanbul escort telegrambetparkcasibomcasibomcrown155hb88super96pusulabetoslobetbetplayholiganbetbetparkstreameast한국야동av한글자막สล็อตเว็บตรงpornopadişahbetBetigmacasibomBetigmaBetlora girişgaziantep escortspin2uneoaus96Casibomholiganbetmarsbahismatbetcasibombets10 girişffpokiesholiganbetbest australia online casino 2026best payid casino australiaholiganbetaresbetdeneme bonusu telegramholiganbetmostbetdaftar situs judi slot gacor hb88 indonesiamostbetmostbetteosbetrbetmatbetmalware porn eskortcasinowon girişholiganbetsahabetwww.giftcardmall.com/mygiftjojobetgrandpashabetcasibomcasibomgiftcardmall/mygift