Realist, not conformist analysis of the latest financial, business and political news

Facebook, Twitter And Christchurch Mosque Shooting Video – How To Stop It Spreading?

Assume that we do in fact want to stop the gunman’s video of the Christchurch mosque shootings in New Zealand from spreading. How do we do that? We’ve now wired the world so that billions can send each other video as and when they wish for nothing. It’s a very good system, a great human achievement. But if people are to use it in a manner we’d really rather such things are not used then what do we do about it?

There isn’t actually a method of doing so without imposing some form of censorship on what may be said – or videoed – by whom and how. Which is indeed a problem because the one thing we really do know is that any form of censorship will, in the end, be taken over by those who really shouldn’t have the power to determine what we may all say – or video, obviously.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]Facebook, YouTube and Twitter struggle to deal with New Zealand shooting video[/perfectpullquote]

There’s that technical problem, sure. Broadcasting the murder of 49 people isn’t quite what we’d like the airwaves to be used for.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””] “New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” Mia Garlick, Facebook’s director of policy for Australia and New Zealand, said in a statement. Facebook is “removing any praise or support for the crime and the shooter or shooters as soon as we’re aware,” Garlick added. [/perfectpullquote]

But there we’re already getting into our theoretical problem, not just our technical one. Well, actually, there’s a technical problem here too. As with Claire Perry and her campaign against porn on the internet. She argued for filters. One of the first places to be filtered was Claire Perry’s because of all the mentions of porn on it. Snigger, yes, but illustrating that it’s very difficult indeed to filter based upon what people are saying. To filter on words, yes, but language is more complex than that.

It’s entirely possible, for example, to praise something. As it is to write about how awful it is that someone is praising something in saying blah blah blah. The computer only sees the blah blah blah and doesn’t understand the difference between critique, praise or condemnation. Bit of a technical problem that.

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””] A suspected gunman broadcast live footage on Facebook of the attack on one mosque in the city of Christchurch, mirroring the carnage played out in video games, after publishing a “manifesto” in which he denounced immigrants. The video footage, posted online live as the attack unfolded, appeared to show him driving to one mosque, entering it and shooting randomly at people inside. Worshippers, possibly dead or wounded, lay huddled on the floor, the video showed. [/perfectpullquote]

I think we’re agreed we don’t want that broadcast? One part being that we’d prefer not to encourage copycats, another being, well, death deserves a little privacy perhaps? But we’ve still our theoretical problem here:

[perfectpullquote align=”full” bordertop=”false” cite=”” link=”” color=”” class=”” size=””]While platforms including Twitter and YouTube said they moved fast to remove the content, users reported it was still widely available hours after being first uploaded to the alleged shooter’s Facebook account. The video, which shows a first-person view of the killings in Christchurch, New Zealand, was readily accessible during and after the attack – as was the suspect’s hate-filled manifesto.[/perfectpullquote]

You see? That hate filled manifesto. Sure, I’m not going looking for it and I’d not publish it here if I did. But, well, this is our basic censorship problem. Once we instituted the tools, the systems, to censor then who is it who gets to decide what to censor? What, for example, is the definition of “hate” that shouldn’t be allowed to be said?

We already have people insisting that it is hatred to think that those with male genitalia should not be allowed to run in nominally women’s races. Actually, there are those who would argue that the use of nominally there is hatred which deserves no platforming. Ta Nehisi Coates comes dangerously close to insisting that not supporting slavery reparations is hatred. Opposition to abortion is often enough called misogyny which is itself a form of hatred.

No, I am not saying that those are the same as mass murder. I am saying though that banning the propagation of a mass murder video does indeed mean that there will be agitation at least for the banning of those other perceived hatreds and their propagation. Actually, you’ve only got to look at the no platforming manual to see what some would happily censor.

Which is indeed our theoretical problem. I’m entirely happy with the social media networks not propagating this video. I’m not happy with the Sorites Problem of when do they stop? Once we’ve got censorship tools what is the proper definition of when they should be used?

0 0 votes
Article Rating
Total
0
Shares
Subscribe
Notify of
guest

1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Jonathan Harston
Jonathan Harston
5 years ago

People can flap bits of flesh to form air vibrations modulated with information, and other people can detect those air vibrations with thin patches of skin and extract the embedded information from them. Horror! How do we stop people transmitting information that we don’t want them to?

1
0
Would love your thoughts, please comment.x
()
x