Newsletter - sign up here
Search Webster
Webster's pieces from The Oldie
Webster's Webwatch

Livestream problems

June 2019

The appalling murders in New Zealand last March were made even more dreadful by the killer apparently wearing a camera and broadcasting live on his Facebook page.

That’s unspeakable enough, but the aftermath was awful, too; even though Facebook removed the video within minutes, 4,000 people did see it and only 24 hours later Facebook said it had already removed or blocked an astonishing 1.5 million copies of the video, and no doubt has removed millions more since then. 

I think we should understand how this sort of mass replication happens, often called ‘going viral’, and how hard it is to stop.   

The gunman clearly knew his way around the online world.   Very shortly before the attack began, a new post appeared on a little-known online political message board, probably from him, announcing the livestream and including links to it.  This would have alerted some people with similar political views, who tend to look at that site.

After the livestream started the man shouted ‘PewDiePie’, which is the name of the most popular YouTube channel, with over 90m followers; it’s an innocent affair, concerned with computer games and other puerile pursuits, but we can assume that the name was taken in vain simply to provoke a big online reaction.  It did; the owner said on Twitter (17m followers) that he was ‘absolutely sickened’ to have been mentioned; this, of course, added to the publicity.

Whilst the original video was online, it was copied, forwarded and re-posted by many people on different platforms, and those broadcasts were themselves re-posted by many more, and then again, and so on. It’s probably still happening.

Facebook, Twitter and Instagram have some automatic techniques for spotting this sort of thing, but they really rely on human users to report sightings.  And, of course, it was also being posted on many sites that are nothing to do with Facebook, with no public status to maintain.

As well as re-broadcasting the original video, many people filmed it using their phones and then re-broadcast those videos; this makes it even harder to spot automatically.

So, you see how this sort of thing can spread, and how hard it is for the platforms to prevent it.  Most of the time they don’t know what’s on their site any sooner than the rest of us do, and once the cat is out of the bag, it stays out.

Their choice is stark: they either allow people to post whatever they want, or they don’t.  That sounds simple enough, but the moral and financial considerations are directly opposed. If you do allow free publication, the users are both happy and numerous and the site is attractive to advertisers.  The cost of running Facebook this year will be over $60bn, so it will have to sell a lot of advertising to cover that and make a profit on top.

On the other hand, if they don’t allow people to post what they want, they’ll have to check every entry before it is published.  However, Facebook has over two billion users; most of them post regularly and expect instant publication. Imagine how many editors Facebook would need, 24 hours a day, to monitor them all.  Even if that were possible, most users would not welcome even a slight delay and would quickly move elsewhere, taking the advertising revenue with them.

It’s a problem that strikes at the heart of any social media site’s raison d'être.  Mind you, they don’t always help themselves.  Several days after the event, I found that if I typed just ‘New Zealand’ into Twitter or Google, the suggested search terms would include ‘New Zealand shootings full video’. You really would have thought that they could at least have stopped that happening.

 

A few links...

Here is some more, rather depressing reading on the subject.

Facebook’s response to the shootings:
https://newsroom.fb.com/news/2019/03/update-on-new-zealand/

Global Internet Forum to Counter Terrorism
https://www.gifct.org/
This industry funded body exists to try and is to substantially disrupt terrorists' ability to promote terrorism or glorify real-world acts of violence online.

YouTube’s Policies on harmful or dangerous content:
https://support.google.com/youtube/answer/2801964?hl=en
Breach these three times, and your account is closed.

Twitter’s policy on ‘hateful conduct’
https://help.twitter.com/en/rules-and-policies/hateful-conduct-policy

Facebook’s policy on objectionable content
https://www.facebook.com/communitystandards/objectionable_content

 

374