As YouTube, Facebook, and Twitter removed video of the 2019 Christchurch Mosque shooting, LiveLeak continued to host it and faced mounting pressure from the Government’s of Australia and New Zealand. After Islamic State posted the video of it beheading journalist James Foley in 2014, LiveLeak banned Islamic State from posting beheading videos. If you wanted to see footage of America firing Hellfire missiles at fighters in Afghanistan, you looked to LiveLeak.Īs the world got more complicated and more people surged online, Hewitt and others tried to better moderate LiveLeak. If a friend wanted to show you footage of a drug cartel beheading via chainsaw, they were showing you on LiveLeak. If you wanted to see footage of the Saddam Hussein execution you went to LiveLeak. LiveLeak contained much of the same footage but framed it in a more respectable way and the creators framed it as a place for citizen journalists to post uncensored videos of world events. Along with and others, Ogrish was a place people went to when they wanted to see the worst the web had to offer. LiveLeak began in 2006 as an offshoot of the early internet shock site Ogrish. I'm sat here now writing this with a mixture of sorrow because LL has been not just a website or business but a way of life for me and many of the guys but also genuine excitement at what's next.” “The world has changed a lot over these last few years, the Internet alongside it, and we as people.
Liveleak, a YouTube-style video site, compared the shooting video to the “glossy promo videos for ISIS” and said that it wouldn’t “indulge” the shooter by hosting his recording.“Nothing lasts forever though and-as we did all those years ago-we felt LiveLeak had achieved all that it could and it was time for us to try something new and exciting,” LiveLeak co-founder Hayden Hewitt said in a blog post explaining the change. Reddit banned a community called WatchPeopleDie, which had been active for the last seven years and attracted more than 400 thousand subscribers, after some of its volunteer moderators, already under increased scrutiny, refused to take down copies of the Christchurch attack. People wanted to share this.Įlsewhere online, other platforms were also scrambling. But its other explanations suggest the company was also thwarted by a much larger and less organized group: the Facebook users behind the rest of that 1.5 million - the people who, as the company said, might have been “filming the broadcasts on TV, capturing videos from websites, filming computer screens with their phones, or just re-sharing a clip they received.” It can gesture blame, as it did, at “coordination by bad actors” who seek to re-share the video with as many people as possible. (The company also acknowledged criticism that it should have done a better job.)įacebook can explain why such a video isn’t welcome on its platform, and how they removed it. On March 20, the company elaborated on its efforts, explaining that existing “content matching” systems and artificial intelligence hadn’t been able to stop the video’s spread because the content itself had morphed so many times. “In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload,” Facebook said publicly on March 16.
This, Facebook said, was among the reasons the company couldn’t quickly eliminate the footage from its platform, which the killer chose as his medium for his broadcast. The recording was made with that intention - to spread. The video of the Christchurch mosque killings portrays the murder of innocent people from the perspective of their killer, who also used it to disseminate his racist motivations and genocidal worldview.