When social media started off as a concept, nobody would have thought that it would be used to share some of the most shocking content. Videos that would make you question the very basis of something as popular as social networking.
We’re talking about the recent suicides and murders that have been performed LIVE on Facebook. The most recent being a father strangulating his daughter. It’s horrific, it’s saddening and it’s viral. In such a scenario, does the onus lie on Facebook or the person committing the crime?
When 24-year old Arjun Bharadwaj committed suicide this year and recorded it LIVE on Facebook, it wasn’t going to be the last killing incident on social media, or so the trend has been.
In United States, Steve Stephens uploaded the video of murdering a 74-year-old man, onto Facebook, and the gruesome video garnered millions of views. The choice of platform certainly allowed the performed crime to spread widely and quicker than ever before. Even though Facebook removed the content and suspended the account, the damage had already been done.
Social networks challenge us with a perverse paradox. On one hand, social media sites offer us a platform to express ourselves; on the other, such sites may leave us in a void, where we are seeking attention from those who are already busy sharing their lives online. Cyber space can be liberating as well as suffocating. We may lose our inhibitions or become more conscious about every little thing that we do, to post it online.
The latest in this series of violent crimes streamed using Facebook’s LIVE video feature, is of a Thai man killing his 11-month-old daughter, before committing suicide. People could access the videos of the child’s murder on her father’s Facebook page for about 24 hours, until they were taken down. The first video drew more than 1,12,000 views, while the second video showed 2,58,000 views.
These acts raise the question of whether technology companies bear any responsibility for the acts of their users.
On its part, Facebook has maintained that there is no place for such content on the social network. Chief Executive Mark Zuckerberg said Facebook would do all it could, to prevent such content in future. And that they (“We) work hard to keep a safe environment (on Facebook”).
The company has also made clear that the type of content posted by such users directly violates Facebook’s community standards that prohibit the celebration or glorification of violence. But is a statement enough?
Facebook definitely has a responsibility to check the content on its platform, and the company says that it is working with law enforcement to catch those who abuse their features. However, Facebook’s power over its users is and can only be limited.
Should Facebook remove tools which these criminals use to draw attention to their acts? Isn’t the Facebook community of those who watch and share such horrific videos just as responsible? These questions are now becoming a huge grey area to which none of the parties involved has an easy and definite answer. The company has said that it was reviewing how it monitored violent footage and objectionable material.
But how far is Facebook to be blamed? After the disturbing footage from Thailand, the country’s Ministry of Digital Economy said that it will not be able to press charges against Facebook, because the site is only the service provider.
Not just Facebook, Twitter and Google have been accused of being instrumental in the rise of Islamic State. YouTube has also been blamed for its regular use by the terror group to broadcast beheadings.
In US, Section 230 of the Communications Decency Act might be able to help tech companies and prevent any civil liability. Back home in India, Section 79 of the IT Act exempts platform providers.
Facebook has always strongly resisted assuming the role of traditional media that exercises editorial control over every piece of information. Under its current policy, Facebook lets users flag content which they think inappropriate. The only option that it might have would be taking down its LIVE feature altogether. But the step will take a big toll on the site’s popularity and innovation efforts.
Ray Surette, Professor in the Department of Criminal Justice at the University of Central Florida, says that “technology has changed the psychology of the crime so much, but what it has changed is the reach and distribution of it. That broad reach also fuels the ‘infotainment’ value of the crime.”
As social media sharing features evolve with time, tech companies will have to struggle with how they promote these features to an ever-growing user base, while ensuring that the platform is not being used for gruesome crimes, cyber laws will have to be clearer and stricter and users will have to introspect.