https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTlUB6gVPSI2Ebt4hwlhLmGJB-TNf0yQugjv_CvZWkLeB8zB1PLGVyM3ZO6MqAfTeUbfQd-8pLeDmddhyny0xRmVY_zsSjGGo9LtwarabYrVmjW529GkP8uH8sq_cBh5OjOo7YRPf0zMA/s1600/Facebook+killing+video+puts+moderation+policies+under+the+microscope%252C+again.png
“This is a horrific crime and we do not allow this kind of content on Facebook.”
The “content” the Facebook spokesperson was referring to was the apparent killing of 74-year-old grandfather Robert Godwin, shot at close range in Cleveland on Sunday afternoon as he walked home from an Easter meal with his family. Godwin’s suspected attacker, 37-year-old Steve Stephens, filmed a first-person view of the shooting and uploaded it to his Facebook page, where it remained for more than two hours before being taken down – not before the video had been copied, reposted and viewed millions of times.
The victim’s grandson, Ryan Godwin, begged people on Twitter to stop sharing the footage, saying “that is my grandfather, show some respect”.
The case yet again raises questions about the social networking site’s ability to moderate content, particularly when there is an active crime unfolding.
The incident comes on the eve of Facebook’s F8, an annual event for developers, and at a time when the company is working hard to promote its role as an enabler of civic engagement. Two months ago, CEO Mark Zuckerberg penned a 5,700-word manifesto outlining measures the social network was taking to address several challenges faced by humanity.
Within the letter, Zuckerberg explained that the company is researching systems that use artificial intelligence to look at photos and videos to flag content for review. “This is still very early in development, but we have started to have it look at some content, and it already generates about one-third of all reports to the team that reviews content for our community,” he said.
Facebook did not respond to the Guardian about whether these automated systems played a role in identifying the murder video.
This is far from the first time that Facebook has amplified a crime in real time. Last month a 15-year-old girl was raped by multiple people in Chicago, an attack that was streamed on Facebook Live. In January three men were arrested in relation to a similar incident involving the live-streamed rape of a woman in Sweden. Last year 23-year-old Korryn Gaines used Facebook to broadcast a standoff with police in Baltimore, which ended in the mother of one being shot and killed. Facebook has also hosted videos showing the torture of a young man with disabilities in Chicago, the musings of a spree killer being chased by police, child abuse and now murder.
“There have been beatings, rapes, suicide … other incidents seemed to be building to this,” said Sarah T Roberts, an information studies professor from UCLA who studies large-scale moderation of online platforms.
“The question I have is at what point do we transfer some of the responsibility for these acts to the platform?”
Terrorists, protestors and narcissistic criminals have always used the media to ensure that “performance” crimes make maximum impact. What’s different now is the access people have to tools – via their smartphone – to create, publish and distribute content at the touch of the button. Committing a crime for an audience has never been easier.
“Social media removes the gatekeepers between performance and distribution,” said Raymond Surette, professor of criminal justice at the University of Central Florida, who has studied the phenomenon. “It’s an avenue for certain types of offenders to get their message out totally unedited.”
The attention from online peers, combined with immediate feedback in the form of comments, reactions and shares, can be intoxicating. The fact that the footage is self-incriminating doesn’t matter to some offenders.
“Being famous for being a bad person is more acceptable for some people than being an unknown good person,” said Surette, adding that if the 9/11 terrorists “had the capability to live-stream their hijackings and plane explosions they would have done it”.
Surette doesn’t think there’s much Facebook can do to prevent footage of these crimes from being uploaded (“If you get an obscene phone call you don’t blame the phone company,” he said) but does believe the company has a responsibility to take videos and live streams down as quickly as possible.
“The less time it’s up there, the less likely it’s going to generate a copycat,” he said.
On Monday afternoon Facebook published a blog post outlining a timeline of what happened, highlighting the fact that Stephens posted three videos in total: the first announcing his intent to commit a murder, then a second, two minutes later, of the killing itself, followed by a live stream confessing to the act.
“As a result of this terrible series of events, we are reviewing our reporting flows to be sure people can report videos and other material that violates our standards as easily and quickly as possible,” said Justin Osofsky, vice-president of operations.
Osofsky said that no Facebook users reported the first video and that the company only received a report about the second video more than an hour and 45 minutes after it was posted.
“We disabled the suspect’s account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind. But we know we need to do better.”
In addition to improving reporting flows, Facebook will use artificial intelligence to prevent the same content from being re-shared and pledged to improve its review process.
Beyond Facebook’s responsibility, the fact that footage of the murder has been viewed so widely – with one version of the video seen more than 1.6m times – highlights an ugly side of human nature.
“It’s the same reason people slow down to watch a car crash,” Surette said. “The dark side is an attraction for everybody.”
Of course not everyone wants to slow down and gape at gore, but the nature of content sharing on Facebook, Twitter and other platforms means the viral murder video – converted in some places into an autoplaying Gif – was foisted into people’s timelines.
“The way this material is often interrupted is because someone like you or me encounters it,” Roberts said. “This means a whole bunch of people saw it and flagged it, contributing their own labour and non-consensual exposure to something horrendous. How are we going to deal with community members who may have seen that and are traumatized today?”
Stephens remains on the run. Cleveland police urged residents of Pennsylvania, New York, Indiana and Michigan to be on alert.
guardian.co.uk © Guardian News & Media Limited 2010
Published via the Guardian News Feed plugin for WordPress.
Facebook killing video puts moderation policies under the microscope, againhttps://goo.gl/jejN9O
0 comments:
Post a Comment