Facebook murder raises questions about social media providing an audience for crime

File photo shows Steve Stephens, the suspect in the murder of a 74 year-old man in Cleveland, Ohio on Sunday, April 16, 2017. Stephens uploaded video of the murder to Facebook. (CNNNewsource)

When Steve Stephens, the suspect in the murder of a 74-year-old man, uploaded the video of his crime to his Facebook account, it did not mark the first or the last time a crime would be publicized for a viewing audience, but the choice of platform certainly allowed it to spread farther and quicker than ever before.

Before Facebook removed the content and suspended the account of the suspect, the gruesome video had already received millions of views. Social media allows users to share everything, but this latest crime that follows closely on the heels of previous violent crimes broadcast over social media, has raised the question of whether technology companies bear any responsibility for the acts of their users.

Police are currently engaged in a 5-state manhunt across Ohio, Pennsylvania, New York, Indiana and Michigan for the murder suspect, 37-year-old Steve Stephens. On Easter Sunday, Stephens uploaded a video showing him shoot an apparently random victim, Robert Godwin, Sr.

"Found me somebody I'm about to kill," Stevens said in the minute-long video uploaded to Facebook. "I'm about to kill this guy right here."

Following the shooting, Stephens recorded a Facebook Live video from his car claiming he had already killed 13 people and was looking for more people to kill "until they catch me." Cleveland Police reported that they are not aware of any other victims.

Facebook issued a statement over the weekend denouncing the "horrific crime," stating, "We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies when there are direct threats to physical safety." The company has also made clear on similar occasions that the type of content posted by Stephen's directly violates Facebook's community standards that prohibit the celebration or glorification of violence.

Aside from its responsibility to remove the content and block the users who violate the platform's terms of use, experts say that Facebook and other social media providers do not bear legal responsibility for crimes committed or broadcast on their platforms.

Sophia Cope is a staff attorney at the Electronic Frontier Foundation (EFF), a non-profit group focused on digital civil liberties. She explained that Facebook is "definitely not responsible for the crime," and that under U.S. law it is unlikely the company will be held liable for allowing the video to be uploaded and shared.

"Like it or not, this kind of violent content, whether its actual violence or violence in movies ... it's protected by the First Amendment," Cope said. In addition, social media companies are shielded from the potentially limitless liability that could come from user-posted content under the 1996 Communications Decency Act, a law that essentially states that an internet company cannot be held liable for the acts of its users.

Even if there were an attempt to hold Facebook responsible for Stephens' videos, Cope said, it is likely that they would be able to invoke the statute under the Communications Decency Act and prevent any civil liability.

Facebook is one of a handful of social media sites that is the subject of a lawsuit for providing "material support" to terrorists in the 2016 Pulse Nightclub shooting in Orlando. The victims' families have argued on questionable terms that Facebook, Twitter and Google were instrumental to the rise of the Islamic State terrorist group (ISIS) and enabled it to carry out numerous attacks. A similar case against Twitter was dismissed in August 2016 under section 230 of the Communications Decency Act, finding that the company could not be held responsible for the terrorist rhetoric used on the platform.

Daxton R. “Chip” Stewart is an attorney and associate dean of at the Texas Christian University school of communications. He also said that Facebook could not be held legally responsible for Stephens' crime under both the Communications Decency Act and First Amendment.

"I don't blame the technology," Stewart said, adding that Facebook "didn't volunteer" for their platform to be used to broadcast images of murder. "I don't think the problem is the technology, I think the problem is the person," he continued, explaining that if Facebook were to be held liable, then so could the manufacturer of Stephens' camera or smart phone.

Video streaming platforms have certainly made it easier to share important information and horrific images with millions of viewers in only seconds. The Islamic State's use of Youtube and other social media platforms for propaganda purposes, to broadcast beheadings, are perhaps the starkest example.

So far companies like Twitter, Facebook and Google have stepped up their efforts to enforce community standards and terms of use to prevent the use of their platforms to promote violence or terrorism or other criminal behavior. But some people want them to do more.

Facebook has strongly resisted assuming the role of a traditional media company that exercises editorial oversight over every piece of information that is posted on its site both for practical reasons and on principle. Under its current policies, Facebook relies on its users to flag content as inappropriate, at which time its content managers can review the action and determine whether or not the content violates the company's community standards.

Even that practice of removing flagged content landed Facebook at the center of a controversy over violent content. In July 2016, the company temporarily removed a live video posted by a Minnesota woman who was filming her boyfriend, Philando Castile, dying after being shot by police. After reversing the decision and unblocking the content, Facebook CEO Mark Zuckerberg explained the decision to show the images, saying "they shine a light on the fear that millions of members of our community live with every day."

More recently, Facebook Live was the platform of choice for four teenagers in Chicago who broadcast themselves torturing a mentally disabled man earlier this year. The four were later charged with a hate crime.

In just the past five months, the platform has also been used by a handful of teenagers to livestream their suicides.

"I don't know a single major social media company that doesn't have a term of use or term of service that prohibits excessively violent content," Cope said, but EFF carefully watches how companies enforce those terms of service. "Clearly in this case, this is a gruesome situation," she said, and Facebook acted responsibly in removing the footage, but in other cases where the shared media has human rights or journalistic value, EFF encourages the companies to be "judicious" and exercise restraint when blocking content.

As social media sharing features evolve, companies like Facebook and Twitter are likely struggling with how they promote new features like live video streaming, while trying to ensure that the platform is not being used to promote the kinds of gruesome acts as the Cleveland murder.

"If you were to hold Facebook liable for the way users use it in a bad way, then they're just not going to make [the features] available. Their only option would be to take down Facebook Live," Stewart said, "because there is no plausible way to screen this 24/7 and put a delay on it the way they would if it were an FCC-regulated broadcast."

At the same time, Stewart acknowledged that there is a connection between the tool and the crime."Obviously there's something going on where people are seeing the tool and using it as an opportunity to live broadcast their horrible acts."

It is hard enough to get inside the head of a murderer, let alone to understand the motivations of someone who films and shares his crime as Stephens apparently did. According to Raymond Surette, professor of criminal justice at the University of Central Florida, the idea of "performance crime" is not a phenomenon that was born out of the social media age.

"Historically there has always been this idea of performance crime," Surette said. "And clearly these crimes are being committed with an audience in mind."

The viewing audience itself has also existed throughout history, driven to read about crime or watch an act of violence "to see what the dark side is capable of." But in the age of social media, the speed of distribution and the size of the audience is greater than before.

"I don't think the technology has changed the psychology of the crime so much, but what it has changed is the reach and distribution of it," Surette noted. That broad reach also fuels the "infotainment" value of the crime, he continued.

Unlike other forms of performance crime that aim influencing a social change or social response, like terrorism or violent political protest, the social media crimes are produced solely for an audience, with the offender seeking affirmation, even with the expectation that there will be positive repercussions, Surette explained.

Forensic psychologist and DeSales University professor Dr. Katherine Ramsland explained that Stephens' motivation is not yet clear, but the choice of broadcast platform indicates he was seeking attention for the act.

"Murder on Facebook seems to be like other forms of public violence," she noted. "He might want to make a big splash and get famous. He uses the medium to make a statement about his anger and degree of threat."

The incidents of livestreaming crimes is likely to run the course of other "fad-like" violent or non-violent phenomena, according to Surette, and will likely last between one to three months before subsiding.

"It will run a cycle, everything bad does," he said. Short of shutting down social media livestreaming and sharing features, the best thing companies can do is get better at flagging and removing content. "The less time it sits out there for consumption, the less social impact its going to have and the less likely its going to generate copycats."

close video ad
Unmutetoggle ad audio on off