What the Tech? Facebook's live video problem
In the wake of the latest crime posted on Facebook, many people are looking for solutions. Are there solutions? How does Facebook attempt to keep graphic violence off the live-stream and off Facebook period?
Can Facebook do anything to keep graphic images from being posted and shared from Facebook Live? That is the question following Sunday's horrific murder which was posted to Facebook and then shared around the world.
If you went searching for the video posted by Steve Stephens you had no problem finding it. The smartphone video shows the Cleveland man saying he had found someone to kill, then shows him crossing the street, raising a handgun at 74 year old Robert Godwin Sr, and pulling the trigger. Stephens posted the video on his Facebook profile page along with broadcasting a Facebook Live video explaining that he was going to kill other people.
Since the incident and the video being shared across the internet, Facebook has been facing a backlash over how videos such as this one can be posted and shared before the social network can take it down. What is Facebook doing now and what can it do in the future? Anything?
Facebook relies on its users to self-police the network. It's made it clear on several occasions that Facebook users should report videos showing violence, bullying, sexual content to moderators who will review them and take necessary actions to have it removed. In the Easter Sunday shooting, Facebook claims the video showing the shooting wasn't reported for nearly 2 hours after it was posted.
Facebook says the shooting video was uploaded at 2:11pm EDT, but not reported until 3:59. The suspect's account was disabled and all videos removed from public viewing at 4:22pm EDT.
The challenge, or problem that Facebook faces with graphic videos and posts is three-fold:
#1 Facebook users have to report it. Without those reports Facebook moderators would not be aware of them. Remember, there are nearly 2 billion Facebook users or accounts to watch.
#2 Facebook users will report just about anything that offends them. That could be posts about politics, religion or anything else they don't agree with. Those reports are being reviewed by moderators like all the rest.
#3 By the time a post is reported and reviewed it's already been seen by hundreds or thousands of people. The posts are often downloaded, copied and uploaded again to other websites and social networks.
As was the case in the Easter Sunday shooting, the video has been posted, shared and re-shared over and over again. I found it on multiple Twitter feeds and on other Facebook accounts.
At Facebook's F8 developer conference, CEO Mark Zuckerberg acknowledged the company have a lot more to do to prevent posts like this one, from being spread around the world, giving attention to the criminal posting them.
"We have a lot of work and we will keep doing all we can to prevent tragedies like this from happening," Zuckerberg said.
Justin Osofsky, Facebook's VP of Global Operations said
"Artificial intelligence, for example, plays an important part in this work, helping us prevent the videos from being reshared in their entirety. We are also working on improving our review processes," he said.
Tuesday, police located the suspect, Steve Stephens. After a brief chase, according to police, he killed himself after stopping his car.