Andy Parker, The Independent
On 21 February 2023, the US Supreme Court will hear the landmark case Gonzalez v Google. The family of Nohemi Gonzalez, a California college student who was killed in an terrorist attack in 2015, sued Google-owned YouTube for promoting the group’s videos via its algorithms. At issue is whether the targeted recommendations YouTube’s algorithm makes to users — in other words, suggesting the next video to watch — are shielded by Section 230 of the Communications Decency Act.
I have firsthand experience on how algorithm amplifications employed by YouTube and Facebook have harmed people. On the morning of 26 August 2015, my daughter, journalist Alison Parker, was conducting a live interview in Moneta, Virginia, when a disgruntled former reporter wearing a GoPro camera shot and killed her and her cameraman, Adam Ward. The shooter posted the footage online which showed him approaching his victims, firing at least eight shots, and then chasing Alison down as she tried to flee.
I have never seen the video and never will. At the time, I had a public YouTube account containing videos from my previous work as a professional actor. In the hours, days, weeks and years following my daughter’s murder, I was inundated with threatening and distressing messages from conspiracy theorists and hoaxers on the site. They then began posting the raw live footage of Alison’s murder on YouTube, Facebook, and Instagram for sadistic entertainment, claiming the shooting was staged and accusing me of being a paid actor pretending to be Alison’s father.
For years, these videos of Alison’s dying moments have proliferated on social media platforms. They’re not just edited to increase their shock value, but worse: these platforms fed that sick appetite by recommending additional hoax videos of Alison’s murder. It seemed that the more a user watched, the more these platforms “recommended.” By graphically depicting people being murdered and capitalizing on their final moments for shock value and entertainment, these platforms violate their own terms of service which proclaim that “violent content is not allowed” and claim they police their sites for these violent and disturbing videos.
Victims and their families are required to also do the policing, reliving their worst moments over and over in order to curb the proliferation of these videos. Even when victims abide by this burdensome process for reporting these videos, the platforms fail to adhere to their own requirement that they be removed.
Eric Feinberg of the Coalition For a Safer Web, who has flagged hundreds of videos and shared the links with the press and lawmakers, says videos of her murder that were uploaded on the day of her death remained on the sites for years after repeated reports. It is indisputable that these platforms have the capacity to effectively police themselves, but refuse to do so in pursuit of financial gain. Mr Feinberg told The Independent that Section 230 “protects Big Tech while not allowing consumers to litigate in an industry that can’t be sued”.
That is why videos of Alison’s murder are just a drop in the bucket. There are countless others on these platforms depicting individuals’ moments of death, advancing hoaxes and inciting harassment of the families of victims. These videos include raw footage of shootings, dead bodies, people preparing to kill themselves, stabbing, and dangerous conspiracy hoaxes. These victims were loved in life and don’t deserve to be exploited in death by Google and Facebook.
With the help of Georgetown University Law Center, I filed a complaint against Google with the FTC in 2020 and Meta in 2021 for violating their own terms of service. The complaints made were echoed in the congressional testimony of Frances Haugen, the Facebook whistleblower. Sadly, the FTC has failed to respond to either valid complaint.
With the exception of endless hearings that exposed alleged misconduct as described by Haugen, an impotent Congress has failed to act, presumably paralysed by ubiquitous and powerful tech lobbyists. In one of those endless hearings, I testified before the Senate Judiciary Committee in the summer of 2019. At the end of my testimony, I made a plea to the members to give me my day in court to hold Google accountable. Senator Ted Cruz said that if back in the day Blockbuster had sent me the video of Alison’s murder, I could sue them for everything they owned. My response was: “Help me be able to do that, senator.”