His Daughter’s Murder Keeps Circulating on YouTube, So He’s Filing an FTC Complaint

Credit to Author: Samantha Cole| Date: Thu, 20 Feb 2020 17:32:00 +0000

In August 2015, 24-year-old reporter Alison Parker was murdered on live television, along with her cameraman, Adam Ward. Video from the broadcast, as well as video from a GoPro worn by the gunman, have been shared across YouTube thousands of times, by conspiracy theorists and users who want to draw views with the shocking, gruesome footage.

Today, Alison's Father, Andy Parker, filed a complaint with the Federal Trade Commission against Google and YouTube, claiming that the way the platform has handled videos of his daughter's death—and many other videos of brutal shootings and murders—deceives consumers.

Despite these videos clearly being against YouTube's community guidelines, which forbids images of graphic violence and death, many of these videos remain up on YouTube for years.

The complaint alleges that YouTube misrepresents how much violent content is on the site, and fails to tell users that the responsibility to get videos of victims' deaths taken down is on them—and most often, on the families of the victims themselves. Even when someone goes through the long, painstaking process of finding and reporting one of these videos, YouTube often doesn't even remove the video. The filing calls these practices "deceptively burdensome" and that the site "utterly fails" to follow through on promises to take down content in clear violation of its own terms of use.

"YouTube claims that it polices its platform for these violent and disturbing videos, when in truth it requires victims and their families to do the policing—reliving their worst moments over and over in order to curb the proliferation of these videos," the filing states. "In Mr. Parker’s case, even videos of his daughter’s murder that were uploaded on the day of her death—nearly five years ago—and have been reported repeatedly since then, remain on the site to this day."

“We specifically prohibit videos that aim to shock with violence, or accuse victims of public violent events of being part of a hoax," Google told Motherboard in a statement. "We rigorously enforce these policies using a combination of machine learning technology and human review and over the last few years, we’ve removed thousands of copies of this video for violating our policies.”

According to the filing, Andy Parker can't bear to watch hundreds or thousands of videos of his daughter's moment of death. YouTube requires flaggers to document specific timestamps noting where in the video the violence happens, along with written descriptions of that violence. A group of volunteers, led by a father of one of the children murdered in the Sandy Hook school shooting Lenny Pozner, help scan the platform for these videos and report them to YouTube.

Much like Pornhub's moderation practices, YouTube requires victims of abuse, harassment, and violence to seek out and flag the content depicting the most traumatic moment of their lives, in order to try to get it taken down. Also like Pornhub, YouTube removes videos that violate copyright—lest the company get sued for infringement. For grieving parents trying to scrub the site of their childrens' deaths, the process seems to be much more cumbersome.

This article originally appeared on VICE US.

http://www.vice.com/en_ca/rss