Honolulu Star-Advertiser

Friday, December 13, 2024 76° Today's Paper


News

Fighting conspiracies, Sandy Hook parent thwarted by online policies

NEW YORK TIMES

A memorial outside the funeral service for Noah Pozner, 6, who was killed in the shooting at the Sandy Hook Elementary School in Newtown, Conn., in 2012. Leonard Pozner, the father of Noah, has tried to erase lies about his son from the internet, but Automattic, which runs WordPress, says “untrue content is not banned.”

Leonard Pozner says he spends hours every day trying to erase online conspiracy theories that the death of his 6-year-old son Noah at the Sandy Hook Elementary School was a hoax.

He has taken Alex Jones of Infowars, by far the most visible Sandy Hook denier, to court. He has put pressure on major tech companies to take action against the conspiracy theorists who flourish on their platforms.

But the bulk of his work is more methodical. Sandy Hook conspiracies are strewed around the internet on various platforms, each with its own opaque rules and reporting mechanisms. So Pozner has studiously flagged countless videos and posts for a wide variety of offenses — invasions of privacy, threats and harassment, and copyright infringement — prompting Facebook, Amazon and Google to remove false material about his son.

Twitter has been less receptive to his claims and some smaller sites have simply not responded at all. But one company, Pozner says, has actively pushed back against his attempts.

WordPress, one the internet’s biggest blogging platforms, is operated by a company called Automattic, which also runs a wide array of smaller sites and internet services. Sandy Hook conspiracy theorists have been able to remain on WordPress thanks, in part, to policies put in place to resist previous campaigns to get content removed from its service, particularly through the strategic use of copyright claims.

“Posting conspiracy theories or untrue content is not banned from WordPress.com, and unfortunately this is one of those situations,” Automattic said in a statement. “It is a truly awful situation, and we are sympathetic to the Pozner family.”

Last week, Apple, Facebook and Google’s YouTube removed videos and podcasts from Jones and Infowars, the conspiracy site he created, from their platforms. Facebook, after fielding criticism about its decision, wrote a blog post about its commitment to free expression and the difficult questions it faces in allowing “baseless conspiracy theories” and other offensive material on its sites. Twitter, like WordPress, has allowed the content to remain.

These debates have put tech companies into a sort of existential crisis. But for Pozner and others like him, the arguments have long been much more personal, as they struggle with images of family members being repurposed in horrifying new ways and experience harassment themselves because of misinformation online.

“The only items that concern me is when his image is being used in a negative, ugly way — denying the tragedy, calling him a crisis actor and everything else that the typical global village idiot on the net does,” Pozner said.

In the absence of uniform online policies about hoaxes, Pozner’s most effective tool has been filing copyright claims on images of Noah. He has filed such claims with Automattic about photos of Noah appearing on posts that labeled him a “crisis actor” who had been spotted in Pakistan after Sandy Hook and others that claimed he was a “fiction” and that photos of him were created using images of his older half brother.

Automattic has repeatedly responded to Pozner with form letters saying “because we believe this to be fair use of the material, we will not be removing it at this time.” The letters explain that fair use could include “criticism, comment, news reporting, teaching, scholarship, and research.” They also warn that the company could collect damages from people who “knowingly materially misrepresent” copyrights.

“The responses from their support people are very automated, very generic, very cold and there’s just no getting through to them,” Pozner said.

“They have taken this incorrect interpretation of freedom of speech to an extreme,” he added. “The only thing WordPress has taken out — and where I’ve been successful — is if someone posts personal information like my driver’s license or address.”

Automattic said the responses Pozner received were “a predefined statement” that is used in copyright situations. “We regret that it was used in this situation,” the company said. “We offer our apologies to the family for the response we gave to them.”

Pozner’s complaints appear to have been thwarted in part by longtime policies at Automattic intended to prevent the use of copyright claims to censor criticism and journalism on its platform. The responses sent to Pozner included a link to a post from 2013 describing the company’s efforts to deal with spurious but effective copyright claims. The post also highlighted that the company had filed suit against two particularly egregious offenders in an effort to “fight back” on behalf of people who were posting material on the platform.

Online platforms are not held liable for copyright infringement claims against people who use their platforms as long as they remove or block access to content in response to the claims. This is crucial to the function of any website where people can post content, and internet companies have traditionally tended to err on the side of removal, even when claims may be dubious. This has created opportunities for abuse, and Automattic has made fighting that a corporate cause.

The company created a “Hall of Shame” to call out businesses and people filing notices for frivolous reasons or to tamp down negative news coverage.

For years, Automattic’s strident response to copyright abuse earned praise from digital rights advocates. Now, this approach has effectively lumped in Pozner with the abusers. “Strictly from a copyright perspective, WordPress’ response is outside the norm,” said Tom Rubin, a lecturer at Stanford Law School who oversaw Microsoft’s copyright group and takedown process for 15 years.

“They avoid getting involved because fair-use determinations are notoriously complex and fact specific,” Rubin said of online platforms. “Platforms would rather eliminate their own potential liability by taking the content down and leaving it to the parties to battle amongst themselves in court.”

Matt Mullenweg, chief executive of Automattic, suggested in a recent interview with Recode that the company was confronting misinformation. “For things that we host and run and provide our kind of company backing to, implicitly through hosting it, we do avoid hate speech,” he said. He added that “egregiously fake or harmful things — we’re pretty good at getting off the system.”

In the case of Pozner, however, Automattic suggested that its approach was imperfect. “While our policies have many benefits to free expression for those who use our platform, our system like many others that operate at large scale, is not ideal for getting to the deeper context of a given request,” the company said in a statement.

Although the posts reported by Pozner “are not violating any current user guidelines, or copyright law,” the company said, “the pain that the family has suffered is very real and if tied to the contents of sites we host, we want to have policies to address that.”

Pozner, who has created a nonprofit group called the Honr Network devoted to “stopping the continual and intentional torment of victims” of major tragedies like Sandy Hook, has become an expert on the many compliance procedures and content-governing bureaucracies that exist inside tech companies.

He has removed photos of Noah from Facebook by relying on policies that protect the privacy of children under 13, a process that has required him to send the company his driver’s license and a copy of his son’s birth certificate. Pozner has also successfully filed such reports with Google.

“You can’t even measure the volume of content I’ve taken down at this point,” Pozner said.

At times, he has been able to explain the abuse he and his family have received, some of it because of his efforts to purge Sandy Hook conspiracies from the internet, and seek removals based on a slowly evolving awareness in the tech community about the issue.

(In June last year, a 57-year-old woman in Florida was sentenced to five months in prison for making death threats against Pozner and his family.)

A report to Vimeo led to a response Friday from a representative who said he would assign the case to a specialist, but first told Pozner that he was sorry to hear about his situation.

“Everyone has gotten better this year, especially with all the work that I’ve done to shame a lot of these platforms for continuing to abuse us and the memory of our children and just all of the ugliness that goes on,” Pozner said. “If you type in Noah Pozner now into an image search on Google, you’ll see it’s mostly normal results but it used to be 99 percent hateful angry memes, so the cleanup is huge.”

Pozner said he was tired of hearing technology companies say they do not want to be “arbiters of truth,” an oft-repeated refrain, particularly as concerns around misinformation on social media grow.

“Technology platforms have had this misguided, futuristic vision of freedom of speech and everything was built around that, but it doesn’t really fit into the day-to-day use of it,” Pozner said. “By not taking action, they have made a choice. They are the arbiters of truth by doing nothing.”

© 2018 The New York Times Company

By participating in online discussions you acknowledge that you have agreed to the Terms of Service. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. Report comments if you believe they do not follow our guidelines. Having trouble with comments? Learn more here.