On September 28, 2021, Laura Edelson, co-leader of the Cybersecurity for Democracy project at NYU’s Center for Cybersecurity testified before a subcommittee of the U.S. House Science, Space, and Technology Committee. Edelson, who is also a Ph.D. candidate in computer science at NYU Tandon, was part of a panel providing input on “The Disinformation Black Box: Researching Social Media Data.”
Here are a few excerpts from Edelson’s opening statements:
“As cybersecurity and privacy researchers, my colleagues and I at NYU study systemic vulnerabilities in online platforms that expose people to misleading and outright false claims — from fake Covid-19 cures, to voting disinformation, to investment scams. I believe we will look back and see this moment in history as a turning point, when we stepped up as a society and took action to address what is now a pervasive problem.”
“Revelations from investigations by the Wall Street Journal and The New York Times over the past week show that Facebook has internal research demonstrating specific harms and differential treatment and on its platforms, and has considered certain mitigation strategies, only to discard them.These findings, which have now been made public, echo the conclusions of both my work and the work of other independent researchers: Facebook amplifies misinformation and extreme content. Facebook’s most influential and powerful users are often the ones spreading the most far-reaching misinformation because Facebook doesn’t apply its own policies and rules evenly. Facebook can be harmful to the mental health of its users.”
How do we address the problem?
In her remarks, Edelson called on Congress to take three actions to address this issue:
“First, Congress should pass a law requiring Universal Digital Ad Transparency now. The biggest digital ad platforms should be required to make all the ads they run publicly available in a machine-readable format. Along with nearly a dozen researchers, I called for universal digital ad transparency last year. We will soon be publishing a draft proposal that spells out the technical specifications needed in detail.
Second, I believe that a researcher safe harbor law would help protect the many researchers who engage in direct collection of data from platforms. The passage of this law would not directly give researchers access to data, but it would clarify the legality of a great deal of work that currently exists in limbo.
Third, platforms should be required to make public data available to the public: that is, public content with meaningful reach or content from public figures with meaningful audiences should be made available to researchers via tools or searchable interfaces that are accessible to researchers and journalists for analysis. Posting on public pages is analogous to slapping up a notice on a town bulletin board, or writing a letter to the editor. The intended audience is: everybody. Researchers should be able to collect this information for analysis.”