When policymakers want to understand how political ad targeting affects elections, they stare to academic researchers, and these researchers stare to one of the most important platforms that sells these ads: Facebook.
For Princeton researchers including Orestis Papakyriakopoulos, a Ph.D. at the University’s Center for Information Abilities Coverage, the key sticking point was a contract Facebook requires research institutions to sign prior to accessing its data. In particular, he and others on his digital tech coverage research team were concerned that agreeing to the contract would give Facebook the legal to decide information from their research findings had they actually went via with the project.
“It doesn’t make sense for us to accomplish research for six months and then no longer be able to submit it,” Papakyriakopoulos told Digiday.
The Princeton researchers and the faculty’s lawyers were concerned that, if the research findings revealed how Facebook’s ad targeting know-how and tools labored or how the company’s machine determined ad prices, the contract would give the company the legal to decide these findings from research prior to publication. “We sought to clarify whether Facebook would assert that information about how the Facebook advertising platform was used to target political ads in the 2020 elections is ‘Confidential Information’ that the agreement would allow them to ‘decide’ from our publication,” wrote the researcher team in an August 5 put up published on the middle’s weblog.
The contract Facebook requires researchers to sign to access data via its Facebook Launch Research and Transparency platform, or FORT, states that research findings resulting from analysis “may no longer narrate any Confidential Information or any Personal Data” and offers Facebook the alternative to review publication drafts “to title any Confidential Information or any Personal Data that may be included or revealed in these materials and which want to be removed prior to publication or disclosure.” According to the contract, Confidential Information includes information relating to Facebook’s products and know-how, its data processing systems, insurance policies and platforms, in addition to personal information pertaining to its users or business partners.
“The questions these researchers ask and conclusions they draw are no longer restricted by Facebook,” a Facebook spokesperson told Digiday regarding the Princeton researchers. “We merely ask academics to sign a research data agreement to make definite that no personal data or confidential information is shared. Today, hundreds of researchers at more than 100 universities have signed the agreement.”
The company said it does no longer approve or reject research papers. “As of now, we have no longer rejected any research papers as a part of our standard review activity to make definite that no personal data or confidential information is included,” said the Facebook spokesperson.
Facebook’s FORT data platform is an example of the company’s increasingly restrictive approach to engaging academic researchers in an atmosphere that’s drastically changed since the 2016 Cambridge Analytica political ad targeting scandal, which involved the exhaust of data originally derived for psychographic ad targeting from Facebook data that was scraped for academic research. Critics usually refer to tech companies’ justifications for academic research data limits as “privacy-washing.”
Poking holes in Facebook’s FTC defense
It appears that Facebook’s justification for why it will no longer negotiate the contract with the Princeton researchers employed an argument that has now been debunked by the FTC. Facebook told the researchers its contract was non-negotiable because the stipulations therein were mandated by Facebook’s 2019 settlement with the agency involving user privacy violations. “We pushed back on this ‘take-it-or-leave-it’ approach,” wrote the researchers, who added, “Facebook later conceded in a subsequent email that they were below no legal mandate and that their approach was merely based on their internal business justification.”
Facebook’s rivalry that its agreement with the FTC prohibits negotiations for data access by academic researchers came to the fore on Aug. 3, when the company’s product management director Mike Clark wrote that the FTC Advise was justification for Facebook’s determination to disable accounts and apps associated with NYU’s Ad Observatory Project, a political ad targeting research effort that had already been below threat of shutdown by Facebook since October 2020.
“We took these actions to stop unauthorized scraping and offer protection to americans’s privacy in line with our privacy program below the FTC Advise,” he wrote, noting that the NYU project’s “ongoing and continued violations of protections against scraping cannot be disregarded and ought to be remediated.” Clark said the NYU researchers ought to composed have tapped its sanctioned FORT data instead.
In response to the Facebook put up, the acting director of the FTC’s Client Protection Bureau, Samuel Levine, wrote, in a letter to Facebook CEO Mark Zuckerberg published on the agency’s place, that its agreement with Facebook “does no longer bar Facebook from creating exceptions for merely-faith research in the public interest.” Notably, he added, “the FTC supports efforts to make clear opaque business practices, especially around surveillance-based advertising.”
Nonetheless even back in May, Laura Edelson, an NYU Ph.D. candidate working on that now-shuttered NYU project, told Digiday the FORT data wasn’t of interest because there were restrictions on the stage of ad targeting information Facebook made available in the data place of abode. Facebook said these limits were “one of several steps we have taken to offer protection to users’ privacy.”