The Russian’s use of social media platforms to influence the 2016 election and the fallout from Facebook’s Cambridge Analytica scandal has triggered calls for regulation of the tech sector, but with no proposals emerging.
Earlier this month, former acting-Commerce Secretary Cameron Kerry called for reviving the Obama-era Privacy Bill of Rights.
Enter Senate Intelligence Committee Vice Chairman Mark Warner (D-VA). While Warner has been a critic of the tech giants’ response to the investigation into Russian activities in 2016, he comes from a tech background having co-founded the company that ultimately became Nextel Communications. He has been circulating a draft white paper, “Potential Policy Proposals for Regulation of Social Media and Technology Firms”, containing 20 options for regulating web platforms.
The proposals fall into three categories:
Understanding the capacity for communications technologies to promote disinformation that undermines trust in our institutions, democracy, free press, and markets;
Consumer protection in the digital age;
Promoting competition across multiple markets, including digital advertising markets (which support much of the Internet economy), future markets driven by machine-learning and artificial intelligence, and communications technology markets.
Duty to clearly and conspicuously label bots
Warner stresses that bot “play a significant role in the amplification and dissemination of disinformation,” as well as being utilized for promoting scams and financial frauds.
Duty to determine the origin of posts and/or accounts
“Anonymity and pseudo-anonymity on social media platforms have enabled bad actors to assume false identities (and associated locations) allowing them to participate and influence political debate on social media platforms.” Verification, however, presents challenges given use of VPNs and could also raise privacy concerns.
Duty to identify inauthentic accounts
“A major enabler of disinformation is the ease of creating and maintaining inauthentic accounts (not just bots, but in general, accounts that are based on false identities). Inauthentic accounts not only pose threats to our democratic process (with inauthentic accounts disseminating disinformation or harassing other users), but also undermine the integrity of digital markets (such as digital advertising).”
Warner notes that a law could be crafted imposing an affirmative, ongoing duty on platforms to identify and curtail inauthentic accounts, with an SEC reporting duty and FTC investigatory power, but that could also infringe on anonymous free speech and protected parody accounts.
Make platforms liable for state-law torts for failure to take down deep fake or other manipulated audio/video content.
Where a party had demonstrated that a deep fake constituted defamation, invasion of privacy, false light etc, a platform could be required to ensure that such content was not reuploaded.
Require Public Interest Data Access
Regulators, users, and relevant NGOs lack the ability to identify potential problems (public health/addiction effects, anticompetitive behavior, radicalization) and misuses (scams, targeted disinformation, user-propagated misinformation, harassment) on the platforms because access to data is zealously guarded by the platforms. Warner suggests legislation that would require larger platforms to provide independent, public interest researchers with access to anonymized activity data, at scale, via a secure API. The goal would be to allow researchers to measure and audit social trends on platforms.
Require Interagency Task Force for Countering Asymmetric Threats to Democratic Institutions
“Setting up a congressionally-required task force would help bring about a whole-of-government approach to counter asymmetric attacks against our election infrastructure and would reduce gaps that currently exist in tracking and addressing the threat. This typically could be done by the President without legislation; however, President Trump seems unwilling to touch the issue, and as such, Congress could force the issue”.
Disclosure Requirements for Online Political Advertisements
“Because outdated election laws have failed to keep up with evolving technology, online political ads have had very little accountability or transparency, as compared to ads sold on TV, radio, and satellite. Improving disclosure requirements for online political advertisements and requiring online platforms to make all reasonable efforts to ensure that foreign individuals and entities are not purchasing political ads seem like a good first step in bringing more transparency online. The Honest Ads Act (S.1989) is one potential path, but there are other reasonable ways to increase disclosure requirements in this space.”
Public Initiative for Media Literacy
“Addressing the challenge of misinformation and disinformation in the long-term will ultimately need to be tackled by an informed and discerning population of citizens who are both alert to the threat but also armed with the critical thinking skills necessary to protect against malicious influence.” The challenge here, however, is that disinformation is able to succeed because of partisan distrust of media sources.
Increasing Deterrence Against Foreign Manipulation
“The U.S. government needs to do more strengthen our security against these types of asymmetric threats. We have to admit that our strategies and our resources have not shifted to aggressively address these new threats in cyberspace and on social media that target our democratic institutions.”
Privacy and Data Security
Proposals have been made to impose fiduciary duties on certain online service providers, such as search engines, social networks, ISPs, and cloud computing providers. “Concretely defining what responsibilities a fiduciary relationship entails presents a more difficult challenge. . . . Applying a one-size-fits-all set of fiduciary duties may inhibit the range of services consumers can access, while driving online business models towards more uniform offerings.”
Privacy rulemaking authority at FTC
“Many attribute the FTC’s failure to adequately police data protection and unfair competition in digital markets to its lack of genuine rulemaking authority (which it has lacked since 1980).” Warner warns, however, the FTC would require significantly more funding to “develop tools necessary to evaluate complex algorithmic systems for unfairness, deception, or competition concerns.”
Warner is skeptical of GDPR-like legislation, due to U.S. companies’ concerns over the scope of GDPR and scale of its fines, its overbreadth as applied to ICANN’s WhoIs regime and the need to create a privacy authority to enforce any GDPR attempt
First Party Consent for Data Collection
While rejecting a GDPR-like approach, Warner suggests considering first party consent for any data collection and use, thereby preventing third-parties from collecting or processing a user’s data without their explicit and informed consent.
Warner also ponders establishing standards that algorithms be auditable for efficacy and fairness, suggesting requiring that any algorithmic decision-making product the government buys must satisfy algorithmic auditability standards delegated to NIST to develop.
“The opacity of the platforms’ collection and use of personal data serves as a major obstacle to agencies like the FTC addressing competitive (or consumer) harms. This lack of transparency is also an impediment to consumers ‘voting with their wallets’ and moving to competing services that either protect their privacy better or better compensate them for uses of their data.”
Legislation could require companies to more granularly (and continuously) alert consumers to the ways in which their data was being used, counterparties it was being shared with, and (perhaps most importantly) what each user’s data was worth to the platform.
The goal of data portability is to reduce consumer switching costs between digital services (whose efficiency and customization depends on user data). “A data portability requirement would be predicated on a legal recognition that data supplied by (or generated from) users (or user activity) is the users’.” Warner highlights, however, that data portability could trigger data security concerns.
“Imposing an interoperability requirement on dominant platforms to blunt their ability to leverage their dominance over one market or feature into complementary or adjacent markets or products could be a powerful catalyst of competition in digital markets.” Warner notes, however, that this raises cybersecurity concerns as well.
Essential Facilities Determinations
“Certain technologies serve as critical, enabling inputs to wider technology ecosystems, such that control over them can be leveraged by a dominant provider to extract unfair terms from, or otherwise disadvantage third parties.” Warner cites Google Maps as an example and suggests that legislation could define thresholds beyond which certain core functions/platforms/apps would constitute ‘essential facilities’, requiring a platform to provide third-party access on fair, reasonable and non-discriminatory terms.
The full white paper is below.
Read more: ilccyberreport.wordpress.com