Privacy law is currently being shaped and implemented by a new industry of third-party tech vendors. These companies code their own interpretations of privacy laws into the designs of their products, but many are doing so without legal expertise.

 

 

Over the last few years, there has been a flurry of new privacy and data protection laws offering consumers hope that their data will be better protected following repeated privacy scandals. The European Union’s new General Data Protection Regulation (GDPR) is seen as the most robust privacy law on the books today. California recently enacted a new privacy law based, in part, on the GDPR model. And proposals from Senators Brian Schatz (D-HI), Ron Wyden (D-Ore.), and others raise the prospect of comprehensive privacy legislation at the federal level.

 

These new and complex privacy laws have opened the door to a new industry: privacy technology vendors, or firms offering software and artificial intelligence (AI) to help companies comply with their privacy law obligations. In 2017, the International Association of Privacy Professionals (IAPP) profiled 51 of these vendors. Its latest report, published last year, includes 192 vendors—a nearly 400 percent increase since the year before. Investment in third-party technology vendors was the second most popular strategy for privacy compliance in 2017, during the run-up to the effective date of GDPR. If companies have the budget, they’re in the market for tech vendors.

 

This phenomenon raises several pressing questions. Who works for these vendors? What roles are they playing in the privacy space? Are lawyers and privacy professionals involved, or are technological tools being designed by engineers in a vacuum? If vendors are helping companies comply with legal rules, how do they ensure that their products honestly and accurately achieve compliance?

 

These questions matter because third-party vendors are playing increasingly important roles in the implementation of privacy law. Because the privacy technology vendor industry is so new and underexplored, I engaged in primary source research to answer these questions. I attended privacy conferences and conducted semi-structured interviews with vendor representatives and vendor executives, sat for demonstrations of products and watched webinars hosted by vendor companies, surveyed privacy professionals and the vendors themselves, and conducted desk research and reviewed vendor literature to determine how they positioned themselves in the marketplace.

 

Here’s what I found: Privacy law is being defined, negotiated, and practiced by an army of third-party vendors. Many of them are coding their own version of privacy law into the designs of tools they claim will help companies comply with privacy laws’ many mandates. But many of them are doing so without the input of privacy professionals and privacy lawyers. Some of these firms are selling themselves as experts and marketing their tools as easy-to-use silver bullets for complying with specific legal mandates. Some of them are under-inclusive and unverified: they focus on the codeable parts of privacy law and often do so based on an engineer’s interpretation of what the law requires.

 

JLINC Labs, for example, is an Oakland-based company with 13 employees, exactly none of whom are lawyers. Yet the company claims its content management technology “makes it easy to comply with any data-related legislation.” Nymity, a privacy research company, states that its privacy compliance software is “GDPR Ready” and helps organizations “attain, maintain and demonstrate ongoing compliance.” FairWarning, which markets privacy and security solutions to health care providers, claims its program “fully addresses 5 of the Phase 2 HIPAA Audit protocol elements and partially addresses 26 more.” ZLTech also offers “GDPR-Ready Solutions,” and explicitly claims that its tools to identify, minimize, and govern personal data uses will make clients compliant with multiple parts of GDPR. In entering this market for legal compliance technologies, these companies are integrating into their designs particular visions of what the law requires.

 

And that is concerning for multiple reasons. Privacy compliance technologies are often marketed to privacy professionals through persuasive—though not necessarily honest—advertising narratives. Almost 72 percent of vendors will, at some point, position their products and services as generally achieving GDPR compliance, and yet most are designed to meet only two or three of GDPR’s many requirements, if that. Given uncertainty about the meaning of some GDPR requirements and the associated anxiety about the financial catastrophe of noncompliance, privacy professionals (or the executives they support) may be uniquely susceptible to promises that vendors can make their troubles disappear. This makes leveraging the vendor market particularly risky for privacy professionals, and a fertile ground for consumer protection law.

 

“Making legal conclusions without legal expertise, and burying those conclusions in code, not only risks making bad products. It also constitutes a threat to the legal and privacy professions by implicitly characterizing the skills of legal interpretation and implementation as routinizable, irrational, imperfect, or just too human.”

 

Outsourcing privacy compliance to third-party technology vendors will also have an asymmetrical effect on industry. Outsourcing is often cheaper than building something internally. The latter requires in-house technical expertise, large salaries and benefits for new hires, as well as institutional time and capacity. Larger companies can leverage internal expertise to conduct extensive due diligence, beta testing, and background research on potential vendors. They can leverage superior bargaining power to adapt vendor products to their interests. They can even buy the best products, leaving the rest of the market with inferior choices (or just more expensive ones). And given that these technologies embody legal interpretations, the advantages of size and scale will allow large companies to build structures that frame the law in ways that benefit them, not their competitors and consumers.

 

Outsourcing legal decisions to engineers also poses a threat to the role of expertise in society. Making legal conclusions without legal expertise, and burying those conclusions in code, not only risks making bad products. It also constitutes a threat to the legal and privacy professions by implicitly characterizing the skills of legal interpretation and implementation as routinizable, irrational, imperfect, or just too human.

 

Moreover, by shifting the locus at which privacy law is negotiated from those trained in the law to those trained in technology, privacy technology vendors change the discourse of power. They shift the discussion from language we can understand—expectations, access, trust, and creepiness—to the language of computer code, something few understand and few will ever be able to see.

 

Translating law through code, therefore, undermines due process. Privacy technologies embody particular visions of what privacy laws require. But the process of designing those visions is almost entirely hidden to us. Law is normally characterized by procedural and substantive due process that safeguards its legitimacy. The more we ask “black box” algorithms to implement the law, the more we undermine the project of public governance.

 

Outsourcing work to technology vendors is not always a bad thing. Third parties with expertise can help do things that companies, especially startups, cannot do in-house. And they can design platforms that give us the information we need to comply with the law. But as soon as that work jumps from facilitating compliance to designing tools that represent particular interpretations of what the law requires, the promises of privacy law may be at risk.

 

Ari Ezra Waldman is Professor of Law and the Founding Director of the Innovation Center for Law and Technology at New York Law School. In the 2019-2020 academic year, he will be the Microsoft Visiting Professor at Princeton University. He is also an Affiliate Fellow at Yale Law School’s Information Society Project. His book, Privacy As Trust, argues that privacy law should be reoriented around protecting information flow in relationships of trust. This post is based on his article, Privacy Law’s False Promise, forthcoming in the Washington University Law Review. Waldman earned a Ph.D. in sociology from Columbia University, a J.D. from Harvard Law School, and an A.B. from Harvard College.

 

Disclaimer: The ProMarket blog is dedicated to discussing how competition tends to be subverted by special interests. The posts represent the opinions of their writers, not necessarily those of the University of Chicago, the Booth School of Business, or its faculty. For more information, please visit ProMarket Blog Policy