Privacy and Compliance by Design

In the face of archaic regulatory systems for assuring privacy and secure digital identity, Dr. John H. Clippinger recently released a working draft paper that outlines a dramatically new and more effective regulatory approach. His paper, published below in full, is entitled, “Privacy and Compliance by Design: Proposal For New 21st Century Regulatory Framework to Protect Personal Identity and Data While Assuring Regulatory Compliance with KYC and AML Rules.” (KYC = “Know Your Customer,” and AML = “Anti-Money Laundering.)

Today’s privacy and compliance practices have their origins in the 1970s, before the Internet, mobile phones, the Internet of Things, machine learning, Big Data, and not least, Bitcoin. It is a very, very different world now. Yet, today we are still trying to shoehorn today’s rambunctious and evolving digital technologies into yesterday’s analog, regulatory containers. It is time for a fundamental reconceptualization and reimplementation of identity, privacy, and banking regulatory processes. In the opinion of this author, there are tremendous gains to be had for both privacy protections and banking compliance, if it is done in an open, incremental and experimental manner.

For the moment, however, privacy and banking regulations seem to be going in opposite directions. Perhaps out of a mistaken notion of irreconcilable differences, they are seen as contradictory policy goals. On the one hand, there is a strong policy consensus in the United States and Europe that all individuals must have a basic “data bill of rights” (individual control, transparency, respect for context security, access and accuracy, focused collection, accountability) that protects them against the unauthorized retention and use of personal data. Its architects are nothing less than the White House Consumer Data Bill of Rights, the NISTC (National Strategy for Trusted Identities in Cyberspace), the Department of Commerce Green Paper, and the FTC Report, “Protecting Consumer Privacy in an Era of Rapid Change.” In the EU it is the General Data Protection Regulation (notice, purpose, consent, security, disclosure, access). On the other hand, in virtually opposite “corners” are the NSA and the FISA courts’ “Third Party Doctrine, The Patriot Act, The Bank Secrecy Act, FinCen regulations and law enforcement practices and policies for KYC (“Know Your Customer”) and AML (“Anti-Money Laundering”), where there is no tolerance for anonymity and the expectation of the need and unbridled right to collect as much personal and identifying data as possible.

Nowhere is this contradiction more apparent than the seeming head-on collision of the aspirations of the Bitcoin and Digital Asset community with the apparent needs of the “law enforcement” community. Yet, this really not need be the case. For there is a real opportunity to have both greater transparency, accountability and privacy while also having more effective and targeted compliance. However to do so means that both sides would have to give up some cherished notions and look at what is possible with fresh eyes.

The very family of encryption technologies that makes Bitcoin possible also make it possible to protect privacy. Through encryption, “zero knowledge proofs” and “proof of work/self signing,” it is possible to collect diverse, verified and fresh data sets that answer law enforcement and banking questions without sacrificing confidentiality/privacy. The challenge for law enforcement, however, is that there has to be “focused collection,” “respect for context,” “data minimization,” “transparency” and “accountability,” and a willingness accept anonymous but authenticated certificates of identity — and not to insist upon personal identifying information from the onset. What law enforcement would get in return is access to more and better, current and verified data about people and transactions. Some of the counterparties from emerging economies, for example, would have little traditional documentation, but they could be effectively authenticated and monitored through digital behaviors and signatures. The new approach would also be much more explicit and transparent in how data are collected, encrypted, analyzed and shared.  So would the processes for establishing probable cause and issuing warrants for identifying and apprehending suspected persons.

The challenge here is to develop a set of principles and best practices for protecting personal privacy, while at the same time enabling regulatory agencies to perform their duties with respect to their KYC and AML obligations. In light of the many exceptions to strict privacy protection, we need to identify and come to a consensus about best practices that are practical, realistic and economically feasible for all parties – individuals, businesses, and regulatory agencies. The point is to have a policy framework and “trust architecture” that assures “privacy and compliance by design.” This means that individuals must be able to know that their personal information will be kept private and confidential and that any breaches of that principle for the purposes of law enforcement or national security will be narrowly tailored and amenable to transparent audits and judicial accountability.

This approach would not have been tenable just five years ago because of technology limitations. Fortunately, encryption and identity technologies as well as computational resources have evolved to the point where it is now feasible to authenticate identities without revealing identities. Technologies can now enable computations to verify personal information without necessarily disclosing the person’s identity. Regulators can thereby acquire the minimum requisite data to perform their oversight obligations without compromising individual privacy. But in instances where an inquiry or computation indicates a “probable cause” that a crime has been committed, a search warrant can be issued and the underlying identity of a party can be revealed to the agency.

Thus, instead of overly broad collections of data by law enforcement or searches based solely on subpoenas and prior notice, not search warrants, the technological framework can provide the means for narrow, targeted data collection under appropriate judicial procedures that are transparent and auditable. This is possible only because there are now secure, protected processes and computations that enable individuals to protect their personal information and identities while also allowing selective interventions that assure effective KYC and AML compliance.

In order to do this, certain principles and practices need to be adhered to. Most of these are consistent with the broad principles of the consumer data bill of rights. Yet these principles are more aspirational than operational at present. Hence, the challenge is in the implementation and actual practices of Trust Frameworks and identity regimes as a form of DAA, or “Distributed Autonomous Authority.” Below are some proposed principles that are also embodied in the architecture of Open Mustard Seed (OMS), which is an example of an open platform to support DACs (Distributed Autonomous Companies), DAOs (Distributed Autonomous Organizations) and DAAs (Distributed Autonomous Authorities) for personal data and digital assets.

1. “Self-Sovereign” Identity and Enrollment. Through their own behaviors and biometrics, individuals should be able to generate a unique core identity “signature” through an open source algorithm that is stored in an encrypted cloud under the control of the individual or a designated third party.

2. Total Mobility of Core Identity Credentials and Personal Data. Every individual has the right to remove his or her core identity and personal data from one storage venue and transfer it to another seamlessly and costlessly. (This is consistent with the principles of the Jericho Forum founded by Chief Security Officers of major global enterprises.)

3. Right of Individual to Have Control and Copy of Their Personal Data. Any individual should be able to be able to discover any third party holding their personal data, access a copy of that data and store it in their own secure personal data store or account.

4. Every Individual Has the Right to Have Multiple Contextual Identities (Personas). All personas are anonymous, but can be linked to their core identities and authenticated, such that the user and only the user knows that all such identities resolve to a single core identity. Contextual identities are defined by policies for enrollment, privacy and enforcement.

5. Data Minimization and Anonymization for Sharing Data. The amount of data shared should be limited to the minimum required to answer a compliance inquiry. To the extent possible, the data should reveal neither the underling identity of the individual nor the actual value of the data attribute. (Boolean – Yes – No are preferred.)

6. Right of Law Enforcement to Identify Individuals. Appropriately credentialed and verified party with probable cause and court-issued search warrants should be permitted to penetrate the cover of an authenticated persona to reveal the underlying identity of a suspect.

7. Individual Control Over Data Held by Third Parties. Under the terms of an open contract, individuals should be able to provably retrieve, retract, or eradicate any data held by third party.

8. Full Transparency of All Oversight, Compliance and Enforcement Activities. The behavior of all regulatory bodies with KYC and AML oversight responsibilities can be reviewed through certified, fully independent audits and analytics of logs, activities and findings.

9. Respect for Context of Personal Data. Identity authentication and data protection and sharing policy rules shall be contingent upon persona attributes and specific, explicit and transparent policies regarding time, location, roles, purpose, and data attribute type.

10. Trust Frameworks as DAAs.  Must be Self-Governing, Self-Healing and Self Deploying. Trust Frameworks used for protecting personal information shall have opt in, consensual, transparent, accountable and enforceable policies for the sharing, analysis, verification and protection of data. There must also be transparent and editable failure-correction mode and the ability to automatically recover and redeploy.

11. Trusted Authorities for Digital Assets and Currencies.  Trusted authorities can verify and self-sign the issuance and provenance of the origin and risks of digital currencies, securities, and contracts, with access to independent modes of verification. Trusted Authorities have authenticated identities and data issuance and assignment and management rights that are also auditable and verifiable.

Challenge For Financial Regulators

Financial regulators want personal identifying data and they want all they can get to to do their job. Can the technology now available be used to develop new approaches to financial banking — digital exchange compliance — where there is a new notion of a personal — corporate digital asset account that contains the requisite data in a secure, verified and encrypted manner to prevent breaches and abuses, but at the same time provide more current and comprehensive ways of verifying names, dates, addresses, and activities and indices of suspicious activities without violating rights? We believe so, and that there should be “safe harbor” trusted experiments to explore feasibility, cost and effectiveness in much the same way that NIST through NSTIC proposed safe harbor trust frameworks for personal data. In this case Treasury and others would specify their performance requirements (performance-based regulations) and these would be balanced against the new consumer data bill of rights requirements to see how both goals can be achieved.