Towards A Post Industrial Networked Democracy: A Decentralized Data Commons for The Exchange Of Tokens of Trust and Value

Preamble:

This paper was initiated prior to the United States general election of 2016, and did not anticipate the election of Donald Trump as 45th President of the United States. Hence, its critique of the Nation State and industrial democracy and free market capitalism did not seem as immediate and urgent as it seems now. With Trump’s election and his advocacy of a radical nativism, authoritarian policies and outright rejection of climate change, compounded by his control over all bodies of government, executive, legislative, and judicial, makes finding alternative democratic structures and modes of governance all the more tangible and immediate.

The proposals discussed in this white paper are all the more relevant to the 19 States that did not vote for Trump and who will be highly resistant to efforts by a Federal government to impose Trumps regressive agenda on them. In a paradoxical reversal of traditional positions, the decentralization, simplification, and localization of governmental functions advocated here could be highly pertinent to the reinvention of the Democratic Party as advocates of States and local rights through a new kind of networked digital democracy. State and city governments will increasingly become the locus of innovations in governance and act as a counter to the threatened concentration and exercise of Federal powers to diminish the freedoms, health and welfare of their citizens and further degrade the environment.

Overview

We are at the juncture of two accelerating social megatrends. The first is the systemic failure of industrial democracies worldwide as trusted institutions. The second is the progressive virtualization of social processes and institutional functions. Taken together these megatrends are shaping a highly decentralized world order, where semi-autonomous digital processes augment traditional democratic and economic processes These megatrends trends are a part of a larger pattern of machine – human co-evolution that is inherent, inevitable and irreversible.

These technological and societal changes need not necessarily be dire nor “existentially threatening” for there is the real prospect of resolving many seemingly intractable problems in the effective, fair, and accountable governance of human affairs. Yet, if it such social technological innovations are to succeed, they will not be the product of happenstance, but rather the consequence of open, scientifically directed, and self reflective and self-corrective processes. Were such efforts to succeed, they would represent an unprecedented, if unexpected, positive advance in human social and economic attainment.

Democratic Design Within The Digital Dimension:

The virtualization of democratic governance and authoritative functions offers a unique opportunity to move beyond the rhetorical, and aspirational assertions of “basic human rights “and “good government” to living socio-economic artifacts that actually embody and perform to measurable standards of fairness and effectiveness. The coevolution of human-machine organizations need not be a zero sum game of man against machine, but a benign compact that is robust, resilient and generative of both biological and digital life.

In this sense, democracy, to take Thomas Jefferson at his word, becomes a continuously, living, social experiment whereby its reinvention need not require the guns nor revolution, but an open adherence to testable experiments in different forms of self-governance. There can be no originalism nor literalism here, as the intent ab initio is to continuously test, learn, and improve. The role of this “democratic governance protocol” is not fixed in time nor space, but to be relentlessly tested and improved over time against a constantly evolving set of criteria that reflect the will and circumstances of those who are self-governing. If there must be an inalterable dictum of meta-principles, then it is always to be an open system with the capacity for feedback, self-correction, and self-invention.

A New Open Sector:

Incumbents are naturally resistant to fundamental change. When basic innovations challenge their core interests, modes of operation, revenue, or power, they naturally try to insulate themselves. What ensues is a kind of “Innovation Theater,” whereby incumbents (banks, governments, media companies, and universities) launch their own accelerators, incubators, investment funds, and anoint “Chief Innovation Officers” to perform visible “acts of innovation”. Yet such innovation theater has produced few “hits” in recent decades, and their endeavors appear more as acts of inoculation than innovation. The sad history of newspapers’ attempts to shape social media and search portals in their own image is a case in point and is being repeated today in the financial and banking sectors. Likewise, most governments over time are incapable of reforming or cleaning themselves up from within. Short of “sweeping” elections or overt revolution, incumbent interests over time tend to weaken or weed out those checks and balances that diminish their influence. And though elections are intended to uproot such entrenched interests, they too can become captured overtime and rendered but empty rituals of accountability.

Since it is impossible for any public or private organization to innovate itself into a wholly new businesses or organization model, transformative innovation requires the freedom to pursue the full value creation of an innovation without regard to its effect upon incumbent interests. When those incumbent interests oversee the process, it not surprising that the sharp edge of the new gets blunted to accommodate the safe comfort of the familiar.

To move beyond the confines of legacy institutions and the often polarizing interests of the private and public sectors, it is not enough to form public-private partnerships, as they too can become no more than a common denominator of the two opposing interests. Rather transformative innovation requires getting out from under the thumbs of legacy interests and having sufficient freedom to undertake genuine, fundamental, and evidence based experimentation. Such efforts have to be ideologically and political agnostic as to what are the “proper” democratic processes or institutions.

An Open Sector for Distributed Data Commons And Emergent Democratic Processes

The Open Sector is neither subordinate to the private sector nor the public sector, but is free to work with both to independently invent, discover, test and deploy new forms of governance and authority. It is much more than a platform for “open data” experimentation and analysis, for it is an open and a free co-invention space where multiple stakeholders can jointly create and test new ways of generating and allocating value to challenge legacy economic, political and cultural interests. Like the open source software projects – Linux, Apache, and many others, it would is an opt in, self-governing network of contributors whereby different stakeholders agree to assert, test and revise a set of protocols for achieving more effective, equitable, accountable, and adaptive democratic institutions. The goal is to develop, test and deploy new forms of trust and governance technologies predicated upon on large scale field experimentation and established and evolving science from neuroscience, synthetic biology, evolutionary biology, computational social sciences and complexity sciences.

It would also be both human-centric and nature-centric, which is to say, using an interdisciplinary understanding of the natural and biological sciences to invent with Nature rather than against it, or over it. Rather than have a top down “omniscient” “social industrial engineering” approach, the goal is to have “emerg-engineering” whereby “autonomous processes” are allowed to emerge and evolve to augment human hosted processes. For example, instead of designing or enabling markets, the goal is to enable niches to evolve whereby social and biological organisms naturally complement and support one another in complete, sustainable and autonomous processes. Hence, the design process should be biological-ecological rather than mechanical.

Also the Open Sector will need to provide a core definition about what “democracy “ is in terms of “desired end states” rather than a collection of specific means or mechanisms such as “a constitution, bill of rights, voting, bicameral legislature, checks and balances, regulation, judiciary, “. Such end states could include “wellness”, level of shelter, mobility, social agency, learning, self expression, dignity, etc. These are like the measurable extrapolations of Thomas Jefferson’s, “life, liberty and the pursuit of happiness”. Unlike other approaches, a democratic approach would “recursively” include “we the people” as part of the process and the solution. Hence, citizens would not simply be passive, recipient beneficiaries of some benign leader or algorithm, but a part of the solution. Seen as a dynamic system, laws could be evolving and dynamic rather than fixed rules.

Nonetheless, it is still necessary when designing new forms of democratic organization in a digital form to posit different kinds of architectures, methods, and mechanisms and not be wedded to single frameworks. Industrial democratic institutions are but an embodiment of 18th century technologies and science and should be seen as but an early phase in a constantly evolving base of technologies that have not only different “releases” and “versions”, but eventually are to be supplanted by a wholly new “family” of technologies. Through this lens it is possible to re-imagine the legacy democratic “mechanisms of 18th century industrial democracy, and thereby consider elections, legislatures, judiciaries, treasuries, constitutions, bureaucracies, and executive functions not as sacrosanct, but as social technologies of their times limited by the available technologies, resources and understandings of their period.

Therefore in designing alternative digital and algorithmic democratic institutions, it is worth considering how current digital, cryptographic and algorithmic technologies might more effectively achieve the functions of more traditional democratic mechanism such as voting, a legislature, a treasury, a central bank, a judiciary. One of the major impetuses for this white paper is the presumption that a combination of new digital, learning, and crypto technologies and methods may provide an effective and viable and scalable alternative to the current crisis of failing industrial democracy.

Prospective Architectures and Processes for Decentralized Democracy and Data Commons Exchange

What follows is a series of proposals on how to design evolvable and robust digital democracies that provide incentive mechanisms that reward pro-social and pro-ecological behaviors in robust, testable and evolvable forms. Think of these proposals as proposals for proof of concepts pilots whereby ideas can be quickly tested, accepted, rejected and improved.

The approach being advocated here is based upon a combination of new technological deployments – cryptography, digital currencies, blockchain, peer to peer networks, machine learning, algorithmic contracts, zero knowledge proofs, homomorphic and multiparty party encryption, biometrics and behavior metrics and new scientific research and finding from neuroscience, complexity sciences, computational social sciences, evolutionary biology, behavioral economics and computational governance and control theory, and the governance of the commons.

Proposed Principles for An Evolving Architecture of Decentralized Democracy:

1. Peer to Peer Network of Equal Standing: Equality in this approach is designed into the organizational fabric of the society as an initial condition where every “person” is a peer of every other, regardless whether that person is an individual, a group, or a corporation. All peers are treated equally as a function of how they can act in the network. In this sense, all individuals are indeed “created equal”. This is assured by both how the P2P network functions and through a Terms of Service Agreement where all participants opt in to participate of terms of equal standing for data sharing. There can be no contracts or agreements of adhesion whereby one party can coerce another. The P2P network is governed as a commons where all parties have equal standing and no one party can subvert or capture the value of the commons to their own end. In this regard, data and their “tokens” are treated a common pool resource, much like water, which is to be shared, protected, and equally distributed among all the peers.
2. Self-Sovereignty of Data and Root Identity: The notion of self-sovereign identity and data was initially proposed through the Windhover Principles in 2014 in order to advance the notion that only the individual has the authority to assert their own identity and their data. This means that they are not dependent upon a State or a bank or some other authority to establish their identity and to have control over their personal digital data. In this sense, the individual simply applies the Open Identity Protocol of the Open Sector to derive an identity credential that uniquely – biometrically and behavior- metrically establishes who they are with a unique “hash” or eigenvector. As a “data subject” the individual is defined by the sum of primary and derived digital data about them. By giving the individual control of their data and identities independent of any central authority the individual achieves definitional standing independent of any particular government or authority.
3. Personal Data As Pseudo-anonymous Verified Asset Tokens; By verifying different classes of personal data – such as age, residence, movement, etc. as asset classes of data and verifying the provenance and validity of such data, and then representing them as signed tokens, it becomes possible for individuals to share and exchange their tokens for reciprocal value without having to compromise their privacy nor expose themselves to the risk of hidden exploitation.
4. Individual Control of Personal Data Learning Algorithms: The effectiveness of machine learning depends upon access to verified data, and the future viability of smart AI chat bots, recommendation systems, and avatar learning programs will depend upon their access to personal data. If the individual is to have freedom and autonomy as a “digital subject”, they must also have control over those AI programs that learns from their personal data. Individuals as “data subjects” will need to be able to develop and control their own virtual representations – agents- as a “data subject.” This avatar or digital projection of one’s digital self is something that only individual should have control over. (Currently this is not the case for Google Now, Facebook, Amazon, and other data mining services that control and mine personal data such as GPS, search, social interaction, and purchase data).
5. Multiple Personas and Proof of Standing Metrics; Individuals have multiple analog and digital identities – personal, professional, recreational, financial – each of which have their own credential to verify their specific attributes. In order for the digital subject to have control over their digital identities, the digital subject needs to control all these multiple ”personas” and have them linked to their root identity. Personas can also be official identities – medical – passport – each verified by trusted tokens of an official authority that signs and attests to the veracity of certain attributes or assertions – such as, college education and performance, job and competency attainments etc. Such trust tokens may be required as proof of standing for some oversight or qualification for participation or reward. For example, an individual has standing in a particular event or action if they are a direct participant – as either the initiator or recipient of an action – or the witness to an action. Standing can be a highly flexible tool that allows for bottom up democracy in determining how to form and govern groups and allocate resources and rewards. The computation of a proof of standing should be done in a provable and independent fashion. In some respects, it is like an independent reputation metric that is contextual, dynamic, time and place dependent. For example, in order to design effective democratic oversight processes, proof of standing would be used to qualify who could vote on a particular issue or have sufficient standing to render oversight judgments. Proof of standing combines the best of both direct and representative democracy as it allocates decision rights specifically to those who are affected by an issue or decision and those who may have the expertise to render informed decisions.
6. Self-Healing and Semi-Autonomous Dispute Resolution; The platform is designed to be a self-learning system that uses feedback to continuously monitor and improve its performance. When there is a failure, that is, when a governance mechanism fails to achieve a desired state or outcome, then it will attempt to determine why and propose and implement changes to improve its behaviors. In the case of contending parties, the governance mechanism might use genetic algorithms and fitness function to find outcomes suitable to both parties Failing that, the default could be prior agreed to settlements, and failing that, could be some human hosted arbitration process. The point is to avoid expensive litigation and protracted arbitration processes.
7. Value Token Exchanges To Replace Price in Markets: Economic pricing models reflect fluctuations in supply and demand where the goods or services are in relative finite supply. In the case of information or virtual goods there is no “physical” cost or finite limitation to the supply of the goods. Information goods are not rivalrous – zero sum goods – where one party benefits at the expense of the other. Rather as the complexity economist Brian Arthur (2013) has shown, information goods increase in value the more they are shared by others and are subject to what he calls the Law of Increasing Returns. In this case, the supply and type of increasing return tokens is controlled by a new kind of “token treasury” for the social construction of value as to what is to be created and rewarded. Through token design and policy, the supply and use of “value tokens” can be limited in their amount and what they can be used for. Since there is no time value of tokens and they can be minted for value created and recognized, there is no need for interest or debt. There may be genuine debate as to whether “real” value was created or suitably rewarded, but such issues can monitored and resolved by independent processes based upon empirical evidence and data. It is noteworthy that value here is an explicit social construction and lies in the eye of the beholder or the community, which in this case, could be a social function decided by the vote by members of appropriate standing. Such members would be positively and negatively affected by the issuance or printing of value tokens.

8. Multi-Signature Escrow Governance of Token Exchanges; Whether a reward, token or asset is “released” depends upon whether certain conditions have been met. In the design of governance mechanisms the goal is to have different parties of suitable standing (trust) to accurately and fairly make the determination. This could be for work done, product delivered, promise kept, or the meeting of a suitable condition, such as a task performed to a standard, or presenting authorized credential.
9. Governance Oversight Through Open-Testable Algorithms: All governance algorithms should be open and testable so that any biases, backdoors, or aberrant conditions can be openly tested and exposed. The goal is to have an open library of governance algorithms that members of the commons can use with confidence that they perform according to specification.
10. Decentralized Central Banks – Entropy – Learning Algorithmic Regulation One of the primary functions of an industrial democracy is the treasury which entails the ability to collect taxes, print money and regulate interest and money supply. Since there is no central bank, no taxes nor interest, the tools of the “token treasury” are vastly altered. Since the cost of running governance operations is minimal, there is no need to collect taxes but rather operational “fees”. Nor is there a need to charge interest as value tokens can be created to encourage and recognize value creation. The issuance of value tokens is a fundamental social- democratic process whereby all members have a say in what kinds of value tokens are to be created, for what purpose, and with what limitations. Along with the issuance of trust tokens, it is the principal process of social regulation and democratic expression and participation. From a complex systems and cybernetic control theory perspective, the regulation of the issuance of the type and quantity of tokens is subject to the law of Requisite Variety (Ashby, 1968) and entropy regulation. This is to say that there are well established mathematical control theory principles for calculating the adaptivity of having different amounts and types of tokens. Bayesian learning theory (Munther, et al, 2014) along with network analysis can help shape policies around the best practices for the propagating value and trust tokens.
11. Complementary Currencies: Convertibility of Fixed Physical Assets and Informational Assets: Unlike information assets, physical assets are finite goods that can be rivalrous goods. Therefore, there needs to be policies for the convertibility of digital asset tokens into fiat currencies or exchanges. There are major policy decisions and social ramifications about the extent to which a token or currency is a closed, open, or partially open loop exchange system. This is where the design of “complementary currencies” (Bernard Lietaer, 2001) comes in; How does one determine how a finite currency like Bitcoin as a kind of “digital gold” or an fiat currency can be used in conjunction with an informational currency – such as value tokens? In other words, what might the appropriate conversion rates and tables between the physical and the digital world? For it is through these “conversion tables” or policies that the many of the values and will of a society are expressed and enforced. Such policies might encourage specific environmental and social outcomes by limiting the consumption and use or certain kinds of resources. Currency convertibility tables, therefore, might becomes a powerful policy lever for the “Office of the Token Treasury” to determine what the mix of fixed and informational currencies should be and for what sectors of the society.

Elinor Ostrom’s 8 Principles for Governing the Commons

In 2009 Elinor Ostrom became the first and only woman to receive a Noble Prize in economics. She received the prize for her life time of work on how communities succeed or fail at managing common pool (finite) resources such as grazing land, forests and irrigation waters. From this research she was able to identify 8 principles for the successful management of common pool resources. We have attempted to adopt and translate these principles into the digital realm of self-organizing networks

1. Define clear group boundaries. This is a profound and foundational issue in the field of crypto-currencies where the distinction between “permission and permissionless” or boundary and boundary-less access is a hard divide between the libertarian Bitcoin and Ethereum communities and all other approaches to “trustless” computing and authorization. Perhaps the dominant rationale for Bitcoin as a currency for techno-libertarians is that it is without boundaries, is mostly anonymous, and makes no presumptions of identity, membership or reputation about the parties involved in the use, and authentication of transactions. In this sense, it is totally “free”. The cost of achieving such anonymity or “freedom” is the “proof of work consensus protocol” for verifying and paying the “miners” of transactions. This requires the costly consumption of computational and energy resources to solve a complex cryptographic puzzle. This approach is based upon the mechanics of a zero sum game between competing miners who attempt to solve cryptographic puzzle to prove the validity of a payment with the fastest and more efficient computing resources receiving the mining fee in Bitcoins. In reality, however, what has been presented as an open and decentralized processes is in fact a highly centralized process dominated by a relatively few mining foundries in China. Bitcoin zealots argue that this is only a temporary aberration and through improvements in the protocol such as “Segregated Witness” and the “Lightning Network “ that these issues will be resolved and transaction processing speeds will be decentralized and scale to the point where they would be competitive with current payment transaction networks. This view is met with considerable skepticism in the payments and distributed data base communities. Given that the Visa network can process 1,000,000, transactions a second and the Bitcoin network is currently limited to 10 transactions per second – there seems to be a long way to go. Yet these problems aside, Bitcoin and Ethereum (another crypto-currency and platform) seem to make Ostrom’s case about governance of the commons, as both have highly dysfunctional, concentrated and unaccountable governance mechanisms. Moreover, most established banks and financial services companies are working with their own variant of a permissioned blockchain and services.

In order to have boundaries, the identities and credentials of the participants of a commons have to be known to one another. It is not a zero-sum nor a one time game, but a “repeat game” where the long term interests of the players are aligned and they are known and mutually accountable to one another. The proof of work protocol of Bitcoin is based upon Hobbesian notions of a “State of Nature” where relentless tooth & claw competition results in the fittest outcomes. This is a caricature of even Darwin and not supported by evolutionary biology where competition and cooperation are essential to he evolution of complex and highly fit biological and social organization.

2. Match rules governing use of common goods to local needs and conditions. This is also an argument for multiple identities and tokens for value and trust. Rather than there being one uniform rule or currency, as in the case of Bitcoin alone, it is important to have complementary currencies or tokens designed for specific local needs and conditions. This argument was made previously for the need for complementary currency designs for the “Office of Token Treasury”.

3. Ensure that those affected by the rules can participate in modifying the rules. Here where is the notion of proof of standing is essential to having not only democratically legitimate decision making processes, but also sound decision making rocesses. It is important to identify stakeholders as not only affected parties but also as knowledgeable and trusted parties.

4. Make sure the rule-making rights of community members are respected by outside authorities. This is one of the “existential challenges” to the Bitcoin proof of work protocol and its Libertarian core values of “stateless”, permissionless and anonymous. Through the construction of a data commons and personas and related credentials our design allows for credentials and policies that can be “respected” and accepted by outside authorities. Moreover, the intent is to build implementations of EU and US privacy regulations that actually are self-enforcing. Similarly, this approach can be used to implement international regulations around Know Your Customer (KYC) and Anti-Money Laundering (AML) regulations.

5. Develop a system, carried out by community members, for monitoring members’ behavior. This requirement is relatively easy to achieve with the eventual “digitalization and virtualization of everything” whereby digital oversight can be achieved by machine learning and governance algortithms.. Here the mechanisms of pseudo-anonymity, blockchain, proof of standing, and trust tokenization become key in providing rigorous and independent reputational metrics, transparency and accountability.

6. Use graduated sanctions for rule violators. The approach taken here for the granting of rewards and sanctions is totally flexible and based upon the specific goals and circumstances. Unlike in the physical world were it is difficult to measure, oversee, and discipline rule violators, in the digital world, where everything can be seen and measured, it is entirely possible to test to see what level and kind of sanctions are most appropriate.

7. Provide accessible, low-cost means for dispute resolution. Unlike the judicial and legal system of industrial democracies, especially the United States, where dispute resolution is so expensive to be unaffordable by anyone but the wealthy or privileged, the use of standard algorithmic contracts and agreements reduces costs and the use of blockchains creates transparency and accountability.

8. Build responsibility for governing the common resource in nested tiers from the lowest level up to the entire interconnected system. The oversight of the access and use of different digital assets on the data commons platform is organized in a heterarchical fashion where all participants are held accountable for their specific access/trust and value tokens. Through a heterarchy of criteria of standing and reputation, different members can have different levels of privileges and oversight. These would be time and condition contingent.

Conclusions:

This white paper outlines in the broadest terms a prospective framework and open platform for designing and evolving new forms of democratic institutions that offer some scientific and technological promise of rectifying some of the systemic failures of industrial democracy. The goal is to move beyond aspirational democratic assertions of “basic human rights”, “equality,” social justice and ecological sustainability to the design of social-technical artifacts that perform, and evolve to actually express, embody and enforce scalable democratic and ecological principles.

If there were ever any doubt that we are transitioning to new global reordering of social and economic institutions, the recent and unexpected crisis of American democratic institutions should erase them. Given the magnitude and imminence of ecological, societal, financial and nuclear challenges before us, focused and thoughtful action is needed now.