amirbolous.eth,
floating online,
touching grass,

The ethics of cryptography

cryptography

The ethics of cryptography: how do we design systems in crypto that are compliant with the law while upholding the values we care about

Note each image in this post was generated with DALL-E and is accompanied by a question I think is worth asking ourselves as technologists who care about this topic

Second note, I wrote this piece several months ago, and wasn’t sure whether to publish it or not (partly because of some people’s reactions, partly because of polish, partly because my beliefs around this topic are complicated) but in the end I figured might as well because it might elicit some interesting discussions!

futuristic city

What is the relationship between cryptography and the state in our future?

Introduction

The rise in popularity of crypto in the past decade has brought a flurry of attention, resources, and capital into the space, rejuvenating applied cryptography research and onboarding millions of users across the world. This has brough a lot of innovation in the space, which has led to the creation of many products in crypto that users now rely on, from decentralized financial products, such as decentralized exchanges, like Uniswap, to decentralized stablecoins, like makerDAO to games like Dark Forest to mixers like Tornado Cash to many others. Although the products have offered a wide variety of different use cases, most if not all have attracted users with the strong guarantees of properties like privacy, verifiability, ownership, interoperability, censorship-resistance, and the immutable and permissionless nature that they stand for. All these products heavily rely on primitives in cryptography as tools that can help provide these types of guarantees, from signatures schemes to public-key cryptography, to zero-knowledge proofs, all have become essential building blocks that products in crypto take advantage of in their business logic.

For example, a virtual cryptocurrency mixer like Tornado Cash is a tool used by users of the Ethereum blockchain for privacy and anonymity. For some background knowledge, every transaction on the Ethereum blockchain is public; things like your balance, historical transactions, transfers etc. are all public by default. So, if someone knows your public address, they can see every transaction you’ve made, how much money is in your wallet etc. Tornado Cash is a tool that allows users to break a link between some amount of money they deposit and their address, it lets users move around some of their money privately by making it difficult to trace the address of the original funder. For example, if I wanted to donate some money to a charity but I didn’t want anyone to know that I donated, I could deposit some money into Tornado Cash, then withdraw it into a fresh address that nobody knows (except me) and that people could not trace back to my original address, then make this donation from this new address. In this specific case, using cryptography, this application was able to provide strong guarantees regarding privacy and anonymity without any central authority. Consequently, this ended up attracting a lot of illegal usage (like money laundering) because of the strong guarantees it provided which led to the U.S sanctioning it just a couple of months ago.

This brief example highlights an important property of cryptography that “because cryptography rearranges power, it configures who can do what, from what […], it makes cryptography inherently a political tool” (Rogaway, 2015). In other words, because of this property of cryptography that we can term hardness, which is that it is secured by math, no one can easily break it or reverse it, it threatens the power dynamics of those who seek to control information. That is, it an effective way of resisting mass surveillance and it threatens governments’ and corporations’ abilities to interfere, tamper with, spy on, or extract information.

In the Tornado Cash example, users’ privacy was secured by math and cryptography, which meant that for example, if everyone could see that a particular address had hacked money and laundered money through the app, there was no one to ask to reveal the data, no operator to coerce to give up information, and no company or person to subpoena in order to compromise this person’s identity or privacy. Thus, the only mechanism the government had was to regulate its legality, which will always end up hurting some retail users who may have been using this application in a perfectly legal way to access these strong guarantees that it provides. Hence, it’s important to think about how we can design applications with cryptography in crypto that still can be compliant with the law while maintaining the strong properties they benefit from to ensure that they can be used safely, are accessible (and legal) to the public, and maintain these ideals without compromising compliance with the law.

painting of cypherpunks fighting for their ideals with cryptography

What is the dystopian/utopian version of the world where the state is secured alone by cryptography with no external intervention?

Background on Cypherpunk Movement

The roots of cryptography being used as a tool that guarantees these strong ideals of privacy trace back to the origins of the cypherpunk movement, which is widely acknowledged as the intellectual and technological forerunner of the cryptocurrency movement (Doctorow 2022). The cypherpunk movement, which formally started with the manifesto in 1993 (Hughes 1993), was a group of activists who advocated for the use of cryptography as a tool for protecting privacy and promoting political freedom. The movement originated in the 1980s and 1990s, at a time when the internet was becoming more widely used and the potential for online surveillance was growing. The term “cypherpunk” refers to the use of cryptography to protect privacy and resist government surveillance and censorship. The cypherpunk movement emerged as a response to concerns about the increasing power of governments and other large organizations to collect and analyze personal data, and to use this data to control and manipulate individuals.

The cypherpunks believed that cryptography could be used to protect individuals from this kind of surveillance and control, and to enable people to communicate and exchange information freely and securely. They believed that, while not all nation states would necessarily compromise citizens’ privacy and conduct mass surveillance for example, cryptography was the only way to prevent any opportunity of doing so. Although not all cryptographers and researchers were cypherpunks, many of the prominent members of the field that drove this space forward early on were motivated by how cryptography could unlock people’s freedom and guarantee their privacy in a world where they believed governments increasingly failed to do so (which Zimmermann discusses in Why I Wrote PGP).

In the early 1990s, Doctorow (2022) explains that “the cypherpunk ideology was a counter to the ideology of the US spy apparatus, primarily the NSA, who claimed that cryptography would enable a quartet of socially corrosive evils: child pornographers, mafiosi, terrorists and copyright scofflaws (these were invoked so often that they came to be known as ‘the four horsemen of the infocalypse’).” American intelligence was very vocal about the dangers of cryptography and would consistently warn the public about its potential to facilitate criminal activity. This fear of cryptography and its potential for misuse would eventually lead to the creation of the “Clipper Chip,” an encryption technology developed by the NSA in 1993. The Clipper Chip was designed to allow the government to monitor encrypted communications, ostensibly to prevent the use of cryptography for criminal activity; it had a built-in backdoor that would “allow Federal, State, and local law enforcement officials the ability to decode intercepted voice and data transmissions” (McLoughlin, 1995). However, the cypherpunk movement rejected the Clipper Chip and saw it as a tool for government surveillance, leading to its eventual failure.

Another landmark case that resulted in establishing code as speech and paved the way for cryptography to continue to be open and accessible to everyone (which the cypherpunks fought hard for) was the Bernstein v. Department of Justice in 1995. At the time, the US government designated encryption software as “munition” which they needed to regulate for national security purposes, which required Bernstein, a PhD student who wanted to publish the source code for an encryption algorithm to “submit his ideas, register as an arms dealer, and apply for an export license merely to publish his work […]. The State Department also warned him they would deny him a license if he applied because his work was too secure” (Dame-Bolye 2015). The EFF assembled a legal team to sue the US government and the court ruled that export laws on encryption software violated Bernstein’s First Ammendment rights, which “made it easier to publish encryption software without approval for the US government.”

Many similar exchanges in the 1990s between the state and the public strengthened the discourse promoted by the cypherpunks, that nation states could not be trusted, mass surveillance was inevitable, and that without the use of cryptography, governments and intelligence agencies would compromise the public’s privacy. The government’s aggressive attempts at regulating cryptography and publicly marketing it as a source of criminal activity, pornography, and chaos which needed to be controlled only further encouraged the cypherpunks to fight for their ideals. The cypherpunk movement knew that cryptography was inherently a political tool because of this redistribution of power, and they embraced it because of the hardness that it provided, fighting to keep it open, free, and secure for all.

Some think the narrative that the cypherpunks perpetuated about the state, whether justified or not, may have missed other ways internet freedom could safely come about. Hellegren (2017) argues that “cypherpunks and journalists crystalized a particular understanding of Internet freedom in which only encryption software can protect online rights. This understanding removes any responsibility from the state to ensure the protection of online rights.” This raises an important point which is that systems that rely solely on cryptography, while they resist mass surveillance and attemps at violating privacy, can also withhold the opportunity for the government to protect certain rights. For example the right to a fair trial for misconduct against you or the right to freedom of speech.

Some cypherpunks argued that “only the laws of physics (like encryption) could protect such rights” however Hellegren raises an important point which is that this discourse of leaving the government out of the loop “removes accountability […] which can justify further state transgressions.” The broader point here is that the narrative that the cypherpunks espoused was not necessarily the correct interpretation of Internet freedom (not that there is a correct one), but that there are many possible interpretations, and some include a world where cryptography and the government do not have to work independently or in opposition of each other.

In fact, this relationship between how we can use cryptography in applications that guarantee these properties that we care about while also complying with the law may vary by the particular application in crypto. For example, it may make sense for applications with higher stakes to have more mechanisms for government intervention when criminal activity would be very detrimental (if there is as an example hundreds of millions of dollars or billions of dollars on the line).

picture of numbers depitcting cryptography floating around

What does it mean to trust the math that secures cryptography?

Cryptography and its use cases in crypto

Narayanan (2012) distinguishes applications that use cryptography between crypto-for-security and crypto-for-privacy, which although use similar tools, techniques, and primitives at the technical level, function differently at the social level. One is for use cases like protecting financial transactions, like “protecting your credit card number when you buy stuff online” while the other is used to provide strong privacy guarantees.

I will refrain from a doing a deep dive on all of the different domains cryptography is used in crypto, since there are more complete examples in the wild, but the important point to stress here is that are different use cases, like DeFi, governance, and privacy-perserving apps just to name a few. Although many of the cryptographic tools used may be similar, the purpose and function of each app varies widely and necessitates approaching the implementation (in terms of how to design the app in a way that can work in harmony with governments and be compliant with the law) in a way that depends on each use case. Each type of application in these domains carry different ethical considerations and responsibilities for the way cryptography is used which need to be carefully considered and may necessitate different levels of intervention.

picture of a two animals representing cryptography and the state wrestling with each other

What does a world which embraces the affordances of cryptography and the state look like?

Ethics of cryptography and responsibility of the state in governing applications using cryptography

We explore the ethics of cryptography here as the moral considerations that arise from the use of cryptography in different applications in relation to the state and the law. As cryptography is increasingly used to protect sensitive information and enable secure interactions, there is growing concern about the potential ethical implications of this technology. For example, the use of cryptography in certain applications, such as privacy-focused communication tools, can challenge the ability of the state to govern and regulate online activities. From the government’s perspective, what course of action makes sense to protect the rights of its citizens? Does the state’s responsibility to do so necessitate adopting weaker standards that allow the state to intervene when necessary?

The answer here will vary widely depending on who you ask. There is a strong narrative in crypto that cryptography is used as a means of enforcing contracts and rules in a way that does not rely on a trusted external or central authority. The important insight here is that this works well in some cases and fails in many other cases. For example, a DeFi protocol - using cryptography - can enforce an invariant in a smart contract which makes sure that any loan taken out is well collateralized, preventing situations like FTX’s collapse. However cryptography alone cannot guarantee for example the right to a fair trial or the right of physical protection from violence.

These may seem like silly, irrelevant examples, but they get at a point which is that governments exist for a reason. Cryptography is not a substitute for governments or states or laws. It can create (permissionless) systems which help run society, it can provide stronger guarantees, and it is an essential component of maintaining Internet freedom, but is not a replacement for governments. Therefore, in so far as governments are necessary to organize and run society, the question then becomes how do we use cryptography in a way where we can benefit from the affordances without compromising the government’s ability to protect its citizens’ rights?

Note that we’re assuming here that a government is well-intentioned, not corrupt, and indeed cares about protecting the rights of its citizens etc. When we can confidently rule a government out as incapable of these baselines, cryptography remains an essential tool for building applications with strong guarantees to maintain anonymity, protect privacy, provide cesnorship-resistant money like Bitcoin and Ethereum, an open financial system etc. when the government fails to do so for its citizens. However, we consider the more common case of an honest government that is concerned about governing its society and protecting the rights of its citiznes. What are some of the practical considerations to consider with how cryptography could help or infringe on the government’s ability to do so?

Practical considerations of cryptogrpahy and compliance with the law

One of the key ethical considerations surrounding the use of cryptography is the possibility of creating unbreakable encryption. In some cases, the use of strong cryptography can make it impossible for even the state to access certain types of information, even when there is a legitimate need to do so, such as in a criminal investigation. This raises questions about the role of the state in regulating and overseeing the use of cryptography, and whether it is appropriate for the state to require companies or individuals to adopt weaker standards. Additionally, the use of unbreakable encryption can also have implications for issues such as national security and public safety, as it may prevent law enforcement agencies from accessing information that could be critical to preventing or solving crimes. In fact, Bay (2017) analyzed the ethics of unbreakable cryptography from the perspective of John Rawls and concluded that completely unbreakable cryptography would be unethical because it could be “considered a violation of social cooperation and thus indefensible for Rawls.” This touches upon the point made in the past about narratives around Internet freedom, unbreakable encryption means governments can never violate any citizen’s rights, but it means they are also often unable to protect citizen’s rights. Some have explored the notion of backdoor cryptography as a way of getting around this, that is providing the strong guarantees we care about from cryptography, but designing the system in such a way that there is a backdoor that can “open up” encryptions and proofs etc. when necessary (and where relevant), for example in a criminal investigation. In theory, this sounds like a viable path forward that achieves some level of compromise between the affordances of cryptography and the government to intervene, when necessary. The challenge becomes how do we guarantee that the government or intelligence agencies do not abuse this power.

Martin (2017) gave one example in 2012 where some documents came to light that showed the NSA may have intentionally weakened the design of a popular cryptography standard. Knowing that their choice of elliptic curve had a backdoor which could potentially be taken advantage of, they bribed the implementors of this algorithm (RSA) to use this weaker standard. Unfortunately, there are other historical examples of situations like this where governments or intelligence agencies have sought out corporate partnerships, intervened in cryptography standards, and built tools to violate many of the rights that cryptography seeks to enforce, not with the intention of being “necessarily evil” but to help further what they thought was the correct way to govern their society.

In the absence of things like backdoors at the cryptographic level, one other lever governments can interface with applications in crypto that use cryptography is through regulation. For certain applications which use cryptography that is enforced at the lowest level and is built with no backdoor, the government can choose to regulate it if it breaks or allows users to break the law. Tornado cash which has already briefly been discussed was an example of this. Even if Tornado cash was only built to enable retail privacy, it makes sense why the government chose to sanction it given the billions of dollars it enabled to be laundered; the government doing so was not necessarily in order to remove strong privacy guarantees for users as much as it was to prevent further criminal activity.

However, this is precisely why it is so important to think about how to design applications in crypto that are compliant with the law because a) the end-goal is rarely to incentivize criminal activity, often this comes about solely as a second-order effect of the strong guarantees that cryptography provides. And b) the people that are most hurt by this (the government choosing to make applications illegal) are ordinary retail users who only wanted to use these applications to solve a problem, for example a user who wanted to use Tornado cash to donate to a cause that they did not want others to see they had donated to or hide their financial history from strangers on the internet or their employer. Both of which are perfectly valid and legal privacy-preserving reasons to want to use such a service. As a simple exercise, one can think of slightly adapted versions of Tornado cash which could provide users with similar guarantees but prevents the type of criminal activity that it attracted; one simple example would be a version of Tornado cash that capped the maximum size of a deposit at a small amount so that hacks with large sums of money would not be able to conveniently use the service. Or other more technical schemes that could, when necessary, open a backdoor to see which new address a depositor withdrew their money to help more easily track it and prevent money laundering.

Ultimately, this will require a level of trust and cooperation from both sides, governments and collective intelligence agencies must uphold their end of this agreement (to only open these types of backdoors when necessary and not abuse it) and application developers and cryptographers must be willing to include governments in the design of the applications.

Any new scheme will always come with compromises, but there will always be a tradeoff between privacy and security. From a practical perspective, it is better to have strong privacy guarantees with backdoors than to have no privacy at all. Even though Tornado cash is self-sovereign software that no one controls, the US still went to measures they had never gone to before sanctioning neither a person nor entity for the first time in history. This has profound consequences on the notion that code is speech, which Hodges (2022) elaborates on as “but what of the product of that code? Indeed, whether or not the outputs of code—websites, video games, algorithms, and so forth—are considered protected speech is far murkier.” The fact that the government went through with this speaks to the testimony of the lengths they are willing to go to in order fight illegal activity enabled by these applications, which shows just how important it is for cryptographers and developers to work with the government to design their apps in a way where users can benefit from the affordances of cryptography, but the apps remain compliant with the law.

Trust

This idea of trust and cooperation between cryptography and institutions is somewhat counterintuitive because the cypherpunks architected a very distinct notion of trust in the narratives they created regarding cryptography. Bruun et. al (2020) distinguishes the difference between this notion of trust institutions demand from their citizens and trust in cryptography “the first is foud in political speeches and public debates about data privacy and security […] governments emphasize the need for people to trust in data systems […] The second is found in research and development environments of emerging cryptographic technologies [….]. In contrast to political speeches and public debates, these communities promise to provide ‘trustless trust’ and abandon the need for trusted intermediaries, authorities, and institutions.” In cryptography (with no dependencies on intermediaries), the burden of trust by authorities is replaced with “new concepts of trust […] like ‘trust in numbers’, and ‘trust in math’” however Rogaway (2015) reminds us that cryptography is inherently political, and a big mistake is to assume cryptography is just applied math with no social or political consequences. Some may argue that this notion of “trustless trust” is superior to trust in institutions, and while this could be true in certain situations where transparency and legality is enforced in code, this comes with its own costs that should be carefully considered. For example, one could argue that this just redistributes the burden of trust

TODO: link tweet here “to a person who doesn’t understand cryptography, trusting the system works amounts of trusting the smart people who say that this is true which is not that different to trusting a system of authorities”, like trusting the government will enforce/abide by the law.

The crucial point I argue here is that there will always be some notion of trust regardless of the systems we choose to use, and this notion of trust will just manifest itself in different ways. Just because an application uses cryptography, does not mean it completely removes the assumption of trust and thinking this is the case would be shortsighted and naive. Anytime we use some cryptographic primitive, a hash function, a signature, an encryption algorithm, we are trusting that our model of the field at present day is correct, we are trusting that future technologies, discoveries will not upsurp these models, and we are trusting if they do, we will still be able to safely transition to these new models with minimal damanges. As a simple example, many signature schemes (like ECDSA) we use today are not post-quantum which means that when we use them, we are assuming we will be able to quickly and safely transition to schemes that are when the time comes when we can adverserially break them with quantum computers. Of course, this model of trust is different to say trusting the operators of some central authority, but the point here is that there are always trust assumptions, they just manifests themselves in different ways.

Note this is not necessarily a “bad thing”, trust is actually a crucial aspect of organizing, communicating, and governing society. The “bad thing” is not anticipating or being aware of the consequences your (choice of) trust assumption carries.

Moreover, each type of trust brings with it different consequences which may not seem immediately clear. I can see why the cyperpunks argued for the use of cryptography to protect individuals from the kind of surveillance and control they worried about, when at the time, governments and collective agencies consistently let them down by violating people’s privacy and pushing forward an agenda that cryptograhy was bad in the 90s. However, I can also see a worrying path forward where it doesn’t matter how “trustless” or “secure” our applications enabled by cryptography are if the government chooses to ban, regulate, or crack down on them if they elicit criminal activity (which is often not the intention).

Closing Thoughts

There is a tendency for these two forces of privacy and security to oppose each other, however this doesn’t have to be the case. We can still design applications enabled by cryptography in crypto that benefit from the affordances of cryptography while remaining compliant with the law and giving space for the government to protect the rights of its citizens. Naturally, this requires trust and cooperation from both the cryptographers/developers and the government. If we do so, how do we trust the government and intelligence agencies will not abuse their power? How do we trust the crypto community to factor in government considerations into the applications they design and making sure there is room for them to stay compliant with the law? These are the hard questions we need to discuss, but which we need to discuss together.

Cryptography alone does not get us where we want and neither will blind trust in institutions, but creating incentives to work together, trusting each other, and taking advantage of the affordances each one provides might just do so.