Elizabeth M. Renieris Interview (Law and policy engineering consultant focused on the areas of digital identity, blockchain and data protection)
“The reality is that we are being tracked, targeted, stalked, and harassed by commercial actors via digital means, whether or not we have direct contractual relationships with them” (1). That’s a quote by Elizabeth M. Renieris–a lawyer focused on policy engineering around digital and Self-Sovereign Identity, blockchain, privacy and data protection, who is fighting against the “commodification” of our personal data.
The GDPR offers improved privacy and data protection by giving individuals enhanced rights and clarifying the obligations of organizations to give effect to those rights, but it’s not enough. The advent of things such as blockchain-based digital identity bring new challenges to the table.
First of all, a blockchain is immutable. Anything that is put there will be there forever. This raises major privacy concerns. Although blockchain technology and cryptography may be secure now, they may not be in the future, risking the exposure of personal details to malicious and non-malicious actors. And what about GDPR’s right to be forgotten? Who on a blockchain network are the data controllers to be held accountable? What about cross-border transfers of data?
Elizabeth believes that blockchain, as currently conceived, is largely incompatible with the GDPR. We highlighted 3 of the 7 reasons, based on GDPR’s core principles, why she thinks this:
1. Principle of Lawfulness: What is the lawful basis for putting this data on the ledger? Even if “legitimate interest” is argued, Elizabeth believes this legitimate interest has to be evaluated “case-by-case basis weighing the interests of the controller against the rights and interests of the individual,” which is at odds with the automated, code-is-law approach of many networks (2).
2. Principle of Purpose Limitation: Under GDPR, “personal data must be collected for specified, explicit, and legitimate purposes and not further processed in a manner incompatible with those purposes.” But what happens on a blockchain is that data is immediately replicated between all the nodes in the network. Making that data “further processed” and “broadcast to an indeterminate number of nodes across an unspecified geographic scope and stored indefinitely” (3). Elizabeth believes that the “automatic replication of data across all nodes in a ledger is also an automatic violation of the data minimization principle” (4).
3. Principle of Accountability. Elizabeth points out that many blockchain or ledger-based projects argue that they are too “decentralized” to identify data controller(s) or take responsibility for giving effect to data subject rights, inadvertently shooting themselves in the foot from a compliance perspective. To the extent that a ledger-based project insists that no one is accountable, she argues that it cannot satisfy the accountability principle and therefore cannot comply with the GDPR.
For the remaining points, read the great in-depth blog that Elizabeth wrote.
As a fellow at the Berkman Klein Center for Internet & Society at Harvard University, Elizabeth works on “designing new and improved data governance models that are human-centric and privacy-preserving with a distinct emphasis on enhancing individual and collective rights and well-being” (5).
We had the chance to ask her a few questions.
What does a law and a policy engineering consultant focused on the areas of digital identity, blockchain, and data protection do? What are your responsibilities & goals?
My three areas of expertise are data protection, blockchain, and digital identity. They are now converging as digital identity solutions proliferate, with many involving some form of blockchain or distributed ledger technology (DLT), and very few actually accounting for data protection and privacy-related concerns. I am trying to bring these knowledge areas together.
I have been a government attorney, a corporate lawyer in large law firms, and in-house counsel at a number of startups. As a result, I have a very well-rounded perspective on the kind of advice and counsel that’s needed to allow these communities to work together more effectively. I sometimes describe my work as acting as a translator between “technical” and “non-technical” communities (although I dislike those categorizations because legal and other skills are also technical).
Most of my clients, at the moment, are larger organizations who are looking to integrate emerging technologies into their products and services. They look to me to perform a kind of due diligence on these technologies and to shape governance frameworks and to understand their policy implications. That said, I also work with smaller organizations and startups to design and build products accounting for law, regulation, and policy considerations. My work is also very international in scope because, while the law is still bound by geography, these technologies are not, so a cross-border perspective is crucial.
My goal is to provide an informed and critical perspective to organizations looking to integrate and implement technology solutions with potentially complex legal, regulatory, and policy implications. By understanding the technology, knowing the right (and often difficult) questions to ask, and providing a real-time, up-to-date understanding of the laws and regulations as they relate to that technology, I can integrate law and policy considerations into the design of technology products and solutions, as well as their ultimate implementations and deployments.
What does it mean and what is the importance of “designing new and improved data governance models that are human-centric and privacy-preserving”?
Data governance has traditionally been designed from the viewpoint of larger stakeholders, notably governments and large corporations. In some ways, this made sense throughout Web 1.0 and Web 2.0. where the setup was more “push” than “pull” or more unidirectional. Given the more interactive nature of later-stage Web 2.0 and the increasingly participatory nature as we transition to Web 3.0, that approach no longer makes sense.
Human-centric data governance is about putting the human point of view at the center. This is not the same as hyper-individualism. Our humanity is shared and collective. In this way, human-centricity gives us more bargaining power against corporations and governments.
Ultimately, human-centricity is about retaining a sense of our humanity in our increasingly digital lives. This is closely linked to what’s needed for a more privacy-preserving approach. To me, a privacy-preserving approach means that we should have norms and expectations that govern the digitized aspects of our lives, just as we have always had in the real world. It is grounded in Helen Nissembaum’s work around contextual integrity, around a sense of our shared humanity, and in the rejection of techno utopian reductionism.
In the field of Digital Identity, what is the question that people should be asking more but aren’t?
One question that people haven’t been asking enough, but are starting to ask more, is: “why ID?”. For example, Access Now has recently introduced an initiative around this question. For me, “why ID?” is about taking a step back to ask why we need to implement a digital ID solution or to identify people in the first place, in a given context.
There are many contexts where we don’t actually need to identify specific and unique individuals, e.g. that I am Elizabeth Renieris and you are John Smith. Rather, we just need to know that someone has only shown up once. For example, in the context of certain public benefits, such as food aid, we might need to prevent double-dipping into public resources but we don’t need to know that I am me and you are you.Another important question that we should be asking is whose imagination are we living in? Whose vision of the world are we accepting? Are we accepting the constraints imposed by others before us? Do those who built Web 1.0 and Web 2.0 (with all the flaws of both) have a monopoly on the future of the Web? As the discussion grows more inclusive, what if that’s not our vision? We have to resist the inevitability and path dependency that can set in.
Specific roadblocks other people in this space should look out for?
Structural racism, misogyny, and other forms of discrimination and exclusion are rampant in the field of identity, often stemming from a history and culture of those who built the Web. My advice is to cut off the air supply. If you are an identity professional working in a toxic environment, don’t waste your time trying to change people who may not be ready or willing. Find a team or a tribe that can support you or rather even go it alone if you can. There is too much important work to be done to waste it on those who cannot see that the future is inclusive.
Another obstacle is having an overly microscopic view of things. We can easily get bogged down by the tech, by standards, by specific implementations or use cases, but we often lose sight of the big picture and what we were trying to do in the first place. Often this happens because we spend all of our time around people like us, working in the same industry often on the same problems. The solution is to get out of these bubbles. Go to a non-identity specific conference (the learnings will be relevant to identity), spend time with people from other backgrounds and professions, read things that seemingly have nothing to do with identity or tech (there will still be many lessons). Diversity and interdisciplinarity are central to a sustainable future for the identity community.
What are your hopes for the future of Digital Identity
That we wake up and resist our own commodification. And that we don’t lend our technology and our efforts to those ends.
That we realize that we are stronger together, as communities, and as humanity at large than we are as atomized individuals.
That, while it may be part of the solution, technology alone is never the solution.
What are the books you have recommended most to others?
Dr. Ruha Benjamin’s Race after Technology (on how the design of techonlogy can be discriminatory).
The Costs of Connection by Nick Couldry and Ulises A. Mejias (on data colonialism and how tech is taking from the flow of our lives).
Margaret Atwood’s The Handmaid’s Tale (about the forces that enable a future totalitarian state to emerge).
We, at Tykn, would like to thank Elizabeth M. Renieris for her time and for sharing her ideas and knowledge with us. Thank you, Elizabeth! Be sure to follow her Twitter.
Tykn is a digital identity company. We are now about to launch Ana, a digital identity management platform that allows organisations to issue tamper-proof digital credentials which are verifiable anywhere, at any time. If you’re keen on reading more we suggest you check out our Blog. There are interviews with Daniel Hardman, Kim Hamilton Duffy and many more. There’s also our Definitive Guide to Identity Management with Blockchain.