Cake
  • Log In
  • Sign Up
    • Please join me in welcoming Dele Atanda for a Cake Panel!

      About Dele: Dele Atanda is a serial entrepreneur and acclaimed digital visionary. He is CEO of metaMe, the world’s first Self-Sovereign AI and Universal Smart Data Marketplace, and founder of The Internet.Foundation, an NGO dedicated to advancing the ethical use of data in commerce while establishing digital rights as an extension of human rights. Atanda is a celebrated innovator having led digital innovations for FTSE10 and Fortune 100 companies that have become the gold standard for digital engagement within their sectors. Prior to metaMe he led IBM’s Automotive, Aerospace and Defense sector as Chief Digital Innovation Officer.

      Atanda has been a pioneering voice on the emergence of web 3.0 technologies, notably with his critically acclaimed book The Digitterian Tsunami: Web 3.0 and the Rise of the N.E.O Citizen published in 2013. He is an avid advocate of the potential of decentralized technologies to advance humanity while positively and dramatically transforming society.

      Welcome Dele!

    • A few things. The first thing is I have had a background in digital, data and security for a while. In fact, I was recently reflecting that my first job in the digital space was with an organization in the UK called the Central Office of Information. It’s been disbanded now - but it was a government agency responsible for government communication. And what’s interesting about the COI is that George Orwell worked there. It’s believed it was what inspired 1984 in many regards, and is what the “Ministry of Truth”  was modeled on in 1984. So that’s really synchronistic, in and of itself.

      My first client at the COI was the Royal Air Force, whom I worked with on using digital media to recruit young pilots. What’s interesting about fighter pilots is that they peak young, at age 21 in fact. So air forces recruit them very young, 17, usually being the youngest age they can be recruited legally. So I worked with the RAF to create campaigns to attract young people to join the RAF. It hadn’t dawned on me until recently the significance of using digital media in that way to influence people. 

      I then went on to various digital roles, but I think the next major milestone for me was joining Diageo, the world’s biggest beverage company,  I worked across a broad portfolio of Diageo’s global brands helping them use the internet to engage with consumers more effectively. I tell this story, because when my boss recruited me into Diageo, he said “We need someone to help us crack Web 2.0.” Diageo was quite mature as a digital marketer at that time. They’d originally had a pretty decentralised digital marketing framework and were getting into trouble in some markets around the world, so they deployed a highly centralized platform to bring order and consistency to their digital operations around the world. But this became an obstacle for their agencies trying to do rich, innovative work, especially as social media started to take off.

      So they brought me on board to crack that problem. I led the design of their new platform, which was designed as the first truly federated enterprise digital marketing platform. It just dawned on me the relevance of this; what we realized back then was that truly centralized platforms stifle innovation, and become very restrictive, but we also realized that completely decentralized models (and this was pre-Blockchain or any kind of immutable governance capability) became like a Wild West, were very inefficient at enterprise scale, difficult to govern and did not enable reuse of anything developed at the edges. So we developed a platform that centralized the aspects that needed to be centralized, which incidentally was mostly data, and decentralized development at the edge for speed-to-market and innovation purposes in end markets.

      This was really novel, and became recognized as a best-in-class digital engagement platform by Gartner, Forrester and the like. The interesting thing as well is that when Facebook did its IPO, Diageo was their reference use case for enterprise partners - it was actually the only use case in their IPO roadshow video. It certainly achieved what I went there to do! It’s funny to think of my connection to Orwell on one hand, and Facebook on the other. But in many ways, Diageo is where my interest in data as a resource was born.

      Looking at how we used data to influence people was fascinating - on one hand - but I become most interested in how we could use data to create better experiences over the internet through, with better personalization, relevance and customization. However, the more I looked at it, the more I realized we could only do this by sharing large amounts of personal data with 3rd party recommendation engines like Google and Facebook. I started to question the intentions of these companies who’d have our data and realized that I didn’t trust their intentions, and that was a problem.

      And as I looked at it more, I began to see that there were bigger issues around data. Around this time, the Arab Spring had taken off in Tunisia and Egypt, and we were experiencing a world where on one hand digital platforms were causing disruption, bringing big social changes, but we we’re also confronted with the fact that people’s lives were now in danger and at risk from their digital footprints. We started to see activists and their journalist counterparts being exposed to serious risk by way of their digital footprints. So we started to see big privacy and human rights issues, involved with data. At that stage, we started to ask “How could we enable more privacy, more security, for people around the use of the internet and their data?”

      We identified 3 types of people who would be interested in privacy at that point - this was very early. Captains of industry, like CEOs and senior managers, celebrities, and activists/ journalists. So we started prototyping secure messaging solutions to enable activists and journalists to communicate without exposing themselves, using things like the TOR network to anonymize their communications. Our strategy back then was to deploy a secure messaging app in places like Syria where activists were in danger, using that to shape a narrative that our data is valuable and that we needed digital rights to control it.

      Then the Snowden’s revelations happened, and the world changed overnight. With “Snowden-gate” we realized that what was happening in Syria was also happening in London, New York and all over the world. We realized our digital footprints massively exposed us all, everywhere.And that was interesting in many ways. One of the first things we did as a result was to establish our NGO, the Internet Foundation. We realized at that point that we not only needed to look at digital rights as an extension of human rights, but that we also had to look at them from the perspective of international law. It was pointless looking at digital rights solely within the context of a nation state’s legal structure; we had to look at them from a macro scale, like human rights, and protect them globally.

      As such we launched an initiative to call on the UN to establish a Universal Declaration of Digital Rights, as an extension of the Universal Declaration of Human Rights - looking at human rights and how we define them in the digital era while providing a mandate and legitimate framework for using technology to protect them globally, across borders. 

      This remains our North Star, in terms of how we operate and define policy today. So that was the first thing we did post Snowden-gate. The next thing we tasked our NGO with creating was a framework for how companies should use consumer data in a responsible and sustainable way. We call this our Clean Data Charter. Our position on this is that if data is the oil of the twenty first century (as often touted) then we are creating a Clean Data framework analogous to Clean Energy. In that sense the Universal Declaration of Digital Rights, is about the state and the citizen, and the Clean Data Charter is about companies and consumers.

      Generally speaking, we see three big issues, considerations and epochs in how people think about data. The first is privacy and security – restricting who can access our information. The second is data management: if you have security and privacy for your data, you’re more open to being able to use your data in ways you’re comfortable with. That includes choosing who you share your data with and using your data to access services that bring value to you. The third consideration is then data trading: how you can monetize your data, and use it to bring you economic value as well as other forms of value.

      So essentially, when we set up the NGO The Internet Foundation, we set it up to look at policies from a citizen’s rights perspective, from a consumer’s rights perspective, and from a sustainable development policy perspective to enable citizen centric, ethical and sustainable innovation. But I think the most important thing about this period was that we were able to evolve from solely having a security and privacy focus towards integrating a more comprehensive data management and data trading framework. Privacy became table stakes but - once you have privacy, what can you do with it became our focus.

      So we started to imagine the kinds of things you could use your data to do in a world where you have more privacy and agency. Having a separate vehicle to address advocacy, rights and policy set us up to think about innovation and commercialization of data in a much more healthy and focused way. Our Clean Data Charter data has now evolved into an Ethical Data Standard that we’re developing with the British Standards Institute as an ISO standard we can use to audit companies. And we’re now engaged directly with the UN on how to move the Universal Declaration of Digital Rights forward as part of the UN’s 75th Anniversary next year. 

      The last point to bring us to how we arrived at metaMe - as we looked at the space and developed policy and standards there were two problems we had to crack to enable ethical data marketplaces to emerge: (A) how could we enforce policies and (B) how could we track whether companies were being compliant with policies? And that remained a tough nut to crack technically, until the Blockchain movement came around. Blockchains, were very exciting for us, because we finally found a technology that enabled us to audit and enforce the rules and policies for how data should be used in a technical manner that was dependable and reliable. So for us, metaMe evolved from the ability to align blockchain technology and with our policy, advocacy and governance work.

      It is often said that Blockchains enable money to be exchanged as easily as information.

      metaMe’s model is to use Blockchain to enable information to be exchanged as securely as money. 

    • The vision we have for metaMe is twofold. The first aspect is what we call “trustless data sharing.”  The principle here is that we should be able to exchange our data with third parties in a manner that doesn’t require us to trust third parties or intermediaries to use our data as consented, but instead we can trust our data and the technology we use to share it to ensure that the rules we put in place for using that data are observed and enforced. So that’s the key differentiation. 

      I think that’s really important: we’re moving into an era now where privacy has become a currency. If you’d have asked me at the beginning of the year if I thought it was possible that by May we’d have Apple, Google, and Facebook competing with each other on privacy I would have said no. Apple recently released their sign-in with Apple product. Now while Apple definitely has better privacy credentials to date than other companies the problem with “sign in with Apple” is that it requires us to trust Apple with how our identity information and personal data is used since they will still have access to it. Apple’s position is that their business model is based on selling us products, does not require them to sell our data and that they’re not reliant on advertising in the way that Google or Facebook are.

      But we know that Apple’s model of selling products is going to have to evolve, they’re going to have to become more of a service-orientated company because the world is moving from an industrial products economy to a digital services one. The value in the digital economy is in services - not products. That’s why Apple’s share price was challenged last year, with their significant miss on iPhone sales targets.

      So as they move towards becoming more of a services company, how long will our privacy be safe in their hands? I’m not saying Apple is not a trustworthy company. In fact even if an intermediary, like Apple is trust-worthy the other issue is they may not be sufficiently secure to protect our data. And the bottom line is that decentralized data protection is significantly more secure than centralized protection.

      Our proposition is that the Blockchain enables us to not need to trust a third party with our data. It enables us to trust technology, algorithms, cryptography and mathematics to protect us, not the goodwill, good nature or flawed security of a centralized intermediary. And that’s really important.

      We see this as the future of the internet. Where we can have trust-less data sharing, that does not require us to trust a third party to protect our interests, where we can trust the internet and Web 3.0 technologies to protect them instead.

      The second aspect of our vision is universal access to value and agency. For me, self-sovereignty, where we have ownership and agency over our information, is what underpins universal access to value and agency.


      What I mean by access to value is this - our data is valuable, everybody knows that now, and we’re learning how much more valuable it is than we realise by the day. Yet the extent of this is hidden to most people, and most people can’t access this value. You look at people in Middle America, for example who have been disenfranchised by this digital revolution, who had mining or manufacturing jobs, people who’ve been left behind as we move to this automated digital era. 

      The question is how do these people access value? How are they able to participate in the economy? The problem at the moment is that money in all of its forms, even crypto money, still has a massive problem of being accessible to the masses. But when you understand that value in a digital economy is all about information and the integrity of that information, then you understand that everybody has value, because everybody has information. 

      I believe the nature of labor in the digital era will be about creating high-quality information, with high levels of signal integrity. This is a really fundamental proposition; that our data is of value, and the role we can play in enriching our data can enable us to increase and capture that value, and contribute value back into society. It’s a real pivotal concept in terms of inclusion: if we can give people agency and sovereignty over their data, and an equitable means of extracting value from it then de facto we create a much more inclusive and progressive economic paradigm at a grassroots level.

      Also by giving people agency and control over their data, and creating a framework that enables us to qualify the integrity of information, we then give people the ability to make truly, free, and informed choices. The other way of looking at this is to say “Well, if we don’t have a way to assess the integrity of information, or the ability to control how our data is used in information systems then we cannot have free choice and we don’t actually have free will.” I think this is a really important context to look at this movement from.

      What’s actually at stake here is our ability to make free, informed decisions.

      Our free will, essentially, which is the essence of our humanity. 

    • That’s a really great question.

      We talk about data, and people get lost - it is abstract, like Algebra or maths. But actually, our data more simply put is like our memories. Memories of what we’ve done, experiences we’ve had - like photographs of moments in time. Our digital footprint is a culmination of experiences we’ve had in the world captured electronically. But even more than that - our data also looks forward, it captures our hopes, fears and dreams, our aspirations, our desires. This is essentially what we’re talking about when we talk about our data. So even though it’s abstract, even dry-sounding, when you really pierce what it truly captures, it’s the essence of our feelings, our memories, the things that are most precious to us outside of material things. The things that make us, us.


    • So there’s another principle around how we’ve designed our system, that gets to the essence of sovereignty: we all own our data, and we give everyone a vault where they store it. We call it a metaVault. And from that vault, we then extract pieces of data needed to complete tasks into metaPods, which can be shared with third parties. Now those metaPods are stored in a wallet, and the important thing is when we give a 3rd parties permission to access that data, we’re effectively giving them keys for their wallet to see the data in our wallet. It’s essentially a wallet-to-wallet framework. Now the important thing is we’re not giving people our raw data 99.99% of the time - there may be some circumstances where law requires some data to be transferred - but that’s very rare. What we’re doing is giving a third party a key to have the ability to see our data and a secure environment in which to process it. The data is not actually moving. All that is moving is the permission and consent. So that’s how we retain control. 

      If you no longer wish for a third party to have access to your data, you revoke or cancel their keys, and they can no longer see what’s in your metaPod. It’s almost like a bank where you have a vault connected to many private safe deposit boxes. A third party can come into your bank and into a room to analyze what’s in a safety deposit box which you have allowed them to access, but they can’t take the contents outside of the bank. They can run processes on that data in the bank, so long as they have your permission to do so but if you don’t want them to have access any further you simply remove the key and they can no longer access it. The metaPod is like a safety deposit box with data extracted from your vault which no third party can access.

    • There are many people I take inspiration from. They come in different categories. I’ve been inspired by the great humanists, people like Gandhi, Martin Luther King Jr, Nelson Mandela, who spoke about a bigger human agenda. I’ve been inspired by great innovators, people like Einstein who challenged the fundamentals of scientific thinking, and people like Jung, who challenged the whole concept of psychology. I’m also inspired by the techno-revolutionaries, people like Steve Jobs, and Elon Musk, people like the cypher-punks and particularly people like Satoshi Nakamoto.

      But I’m also inspired by great storytellers - I’ve read a lot of science fiction over the years, people like Frank Herbert, Octavia Butler, Isaac Asimov, Ian M. Banks, these are all people who’ve inspired me to think of the world and the possibility of a different kind of future. Even artists - people like Van Gogh for example. I never really appreciated the magnificence of his work until I saw it alongside other Impressionists and then I realized “Wow, it’s such a unique way to look at the world.” So I take a lot of inspiration from disparate sources. 

    • When it comes to privacy and security, there are a few organizations that have been leading the charge for a while now, like the last 10-15 years. The Mozilla Foundation have always been very progressive, very pro-privacy, providing developer tools that can enable privacy in a meaningful way. The EFF has been a consistently steady voice, before privacy reached the crescendo of concern it has today, they were championing the cause. New organizations like the Decentralized AI Alliance are great as is the Open Data Institute. Tor has been on the frontline of privacy-enablement for many years as well. And people like Tim Berners-Lee, have been great supporters of the cause, and his new venture Solid is interesting: I think it hasn’t quite embraced the decentralized movement, but they’re making good contributions to the space as a whole. In the large enterprise space, Apple has made a lot of great strides with privacy, but as I said before, we have to see if they’ll continue to be pro-privacy as they pivot to services. And one of the big questions for Apple is will they embrace the decentralization movement, which is more private by default, or will they remain a centralized behemoth? I think it’s fair to acknowledge they’ve done a lot in the space, but if they’ll remain champions of privacy in a meaningful way, the jury is out on. 

      And within the blockchain space, I think there are a few organizations who are head and shoulders above most: organizations like Sovrin are solid in their approach to decentralized identification and Enigma are impressive in their use of zero-knowledge proofs and secure hardware to enable private, anonymous data processing.

    • In a nutshell, I would say it’s staying on the right side of history.

      We’re moving into an increasingly technology-driven world and technology is becoming an increasing part of our lives. I cannot see that trend changing. But I think we’re at a crossroads between a potential Cambrian burst of innovation, creativity, progress, vibrance, and productivity on one hand… but it’s not a given that this is where we will end up. On the other hand I think technology could also lead us to a very dystopian future. Technology companies really do need to ask themselves what kind of world they want to create. 

      And they have to be really aware of unintended consequences, sometimes even intended consequences, and whether they are going to be on the right side of history. I worked for a tobacco company in the past, and I think that right now Facebook is as much of a pariah as big tobacco or big pharma – when you look at the larger picture, the destruction, I think there are many correlations between these types of companies at the moment. So I think staying on the right side of history is one of the biggest challenges for tech companies today. 

      If you think back to agency, inclusion, diversity of thinking - and that’s another thing, if you don’t have a diverse team or diverse inputs, you’re going to have constrained outputs, outputs that are biased, not inclusive or representative of the needs or aspirations or desires of society writ large, and that’s a massive problem. And right now, technology companies have really poor credentials in that space. Really poor credentials of inclusion, diversity, on being ethical and being ethically driven. The challenge is less about what the technology can do now. It’s more about why you’re doing what you’re doing with it.

      Is what you’re doing bringing a net benefit to society, or is it a net detractor or destroyer of that which is important to us as human beings?

    • So I think the book du jour on this topic is The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power by Shoshana Zuboff. It really is a seminal piece on some of the dangers. I don’t necessarily agree 100% with her conclusions, but I think that it’s a great, comprehensive place to start. Also, we are creating a comic series called “Metaiye Knights” as a more fictional story to teach people about these issues. There’s a lot of technical documentation in this space, and certainly The Age of Surveillance Capitalism is a very authoritative and well-researched book on the space. That said, I do believe we also need to articulate these issues in storytelling, narratives that people can connect with emotionally. A great fictional writer in this space is Cory Doctorow. Doc Searls, has been an early pioneer in the space and now runs a big “Vendor Relationship Management” group of privacy professionals. There’s another group out of England called “Control-Shift” and they run conferences around the Personal Information Economy, and have blog posts, newsletter and are also a good source of information on privacy. 

    • Yes, I know David Siegel well and we have been in in discussions with the Pillar team for a while on collaborating. I am sure we will do very soon. I also am very familiar with Sir TBL's Solid and Inrupt projects and are looking to collaborate with them in the near future. Our protocol is designed to add value to a lot of these services and we believe will raise the tide of the entire privacy industry.