The Digital Deal Podcast

There's Hope at the Edges of Power

Meredith Whittaker & Calin Segal Season 1 Episode 5

Send us a text

In today's episode, we talk to Meredith Whittaker of Signal and artist Calin Segal about what surveillance and the concentration of power in the hands of a few tech companies mean for society. We discuss the context that made it possible for these companies to capture data without regard for privacy and use it to produce new social, cultural, and political dynamics. At the edges of what looks like an inescapable panopticon society, we find hope in the role of art, research, critical thinking, and organisations that prove “none of this is natural or inevitable”, in levelling the playing field.

Resources:
Selling the American People by Lee McGuigan
Profit over Privacy: How Surveillance Advertising Conquered the Internet by Matthew Crain
The Exhausted of the Earth: Politics in a Burning World by Ajay Singh Chaudhary
When the Clock Broke by John Ganz
Postjournalism and the death of newspapers. The media after Trump: manufacturing anger and polarization by Andrey Mir

Host & Producer: Ana-Maria Carabelea
Editing: Ana-Maria Carabelea
Music: Karl Julian Schmidinger
________________________

Host & Producer: Ana-Maria Carabelea
Editing: Ana-Maria Carabelea
Music: Karl Julian Schmidinger

The Digital Deal Podcast is part of European Digital Deal, a project co-funded by Creative Europe and the Austrian Federal Ministry for Arts, Culture, the Civil Service and Sport. Views and opinions expressed in this podcast are those of the host and guests only and do not necessarily reflect those of the European Union or the European Education and Culture Executive Agency (EACEA). Neither the European Union nor the European Education and Culture Executive Agency (EACEA) can be held responsible for them.


Ana-Maria Carabelea: Welcome to The Digital Deal Podcast, the series where we talk about how new technologies reshape our democracies and how artists and critical thinkers can help us make sense of these changes. My name is Ana Carabelea, and today I'm joined by Meredith Whitaker and Calin Segal.
 
Meredith is the President of Signal. She is the current Chief Advisor and the former Faculty Director and Co-Founder of the AI Now Institute. Her research and advocacy focus on the social implications of artificial intelligence and the tech industry responsible for it, with a particular emphasis on power and the political economy driving the commercialization of computational technology. Prior to founding AI Now, she worked at Google for over a decade, where she led product and engineering teams, founded Google’s Open Research Group, and co-founded M-Lab, a globally distributed network measurement platform that now provides the world’s largest source of open data on internet performance. She has advised the White House, the FCC, FTC, the City of New York, the European Parliament, and many other governments and civil society organizations on artificial intelligence, internet policy, measurement, privacy, and security. 

Calin is a computational artist whose work focuses on generative systems. He co-founded the In-Dialog research studio to explore the intersection of art and technology. His skill set spans design, craftsmanship, coding, and 3D modeling, and his creations are defined by their balance between calculated control and organic unpredictability, leveraging intricate mathematical models and algorithms. Calin’s work has been exhibited at Nuit Blanche Paris, the Enric Miralles Foundation, LEV – Matadero Festival, and the Geneva Mapping Festival. He has also been part of residencies at V2_ Lab for the Unstable Media, S+T+ARTS GRIN, S+T+ARTS VOJEXT, and CYENS – Center of Excellence in Cyprus. Calin is also currently an artist in residence with the Zaragoza City of Knowledge Foundation, one of the partners in European Digital Deal

Welcome to both of you. Thank you so much for joining us today. 

Meredith Whitaker: Wonderful to be here. Thank you. 

Calin Segal: Yeah, thank you for having us. 

Ana-Maria Carabelea: It's great to have you. What I'm hoping to unpack is how the data captured from our online behavior is used to shape our behavior in what kind of reminds us of a panopticon defined by asymmetrical surveillance that then becomes a form of correction and discipline and, as I said before, a way of shaping behavior. I think since Shoshana Zuboff's The Age of Surveillance Capitalism, we know that capturing data does not only serve to describe or understand realities but, rather to prescribe behaviors in a way that starts to produce new social, cultural, and political realities. I think it's helpful to start by looking at what drives this instrumentalization of data, both the economic and political mechanisms behind it, so that we can begin to understand how these online spaces come into being, and ultimately what ways of resisting them we might have. Meredith, I'd like to maybe ask you to go first and give us a bit of context, talk us through what brought and what maintains this predicament that we find ourselves in. 

Meredith Whitaker: That is a very, very big question, and - like all big questions - there are so many places we could start, so many genealogies we could trace in the process of answering it. We could go back to state statistics and the requirement of information asymmetry as a mechanism for governing, for controlling, for administrating across large populations of people. You can read Ian Hacking's work and other historical work that looks at the rise of state statistics. We can look at this not as a digital phenomenon necessarily, but as part of the way that governance, centralized power, constitutes itself via knowledge of its subjects, via information asymmetry. We can trace that through the origins of computation, which I have done research around - I’ve spent a lot of time in the archive. You can see very clearly that in the figure of Charles Babbage, computation was entwined for him with questions of how you discipline workers, how you control large swatches of the population in service of producing for the British Empire, in service of keeping the British Empire’s standing in the world order, in the emerging capitalist system that was being created. We can then trace that through to the Cold War era, post-World War II in the US, and the emerging rivalry with the U.S.S.R. that animated fantasies of omniscient technology that could use data collected by sensors and devices and computational infrastructures to see everything, process everything, and do that in service of nuclear dominance and an ability to stymie a potential nuclear attack from the Soviet Union. This is around the time that you start to see techniques developed in operations research in service of these Cold War fantasies, combined with the imperatives of marketing and advertising. The scholar Lee McGuigan has done some pretty incredible work tracing this genealogy in a book that he published, called Selling the American People. I would recommend that book if we want a history of the longstanding entanglement between advertising, operations research, and the idea that via this type of data about demographics, about people, we could both sell them, manipulate them into buying, create markets, and accurately target weaponry, accurately create these omniscient cold war systems. I emphasize that point because it shows that what Zuboff and so many in recent history are observing is not new. It's a continuum that tracks alongside increased computational power via computing power infrastructures. It was supercharged by the commercialization of the Internet in the 1990s that saw the Clinton administration do two important things that set the groundwork for where we are now. The first was to endorse advertising as the business model of the Internet - so, keeping it in line with more traditional media, television, print media, so the advertising industry would continue to have their piece – and [the second thing was] to put no guardrails on privacy – so, effectively permit unchecked surveillance by these companies. Of course, endorsing advertising creates the incentive for surveillance. Know your customer. Know more and more and more about them. Use your algorithms to create demographic types to micro-target all the pathologies we're living through today.  

There are a lot of answers to that question but, I think, ultimately, there has been a very long history that is related to the growth of certain forms of capitalism, related to the administrative imperative of states and other centralized corporations or entities, and has a lot to do with Cold War fantasies that have animated and narrated the capabilities of technologies we have today, often to our detriment - because a lot of these stories look more like fairy tales and [only] when we dig into the material reality of these tech, then they actually look like a clear descriptive of their capabilities. So that's a long answer, but it's a big question. 

Ana-Maria Carabelea: That's great. That's exactly what we needed to get started. It's a very comprehensive history, albeit short, and it brings us into the present. And I'm going to turn to Calin now because your project looks very much at where we're at right now and tries to criticize that or approach that from a certain angle. Can you tell us a bit about your project? And what is it that prompted you to choose the focus of your project? 

Calin Segal: A bit of background on my project, I started with the idea of looking at media through the lenses of propaganda. So, to simplify terminologies of marketing/advertising, I just combined them all into a classical term of propaganda, because they all have the same kind of roots. What I was interested in understanding was how - in this landscape, where digital media reigns supreme - specialized elites - such as politicians, and corporate leaders - wield unprecedented degree of power over public opinion. These specialized trendsetters skillfully utilize emotional resonance to legitimize their message and construct compelling narratives that shape collective perspectives. In a way, there's nothing new about this. The only difference now is that algorithms have become more aware of personalized preferences and biases. This, coupled with AI and deepfake technology, has the potential to spread this form of propaganda or misinformation at a scale never seen before. So, I wanted to particularly understand the relationship between propaganda and the arts. And that's where the emotional engagement comes into [play] because there's historical precedence in how this has been utilized. Today's landscape resembles much more the theater of the absurd rather than a platform for a reasonable public debate, and this is no coincidence. Since arts and propaganda have always gone hand in hand, the ability of artists to do world-building has always been used by social-political elites to embed their ideology into the public discourse. We can look at Joseph Goebbels, Edward Bernays, or even Putin's former PR consultant [who] came from the avant-garde movement. There's a precedence of how artists have been misused or their talents [have been used] in a negative way to manipulate public opinion. I wanted to look at how these emotional engagements happened by studying the stereotypical influences of today. By looking at the extremely religious evangelists, the radical activists, the dominant CEO's, all of these archetypes of public figure, I wanted to understand how they frame the issue, leverage social proof, exploit the us versus them mentality, in order to manipulate the emotions of the public perception. Tying it all back a bit into the relationship between arts and this model, that's very interesting. 

Ana-Maria Carabelea: Meredith, in one of your articles, you draw attention to the technology research field as being closely tied to industry. Going back to what Calin was saying [about] the role of art and art's presence in selling technologies or selling political ideas, I was wondering - not just [for] academia, not just technological research, but other types of research, arts, and culture - do you see any healthy, productive ways for these fields to work with industry? What would allow for that? 

Meredith Whitaker: Well, I think these fields are vital for the survival of human life and culture. Art is not frivolous. It's not something we put on top of the serious business in order to have an aesthetic experience. It's how we relate to the world and describe it to each other. It's how we make meaning. It's a very deep and fundamental practice. When I think about artistic collaboration or working with industry, what I often see is industry setting the terms and in effect, contracting with artists to paint smiley faces on the side of the bomb, to be metaphorical about it. Right. Google might have a Google for Artists program, but what they're actually doing is giving grants to artists willing to use their APIs in ways that bring attention to their products and services that can be considered positive. This is not full artistic license. Oftentimes, the terms are set so narrowly, the requirements are set so narrowly and, to be frank, the artistic field, in a sense, has been hollowed out by financialization, hollowed out by the fact that the job market is now not really conducive to people who don't have financial stability entering into an artistic career. I think this helps explain how boring art has become in the last decade or so. I went to art school. I came up in that world. It's something I care deeply about. [I think this helps explain] why so much of this kind of corporate art collaborations end up feeling like hollow marketing, because, in fact, they are. It doesn't matter how noble the artists might be or how interesting their approach is. The terms have been set by the companies - the terms of what even counts as art, the curation, all of that has been done by forces that have no interest in critical interrogation, which we certainly need more of. I think critical interrogation of these systems - their limitations, how weird, odd, centralized [they are] - all of that deserves meaningful artistic focus. But we can't confuse throwing a few marketing coins at some artists to make your party a little bit more fun with actual artistic practice that is serious about such engagements. 

Ana-Maria Carabelea: Calin, as an artist, - I don't know, you tell us how engaged you are with the corporate world or not - what do you think would enable you to have a practice that can allow for more critical thinking?  

Calin Segal: I guess there are two layers to this. One is the opportunities that exist that are limited outside of it. We're lucky in the European Union that they have these forms of initiatives but outside of this real artistic research is very limited. Most of it is all about consumer-orientated arts, and that's product-based, [and] it has a very short time span compared to a research program that's one or two years. But there's also the problem [that] everything has to be shared on social media, everything has to have a specific format. Even the ratio of the images, the way you narrate your stories have become so standardized that it's really becoming harder and harder to compete in it. If you want to challenge your audience, all of a sudden you find yourself marginalized by the algorithms, [which means] you find yourself having less impact just because you're touching on topics that might be a bit sensitive because you don't have the right visual storytelling to go with it. It's hard now to create a unique identity when everybody's looking for that standardized way of exhibition or presentation. But it's a game to play, right? It's [about] how you trick the system. It's not a defeatist position where [you think] that's it, the world's going to end tomorrow. It's more about being aware of this context in which we're producing art and having ways of manipulating it to serve our own benefits. 

 Ana-Maria Carabelea: The other thing I thought [about] is the way that surveillance is often presented as the price that we have to pay for security - whether it's safer online spaces, safer cities, - or personalization, where we get exactly the content that we want to get, we see what interests us. As Meredith mentioned: know your consumers, and then that means that you can give your consumers exactly what they want. Despite the fact that we continue to see a lot of anxiety, or perhaps more and more anxiety about the presence of surveillance technologies in our everyday life, still, quite often this argument is brought up that people are okay with the trade, we are more and more aware of it these days, and yet we continue to sort of sign up to it. Is that really the case? Are we ready to give up privacy for security and comfort/personalization? And if that's the case, what's the problem with it? 

Meredith Whitaker: I mean, God no! There's no consensus on any of that. There never has been and even the framing is false and, I believe, disingenuous. Privacy is not oppositional to security. Privacy is not oppositional to comfort. We're in an ecosystem where, because we have seen an endless string of issues of privacy, breaches of malfeasance by tech companies, of things like the CrowdStrike outage - which shows us just how powerful and dangerous the centralization of core resources in the hands of Microsoft is for the world at large -, we are seeing a demand for more privacy, for more autonomy. How we meet that demand in a political economy where for the last 30 years, tech has been growing in a monopoly form under the umbrella of a handful of large corporations largely jurisdictioned in the U.S.- where what we think of as startups or innovation is really just floating on the surface of these entrenched monopolies - is a question that is hard to answer.  

But we cannot mistake the fact that we have foreclosed meaningful consumer and government choices around privacy, around computational infrastructure, with consent, with people being actually okay with these systems. I will point to the fact that Signal’s user base continues to grow hugely. Meta is now pouring billions of dollars into marketing WhatsApp as private - private being this morally redeeming value they can claim because they're using the Signal protocol to protect part of the WhatsApp information. Apple is pouring huge amounts of money into marketing their vertically integrated monopoly as private – and in some sense they are providing better services than others. There's a huge understanding that people actually, very much do value privacy, particularly in a very volatile geopolitical context, where things like reproductive health care are being criminalized, where we have conflicts with Ukraine and Russia, where there is growing concern in some camps around China. The recognition that privacy isn't an old-fashioned, nice-to-have value but is actually constitutive of the ability to have independent government, constitutive of the ability for human rights, workers, journalists, dissidents to function within these contours, means that I don't think we can take that seriously. I honestly don't even think the people who used to profess that with such competence take it seriously. I couldn't picture Eric Schmidt saying it again in this climate, because it's just so simply not true. I think we now need to grab this opportunity to really demand a supercharging of privacy. How do we undo some of the sins that were laid down in the nineties with this surveillance business model and build infrastructures that are privacy and dignity preserving, that preserve the agency and the sovereignty of independent states that can be democratically governed, that can function more like Signal than like Meta, and actually move in the direction of that model? Because at this point, I don't think anyone serious is arguing that there's no problem here. 

Calin Segal: Yeah, I can even give the example from my own field. When generative, adversarial models came into play, everybody was excited, Oh, we could create images. Everybody started playing with them. But when those images started resembling my artworks or my friends’ artworks, all of a sudden people started caring about where these databases come from, who is scraping them, what are they scraping them for? I think it takes a while for the public to understand the implications of them. And also, I think we're moving to a new form of privacy. It's not anymore, the classical idea of privacy. I'm in my house, and this is my private space. It's more about privacy of identity, right? How do I protect my identity, which is on the Internet, which is in those databases? And that's something that is specific. I'm talking about the arts world, because that's something that I know, [where] now the biggest debate topic is: how do you protect everything that you're willing to share with the world? How do you share your artwork without [it] being stolen by those kinds of corporations? It's really a matter of time, there to be a social momentum behind those movements and go outside of the academic discourse into the real-life implications of them. 

Ana-Maria Carabelea: What's interesting is also this asymmetry - you do want to share, but that doesn't necessarily mean that you're giving it away or you do want to be online, but that doesn't mean that you're okay with your data being used for whatever purposes it's being used. And I'd like to hear what you think the direct implications to democracies are. I think this is a particularly good moment. We've had elections all over Europe, we're looking at elections in the U.S. coming up, and there's a lot of concern about that. 

Meredith Whitaker: I can start by saying, I don't think this is a technical question. If you look at Signal, Signal proves we can build highly performant systems that collect close to zero data, that go out of their way to not remember you, to not define your identity for you, that don't scrape copyrighted material, that don't claim the riches of the open web for one proprietary entity. All of that is possible. In fact, the opposite is what we should be interrogating, because there are many, many ways to build technologies that look nothing like the political-economic structure we have.  

I think what is dangerous to democracy is the significant concentration of power that defines the tech industry globally. Today we have two polls - that's the U.S. first and China second – that, for various historical reasons, developed platform monopolies on the backs of this surveillance infrastructure. There's a very interesting history of this throughout the nineties. I would recommend Matthew Crain's work Profit over Privacy. It’s a book that is a great diagnostic of this phenomenon [that] ultimately allowed U.S. companies to dominate the primitive accumulation phase of the commercial tech industry, to build out infrastructures globally, to build out platforms with incredibly entrenched network effects that make it almost impossible for new entrants to compete. And so, at this point, we have 70% - that is 70 - of the global cloud market in the hands of three U.S.-based companies: Amazon, Google, and Microsoft. This means every startup, beyond [that], every government, is effectively licensing infrastructure from these companies in order to exist, in order to scale. We have four of the five major social media platforms also in the hands of U.S.-based companies - these same platform companies, the YouTubes, the Instagrams, the Facebooks, et cetera. And we know that people don't go to Der Spiegel's website generally. They don't go to the New York Times; they go to Twitter or Facebook. Most news is encountered through these platforms. So, what you have here is an incredibly concentrated control - in one jurisdiction - over our core computational resources and our media ecosystem, with the ability for these companies - as we know - to put their fingers on the scale at any point they want. [...] Whatever side you're on, the volatility in the U.S., [and] the current election cycle is waking everyone up to the fact that these are single points of failure that could be easily weaponized by whoever it is, and that the consequence of that weaponization would spill well beyond the US, shaping opinions, public discourse, our sense of shared reality across the globe, and potentially providing an unprecedented degree of information asymmetry benefiting the U.S. regime and potentially harming other states, corporations, and those who are dependent on these infrastructures. This is not a prediction. I don't know what's going to happen in the election. I have no crystal ball. I do think this confluence of events and the negligence over the past three decades in terms of managing this market and ensuring meaningful competition has led to a moment where everyone is uneasy, and the people who really understand these systems are deeply uneasy. Pushing for an alternative is where my focus is: how do we grow the Signal model, and how do we grow much, much more alongside that, so we create a foundation for technology that is not toxic or concentrated? 

Calin Segal: I can even make the parallel with the tools that we're using in the digital arts, right? [There are] five companies that own most of the tool sets that we use. So, if tomorrow, let's say, Adobe decides to close down, all of a sudden, we're locked out of the system because it's a cloud-based system, it's not running locally on my computer. This idea of everything being on the cloud, this push for everything to be on [it], is actually what has taken away agency from the end user. Whereas before I could open up, hack in all directions, and make the tools do what I needed [them to do], I now find myself blocked by all these corporate gates that have been put in place once they've moved into the cloud model. What's really perverted is that this cloud model amplifies the power of control of these companies. It gives them more and more access to how we do things, what we do with their software, and it removes some of that agency from the end user. 

Meredith Whitaker: Yeah, I would just co-sign that we have seen power that used to be at the edges, so to speak, almost completely consolidated at the center, the center being these five companies. This is across the board, and it's something, of course, we at Signal are fighting. We are rewriting a lot of the stack to enable privacy. We are working to claw some of that power back. But what we need is not just one, or another effort. We need a concerted push to really invest in shifting this paradigm toward a more democratic, more accessible, more actually innovative ecosystem. 

Ana-Maria Carabelea: And not just that, but once power is concentrated in the hands of a few opinion makers, companies dictate the public discourse. And it's not even in the way that Calin was mentioning before - the classical way propaganda did it, where we were all aware of the public discourse, it was all the same for everyone. [Instead], it's now very fragmented. We all have very different bubbles in which we [are made to] believe that's the public discourse for everyone, but it's not. So, at the same time as the power is centralized, there are many public discourses that kind of separate us into different bubbles, which inevitably makes democracy a lot more difficult to navigate and a lot more difficult to understand. 

Meredith Whitaker: Well, I think we could analogize it, perhaps, to the Catholic Church's control over the written word for many centuries. I don't want to index too hard on filter bubbles. I think it's a very disempowering frame in some way: that somehow, because we don't share, because our information ecosystem is so polluted, none of us are allowed to have any purchase on what is real, that we somehow just exist in a bubble, and we can't turn to democratic processes or a deliberative practice because we've somehow lost the thread. And, of course, there's an implication there that somewhere, some system, some company would know the whole picture, and we need to turn to them. I want to be careful about indexing too hard on that framing because it's not always in good faith. Of course, human beings encounter each other. There are disciplines that have rich literatures that are studying many of these elements. There are people producing brilliant work. There is a lot that is happening artistically, intellectually, and otherwise, to explain and map what we're dealing with. At the same time, ad-supported, algorithmically-driven content platforms are not a good way to support a shared information ecosystem that is going to help us with the kind of complexity we're dealing with. [...] It's not good because they rely on mass surveillance. It's not good because - in the name of time spent on the site, ads clicked on the site - it is showing us more bullshit and less meaningful content. And it's not good because the type of control that these massive networks exert via a handful of companies means that we really could see social control, information control at a level we've never seen before. 

Calin Segal: And in a way, it's a self-fulfilling prophecy, right? Because the more these technologies are used, the more fragmented [the discourse] becomes, the more it's accentuated. It's a never-closing loop. And that's the main problem with where the Internet is today, as you said earlier, that everything is centralized. Before it used to be much more [about] blogs, posts, random websites. So you actually had to make the effort of scraping and moving around to get your information, whereas now, with all the LLM agents like chat GPT, everything gets funneled into one single access point. [This] is fueling this kind of individual bubble, because we get everything from one platform that has been curated for us. It creates our own vision, [or] it amplifies our own vision, and, therefore, we believe that that's the common consent. It's something that I see a lot in my installations. When I create provocative moments, people really tribalize together. You have groups uniting, groups in opposition, people starting to fight each other. It takes a little spark to ignite this kind of social manifestation. I’ve been seeing it also in terms of generations. [...] Especially my generation and the younger generation, we're up for fighting for our beliefs. We're very ready to fight for that, but we don't know what to fight for, we don't know which side to fight for. So, we end up creating a cacophony of opinions and ideas around each other. Whereas the generation after is much more willing to debate and discuss. This is happening in a 20-person group, and I've seen this happening throughout exhibitions too. I'm guessing this can also be applied at the societal level, where there's this trend [to think that]: because I'm entitled to an opinion, I have to say it, therefore it's my own beliefs that count over the collective beliefs. 

Ana-Maria Carabelea: Great. Thank you so much. Before we end the conversation, I usually ask this for those listening who want to dive deeper into these topics. Do you have any recommendations for books, articles, research pieces, or artworks? Meredith, you mentioned a couple already. Is there anything else that you'd like to add to that already great list? 

Meredith Whitaker: I'm going to just add a couple of things I'm looking at on my table right now, which I'm enjoying. I just started a book called The Exhausted of the Earth that is really interesting. It’s thinking very broadly about the theme of extraction, whether it be our time or fossil fuels or just the extractivist mentality that led to the hollowing out of open-source core infrastructure in the name of computational commercialization. It's a very powerful framework to think about where we are and how we might get away from there. Although I've just started it, I'm already very much enjoying it. Then there's another book called When the Clock Broke by the author John Ganz, that looks at the political context of the early 1990s as a precursor to what we're seeing today. That has been very helpful as I work on a longer project that theorizes in part the 1990s as a core history of our tech ecosystem today. Putting that in conversation with Matthew Crane, Sarah Myers West, Karina Rider, and others who've done work around that critical pre-figurative period that led us to our context. When you look at Ganz's book, you see that in a lot of the political movements there was a turn toward the reactionary, toward the racist. There was a celebration of David Duke as a viable political candidate. There was an attack on people who received social services. There was a thinly veiled racist attack against "welfare queens” and others. So, you begin to get a sense that the environment that created the foundational tenants for Internet policy looks a bit like what we are dealing with today. For me, it gives a much richer map of how we might get back to something more socially beneficial, incorporating an understanding of where we came from. 

Calin Segal: The book that really helped me was Andrey Mir's Postjournalism and the death of newspapers, which is a nice, in-depth look at the evolution of media from the financial perspective, but also in terms of technology, and also touches on Manufacturing Consent by [Edward S. Herman and] Noam Chomsky and how he readjusts the model for the current status, [and] gives a really nice historical precedent while also touching on the contemporary landscape. 

Ana-Maria Carabelea: Great. All great recommendations. Thank you so much. Thank you so much for joining me today. It was a pleasure to have you on. 

Meredith Whitaker: Thank you so much. It's been a really lovely conversation, and nice to meet you, Calin. 

Calin Segal: Likewise, Meredith. 

Ana-Maria Carabelea: That's it for today. Thank you so much for tuning in, and I hope you enjoyed it. The Digital Deal Podcast is part of the European Digital Deal, a three-year project co-funded by Creative Europe. If you want to find out more about the project, check out our website, www.ars.electronica.art/eudigitaldeal. 

People on this episode