The Digital Deal Podcast

Love and the City

Ars Electronica Season 1 Episode 7

Send us a text

What happens when our public behaviour is constantly monitored? Do we still hug, kiss, or allow ourselves to be vulnerable in public, does it stop us from being at our worst? In this episode, we talk to artist Noemi Iglesias Barrios and Guggenheim curator Noam Segal about surveillance systems in the public space and why we might want to 'measure' cities in terms of emotionality by training algorithms to search for signs of love on the streets.

Resources:

The Radicality of Love by Srećko Horvat
The Shadow (1981) by Sophie Calle
Noise by C. R. Sunstein, D. Kahneman, and O. Sibony
The Atlas of AI by Kate Crawford
Race After Technology by Ruha Benjamin
Nexus by Yuval Noah Harari 

Host & Producer: Ana-Maria Carabelea
Editing: Ana-Maria Carabelea
Music: Karl Julian Schmidinger

The Digital Deal Podcast is part of European Digital Deal, a project co-funded by Creative Europe and the Austrian Federal Ministry for Arts, Culture, the Civil Service and Sport. Views and opinions expressed in this podcast are those of the host and guests only and do not necessarily reflect those of the European Union or the European Education and Culture Executive Agency (EACEA). Neither the European Union nor the European Education and Culture Executive Agency (EACEA) can be held responsible for them.


Ana-Maria Carabelea: Welcome to The Digital Deal Podcast, the series where we talk about how new technologies reshape our democracies and how artists and critical thinkers can help us make sense of these changes. My name is Ana Carabelea, and today, I am joined by artist Noemi Iglesias Barrios and curator Noam Segal.  

Noemi Iglesias Barrios is a multidisciplinary artist and researcher working with porcelain and long durational performative formats. Her practice focuses on how emotional experiences, such as falling in love, are influenced by consumerist strategies. Currently, she is researching Cobalt extraction and urban recycling processes at the University of Fine Arts of Lisbon. In 2024, she had her first solo exhibition, Love Me Fast, at the Museo Nacional Thyssen Bornemisza in Madrid. Her upcoming exhibition, Landscapes of Affection, will open in December at the Museum of Fine Arts of Asturias. Her work has been shown at DG Connect of the European Commission, Electron Festival in Switzerland, the Gimhae Clayarch Museum in Korea, the Aichi Prefectural Museum of Art in Japan, the Yingge Ceramics Museum, and the MOCA in Taiwan, among others.  
Noemi is also currently an artist in residence with GLUON in the European Digital Deal project.  

Dr. Noam Segal joined the Guggenheim’s curatorial department in 2023. Her work centers curatorial practice at the intersection of digital, political, and social representations in contemporary art. Prior to joining the Guggenheim, she was the director of curatorial research in the MA program for curatorial practice at the School of Visual Arts, New York, where she organized the Algorithmic State, an online series of critical discussions featuring artists and technological scholars such as James Bridle, Agnieszka Kurant, Lev Manovich, Hito Steyerl, and others. Noam has curated or co-curated exhibitions and presented public lectures and programs at several museums and arts organizations across the United States, Europe, and the Middle East, including Still Present!, the twelfth Berlin Biennale for Contemporary Art, and Afterwards was already before, the AURORA Biennial in Dallas. Noam’s also a frequent contributor to books, academic journals, art magazines, and catalogues.   
Welcome to both of you. Thank you so much for joining me today. 

Noemi Iglesias Barrios: Hi Ana. Hi Noam. 

Noam Segal: Hi Noemi and hi Ana. Thank you so much. I'm glad to join you here today. 

Ana-Maria Carabelea: So am I. Thank you so much for joining me in what is going to be a discussion about love and public displays of affection, but with a little caveat: we're looking at that in the age of surveillance and, more specifically, how emotions are lived in the public sphere with this awareness of an increasing possibility of being seen or surveilled. 

I wanted to look at a few things that make and remake the ways in which we love and express emotions in an increasingly surveilled public space. And Noemi, the project you're currently working on provides a good introduction. So, I’ll invite you first to tell us about the project, place it in context, and how it connects to your practice in general and your previous body of work. 

Noemi Iglesias Barrios: Thank you, Ana, for the great introduction. I'm very happy to join the Digital Deal Podcast, and I will speak about my latest project, The Falling City. It's a project that measures the levels of emotional display in the city. I think the streets of the cities reveal and reflect the values of the society inhabiting them. While there seems to be a need for technologies that will measure the levels of safety within, such as the recognition of violent areas or the identification of pavements in poor conditions where the most vulnerable people tend to fall, I think that something is missing, such as the fact that there is also a fall in falling in love and emotions are fundamental elements in the construction of public space. 

While there is a wide range of advanced technologies in the market focused on registering the negative scenes of people's actions and reactions in the cities to warranty privacy or safety, there is still no system based on surveillance artificial intelligence that has been trained to recognize people's emotional behavior by measuring the levels of empathy and affective display in the public space. 

I normally work around the theme of love and emotions, and many times, I've been asked why. I'm constantly doubting, you know, why or why not. I don't know if love is something that I've chosen or if it's something that has chosen me. 

Ana-Maria Carabelea: Obviously, I was going to ask why, and I'm still curious. My “why” is not why you specifically chose the theme, but perhaps - and it's a question for both of you - why love, or how love might be a good pathway into creating this understanding of how the presence of surveillance technology in public space affects us. 

Noemi Iglesias Barrios: Well, I'm not stating the debate on the desirability of these kinds of technologies in public space. I'm not focusing on whether we should have cameras, or we should not have cameras. I'm more focused on the fact that if these technologies are there, why are we not using them from a positive side? Why cannot we also train these kinds of technologies into something more related to us as humans? And well, there's nothing more human than emotions. I was also quite surprised that it hasn't been used to read our emotions, while we always use it to count the levels of violent areas or poor street conditions. 

Noam Segal: Thanks, Noemi. It's so wonderful and fascinating to hear about The Falling City and how you think about love in public space. 

I found it particularly fascinating because of the question: how do you even train machine learning about love? How do you manifest this idea into a code based on mathematical probabilities? 

What kind of concept, terminology, or things that you can datafy mark what we call love? Beyond a set of representations, as you justly created, we also have tools today that track our eye movement and irises found in public spaces around America at least. Basically, they see how our eyes react to this kind of love and care, gestures, and so on. 

There is a big change in the way that love, let's say, has been represented in digital environments. And if we think about previous evolutions of the web, like in networked societies in Web2, the Internet was a place where we could create those spaces for care, for love, for creating communities. It served as a big space for LGBTQIA+ communities and so on.  

What we are working with now is a very, very opposite environment because there, we had the transparency and could come together. Today's a little bit the opposite. We don't have any access to the code. The code is proprietary, and we don't know exactly how it organizes and choreographs that socially. 

I want to mention a few other artists who tackle this in different ways. For example, Julie Chang who created spaces to enhance love in digital spaces, love that was not always accepted in public spaces under different oppressive laws. The entire hacking manifesto - how can we embrace and amplify love gestures and so on. And on the other end of this, of course, today we can see smaller data sets and AI that were initially trained on values of care and intimacy and love in, for example, the work of Stephanie Dinkins

We can think of many, many other artists who designed the code to be calibrated and react as more of a loving, vulnerable, and caring figure. There is a great scale of different artworks that we can think about. We can talk about surveillance in that sense and how it's been tackled in different geographies because, of course, what happens, for example, in China is different from Korea and Singapore, and what happens in the US is very, very different from the Middle East or Latin America. We have different infrastructure for that. 

Ana-Maria Carabelea: I like the idea that we have different infrastructures, but I also think we have different ways of displaying emotions. And I want to quickly go back to Noemi's project to understand better how you define love. What does the algorithm read as love? What sort of categories is it operating with? 

Noemi Iglesias Barrios: Well, the technology that we are working on is an algorithm that we are training to recognize three actions: hugging, kissing, and holding hands. It's not that I believe that if you hug somebody or if you kiss or if you don't, you love or you don't love. But there are certain social agreements that we have come up with in society. And I thought these three actions might define more or less a display of emotions in the public space. Of course, it's not the same if you are filming in any city in Europe where people are allowed to hold hands. And in some other countries, it might not be allowed, but that doesn't mean they don't love each other. Somehow, I had to work with something that could be a reflection of intimacy, of an emotional display. I think that these three actions could be the basis for this. I have to start somewhere. So, I thought, let's start with these three, and then we can refine our algorithm.  

I'm not using facial recognition. That's something I had very clear from the beginning, that I didn't want to use such technology. So, we are recognizing the actions through the skeleton. It's called skeleton recognition. We are only tracking the movement of humans on camera, and we count in seconds. With this, we measure love or emotions, if you want to put it this way. I'm not stating that you can measure love in seconds, but with this kind of technology, you have to keep some data to track the information. So that's the way we are starting and setting the basis of the project.  

Noam Segal: Noemi, it is just fascinating - the way you highlight the gestures of love. Of course, you have to quantify them, right? There is no other way to represent something for an algorithm without quantification. And one of the things that struck me with your work is, on the one hand, it highlights the gestures to gallery audiences and shows - with a flickering light, almost like fireflies - how can we find those still magical moments in public spaces that are allegedly free but still surveilled, but then still keep the opacity and the privacy of the people? 

At the same time, I thought that the even crazier potential of that is what will happen if we can use that tool in a reverse way, in a reverse engineering way, in a way that the algorithm actually sees more and more and more gestures of love. And we train the algorithm to amplify and deal with this in the way it creates its outputs. If we create more presentations of love, then we have more probabilities of love, and then to some degree, we can repair - let's say we can use the language of repair - or realign the code to consider that in its output. Instead of the original mechanisms of surveillance or whatever it was trained upon, we can actually intervene in those codes to amplify and change the outputs based on that. There is great potential there that the work is turning to the audiences, but at the same time, it can also turn its gaze to the algorithms and the code themselves and create that process of reverse engineering. 

Noemi Iglesias Barrios: That's why I started this project. If it could feed us with positive information, why is it not doing it? Why are all these surveillance systems, codes, or algorithms always feeding the speech of fear? It could feed the speech of love. It will probably be healthier for us as humans. 

Noam Segal: How do we create the operationality? How can we make them operational with these kinds of gestures of care and love? 

Ana-Maria Carabelea: Yeah, I love the aspect of subversion, hacking, and training a code so that it does something else that it was intended to do in the beginning. The other very interesting thing is this idea of the approximation of what love is for the eyes of the algorithm, which comes with the question of what we measure and quantify as a way to assign value to things. I am very curious to see how you think this - beyond just training the algorithm to recognize these other aspects of human life - how it might help us go even further than that, perhaps by coming up with different ways of defining value or different definitions of value and what we see as important. 

Noam Segal: Thanks for this, Ana. It's a fascinating question, and I grapple a lot with this. We talked about how love can be quantified and Noemi's choice to use duration, right? Because how else would you, right? You have to quantify these things. But then there is the question that stands at the basis of all of this: that computer scientists had to translate everything to quantified data, to objective data, and that already carries a set of values, right? 

The understanding is that everything can be reflected in this way and represented in a way that is datafied and turned into mathematical values. This is something we carry in mind as we delve into this question. We can see that throughout art history, artists usually enhance subjective ideas in datafication, whether in conceptual art or post-minimalism. Artists gather data, but that data always points to the subjective experience of life and not the objective one. We can think of Tehching Hsieh's one-year performances punching a card or being homeless. We can think of the conceptual work of On Kawara marking a date, but at the same time using other forms of representation to give room to the thing that is not exactly in language, that is the mystery of his inner life. Artists always enhance the facets that cannot be captured by the objective datafication traits. Yet, the data we're dealing with now seems to be doing the opposite of that. Here, I want to open a parenthesis and mention that today, we have datafied courtrooms, for example, the understanding that we can datafy justice. At the same time, it's very hard to create this alignment - we call it alignment - for example, for stealing something, what do we do if we have an immigrant parent who steals to provide food for kids, or we're talking about an armed robbery of a criminal act. It's very hard to create this objective basis for these kinds of things. In the same way, for suffering, we can maybe measure pain, but what is human suffering? What is human suffering from a broken heart, falling in love, and so on? 

There are very, very big questions that come with the understanding that we have to create those objective values. I also want to add that machine learning is based on language, so the code can represent everything that can be datafied through linguistic concepts. But, especially in art, one of the main advantages of art is that it brings us a form of representation that does not exist in language. And this is exactly what we might lose and cannot be represented. This is why the work of artists like Noemi and many others that try to create room for the things that cannot be grasped through full dataism and cannot be represented by probabilities, but by creating space for that thing that is not fully transferable inwards. 

Ana-Maria Carabelea: That is what I was hinting at before with the mention of approximation because, for me, Noemi's work clearly admits that there's always a rest that the data cannot represent, and very much hints at that, and perhaps most visibly in the installation. Maybe now is a good time, Noemi, to give us an insight into how the installation was designed and conceptualized and what kind of experience the visitor will have. 

Noemi Iglesias Barrios: I think it's important for us as artists to be able to find a way to de-codify the information that is not particularly understandable from our cognitive side. I normally create attractive installations, or at least I believe they are attractive, because, for me, it's a way to attract the public. It's a way to make people feel more attracted to what they might be seeing or experiencing. For example, in The Fallen City, the camera will always be pointing at the public space, but images are not recorded, and we are not storing any of the images. We are just using the data from the camera and counting the levels of emotional display through the three actions we mentioned: kissing, hugging, and holding hands. The light installation in the exhibition space is connected to the camera in the public space. So, the data collected of people hugging, kissing, and holding hands in the public space triggers the installation to light up in different colors: blue for hugging, pink for kissing, and fuchsia for holding hands. The longer the action is performed in the public space, the more intense the light will be in the installation. 

It's super difficult or even inaccessible for the people in the public space to have access to what is being recorded if they are being filmed, and [understand] to what extent this affects their behavior or emotions. To bring up that kind of conversation, it was necessary to translate all this data into an artistic installation with fairy lights. To say it somehow, you know. 

Ana-Maria Carabelea: I think what it does brilliantly is to bring to our attention the dissolution of the boundaries between the public and private spheres. Within this context of increased surveillance - whether we think of smart cities that have technologies embedded in their design or simply the streets of London that are just packed with CCTV. 

Noemi Iglesias Barrios: Yeah, but wouldn't it be great to use these CCTVs in the streets to be like, oh, do you know where is the street of the hugs or the street of the kisses? Or the most romantic area in the whole city of London, Brussels, New York, or any city. Because we have been reading cities from many perspectives - the levels of pollution in the metropolitan area are not healthy for the inhabitants and so on. We always have these kinds of measurements of the cities, but we never hear in this city people love each other a lot, give a lot of hugs, and kiss each other so much, you cannot believe it. I think, I don't know, but maybe that could also motivate some people to be a little bit more emotional with each other. It will be like another way of reading the cities. 

Noam Segal: It's interesting to connect what you said earlier, Noemi, about the code being inaccessible and just a bunch of numbers [...] no one can read. To some degree, we do live in a post-privacy era. Forget that (i.e., privacy), everything is surveilled - it depends where you are; of course, some countries are much more surveilled than others - but this has resonance in public space, and it affects the way people behave. For example, some places don't suffer from stealing anymore because you can leave everything in public space, get back after three days, and it will still be there because it's all surveilled. So, there are different directions that it can take us. But with that, it's important to remember that most of that information right now - it’s not that our secrets are being shared with other people in the post-privacy era - is being shared between machines. And [...] we don't have access to that inner communication between one machine and another. Part of the great potential that your work presents is how we can bring this back into the algorithm, into the liminal space, into the vector space of the code that we cannot access. 

I'll give one example to make it more coherent. For example, we can use smart watches or other things that measure our biodata. That biodata will probably not leak, but it will only be connected to other algorithms. Let's say that an individual has some sort of medical situation that they may not even know about. At the same time, they are being interviewed for a job. Today, the HR system is also dependent on AI, and that information is communicated from one algorithm to another. And no human will ever even have to access that, to think, oh, that person, I will consider their health situation, or not. Sometimes even before the person knows that, it will be inner algorithm communication that will make those decisions for us. 

So, there's great value in showing audiences and making us all aware. At the same time, the potential of taking part in inner algorithm communication is equally important for our future and well-being.  

Noemi Iglesias Barrios: To what Noam is saying about profiling, right? When there is a machine communicating with another machine and no human interference, that's a little dangerous for our society. There is no human interference in that; the machine just assumes that that is the truth. But it's not always as it seems; these assumptions are not always based on true data, even though the machine thinks they are. The data that the machine reads is not always intentional from the human side. Sometimes, you might have your phone open on a website and forget to turn it off, but then the algorithm thinks, oh, they really like this site because they have been there for one hour. That's what I'm saying about the interpretation of data and how it's related to us and to who we really are. 

Noam Segal: Yeah, it's a great comment. The sad thing is that it's not in our hands because the code is inaccessible, so we don't know how data is managed over time. 

Ana-Maria Carabelea: At the end, as we have to approach the end, unfortunately, I usually ask this for whoever is listening and might want to dive deeper into the topic. If you have any recommendations for books, articles, or other artistic practices, whatever you can think of that might provide an entry point into the topic.  

Noemi Iglesias Barrios: I really like the book of Croatian philosopher Srećko Horvat, The Radicality of Love. It's a book that weaves together a story of love and revolution, focusing on the potential and dangers of love through desire, sex, destruction. But it's also very based on the use of public space and how that can affect the way we might be able to show emotions or not. 

Then there is also this artist that I admire, Sophie Calle, a French artist. She has these early works from the ‘80s, and one of them is called The Detective. She hired a private detective to follow her, and as she was being followed and recorded, she also reversed that set and started following her detective. So, [...] the detective was following her, but then she was following the detective, and it became a bit like nonsense. I thought it was very inspirational. 

Noam Segal: I'm always happy to share a few books in this vein. The book - already published a few years ago - called Noise: A Flaw in Human Judgment by C. R. Sunstein, D. Kahneman, and O. Sibony. They deal with the question of automated courtrooms. It gives a lot of information about how the court and the legal and judicial systems align with these questions and surfaces many ethical questions in this regard. Jussi Parikka’s - who I know was also a guest on this podcast - Operational Images is a wonderful book. Of course, Kate Crawford’s Atlas of AI and Ruha Benjamin’s Race After Technology are the building blocks of thinking about machine learning today. I'll mention another book I'm currently reading: Nexus—A Brief History of Information Networks from Stone Age to AI by Yuval Noah Harari. 

Ana-Maria Carabelea: Great, thank you so much. Those are all such great recommendations. I can't wait to put that list together. Thank you so much for joining me today. It was great having you.  

Noam Segal: Thank you both so much. And Noemi, it's so great to hear you and learn about the work. Thank you for the invitation, Ana.

Ana-Maria Carabelea: Thank you for joining. 

Noemi Iglesias Barrios: Thank you so much, Ana. It's been a pleasure talking to both of you today. Thanks, it's been really inspiring. 

Ana-Maria Carabelea: That's it for today. Thank you so much for tuning in, and I hope you enjoyed it. The Digital Deal Podcast is part of the European Digital Deal, a three-year project co-funded by Creative Europe. If you want to find out more about the project, check out our website, www.ars.electronica.art/eudigitaldeal

 

 

 

 

People on this episode