Digital Technology and ­Human Rights

Podcast mit Joana Varon

In der Anfangsphase galten das Internet und digitale Technologien als revolutionäres Werkzeug, das allen Menschen gleichberechtigten Zugang zu Wissen ermöglichen sollte. Heute ist die digitale Landschaft von großen Tech-Konzernen und einzelnen Akteuren monopolisiert. Wo es anfangs um Creative Commons und das Verändern von Machtstrukturen ging, ist der digitale Raum heute von Hassrede, Desinformation und Machtungleichgewichten geprägt.

Die Brasilianerin Joana Varon forscht zu Technologie und Menschenrechten an der Harvard Kennedy School. Mit ihr sprechen wir in dieser Folge über digitale Rechte, Möglichkeiten des Empowerments und diskriminierende künstliche Intelligenz. Um Demokratien und Minderheiten online zu schützen, müssen wir uns,  laut Varon, von der Idee universeller Softwarelösungen abwenden und stattdessen Technologien lokaler und vielfältiger gestalten.

Es ist eine Grafik zu sehen. Ein weißer Kreis ist umgeben von schwarzen, pinken und blauen Streifen, die vom Kreis nach außen gehen. Im Kreis ist eine Illustration einer Frau zu sehen mit dickem lockigen Haar. Um sie herum steht Joana Varon, Die Kulturmittler, #44, im rechten Eck der Illustration sieht man das Logo des ifa – Institut für Auslandsbeziehungen. Es handelt sich um das Cover der 44. Episode des ifa-Podcasts "die Kulturmittler" mit Joana Varon aus Brasilien zum Thema Digitale Rechte und Menschenrechte. Illustration von Lea Dohle.
© Lea Dohle

Kulturaußenpolitik hörbar machen.

Das ifa liefert Hintergrundwissen und Antworten auf Fragen der Zeit im Podcastformat

Diese Folge des ifa-Podcast ist auf allen gängigen Podcastplattformen abrufbar. Um keine Folge zu verpassen, am besten "Die Kulturmittler" auf dem Streamingdienst der Wahl abonnieren.

Transkript der Folge

Folge #44: Digital Technology and Human Rights. Mit Joana Varon

I think one thing in tech that helped that kind of craziness that's happening now on the far right and disinformation is the logic of the algorithms that allow for people to live in their filter bubble. You start to receive a lot of content about the same misinformation, and then all of a sudden that bubble constructed a parallel reality with a lot of untrue facts. The algorithms on social media that we have now, they promote misinformation, and they promote hate because those kinds of content have much more engagement than the regular, sometimes boring truth. So that's something for us to think about the relation of civil society. (Joana Varon)

Dan Wesker: Welcome back to ifa's podcast "Die Kulturmittler". The title of which can roughly be translated as "The Cultural Conciliators". I am Dan Wesker. The voice you heard at the beginning belongs to Joana Varon. She is the Executive Directress and Creative Chaos Catalyst at Coding Rights Brazil. Coding Rights is think tank run by woment that exposes and redresses the power imbalances built into technology and its application, particularly those that reinforce gender discrimination and inequalities between the Global North and South. Joana Varon is our guest for this episode. We will talk about her visions for creating a more equal and inclusive digital world. First, I wanted to get her opinion on recent developments in the tech world. The Social-Network Twitter is one of the most relevant media outlets of the civil society worldwide and political movements such as the Arab Spring. Back in October, Elon Musk bought the platform. I asked Joana Varon what changes she has already seen since Musks takeover and what she is expecting in the future.

Joana Varon: It has changed already. You know, he dismantled the team that was responsible for thinking about human rights and the company. The Safety and Security Council is also compromised. He has a very old view of tech. That tech is about just the developers that are not even literate in human rights issue. And we already know that technology is not neutral. It can cause implications on rights. And he has been dismantling all that and having horrible working conditions. So of course the software that will come out of all that will progressively be affected. That's why have a problem of having monopolies because like what happened with Twitter. It got bought by one single person, a billionaire with a crazy vision of the world and of politics and all that. So this is threatening particularly to the most vulnerablised communities. If you don't have a Safety and Security Council in place, people thinking about human rights and he's actually re-establishing several accounts from the far right. So those people are back on and attacking people. So, it's already very chaotic. That's why monopolies are so dangerous.

Dan Wesker: And do you believe that a better and fairer version is possible?

Joana Varon: It's a fight in many layers. The layer of regulation and regulation in several fields from consumers law, data protection, competition law, human rights law. So there is the layer of regulation. There is the layer of investing in science and technology development, local science and technology development. What we have now are tools that are universalizing one technology, but every community, every context has a particular need of a technology in terms of language, in terms of the level of connectivity. So if we can create an environment that's rich for the development of tech, we can have alternative tech as well. And I think that alternative tech shouldn't have this goal to be completely global and universalized. We just need technologies that are interoperable. It doesn't need to be that big. We are creating big monsters. So then it's hard to deal with them.

Dan Wesker: At the moment, Joana Varon is a Fellow at the Carr Center for Human Rights Policy from Harvard Kennedy School and affiliated to the Berkman Klein Center for Internet and Society at Harvard University among many other international civil society activities. I wanted to know how she came to be so involved with human and digital rights in the first place.

Joana Varon: Since I was a little girl, I always liked video games and I used to engage in video game activities. And back then when I looked around, I was surrounded by boys and I was like, hy? Where are the girls? Why are only boys here?' And that was, I think, the first feeling that I had that there was something wrong. And then when I created Coding Rights organization, I had this idea of, okay, let's gather women and non-binary people to think about technology. Before that, I was already engaging in the tech fields in research centres in Brazil or as a consultant. I was again surrounded by men. So, I wanted back then to discuss, okay, what if we debate gender and technology? What if I get surrounded by women and non-binary people? What kind of projects would emerge? What kind of debates would we have if we add this layer of gender and add feminism to the debate around technology?

Dan Wesker: Bringing feminism and more inclusive ideas into the digital space – these are just some ways in which Joana Varon aims to not just improve the online world, but also the lives of people in general. She does not separate human and digital rights as the latter should be seen as an important part of any human rights debate. Especially now when fake news, far right filter bubbles and big tech firms are threatening democracies all over the globe. I wanted to know in how far the human rights debates have changed because of digitalization.

Joana Varon: I think new challenges arise and also opportunities, you know, because now we can communicate much wider with similar communities. We can exchange ideas about strategies. So, there is the opportunity that communication brings. And that was the main spirit of the Internet before it got monopolized by big tech companies. And then some challenges arise in terms of surveillance, protection of privacy, of freedom of expression, particularly in this context of monopolization, some of the main tools that we use to communicate. So, there are lots of opportunities and lots of challenges as well. But for me, what I don't like of the term digital rights is that it kind of gets separated a little bit from the traditional challenges of human rights. And that's what is important, that we connect back all the knowledge that has been produced in the field of human rights, in the field of feminist anti-racist fights. All that applies to technologies as well. We are not departing from something completely new but there is this extra layer of tech.

Dan Wesker: I understand that if one's being concerned about human rights before then obviously one sees it again with the expansion of the web and online media. How did you actually come to it?

Joana Varon: I came to this decades ago when we thought the Internet was a tool for revolution, a tool for access to knowledge. We were debating copyleft, Creative Commons, exceptions and limitations to patents, access to medicines, access to content and technology as a tool for communicating. We had this hope of changing power structures, an opportunity to exchange content that was never seen before. So back then, that was the fight, access to knowledge. And then many things changed. There was this progressive monopolization of tools that we use on the Internet, social media started. We had the fight on freedom of expression and content moderation. How do we balance those two things, particularly when the business model of those mainstream social media is the surveillance capitalism? So how do we protect our privacy, our data in this kind of scenario of this dominant business model? How do we protect freedom of expression but also protect vulnerable communities, like LGBT communities? I'm part of the LGBTQI+ community as well. How can we have freedom of expression but also not pertained by hate speech, misinformation? How do we assess that balance? How do we deal with the fact that those companies acting as in this business model of surveillance capitalism and their relation to governments exposing our data? So how do we maintain some level of anonymity that is important to discuss in the democracy without turning all this into toxic spaces? There are many debates that underline the classic human rights challenges, and those debates became progressively more complicated with the digitalization, with the monopolization. And the differences of North and South and how those technologies are being developed and implemented. I also debate a lot of practices of digital colonialism. There is a logic of the Silicon Valley that is colonizing most of the globe, and there is now the emerging of tools from China. So where do we position ourselves in in this? I live in Brazil, in Latin America. Can we recall our culture and think about other technologies that emerge from other logics that are more decolonial logics, feminist logics, and not the surveillance capitalist logic and the digital colonization logic? So those are like the emerging teams that have tried to deal with in the field.

Dan Wesker: You mentioned already the North/South inequalities and the Silicon Valley influence and China's influence on the world. How would you imagine that these power imbalances could be challenged? How could they change?
 
Joana Varon: It is a great question and a big problem. We have two big political and economic superpowers. At Coding Rights we are doing a map of the Internet territory and playing with tech cartographies. And the idea is to exactly materialize technology. There is particularly the Internet we are focusing on because there was this idea of the cloud, and the cloud doesn't exist. What we have are servers, cables, factories producing the chips and the components and the devices. And then we have people working on software layer and then we have a lot of e-waste. So, in this geopolitical structure of the Internet, we have China as the main place manufacturing all this, even the devices that are sold by US brands. So China independently is the main producer of all our devices today. It produces a lot of e-waste because that's where the factories are, but it also imports e-waste from the globe to recycle and to use in the factories. Even in the debates of ecological debates, climate justice, do we want to remain as that? Do we want to think about other ways to produce those devices in other regions and also with other logics that is not only the Chinese logic? Because then we have the North as the major consumers of those devices and these devices get discarded very quickly. So a lot of the production of the e-waste is also in the North because they have more consumer power. But on the other hand, we see again, a colonial logic because the extractivism of minerals. The mining mostly happens in the South, in Africa, in Latin America. So for instance, Elon Musk selling Tesla cars and saying that, oh, this is green. Actually, green for who? Because for those cars to function, they need batteries. And for the batteries to exist, they need lithium. And most of the lithium is now in Latin America, in Chile, in Argentina, in Bolivia. What is the social environmental impact from mining, from extracting all this lithium? How could that be green if it's destroying many people's lives and many ways of being and of living? We need to rethink. So the exercise of the map is exactly to showing those processes at least of the Internet.  Showing the colonial logic behind it so we can rethink what is the alternative that we want from the manufacturer level of the devices to the soul of the devices, to the software level of the devices. And what I say is that we need to diversify. We need to break those giant, big tech monopolies in one hand, and we need to redistribute and be more local. If things are produced more locally and recycled more locally, we are more connected to the impacts of things because then we see it in our backgrounds.

Dan Wesker: At Coding Rights, Joana Varon raises awareness about the current artificial intelligence systems used by the public sector as they can transfer racist and misogynist ideas. I wanted to know how exactly these technologies discriminate and if we can do anything about it.

Joana Varon: At Coding Rights in partnership with my colleague Paz Peña from Chile, we started this project that's called "Not My A.I.", in which we studied and we mapped A.I. projects being implemented by the public sector in Latin America. And as Katie O'Neill says, A.I. and algorithms are mathematics that have values embedded in mathematics. And we currently live in a very unequal, patriarchal, sexist, LGBT phobic, racist society. So if those who develop or who commission these systems have those values, those values are only hidden into the math. So there's something neutral, but it's going to just automate inequalities.

Dan Wesker: From my perspective, all these sorts of empirical things that were embedded in society in the first place, they've sort of just been able to grow without regulation for most of the development of the web. Is regulation really the way that one can draw this in a little bit and control it, that there's tighter regulation, as there is with perhaps television, radio, newspapers, all the old forms of media? Can a similar thing be done with online digital?
 
Joana Varon: Yes, but I think it's not the only path. The organization that I founded, it's called Coding Rights. And by coding, I mean legally coding the law, legal codes, but also coding programming codes. So I think it goes both ways, the way we envision and develop and code those technologies and the way we regulate it and the way we use it. Sometimes the way we use tech, we hack the purpose on which that tech was envisioned or created. It has many layers of resistance, but also the first layer is to create awareness about the technical political aspects of the technologies that we use today. So that's why we have this project of mapping the Internet territories. It's about creating awareness. The "Not My A.I." is also creating awareness and building a feminist framework to question A.I. systems, because we also need to have the tools to criticize and to have critical views of those systems. And I think feminism is a good tool. Human rights debates, human rights, knowledge and production is also another tool. But from the feminism, we have the debate on the power relations. Sometimes with human rights we have this view of what are the rights that should be universalized? Freedom of expression and access to the Internet, data protection and privacy rights to social justice, equality, social environmental justice. Those are our goals. But we don't have all those rights guaranteed universally. We have power imbalances, and we have differences. And that's where feminist theories play a role on having that lens that points out to the power imbalances on implementing rights. So I think those are good analytical tools for us to have awareness of the problems, but also to create the alternatives. We also have another project that's called "The Oracle for Trans Feminist Futures" that Coding Rights develops also in partnership with Sasha Constanca-Chock. She wrote a book about design justice and with "The Oracle" it’s like a tarot deck. We play on envisioning trans feminist technologies like it has a set of values like non-binary, diversity, equality, interoperability. So we play with that to envision alternative tech. And of course, this is like a more of a game that also helps people to get out of the box of what technology is, of the narratives of technology, which is also coined by like Hollywood. So we need to decolonize our imaginaries around tech as well to be able to reinvent other techs.

Dan Wesker: And are there specific events or situations that that make you want to push your topics even more or have made you understand how important it is to raise awareness on this topic?
 
Joana Varon: I think in the last three years, we saw emerging the cyber feminists in Latin America. A lot of feminists building alternative feminist infrastructure and feminist projects on data, like data about femicide. Feminist infrastructure projects that are community wireless networks, but also victories in the legal arena, like the initiatives that managed to ban facial recognition and for security purposes that is sold as security. But this is actually automating racism. So there are many victories. Like Brazil has a data protection law now in force. So it's gradual. The legal victories are lower than the community victories because the legal path takes longer to approve and all that. But there are many and I think more people are critical about technology now after the Snowden revelations, after Cambridge Analytics scandals. Even after Trump and the misinformation war, people got more aware about the possible damages of technology. So that's also something positive that emerged from something bad.

Dan Wesker: Fake news distributed via social media are damaging factors to democracies. Not just Donald Trump’s rage against the truth has torn societies apart. Similar strategies can be seen in other parts of the world. I asked Joana Varon about her experiences in Brazil and whether she thinks that we have reached the peak of the alternative-facts-age.

Joana Varon: Hard to answer that. We just had the elections here in Brazil with Bolsonaro using pretty much the same strategies as Trump because it's global. We have Steve Bannon and his crew teaching lots of people and spreading those tactics across the globe. And just this week as Lula got his diploma of president, on the same day far right people very radicalized, went to the streets in Brasilia, the capital, to burn buses and cars. They were all feeding misinformation that led to those acts as well. Falsely questioning election integrity. So at least here in Brazil, on the 1st of January, we will have Lula. He just got the diploma for president. And then on the 1st of January, he starts his term and there is a ceremony for that. So we are like having a close look on the networks and being cautious about the security of all that. We are on red alarm. All this mass of misinformation from far right is fostering a vision of society that should have been far gone because it's not a vision of care or patience, of love. It's a vision of hate, of disseminating false information to foster hate against each other. Now we have the neo-Nazi people emerging. All those things should have gone far in the past. We have seen what are the consequences of all that hate. I think we need to have more research to understand how misinformation is affecting people's minds and consciences to foster these kind of ideas. How do we battle that? From the regulations, of course, but also from reassessing. What is wrong with society to leave space for this kind of mentality that's full of violence and hate? And how we can overcome that? Through social education projects, sports. I think that is something there to deal with that's very problematic and dangerous. Our elections here were not an election about a party or another. It was an election about maintaining a democracy and maintaining a democratic space for debate and going completely authoritarian. Luckily, we went for democracy.  Sometimes for me, it's insane to think that it was very close. I think we need to understand further what the psychological and social aspects in society are that are opening spaces for that kind of view that's so violent.

Dan Wesker: I always find that under this white flag of free speech, suddenly, as you say, ideas that should have been left by the side many, many decades ago have suddenly exploded again onto the scene. How do you see the relationship of the civil society sector with this technology?
 
Joana Varon: I think one thing on tech that helped that kind of craziness that's happening now on the far right and disinformation is the logic of the algorithms that allow for people to live in their filter bubble. You start to receive a lot of content about the same misinformation, and then all of a sudden that bubble constructed a parallel reality with a lot of untrue facts. The algorithms on social media that we have now, they promote misinformation, and they promote hate because those kinds of content have much more engagement then the regular, sometimes boring truth. So that's something also for us to think about the relation of civil society. It changes, you know. Like for instance, when Brazil was under Bolsonaro government, he closed all the channels with civil society, the progressive civil society, to engage with public policies and now with for instance Musk on Twitter also changed the scenario because before civil society that was operating on the field of human rights and technology will have some level of dialogue with people, the human rights people, the safety and security people on Twitter and now these people are gone. So then you don't have more interaction with the platform. So I think those relations change according to the political scenario in the countries and according to how those companies are operating and taking into consideration what we have to say. Again, even when Twitter had the human rights people and all the big tech platforms that have human rights people, it's very different to engage with them being from Brazil or other country, from the South or being from the US. We feel that things here needed to escalate from the Brazilian office to the US office to really be considered seriously.

Dan Wesker: How could software be based more on human rights? What do you think could be implemented that would address human rights much more?
 
Joana Varon: Encryption should be the basic. There isn't a recipe. I think if a software is produced by and with the community it's made to be used for, it will be more connected with the needs, but also with the rights of that community. The problem is that the big tech and the mainstream software that we use now were built with this view of being global, universal. For profit and for surveillance capitalism. If I sit here and craft a software with my community to map feminist sites in Rio de Janeiro, it's very likely that it's going to be a software that have embedded on it human rights considerations. So I think the issue of scale is something. Things shouldn't have such a huge scale. That's something that capitalism teaches you for things to scale. But I think we need to degrow on that sense but build things that are interoperable, so they don't scale, but they talk to each other. That was in the beginning of the Internet. Internet has operability, interoperability as a core value. It has the links as a core value. But then Instagram came and killed the links. Why is it that you couldn't click on things? You need to go to the Bio for the link. So some core principles that were laid down in the beginning of the Internet were engineering principles that actually helped human rights. We can reclaim those and expose what those monopolies have destroyed and reclaim those principles and I think be more local. But interoperable is one step for solutions.

Dan Wesker: Artificial Intelligence that is programmed to be discriminating and fake news are growing concerns. This is partly caused by big tech companies that are creating software for a universal and global market instead of searching to work on a more local and community-based level, according to Joana Varon who is worried about the future of the digital world. But with her projects she is drawing attention to human rights-based technology and seeks to reinforce the idea of the digital world being an important tool for the civil society. And with that we have reached the end of this episode. This was the last episode of the year. Thank you so much for listening. As always if you have any suggestions, critique, or wishes feel welcome to email us at podcast(at)ifa.de. My name is Dan Wesker – take care and goodbye!

Kontakt

Sie haben Themenwünsche, Lob und Kritik?

Podcast-Team

Charlottenplatz 17
D-70173 Stuttgart

E-Mail: podcast@ifa.de