Welcome to MTM’s Podcast!
Welcome to the Migration and Technology Monitor Podcast!
The Migration and Technology Monitor presents a podcast that centers on the deployment of technology and how it impacts the lives of people affected by it. Technology is presented as a solution to all our problems, an inevitable part of life. Yet more and more, we see how automated decision-making, profiling of social media, massive data collection, and deepfakes are harming individuals and disrupting societies. Join us to meet the journalists, researchers, and tech developers from all around the world discussing problem and possible solutions for the most pressing issues of our time - always centering the human experience.
Episode 1: Digital Othering
AI-powered technologies need huge quantities of data to work. While many people still think that AI is neutral, study after study shows that technologies are biased. Not just because of how they are programmed, but because the data used to feed them contains biases, racism, and discriminatory patterns. AI also dramatically deepens societal rifts and infringes fundamental rights. What we are witnessing is Digital Othering.
In this podcast we explore how this looks in the real world and how people are already affected by Digital Othering. Four specialists share their observations and insights. Their greatest concern: If the human experience is taken out of the equation, technologies will remain tools of oppression rather than empowerment.
In this episode: Judith Cabrera, Grace Gichanga, Antonella Napolitano, Ankid Sarin, with host Florian Schmitz
Transcript (also available in Arabic, French, and Spanish through our sub-sites):
[00:00:01.120]
Hello and a very warm welcome to the first MTM, the first Migration and Technology Monitor inhouse podcast. My name is Florian Schmitz. I am a journalist and I'm also overseeing the fellowship at the MTM. We have recorded this in November 2025 in Nairobi, Kenya, on the sidelines of a very special gathering. With a lot of hard work and help from the German Heinrich-Böll-Foundation and the Robert-Koch-Foundation, we managed to bring together MTM fellows from 3 years. MTM fellows are professionals from global majority countries with lived experiences of migration and occupation. And all of them work on issues on the threshold of technology and human rights. And we at the MTM support their work with a $30,000 stipend. But of course, we want more than that. We flew them into Kenya to bring them together with researchers, journalists, activists and human rights advocates so that they together can discuss the use of technology and how this use can collide with human rights. They were to share expertise and to bring the very important factor of lived experience into the debate. It's something that in the global north, we don't do very often.
[00:01:19.820]
Unfortunately, we much rather make decisions about people without involving them. And we develop technologies with the promise of solving all our problems; a widespread phenomenon oftentimes referred to as techno solutionism. And we say that technology is neutral. But is it really? Studies have shown time and time again that artificial intelligence copies and enhances biases that we have as societies. The result: the gap between people of color and white people widens. Refugees and migrants are criminalized. People are deprived of fundamental rights. We call this process digital othering. And this is what our first podcast is all about. I had the pleasure to talk to four experts on this. Some are MTM Fellows, others are not. You will be hearing from Antonella Napolitano, an Italian researcher who is one of the pioneers in the field of technology and human rights in Europe. And you will also hear from Grace Gichanga, a human rights lawyer from Johannesburg, bringing in her perspective on South Africa, a country which, for American and European ears, is totally under the radar when it comes to migration issues. Indian researcher and journalist Ankid will explain how people in Kashmir, one of the most surveilled places in the world, suffer from how technology invades their lives and privacy.
[00:02:43.350]
And the first guest you'll be hearing from is Judith Cabrera. Judith is from Tijuana, where she runs the Borderline Crisis Center. The border between Mexico and the US has been making headlines continuously the past couple of years because of the harsh realities for asylum Seekers who face hardship and cruelty from the side of the United States. But also people like Judith who were left alone with this chaos, suffering and also dangers that a violent border regime causes, are deeply affected by the situation. Ever changing policies make the work of people like her unpredictable. The Mexican and the US Administration left them alone with the huge burden of putting order into this chaos. At some point they were even supposed to administer lists with the obligation to decide who would get one of the few interviews at the border and who wouldn't. A terrible task that put them in the middle of desperate people, who were waiting in Tijuana for months or years, criminals, who tried to make a bargain out of all of this, and the security services themselves. Technology - you might have guessed it - was supposed to solve all of this, but listen to Judith giving a first-hand account.
[00:03:56.340]
The United States integrated an app called CBP-1. In how far did this change the situation at the border? Did it release pressure? Did it create further pressure? What did you see on the ground?
[00:04:10.530]
CBP-1 is an app that Customs and Border Police use to manage the flow of migration. They still use it. If you're going a certain amount of miles away from the border, you have to request a special permit. And so you can do this now through CBP1. And then they decided to integrate the asylum process into this app. So instead of presenting yourself at the border physically and being in their territory, now you request an appointment through this app and whenever you get your turn, you present yourself and then the whole thing starts.
[00:04:55.490]
I imagine that registering in this app is not as easy as opening an Amazon account. Is it complicated?
[00:05:04.780]
It was very complicated. And that's when we start talking about access. First off, you need to have a smartphone. And not every asylum seeker has a smartphone because of poverty. So, you have to have a smartphone, you have to have connection to the Internet, which not everybody has. And you have to know how to use this smartphone and this digital literacy that not everybody has access to. And we saw how the younger kids suddenly took a leading role in their families because the older ones didn't know how to operate this app. And especially when you talk about older people and some people can't even read or write. So how do you register in an app if you can't read? You don't have a phone, you don't have the Internet. And whenever you get this, after a lot of effort, you don't even know how to use it. So that was the issue and it...
[00:06:04.400]
Glitches though, right? Like, did the app work well or did it crash?
[00:06:09.440]
It had all kinds of glitches. And the first thing is it was only in Spanish and English. And we have a huge population of people from Haiti, and there's a huge population in the world that doesn't speak either English or Spanish. So, the Haitians, they translated it themselves. There's this organization called Haitian Bridge Alliance. They translated the whole thing and sent it because they were not being fast enough. And we had to push a lot for it to be accessible in different languages. And then while it had these glitches and; and here's where it gets very interesting: It takes your biometrics, it takes a picture of you. But you know that this software for face recognition has a hard time recognizing darker skins. So, all of the Haitian population was even more limited to have access to this.
[00:07:08.440]
How often did this happen?
[00:07:09.800]
Every day. Every day. And then you could only request the appointment, like at 6am so everybody was up at 6am doing it at the same time, from six to seven. And they crashed the app because it's too many people connecting at the same time. They didn't change it until months later.
[00:07:29.330]
And another information that I want to share is that, I mean, people might ask, why, if they came from Guatemala or wherever, why didn't they just use the app there? You need to be in proximity to the border to be able to actually register, right?
[00:07:43.330]
Geo localization. If you're not in proximity to the border or in Mexico City, you couldn't, you couldn't apply. And then what happened is that organized crime, they have hackers, they have people who know what a VPN is, which is something very, very advanced for most people. And so, they would charge you for installing a VPN and getting you the appointment. And then, when it was only possible to use it from 6 to 7, well, the sunlight wasn't out yet in a lot of places, so the illumination was not good. And you had people there for weeks trying to just register, not yet request the appointment. So, it was very frustrating. And the government, you know what they did? They gave all the shelters Internet connections. Well, they're like, oh, you're complaining about access? I'll pay for your Internet. Oh, you can pay for my Internet now, you know, and that's when you see the compliance with the United States interests that is ever present. So, there was a lot of complaints. But also, I can tell you from personal experience, and this is not a popular opinion, that CBP1, despite all its problems, was the least problematic way of requesting asylum since 2016, because finally people could do it by themselves, more or less.
[00:09:16.400]
And there was nobody to pay for this because it doesn't depend on anybody now to get access. So, there's nobody profiting from this, and it's not on us to do it on the organizations and the Mexican side. So, it took a lot of the risks, you know, and it finally gave us, like, a little peace of mind. Now we have to work on getting you access to the app, but there's no way, if you point a gun at my head, that I can do it faster for you. So, we were safer. Yeah. And then another problem was that they weren't clear. They don't have privacy policy. We didn't know what's going to happen with this information. And we kept asking, what are you going to do with this data? No answer. And who can have access to this data? We don't know. And is this going to help you surveil people once they are in the United States? No, that's not what we want it for. But then here comes the second administration of Donald Trump, and they were able to locate every asylum seeker in the United States and potentially deport them.
[00:10:29.860]
You know, and they're actually doing that.
[00:10:31.820]
And they're actually doing that. Yes.
[00:10:33.460]
I read an article today that they raided a kindergarten somewhere in the United States. Some people with masks just ripped one of a person from - I don't even know from what country - but they just, I mean, in front of all the children and parents just grabbed her and took her out of this kindergarten. I mean, what is that?
[00:10:55.950]
Well, there used to be, like, sanctuary spaces, schools, hospitals and courts. And they took all of that away. Like, if you have to go to court because of your asylum process, they can catch you at the door because you're not regular yet and deport you from there. So, what do you do? Do you go to court, or do you not go to court? And if you don't, you're going to lose the whole process. So, it's just hunting people, you know, it's really awful. And then what they did was on January 20th - I'm sorry, January 20th was really traumatic for all of us. The first day this person takes over the White House again, they shut down asylum in a heartbeat. And having it all in an app was very useful for this. You know, just end that feature in the app. And a lot of people were, like, stranded in their way to the border. A lot of people already had their appointment, and they lost it. A lot of people were at the border already that day because there were three shifts in which people would come in, and they only took the first shift in the morning, which I believe it was at 5am and then the people who were supposed to enter at 9am they didn't have access anymore.
[00:12:19.400]
And you're talking people from all over the world again. We were helping people from Iran that were stranded in Chiapas, our south border. And now they can't move forward, they can't go back. What are they going to do in a place where nobody even speaks their language? We had people from Colombia. They have the whole life packed with them. They thought that was the day that they were going to enter the United States and never come back. And they were just left in Tijuana. So, it was devastating. It was. It was awful.
[00:12:57.450]
I can see that it makes you very emotional also to talk about these things. I'm sure you see a lot of hardship and despair that, you know, that you can't help with.
[00:13:08.700]
You know, it was very hard. They kept working on the glitches of the app and all of this. And then it became very evident that the algorithm was programmed to delay the application of Mexicans. All other nationalities could go faster, you know, if you were, I don't know, from Persia. You get to Tijuana and two weeks later you have your appointment. Well, Mexicans could be there for six, seven, eight months. Do you have any idea what it makes, what it does to a person to be living in a shelter for eight months? And then you finally get your appointment and they just tell you it's canceled. Is it ever going to come back? Are you going to respect the fact that I already had an appointment? No. Can I hope for it to happen in a couple of months? No, it's just not going to happen. And we had a few families at the shelter that after a very long wait, they finally got their appointment. And then you have to explain to them that we don't even know what's happening and we don't even know what to expect. And this guy is signing executive orders like left and right.
[00:14:21.470]
And yes, it was very hard. It was very hard because suddenly there is no hope. I mean, everything that we've been Talking about since 2016, at least there was a way. I mean, it was faulty and it was, you know, slow, but there was a way. And of course, we complained about that, too.
[00:14:40.300]
Rightfully so.
[00:14:43.820]
Thank you. But then suddenly there is no way. And right now, there's very few people at the shelters because once there's no way to request asylum, they started integrating into the city faster, looking for jobs, looking for a place to rent things that they wouldn't do before because they had the hope of reaching the United States. And a lot of people don't even come to the border anymore. And a lot of them went back. There is a population in Puebla of people from Venezuela. So, they find other places that are less hostile than Tijuana, and they start settling there. And in a way, it releases pressure, and it stops creating this bottleneck of people stranded at the border. But it's a really sad situation.
[00:15:38.190]
It's not a solution. And Antonella, I saw you nodding along when Judith was talking about how they were asking questions about what do you do with the data, where do you store the data? These are questions that you've been dealing with for a long time as well. You're a researcher and you're focusing on the threshold of technology and human rights with a strong angle on migration. What Judith is describing, in how far is that a global trend? What do you observe?
[00:16:10.880]
I feel what Judith described is in a nutshell, as a lot of the elements of global trends that we're seeing in the United States, Mexico, but also in Europe and in other parts of the world, borders are in a way expanded by technology, are outsourced. So, in the case of the European Union or the UK, there are border externalization policies and technology and surveillance technology is one newish, not very new by now, but a newish element in a process that has been going on for decades. So, some of the work that I've done also look at the way the European Union has used, for instance, the EU Trust Fund for Africa or other funds, including development money, to outsource surveillance capabilities to countries in like African countries as well as in, in the Balkans, providing surveillance tools, equipment, training, with the objective either implicit or explicit to curb migration. Sometimes it's done under the guise of managing more efficiently. And it's air quotes all over managing efficiently migration, addressing root causes. But what happens is this affects people not only in their, in their journey. So as Judith was mentioning earlier, this can push people to look for riskier and more like deadlier routes.
[00:17:47.070]
But also these technologies are given to governments that are authoritarian or very fragile democracies. And what happens is they also use them against dissidents, against protesters, human rights defenders, journalists. So, making those countries more instable as a result. So. there is the externalized dimension of borders, there is the technology deployed at the border. And this can take different shapes, it can take the shape of a database where you have to apply for a visa and your data are collected and shared, your biometric data. So increasingly these databases collect enormous amount of data. So, we're talking about migration, we're talking about asylum. But this applies to migrants, not only to asylum seekers. So, this is the way the European Union and other parts, like other countries in the world, choose to collect data to make choices about who is, again quote unquote, deserving of a visa, of an asylum status. This can take the shape of more visible technology at borders, at then refugee camps that can be highly controlled, that have systems to surveil and control who goes in, who goes out, if you can, if you can't. But also this extend even beyond borders.
[00:19:17.170]
So we know of immigration detention, but we also know that in several countries, I mean for instance I'm thinking outside of the European Union, the UK has deployed system of GPS tagged ankle bracelets for people released from immigration detention. These have been time and again struck down by courts. But this does not stop people from trying. And despite the fact that it hasn't been used in practice in European Union countries, now there are several laws in different member states that provide legal cover for this to be used. And again after, even after people arrive or apply for asylum or have status, these systems, algorithmic scoring and different forms of data assessment are used to provide forms of welfare or to monitor social media. Again, these are all systems that compound different techniques and tools. Share like live on extended collection of data, increase data sharing where people lose track of their data and which means lose agency because they don't know what choices are made about them with their data, how long it's going to affect them. So, technology, as I said at the beginning, like expand borders, make them ubiquitous, but also it's a veneer for the violence and control that happens.
[00:20:53.920]
And just to also make a note, because there is also this idea that, but technology is a clean cut solution or it's something that is reducing the violence. First, violence happens at borders and in different stages with or without technology. But what technology can do is to provide more ways for governments and for different authorities to then exert this kind of violence on people.
[00:21:23.570]
And not only governments, but we have the private sector very involved in this. We see this at the external borders of Europe going insane with massive amounts of data being illegally extracted from people Right. Like I mean we did an investigation in the beginning of the year, and we spoke to people in the closed and controlled access camp in Samos, which is the state-of-the-art refugee internment camp. There's no other way for me to describe it. And they're extracting data from the cell phones, they're collecting other data from the biometric data from the people without them ever signing a consent form. Why do you think do they gather all this data? Why are the companies so interested in that?
[00:22:13.790]
Well first of all, the private sector is heavily involved in the whole like what we can call the border industrial complex. They provide not only the technology, they provide the narratives as well. And this is again we have to think of this as not only what we see in, for instance asylum and refugee space, but how this has become normalized in the experience that every citizen has at the border. The idea of a smart border, again quote unquote. like a smart, a clean, a seamless experience. These are all concepts that there are used to convey the idea that technology is going to make, is going to solve a problem. You mentioned techno solutionism in the beginning. These companies also in some cases co-design this technology and the use of this technology. So, there's a lot of this happen in the context increasingly of public private partnerships. Just to give you an example from the immigration detention and alternative to detention space in Australia, a risk assessment tool to define, to decide who is going to be in immigration detention, was co- designed by the Australian government and a company called Serco who is in charge of immigration detention in many countries in the world.
[00:23:34.970]
So, and, and this is like a public function, like a state government that is co-designing with the private sector who's, I mean at the very core like their, their objective is to make profit.
[00:23:45.290]
So, and I mean governments kind of have to answer to us when we pose questions to them. Although from my experience also, and I'm sure you share that, you can ask as many questions as you want, but a clear answer you won't get out of them. The European Union will answer, they will send you documents which are all redacted and all the important stuff is black. So, you get to read all the boring stuff but never the interesting stuff, which is extremely frustrating. But what is the issues with private companies and also semiprivate institutions like the ICMPD or the Center for Security Studies in, in Greece? What is the problem when the private sector penetrates on this level responsibilities of the government?
[00:24:27.930]
That there are no sytems of accountability. As imperfect and difficult as they are, as you have just described, governments should at least be accountable. There are mechanisms that can be improved, that can be changed, that can be created. You mentioned access to information. But also, more broadly, politicians and policymakers should be accountable to citizens. So, a citizen can decide with a vote, with protest, with a movement that they don't agree with what the government is doing. But if a private company is doing something, like they do respond to their, I don't know, to the owner, to the shareholders. That depends on the type of company and when they are outsourced. And this is something that is happening increasingly in so many sectors. So, we're talking about migration, but this is happening increasingly in healthcare, welfare, who responds to the results, the effect, the impact that it has on citizens, on people, private companies. There aren't really a lot of mechanism for them to answer, to be accountable for decisions that are made. And there are life changing decision even you know, you mentioned earlier like consent, but it's; we're not even at the stage of consent because consent you should like if it is real consent, you should be able to be aware to withdraw it.
[00:25:57.650]
And situations shouldn't change for you. But these are impossible choices because even if you like, if the person could understand and could have informed consent, but has no other choices but to consent, it's not meaningful consent. So these are..
[00:26:11.380]
You mean like in a refugee camp?
[00:26:12.820]
Yeah, in a refugee camp.
[00:26:13.900]
The example I suppose to say no if I know that this will most definitely affect my asylum process.
[00:26:19.180]
Or at the border if you are asked. So, you know now there are procedures where you are basically forced to give your biometric data including, at the EU border, of children as young as six years old. And so, it's like okay, but what is the alternative? Like oh no, you have to give me your data. And then what's happening to my data? I don't have control over my data. And this again like to go back to the, to the private sector. These are data that can train systems. These are data that can be used to create new products. These are, basically like, they can be used also for other purposes like what we call like mission creep. I collect data for a certain purpose, maybe even in a like sort of with the legal framework. But then I can share with another agency. I can use it for like something else. I can share data that are collected in a certain context at border with law enforcement. So, to just to talk about agencies and then we'll go to the non-public agencies. There have been investigations and the European watchdog pronouncing themselves like about the collaboration between Frontex, which is the border agency and it's not law enforcement, and Europol, which is a European law enforcement agency.
[00:27:38.200]
And the way these two agencies can collect data are different, are differently regulated. But they were sharing this data in a very, let's say, creative way, but definitely not a very legal one. To the point that the European Data Protection supervisor said that this was illegal, they had to delete all this data. And then there are the, not like public agencies, so organizations that are funded by the European Union to manage these projects, like EU funded projects that can be about integration, that can be about again like border management. And these are also like the language is also quite telling. It's the language of bureaucracy. So, this is also something that it's, it's another veneer of respectability to process that are inherently violent and discriminating. You have border management, migration management, efficient management. But what happens is that again, there have been recent investigations showing that this money and this kind of projects were actually used to basically kidnap migrants and dump them in the desert in Tunisia, Mauritania, in Morocco. This was done with EU money. Another investigation that showed that again, money that was supposed to train border forces was used by internal police forces in Senegal to quash protests and repress dissidents.
[00:29:13.970]
Some of these projects were managed by some by national member states, law enforcement, and some by these agencies. But as for instance, as an Italian, as I am, and I know my government is involved in the training, I can protest, I can build solutions, I can try to make my government accountable. But if it's a private agency that is heavily funded by the EU or by member states, who should I go to to make them accountable? What is the process? Who do they respond to? They respond to their donor, I guess, not to the public.
[00:29:50.250]
I mean, something else that I think is really interesting, especially - I'm not a tech person. Well, I don't know how to build tech. I don't really understand how algorithms work. I'm beginning to understand maybe. But what I understood is that the quality of the data is very important for the functionality of whatever system we have. And what I'm seeing is that they collect whatever kind of data and there's also no standardization of how to collect data. There's, you know, if a racist police officer interviews you at the border and that goes straight into Eurodac, that is not reliable data. It's data coming from a person who already has a bias. So that also explains why the systems are so biased, right?
[00:30:35.480]
I mean, these systems, again, this is this misconception about technology that is going to make the process better because it's neutral. But this is created by humans. This is like algorithms are trained on data that are collected and assembled by humans. The issue, though, is that when the algorithm starts learning from those data, it will become, you know, sort of a loop. So, it will be much more difficult to go back to the original data and say, okay, actually this data was flawed.
[00:31:08.130]
But also, the other thing is, like, people very often don't know that decisions are made based on this data. So, this is also another issue. This is something that we've also seen with investigation on welfare surveillance. When there are algorithms at, you know, city level or state level that decide whether you are entitled to forms of welfare, of social protection, and you don't know that you've been denied because an algorithm has assessed that you being a migrant, you being a single mother, make you more like a risk, are risk factors. You cannot even challenge that. To me, I find it a bit of a paradox that I read, and I've contributed to some of this investigation; and what you find out is that the elements that make you vulnerable, so in theory, more entitled to welfare, make you more of a risk and so less entitled to welfare. But who's going to challenge this mentality if you don't even know? So, I think, for instance, in the case of the investigation, I mentioned the fact that the journalists were able to obtain the parts of the algorithm meant that they could go back to the city administration and say, look, these are the elements.
[00:32:17.910]
And this is like, why that doesn't work. I mean, to their credit, in the end, the city administration, which was the city of Rotterdam in that case, decided to stop using the algorithm. But again, this requires so much effort, so many resources and understanding of the tech that shouldn't fall on the individual. Like, this is also the issue. Like, sometimes people say, oh, but you should protect yourself and use this technique against facial recognition or against data. She like, oh, why people share so much data on Facebook. How do you expect? But this shouldn't fall on the individual users. Like, if I use Facebook to connect with my relatives in another country, why it should follow me for the company not collecting my data, sharing, selling, profiling me, the, what it's called today, surveillance capitalism. It's also part of the systems where data become like something that can be monetized. But the effect sort of follows you in. Like it's a ripple effect that you don't even know when it's gonna hit you. And I'm not just talking about migration. I'm talking about citizens in different parts of their, of their lives.
[00:33:32.030]
It's a huge money machine also. And used under this pretense that Europe says we can't take in more refugees. Because in Europe we like to think that every refugee wants to go to Europe. And most refugees already are in Europe, which is especially interesting because we are here in, in Kenya, right? And the grand majority of refugees are not in the Global north like Europe, but in Global majority countries like Kenya, where my guest Grace is from, actually. And when you were 12, you migrated down to South Africa with your family, where you now work as a, as a human rights lawyer. And well, my question to you is how fair or unfair is the asylum system in South Africa? And what have you observed on the level of technology? How is that being used in South Africa?
[00:34:27.090]
If we're going to take it all the way back, I think it's important to speak to the roots of how the South African asylum system and the migrant system came into place and how it's deeply rooted in human rights and democracy, largely because of the past, the very dark past of the country came from. And also, being mindful of the fact that South Africa is a very young democracy, and in, in putting together legislation post 94, the thinking was that ‘Never Again.’ That was a lot of what was going on. Never again. No more exclusion. So even in drafting the constitution, which has actually been lauded as one of the best constitutions in the world, a lot of the principles in the constitution are very inclusive and inclusive of migrants. So, for a long, long time, South Africa was very welcoming with respect to migrants and refugees only because they really did seem to understand and understand the context in which people have to flee. But there was a shift in 2008 where due to shifting geopolitical circumstances, with the economies down, South Africa experienced a large influx of migrants for the first time. And it was at this point that I think people started seeing a shift in narrative, a shift in policy, in how migrants were being perceived, particularly migrants of African origin.
[00:35:56.580]
Unfortunately, I just feel like it's something that we don't learn from, and we seemingly keep finding ourselves back in those spaces. But to speak to the asylum system and how they work, and in my work, my day to day work with asylum seekers and refugees, what I have observed, especially in the last 24 months, is an increased focus on limiting the rights of asylum seekers and refugees to the point of wanting to withdraw refugee status for those who have secured refugee status for over 20 years. So, the system is very slow, it's very paper based. I mean, I was listening to my colleagues where I guess the challenge on the other side of the world is how the influence of technology and collecting data and what that means for people on the move. Whereas I would think down south or in the Global south, rather, people want their data to be taken because then that gives them visibility, then you are somebody in the system, which means that once you get through that and they've taken your fingerprints, your biometrics, whatever, if the police stop you, you are somebody. If you are nobody, you're in the back of the van, you're in detention for three months, and you're deported right back to the border.
[00:37:10.120]
So, I find that the experience on the other side is that people want, they want their data to be taken. And Antonella was speaking about consent. Is it manufactured, is it real consent? I think in those circumstances, the, they think about the tradeoff. Am I trading off my, my, my privacy, access to my data, and what is the tradeoff with that? What is the tradeoff in that? For myself and for my children, our ability to, to sustain ourselves, our ability to access services, our ability to integrate into society. And for a lot of people, that is a tradeoff that they take. Again, is that real consent? I don't know how we can answer that question, but at this point, that is what it is. And in speaking to how this is the commercialization of some of these spaces, just as an example from the 10th, I think it was the 10th of October, or a little bit earlier anyway, the asylum management system was down for over a month. This is the system that allows asylum seekers and refugees to renew their status by, it's an email. You literally send an email, and they respond to you if you've been renewed or not.
[00:38:26.190]
So that system was down for a month, but there was very little communication that the system was down. So, you can imagine how many thousands of people were sending in their applications for renewal, but they just kept bouncing back. But there's no communication. Nothing was being forwarded as to why there was system downtime. What that does to the ripple effect of that is then we had an increase in detentions because now the police are like, well, your permit has expired, or your asylum seeker permit has expired. You are illegal in the country. And oftentimes when migrants don't understand that, you know, you do have rights, especially in South Africa, to motivate us to why you're, you're not, you're not able to renew your permit is because of the system. That usually then opens the door to having a conversation about, okay, then let's have a conversation on the side and see what you can give me. Because the police officer knows. Police officer knows that, you know what, it's not their fault. But he also knows, how can I say this? The culture of the streets where people understand that, okay, do I want to go home to my family tonight or do I want to sit in a jail cell out of principle?
[00:39:44.920]
So, again, are these decisions that people make because they can, or are these decisions that people make out of necessity and survival? You gain a lot of empathy for especially asylum seekers and refugees who come particularly from countries such as Zimbabwe or Malawi, who don't fall into the category of refugees and asylum seekers because they're considered to be economic migrants. I think we need to do a lot of work about redefining people who need safety and security. But they are often part of the most vulnerable group because even from a legislative point of view, they have minimal protections. So, again, speaking about asylum in South Africa, the state understands its responsibilities. The state understands the rights that all migrants have in the country. But does that stop opportunistic politicians from playing on this? No. Does it stop the rise of vigilante anti migrant groups who stand in front of hospitals and clinics and stop migrants, legal or illegal, from gaining access? No. Does it stop the call in the new year to stand in front of state schools and stop children, migrant children, from accessing schools? No. So I think when we have the conversations about tech and migration and legislative policy, I think what is missing in that conversation is sentiment on the ground, positive or negative, and how do we proactively engage with it?
[00:41:28.260]
Because if we keep burying our heads in the sand and speaking about, well, the law says this and the law says that you still have growing animosity towards migrant communities, and there's very little work being done towards integration or social cohesion. I know we're moving away from tech, and I digress, but these are real bread and butter issues. These are situations where people are caught between a rock and a hard place, and the decision that they have to make is rooted in life or death and survival. And oftentimes, can tech be an enabler? I believe it can. I think there is a lot of room to use technology, to educate, to empower, especially in Africa, because I think by the time our governments get to the point of using technology that Antonella and Judith have spoken about, I don't think that future is far off. But I think at this point we have a small magical window to educate people about what could potentially be coming before it actually does. Will the courts stop them? I think, I don't think it'll be an easy ride for them. Especially in South Africa. The courts have proven to be almost like the last line of defense for human rights and human rights, the opportunistic abuse of some of these freedoms.
[00:42:49.010]
But is it coming? I think it is because again, they will frame it as protecting our borders, national interest, lack of resources. It's also a framing of, and this is what we hear a lot on the streets is we have our own problems, and we need you guys to leave so we can clean up house. We cannot ignore the fact that this is the sentiment. How do we advocate to educate both host communities and migrant communities?
[00:43:21.180]
Yeah, I absolutely agree. And to add to that, I think it's also an opportunity to show that what happens elsewhere is not inevitable. We're seeing in many of these issues, Global north countries like the US towards Central and South America, the EU towards African countries and other countries in other regions to push for similar solutions. So, the solution is more detention. We're going to fund detention. Like this news of just, I think yesterday that Mauritania has built two immigration detention centers. But this is not enough. It doesn't have to be this way. And I think I absolutely agree with you that maybe there isn't a lot of time, but there can be like a conversation can be built and technology can be used to educate people. And yeah, and like it doesn't have to be like it can be a different way to, to talk, to have a different conversation, to have a different way to decide how these things should work, that people should be at the center of it. And yeah, I think also one of the downsides of tech, which as you said, can very much be used to empower people to self-organize and we're seeing a lot of examples of that, is that it's not inevitable.
[00:44:46.080]
You don't need tech to do certain things. You don't need to have tech working in a certain way to do certain things.
[00:44:52.240]
But you can develop tech to for example, to inform people, which is what you've been trying to do. Also unfortunately, the private sector seems to have set their mind on doing the opposite, on keeping people in need of protection on the other side of the border, never mind what's going to happen to them. And so again, what also you mentioned before, the burden of developing these technologies lies again on the shoulders of, well of lawyers like you, and what kind of problems do you encounter while, you know, trying to develop technology that assists people in need?
[00:45:33.620]
Developing tech, it's expensive, yes, but that's the easy part. Building the platform is the easy part. Creating the content is the easy part. Taking it to communities, that's where we get stuck, because we don't acknowledge that. You know, I think we've, we've eroded a lot of trust from community, especially from migrant communities, because civil society has been seen to be very extractive in their approach, where a lot of communities will say, we know you're in this for the check. You, you know, we know once the funding dries up, you, you, you; it's not that you care about us and you coming to tell us about this chatbot that's going to educate and empower us, it's almost like, what's in it for you and what are we getting out of it? And I know it sounds terrible, but I always say trust is a currency that must be earned. And even in building tech for good, you need to understand and appreciate where they're coming from, where they've been let down by the system, where they've been let down by civil society. So even with us coming together to build something that, that is supposed to be almost countering a lot of this negative and harmful tech, we also need to check our privilege and understand that we are dealing with a community that also potentially views us with suspicion.
[00:46:59.000]
So that's where the difficulty was. The building the tech, the finding the platform. A lot of that is the easy part. And those conversations are easy. But it's, will the community use it? Then ends up being a question of who are you building for?
[00:47:13.330]
But of course, building a huge database. Building a huge database, not. Not so easy…
[00:47:15.330]
…not so easy. Building the data sets, building the safeguards. But then again, even with that, if you have the right experts in the room who, who are on a mission that's aligned to human rights and to do good, we can do that. I think it's, for me, it always comes back to the platform you're building. Will it be of use to its intended audience, irrespective of who's in the room and how? Because you can build a beautiful platform, a perfect platform. From a system point of view, yes, it'll cost a lot of money, which a lot of us don't have, but once that's built, will it just be another white elephant project? And we'll pat ourselves on the back and say, we built a chatbot that People on the move can use. You were giving examples about how the facial recognition and it's struggling with, with darker skin tones. A lot of that can be fixed if you put the right people in the room to have these conversations about who's building the tech. But once that's done, will it, will it be of that benefit to that single mom who's packed up her whole life and it's in her backpack and her child is on her side?
[00:48:31.130]
Will she be able to actually gain actual value from whatever it is that we believe that we're now putting in her hands? Or will she rather go to the state side and say, take all my data, I'll give you all my fingerprints, scan everything you want to scan. Because for her, the tradeoff is then me and my children can be safe.
[00:48:52.970]
We've been talking a lot about technology now that is supposed to keep people out. But the same technology essentially can be also used to keep people in. We have seen a lot of horrible images from Gaza the past couple of months where we know that Israel is a were very, a highly technologized country. They're on the forefront of surveillance technology, of predictive policing. There's Hebron, which is pretty much a laboratory for tech companies and the state of Israel to see how, in how far they can profile people, intimidate people and invade the privacy of people. Not so many people know that Kashmir is also a very harsh example of how private companies, in cooperation with the states, install prisons, turn countries into prisons. And I have Ankid here with me, he's an Indian researcher focusing on this, on this issue. And Ankid, could you please explain to me what the issues in Kashmir actually are and how the, how this affects the people?
[00:50:08.370]
Thank you very much, Florian. I think when it comes to technology and when it comes to digital othering, I feel Kashmiris are Laboratory 2.0. If the Palestine is 1.0, Kashmir is 2.0 because they're using the same facial recognition technology called Red Wolf to track the movement of people who live in this conflict zone. So, I will talk about the borders first. When you go to the border, the border is, it's just hundred meters away between India and Pakistan and those people who are living on the borders for last over 75 years now, since the country was divided into two equal two parts. What happened in 1947, the partition when Pakistan was born out of India and then in 1971 Bangladesh was born out of Pakistan. The borders, these three borders are the most vulnerable borders. I go there often as a researcher. I see people. So, people are hundred meters away. So, the wife lives on the other side of the river and the husband lives here and the brother lives here and the sister lives on the other side of the river. But they are not able to talk to each other for the last 36 years.
[00:51:38.470]
This is how, how, this is how tech divides in Kashmir in a conflict zone. And then if you go to the other side of the border called Uri, it's, it's a heavily, heavily militarized place where every, every moment is tracked. You are not even allowed to take your phone to that place so that you could see or at least record what's happening there. But the people who are literally living in those garrisons, they, they are supposed to share the data, they're supposed to share how many children they have had so far, how many, how many, how much money they make, they made. Why, why are they going to stay? Even if they have to go to the street for say to, to see the doctor. But in the morning, they are, they cannot go off their own. It's a gated community. They have to get permission from the army so that they can go. People say that tech is a digital equalizer, but, but I feel it's a digital divider. This, this has been happening on the ground in Kashmir where right to food depends on whether you are, if you are an 80 year old, the government gives you a subsidized 5kg rice for one month.
[00:52:58.540]
And if your fingerprints do not match in the machine, you are denied the food. And then you figure out for yourself what are you going to do with, with, with, you know, with you, with your life. And then when it comes to workers, say there's a scheme named after the founder of India, the father of India, sorry, called Mahatma Gandhi. It's called Mahatma Gandhi National Rural Employment Guarantee Scheme. And the government pays, the government gives you hundred days of work. Migrant workers, they're, they register for it. And for the registration you have to register in a particular portal. And Kashmir, you know, is the world's Internet shutdown capital. The highest number of shutdowns. Any moment, any protest happens and there is a shutdown. And the government fears, okay, it's like a switch. You want to turn on the switch, the light is on and you want to turn off and the light goes off. And when it happens, what happens is you are denied the right to work. And when you cannot register, you cannot work. And when you cannot work, you cannot get paid for the meager sum the government pays for it. And of course, when, when these migrant workers, when, when they register for the, the work the government provides, they have to, they have to register with a card called Aadhar card.
[00:54:31.510]
It's, it's, it's a, it's a mandatory I card. It's like a, it's a passport. Without it, nothing happens. And even the Supreme Court, the apex court of the country, it says it's not mandatory, but on the ground, nothing happens without it. And when, if you do not have an Aadhaar card, you cannot register. And when you cannot register, you cannot get work. And how do you get that Aadhar card? You get Aadhaar card, if you are living, if you are a permanent citizen of the, of a particular place and if you are not able to prove that I have been living in this place for last 15 years, then you are denied an Aadhaar card or you are denied the domicile certificate.
[00:55:11.550]
We've heard this also from Judith and from everybody here, basically, that technology has not been living up to the hopes and fake narratives that we've been hearing about techno solutionism. But in how far, and I think this is where Kashmir is also very, very special. In how far has technology further penetrated the lives of people away from the borders?
[00:55:36.480]
So I go to Kashmir, I go to the capital city, Srinagar, quite often. It's the place where you land when you fly from Delhi. I. Once you move out of the airport, you see, it is the heaviest militarized space for 8 million people. You have 1 million soldiers. If that was not enough, they did a human, human spy, you know, monitoring your moment or just suspecting you. Being a single, you know, simple Kashmiri walking on the streets was not enough. They have gotten the technology now. So, under the guise of this, we have to monitor the traffic in a very small place like Kashmir. They have got something called integrated traffic management system. So, these are like B2Z cameras wherein they monitor the 180 degree of. Wherever that, that, that instrument or that device is installed. It is installed.
[00:56:42.460]
It's.
[00:56:42.780]
It has become mandatory now. Every shop, every school, every establishment, say, any garment building, private building, they are supposed to install these cameras now. And nobody dares to ask the question why? And they say we just have to stop the thieves. Thieves are just on prowl here. They loot people. But that's not the truth. There is no data suggesting it. They're monitoring the people living in this highly surveilled and highly militarized place. And what do we, what do they do? And what do they do with that data as RTI activists? I mean, there is something called Right to Information Act. It is the act which was passed in 2009 by the Indian Parliament. As per the act, you can ask questions to the government. There are some exceptions. I mean, you cannot talk about the defense, you cannot talk about the communications, you cannot talk about human rights. National security used as a reason. Yeah, yeah. So, they say, yeah, under the guise of this. So, you cannot be somebody who, who questions the government's national interests. And if you ask a question now, it has been reduced to a like, useless law.
[00:58:11.490]
Because if you are an RTI activist, if you submit your RTI application, right to information application, you are asked a lot of uncomfortable questions. Not from the commission, but from the police, from the law enforcement agencies. I met one of the RTI activists. He said, I, I wanted this information. And then the local officer, the lowest form of governance starts with the villages. That person threatened the RTI activist and asked him, why, why, why, why do you need this information? And if it was not enough, then the police came and they threatened him. If you are, if you don't withdraw your application, then we are not responsible for your life.
[00:58:57.200]
Trying to exercise your rights actually leads to threats, Threats.
[00:59:00.960]
And then that person had to move from the village and now he lives in the city and he does not want to go back because that feels, that place feels very, very intimidating. For as a journalist, you feel helpless. As an activist, you feel helpless, and then you are left with no option but to, but to just submit. And then if you question, if you, if you question, you are a dissenter. You, you, you, you are a troublemaker. And for troublemakers, the only option is jail. You are put behind bars and then you rot there.
[00:59:36.310]
You were talking about activists and we, we've seen this happening also in Europe a lot that, and in the United States and I'm sure everywhere that activists and journalists and researchers are being targeted when they deal with these very uncomfortable questions. We also see that in environmental movements at the moment. But what about people who don't provoke, who don't, who don't ask questions? Are they unaffected by technology?
[01:00:02.240]
Yeah, about the, you know, when these people feel that they are being watched, they're being monitored, even on social media, then they of course feel it's, it's, it's not normal, it's abnormal. And then they feel, when they see a doctor, when they see a Counselor, when they see the psychiatrist, they notice that the behavioral changes, they notice that this person is feeling anxious. I mean, nothing happened. But just because the state watching you makes you feel bad. You see a lot of mood swings in people. You see people shouting on the streets, honking unnecessarily. I go and talk to people in Kashmir, they feel like they're helpless and they are silenced. They're silenced because they feel they don't have a voice. The big tech has failed them. And they feel that. I mean, as per the Médecins Sans Frontiers report, every Second Kashmiri walking on the streets is suffering from post-traumatic stress disorder. This is not normal. This is something very abnormal. Abnormal in an abnormal place with lot of surveillance from both humans and machines. And I mean, since Kashmiris are silent, what I the feel I get from the streets is that for them, silence is the loudest protest.
[01:01:48.720]
I think what's really important in all of this year when we have discussions around technology, is to incorporate the human experience of it, because only that really guides us to designing better technologies, more responsive technologies, and also technologies that actually work, because we know that a lot of the technologies that are being deployed already don't work. Judith already said that facial recognition oftentimes doesn't work with people with a darker complexion. We also know that an algorithm or an AI doesn't make decisions. It makes an approximation based upon statistics, basically. And if we look at court decisions also in the European Union, in Greece, for example, where migrants are being caught, refugees are being caught at the beach, and basically being charged of human trafficking, right away. AI will learn all of this. It will be a harsh judge of our own failures to implement laws that are supposed to protect us all. And I think people underestimate in how far technology will also deprive them of their rights at a point. But in order to wrap this up, I'm very aware that none of you will be able to provide a holistic solution for all of this, unfortunately.
[01:03:09.940]
But what I would like to do is just give us a quick impulse. Not, not long, but a quick impulse. From your point of view, from the experience that you made, what are your demands, what are your hopes in how far this situation can be, can be controlled, can be in how far can we regain control over this? Judith? Maybe, maybe you.
[01:03:32.580]
I guess I think more in terms of the human factor in all of this, you know, and it's like I really cannot understand how as humanity, we have this level of technology and we still can't have like a more just distribution of wealth. You know, like, how can we have something that I think about, like kings a hundred years ago, they couldn't have the access to information that we have now in a smartphone. You know, like, this is richness in a way, but it's used, it's been weaponized to further increase the breaches in access, in dignity, you know, And I can't understand that for the life of me. I can't. There's enough resources for everybody to have a dignified life.
[01:04:27.020]
So what is your demand from your position?
[01:04:29.620]
I don't think I have demands. I think I want to, to retake the point that Grace was making when you were talking about the sentiment, and you said: But I'm digressing. I don't think you are digressing. I think this is a core subject in the technology conversation that we should pay more attention to. This should gain prominence. And because tech doesn't happen in a void, tech happens in a context. And the way that it's implemented has everything to do with the sentiment in the streets, with the sentiment in the government, in the institutions, you know, So I think we need to maybe pay a little bit more attention to that. And maybe it comes down to changing the narrative, which already sounds a little bit cliche and overused, but maybe there's something else that we can do regarding the sentiment. Maybe potentialize our connection to each other, maybe create more spaces like this where we can sit with different backgrounds and different opinions and different proposals for solutions. And what we're doing here, meeting in person, is feeding that connection, that human connection.
[01:05:53.190]
I don't think it's cliche at all to speak about this, because I think we've seen how powerful narratives are in the negative context here, how quickly, I'm from Germany, how quickly people who in 2015 were applauding refugees arriving at the train stations turned against migrants, and in how far it destabilizes the country. I think this is something that we see in Europe very strongly at the moment, how the far right is taking over. And in the United States, we have people in power who sympathize with ideas like CEO monarchies, who actively fight against the Constitution. And this is something new and very much exacerbated by tech. Antonella, as a researcher, what are your demands? What would you like to. If you could change certain agendas now and certain practices, let's say, what would your first attempt be? What would you like to change immediately?
[01:07:01.680]
Well, I'd like to work on something else for a change? No, I mean, I think again there is a level of the conversation that's like the one that Judy just explained. I feel a lot of this conversation has been artificially imposed by very short-term thinking. And even in a lot of European countries migration is not considered one of the major issues, but it's been sort of pushed by politicians as the, as the issue, as the issue that is preventing you from getting better access to healthcare or to welfare. We know, I mean especially like we're for like a decade on this. This was something that I've been hearing like for the past 10 years over and over. So, I think there needs to be a shift in the conversation. There needs to be understanding that tech is not about tech. Like tech is about what people are trying to use that for. And in a way it's like, I think from a researcher point of view, just to try to answer a little bit your question, but not as a demand to governments, a demand to researchers and journalists and people who are investigating these things.
[01:08:21.870]
I think we should get better at not using tech as an actor. So, I hear like these days talking a lot about AI accountability and I think this sort of obscures who has to be accountable. It's not AI that has to be accountable, it's the power actors. Who are these people who should, they should be accountable to? Like AI is, is not so like I cannot go to AI and ask to be accountable for what they do. So, I think also the shift in the language we use, in the frame we use and sometimes we're sort of prone to repeat the same thing because it works, because it's what we used to, because the funding that we're getting has that kind of language, that kind of label. So, it makes easier to get a grant, to get support, to get an audience. So, I think we need to think more critically. And then of course, you know, I mean as governments go, I think we already have laws that are not respected. So even just having them like actually applied will be one thing. But also again, like to the not inevitability of what is happening.
[01:09:31.780]
Like it's not like when we hear about, oh, tech is here to stay, it's such a lazy trope to use. Yes. I'm not saying let's destroy the machine, but we can again, we can use tech differently. We've seen, I've seen in my like most recent research a lot of work that it's community led, people led. It shouldn't be that a lot of effort goes into providing information that the government is not giving you also, this should be like the way tech is designed, should take this into account, should live out of this. What are the needs? How are we addressing the needs? So, this is not happening, but I think this is one of the things that we can demand and we can demand, I mean, not only politicians, but this can happen also at a lot of different levels. There are more successful and more meaningful examples, maybe at the city level or at more local level, that needs to be explored more. Not just the big government and big tech exploiting people, but elevating more, you know, examples of meaningful design and collaboration. There are people led and the community led.
[01:10:47.340]
Grace, from your South African, Kenyan angle, also looking at what's happening in Europe and how tech is being deployed, what are your two cents here?
[01:10:58.620]
Mine is to say that it comes down to people. I think as civil society, we need to understand that we're not helpless in this. And we spoke a lot about shifting narratives. Judith, you were saying about how tech happens within the context. So, if we shift narratives on the ground and we work towards that, then maybe we're not able… politicians would not able to manipulate the anti-migrant sentiment and use that to rally their polls and distract us from them actually dealing with the real socioeconomic issues. So maybe it is that fundamental and it is that at the point where we need to stop. I mean, look, I get it, everybody has their mandate and they're doing what they need to do. But maybe if we didn't work in silos, maybe if we actually worked and we presented a united front both to the state and to communities, then maybe that shift could actually start happening. And tech within that context, maybe then would also shift to a more progressive, participatory and human rights based. But at this point we're all just making noise because everyone's trying to speak at the same time.
[01:12:09.510]
Ankid. Finish the podcast.
[01:12:13.830]
Yeah. Thank you very much for the honor. I have been doing it for like six, seven years now at radio. I think. Yes, you, you are right. Drones can wait, but the welfare schemes cannot. You are, you are the world's largest democracy, but at the same time you top the list in world hunger index. So, the 9.6 billion rupees you are investing in facial recognition cameras, or you have 600,000 cameras in a population, in a, in a, in a city which does not actually need them. You can easily divert that money. You can spend that money and pay pension to people with disabilities, the migrants, to senior citizens who need that the most. I think business can wait. Empathy cannot and I feel we got to support the Amnesty International ‘Ban the Scan’ campaign and tell the government that just 2% of your facial recognition results are reported to be accurate and 90%, 98% of it is inaccurate. And what you do is you target. You spend billions of rupees on these facial recognition cameras just to target minorities, just to target Muslims. And label them as terrorists. Label them as troublemakers and anti-nationals.
[01:13:59.140]
Yeah, anti-nationals, antisocials, whatever shit they call it. But the truth is, you got to be. You at least try to. If you cannot put yourself in their shoes, at least have some empathy. And then if you are designing these policies in AC rooms, at least see yourself there in the border or in the city and be considerate and be mindful of the fact that this world works on karma. What goes around, comes around. Have some empathy, please.
[01:14:44.710]
Yes. Be human. Be kind. Become aware. Inform yourself. Get up, take agency. I think this is what I'm taking away from this podcast and from this whole gathering, actually. And I really appreciate that you guys took the time out of your very busy schedule here because we keep you. We keep you quite busy. And we have our next appointment, like actually half an hour ago. But thank you so much for sharing your insights, for sharing your experiences and. Well, have a nice gathering.
[01:15:17.680]
Thank you.
[01:15:19.200]
Thank you.