Data Borders: Tracking, Surveillance, and the (Anti)Immigration Economy

On October 28, 2025, Georgetown University’s McCourt School of Public Policy hosted a book talk with author Melissa Villa-Nicholas, associate professor at the UCLA Department of Information Studies, discussing her research into entrenched and emerging borderland technology. The article below is a retelling of the conversation she had with Katelyn Ringrose, Georgetown Tech & Public Policy Fall 2025 Visiting Fellow.

Ringrose: Dr. Villa-Nicholas, in your book, you differentiate between a story and a parable. Can you start today’s conversation with a story of you and why you decided to write about the monitoring of Latinx immigrants? And if this story were a parable, what lesson would you want the audience to take away from it?

Villa-Nicholas: I grew up in Southern California, an hour from the San Ysidro border, so it was common to see border patrol. We have a border detention center and a border checkpoint in my hometown region, and border patrol is frequently parked on the freeway passing through our cities in Riverside County. For myself, as a Mexican American, I saw my mom questioned by border patrol about her citizenship, and I remember feeling the fear around those interactions with border patrol. 

Fast forward to 2018, and I started reading reports about the many companies that held contracts with the Department of Homeland Security (DHS) to develop information technology and data for agencies such as Immigration and Customs Enforcement (ICE). When I would watch demo videos of technologies such as drones by companies like Anduril Industries, the landscapes looked like the Southern California borderlands and could easily have been many of our hometowns in the Inland Empire (Riverside and San Bernardino Counties) or San Diego County. My concern was piqued that information technologies were being built around our communities for the purposes of surveillance.

After the research and writing of Data Borders, and with the benefit of hindsight, if this story were a parable, the lesson that I took from this book, and what I want the audience to take away, is that we have to listen to undocumented people, DACA, naturalized citizens and immigrants on visas on how we should think about data privacy and data policy. The lesson that I learned is that Silicon Valley companies and government agencies need to relinquish control over others’ data and let those communities have determination over the collection and movement of their data; that the communities most focused on by ICE surveillance, detention and deportation can determine the priorities of data privacy policy and artificial intelligence policy. 

Ringrose: Before we get into the lessons from your book, you have coined several phrases — phrases that have been used in similar contexts, but are unique to the ideas you put forward. One of those is the data body milieu. What does this phrase mean to you?

Villa-Nicholas: Data body milieu is this emerging state of borderland surveillance that brings all residents into an intimate place where our data lives together, defines us in digital borderlands and places Latinx immigrant data at the center of technological innovation and development. The data body milieu is social media data, library databases and the DMV database, to name a few, working together to build a new borderland for surveillance and deportations. It is that sense that our information is somehow linked to the massive surveillance project happening with ICE.

Ringrose: One of my areas of research is genetic privacy, and something you discuss in your book is something I’ve written about quite a bit, too, which is the blurred line between law enforcement and consumer genetic databases. Right around when your book came out, Homeland Security made a push to include the genetic information of individuals at borders in law enforcement databases, but databases traditionally used to hold the material of individuals at arrest. This push included collecting DNA samples from children under the guise of reunification. In addition to genetic privacy, what other dystopian data collection points keep you up at night?

Villa-Nicholas: One recent event that I think speaks highly to my concern around the collection of biometric data for “proving” citizenship is the mass deportations we saw in early 2025, when hundreds of Venezuelan and Salvadoran men were deported to El Salvador under the 1798 wartime Alien Enemies Act. One of the driving motivations for these deportations was that ICE questioned and detained men with tattoos, with ICE claiming that those tattoos were related to Tren de Aragua gangs. Tattoos as biometric data are used to profile, detain and deport people with no evidence that those tattoos are gang-related and are often used subjectively and without evidence to deport people without due process. 

What stood out to me about this incident is that biometric data in the form of tattoos could be incorrectly entered into systems and incorrectly interpreted for the purpose of deporting undocumented people. I think the red flag here is that biometric data– tattoos, phenotype, genetics, etc.– can be intentionally interpreted with prejudice that decides who should be detained and deported. 

Ringrose: As a Tech & Public Policy Fellow, I have the benefit of leading discussion groups here on campus, and during one of them, the students brainstormed the true risks of surveillance. In those conversations and others, I often hear things like “Well, I don’t have anything to hide” when it comes to protecting one’s privacy interests. This comes up in your book too, where you note that to be networked into surveillance is seen as “part of good citizenship: to be surveilled is marked as part of one’s civic duty.” In what ways do researchers, librarians, the students here, all of us participate, willingly, knowingly or otherwise in surveillance? 

Villa-Nicholas: In my book, I discuss how all residents of different citizenships in the United States have a vested interest in the data body milieu, because for big data, text mining or AI to work, there needs to be a lot of data to draw on. Data body milieu is communal in that our data is valued in large amounts. Therefore, we all need to be concerned about how our data moves to benefit these technologies. Whether we have personal relationships with undocumented people or permanent residents, people on visas who have also recently been targeted by ICE. There are many entrances into this data border state.

For example, as a Library and Information Science professor, I teach databases to my students who will go on to have relationships with vendors when purchasing those databases and will also use those databases to help patrons conduct research in academic, public or private libraries. When we use databases such as LexisNexis, which holds contracts with ICE, we are asking our students, patrons and customers to provide information that ICE can search. This makes it pretty much impossible for me to uphold my information ethics regarding students’ and patrons’ data privacy rights. That is just one example of the many ways surveillance is embedded in these ICE information technology contracts. 

Ringrose: We’ve spoken a bit about the gap in time between when you wrote and published your book, which was published in 2023, and what’s been happening since—both with respect to the executive shift and the increasingly virulent political rhetoric, to the “disappearance” of immigrants, including a number of transgender and gender diverse individuals. What do you wish you could have included in your book?

Villa-Nicholas: I try to emphasize that what we are seeing today with ICE and the current administration’s approach to detention and deportation scaffolds on infrastructure, policy and xenophobic sentiments throughout U.S. history. President Obama, for example, had the highest number of deportations in U.S. presidential history, and many of the predictive policing systems and tech companies that we are seeing investment into by the federal government have been tested in communities of color for decades. For example, the new ImmigrationOS Palantir AI System comes after Palantir has had years of building out predictive policing systems such as their Gotham database on Black communities. I know that what we’re seeing today is alarming given how public ICE and border patrol have become and the funding they received from the One Big Beautiful Bill Act. Still, these systems were in place before the 2025 presidency began.

With that said, I do wish I had gone further into the importance of acknowledging the interconnectedness of different groups of immigrants and the use of surveillance technologies towards various groups of immigrants. For example, in Data Borders, I focus on Latinx immigrants in the U.S.-Mexico border. Still, since publishing my book, we’ve seen the use of surveillance technologies targeting Palestinian student activists, such as Mahmoud Khalil of Columbia University. That is to say, the development of surveillance technologies on one group can be implemented towards another group, depending on who is the target as an “enemy” at that given point in time. 

Ringrose: And finally, if you could go back in time to when you were writing the book, what warnings would you include for your readers? No stock predictions please, thank you! 

Villa-Nicholas: In my book, I interview undocumented people, DACA recipients, permanent residents and naturalized citizens who have experienced detention, deportation and high-intensity surveillance technologies in Riverside County. From their experiences, those folks warned about the exploitive nature of border patrol and the inhumane conditions of detention centers. We also focus on how they would change the border and the development of information technology. I think I would further emphasize the importance of listening to people who have already experienced these conditions and who currently experience inhuman detention and surveillance conditions. 

It’s easy to focus on the big names that partner with ICE, such as Palantir, Anduril, Clearview AI and Amazon, but what matters more is changing the systems of technological design, policy, borderlands and border agencies in line with immigrant data rights. A frequently used phrase in community organizing, accredited to Representative Ayanna Pressley, is, “those closest to the pain should be closest to the power.” We can keep this phrase front and center as new dystopian investments in ICE and Silicon Valley are built as a guide for data privacy, AI policy and social change. 

Author Bios

Katelyn Ringrose is a privacy and cybersecurity attorney, with expertise in the protection of sensitive data and navigating complex federal regulations. She currently serves as a visiting fellow at the Tech & Public Policy Program at Georgetown University’s McCourt School of Public Policy.

Melissa Villa-Nicholas, Ph.D., is an associate professor at the University of California Los Angeles Department of Information Studies with affiliations in the Chicano Studies Research Center, the Latino Policy & Politics Institute and DataX. Her first book, Latinas on the Line: Invisible Information Workers in Telecommunications (Rutgers Press), received an honorable mention from the Labor Tech Network book award for 2022. Her second book,Data Borders: How Silicon Valley is Building an Industry Around Immigrants, was released with UC Press and received the McGannon Center Book Award from Fordham University (2023) and the Association of Borderlands Studies Past President Book Award (2025). 

Sarah Mathey is a master’s student in the Data Science for Public Policy program at Georgetown’s McCourt School of Public Policy and leader in the Georgetown Technology Policy Initiative. Despite two decades of United Nations (UN) reform agendas, women in conflict and post-conflict settings continue to face a pervasive lack of protection that leaves them vulnerable to recurring threats such as sexual violence, forced displacement and political exclusion. Countries like Sudan, South Sudan or the Democratic Republic of Congo (DRC) show this especially clearly. The UN’s language has become more progressive, the documents are more expansive and the commitments are increasingly more ambitious, but the lived realities of many women globally reflect only a limited substantive change. The organisation continues to operate in ways that prioritise its own reputation and the political interests of its member states over the immediate safety and protection of civilians.