Skip to main
Loading...

By Akil Gore MPP'27

On November 5, 2025, Sanford welcomed professor and author Mary F. E. Ebeling to discuss her book, The Afterlives of Data: Life and Debt Under Capitalist Surveillance. Professor Ken Rogerson’s Public Interest Technology Book Club read the book in the spring 2025 semester and students were eager to interact with her in person and dive into details about her research and her book.

Ebeling is an associate professor of sociology and affiliate faculty at the Center for Science, Technology, and Society at Drexel University. Her research examines the intersections between gender, race, digital technologies, and data privacy with a focus on the healthcare and medical sector. Her first book, Healthcare and Big Data: Digital Specters and Phantom Objects, was published in 2016. Her most recent book, The Afterlives of Data, dives into how personal health, medical and financial data live multiple lives once they escape individual control, including how they are digitized and repackaged into new data commodities that have “afterlives” even after people have died. 

I sat down with Ebeling during her visit to talk about her work, the evolving landscape of data and artificial intelligence (AI), and what it means to take responsibility for the “afterlives” of our data. Below is our conversation, lightly edited for clarity. 

Interview with Mary F. E. Ebeling

Gore: Your book was published in 2022, and in the world of data, things move fast. What has changed since then? What has surprised you, for better or for worse? And how has the rise in AI shifted your views?

Ebeling: Often times books feel outdated before I even turn the manuscript in. When I wrote this one between 2020-2021, AI wasn’t where it is today and was just beginning to surface as a major concern. Outside of AI, the biggest change I will mention is public awareness. Five years ago, few people recognized the realities of data surveillance and how companies were exploiting it. Today, you are seeing much more awareness by the public on their data and what is out there. 

What worries me is that even with this awareness, individuals are still expected to manage and protect their own data. This shouldn’t be the case! We have been conditioned in the U.S. to believe it’s on us to confront these huge corporations, but that’s not realistic. In the medical field, even doctors don’t know what data is being taken from their own patients. However, what gives me hope is solidarity. More people are realizing that this is a systemic problem and are trying to find ways to work together.

 

Gore: One of your themes is how storing and digitizing data offers evident conveniences (health monitoring, analytics, predictive care, etc.), yet also risks of commodification, surveillance, and inequality. If you were to challenge yourself, do you believe any of the benefits outweigh the harms? How might we think about convenience vs. protection? What would a fair trade-off look like to you?

Ebeling: In the ethnographic work I did with a university lab, I saw data scientists building tools that directly benefitted both patients and practitioners. Sepsis tracking in hospitals is an example of where there was a win-win scenario. When hospitals were required to report sepsis cases and readmission rates, they had to use their internal patient data to meet these guidelines. They wouldn’t get paid or reimbursed if they couldn’t track cases. Hence, there was a financial incentive to reduce sepsis rates, but it also directly helped patients.

But then you look at something like a third-party dating app collecting and selling data related to HIV status. There’s no benefit here; Not to users, not to the company. Mainly because it erodes trust.

 

Gore: In Chapter 4, you discuss the rise of alternative data and ways it can be used to evaluate people in non-traditional ways. How is this evolving in the age of AI and machine learning, especially with the ability to draw trends and patterns that couldn’t be drawn before? Are there new forms of alternative data that concern you? Or that feel promising?

Ebeling: Alternative data is being used widely by industries, corporations, and governments alike. A few years ago, the Georgetown Law Center and American Dragnet published a report on how U.S. Immigration and Customs Reform (ICE) is using alternative data, such as utility bills, for surveillance. This is a dangerous precedence for using data, and especially with AI these forms of surveillance can scale dramatically.

At one of the biggest data-based marketing conventions, a data marketer once told me, “All data are health data.” You can take any information about someone and use it to help you infer about their health. In that sense, all data have become alternative data.

 

Gore: Your book highlights that existing policies (health privacy, data sharing, etc.) are not simply bugs but functions of broader power regimes. What kind of regulatory or institutional changes do you see as most urgent now? How optimistic are you that these changes can actually happen?

Ebeling: The most urgent need is to stop treating our data as the assets of private corporations. I don’t have a precise solution on how to do that, but it has to start with building awareness, solidarity networks, and mobilization towards taking steps to stop these practices.

I actually think local government is where a lot of meaningful change can happen. I contributed to the Reproductive Health Platform policy in Philadelphia that was passed by the city council and into law in 2022. This was a package of three bills, one of which focused on restricting the disclosure of reproductive health information and data, using the city's sanctuary status to protect health data. It showcases an example of where true change can happen at the local level.

 

Gore: Your final chapter discusses the “afterlife of data” and the ethical responsibilities we owe not only to the living but also the dead. What would data ethics for the dead look like and what principles would it need? Who should bear responsibility?

Ebeling: We need to see data as our descendants. If we think about data as our descendants and our kin, the goal wouldn’t be to repair the current system but to reimagine it completely.

Indigenous communities are practicing forms of data stewardship grounded in responsibility to their ancestors and future generations. In 2023, the University of Waterloo produced a report on the CARE (Collective benefit, Authority to control, Responsibility, and Ethics) Statement for Indigenous Data Sovereignty to protect the data and its governance of Indigenous Peoples. This model is a good example of how we should prioritize thinking about our data and how we can best protect it. 

 

Gore: What does the metaphor of “afterlife” mean to you? The use of this word for your title is a powerful, even possibly a haunting choice. Can you unpack your thinking behind this framing? Are you suggesting that data live almost a social life on their own after we die?

Ebeling: While I was writing the book, I was working closely with medievalists and poets, so the concept of “afterlives” was always on my mind. Data are constantly resurrected; it’s almost like a living dead, a kind of golem or zombie summoned to perform tasks for its master. It becomes alienated from the origins of its birth—completely extracted from our bodies and then reanimated in a ghoulish way. 

It goes out into the world like an uncanny phantom, a golem to do the bidding for late capital and technofascism.

 

Gore: Finally, if you could leave your readers or our students with one key insight or takeaway from your book, current research, or future outlook what would it be?

Ebeling: I have a lot of hope looking forward to the future. Today, there are so many people working in data and digital equity and organizing in solidarity, much more than there have been even a few years ago. People are aware, mobilized, and committed to working together to collectively “bury their dead.” 

 


Akil Gore is a first-year Master of Public Policy student at Duke University. He is concentrating in technology policy and is the co-first-year representative of the Technology Policy Club at the Sanford School of Public Policy.

 

Meeting with Mary F. E. Ebeling