
By Taylor Reasin PPS'27
I didn’t expect “playing with toys” to be on my college agenda, yet there I was at a roundtable with Dr. Kate Darling, petting a robo-dinosaur named Mr. Spaghetti.

I was surprised by the fondness I felt for a bundle of circuits, but that feeling of empathy for an inanimate object was exactly why I had come to meet Dr. Darling. As Co-President of Tech for Change, Duke’s undergraduate public interest technology club, I was curious but skeptical about her animal analogy for robots. Our club had discussed too many case studies of people romanticizing AI and robots as emotional stand-ins for relationships. When I heard she used an animal analogy for how we should treat AI or robots my first instinct was concern: wouldn’t framing robots like animals invite parasocial bonds, with people treating machines as pets and drifting from real human-animal bonds or even human connection?
What Clicked Instead
However, my misconceptions were quickly dispelled by Darling herself. During an intimate student meeting, she clarified that the analogy was not about using AI or robots solely as pets or friends, but rather as a different kind of companion that could enhance our lives, our work, and even our own morals. She mentioned how human-centered designs, like Alexa’s “please” function for bossy kids, builds better habits in us, since how we treat non-humans largely reflects our ethics. Mr. Spaghetti, Darling’s robotic dinosaur, further confirmed that we as humans connect with anything that moves and responds, but that’s a cue for strategic cooperation, not parasocial relationships. We’ve teamed with horses for power and dogs for scent, without demanding they mimic people. Robots and AI fit the same mold: tackling repetitive or dangerous jobs, teaching us ethical habits, and even freeing humans for creativity and connection instead of replacing it.
Policy Horizons Expanded
As an aspiring leader in the tech policy space, I was inspired by how Darling carved such a unique and impactful career path, blending robotics, ethics, and real-world storytelling to shift our perspective on technology. Events like her session paired with my classes at the Sanford School of Public Policy constantly motivates me to create new ways to be the change.
My PubPol 302 Ethics for Policy Makers class taught by Professor Antepli echoed the same themes from this lecture. Professor Antepli was relentless about helping us understand that policy is not just rules on paper; it is about bridging our deepening polarization divides and designing solutions that pull people together rather than letting tech widen those gaps. I still remember scribbling notes in my journal during Darling’s lecture to “keep humans at the core.”
Takeaway
Leaving Dr. Darling’s roundtable, with Mr. Spaghetti’s still fresh in my memory, I saw my role as a student leader in tech policy through a clearer lens. I want to encourage students to build on our existing solution-focused discussions, advancing policies that position robots or AI as capable non-human partners ideal for repetitive or risky work, while subtly shaping our ethical reflexes through designs like politeness prompts. As an aspiring tech policy leader, my education has expanded my mindset to think of innovative frameworks that embed human priorities, ensuring technology amplifies our strengths and connections rather than competing with them.
Taylor Reasin is a junior studying public policy and digital intelligence. She is passionate about the implications of technology on societal norms, politics, and global innovation. Reasin co-founded Duke’s premier undergraduate public interest tech club, Tech For Change, and is active in various tech research projects as well as philosophy-oriented clubs. She has interned with leading ethical tech think tanks, Congress, and this upcoming summer Reasin will explore tech in the private sector. In her free time, she loves to explore the great outdoors and dabbles in the saxophone.
Featured Video