Valerie Black, PhD
Anthropologist of Human-Technology Relationships,
AI Care, & the Future of Work

My Research
I’m a sociocultural and medical anthropologist (PhD, UC Berkeley) and disability studies scholar whose research explores and critiques the growing presence of socially-situated relationships between humans and AI– relationships that can range from deeply intimate to entirely impersonal, and everything in between. We’re only beginning to understand what human-AI relationality entails, and I’m concerned that we’re underprepared for the magnitude of the shifts it will bring. These dynamics are not fringe phenomena; they shape how we access, interpret, and relate to information.
This underpreparedness, I argue, stems in part from the tendency to dismiss human-AI relationships as marginal or irrational– a dismissal rooted in ableist assumptions that only “crazy people” are vulnerable to the effects of AI relationality (and that psychosocially disabled people in turn don’t deserve consideration).
My concept of AI’s ontological ambiguity helps explain why such assumptions are not only inaccurate but actively obstructive, insofar as they limit both inquiry into and responses to AI’s relational impacts.
My work underscores that disabled people are critical interlocutors for understanding AI relationality, precisely because of the many cultural projections about how disabled people might need or relate to new technologies– perspectives that rarely come from disabled people themselves.
Simply put: because AI affects us all, being an anthropologist of AI means working to ensure that those most impacted by its development and deployment are centered in determining what it becomes, and how we use– or refuse– it.
I’ve conducted three years of ethnographic research at AI startups in Silicon Valley and Tokyo highlighting the importance of evaluating our relationships with and through AI as a basis for developing ethical frameworks to guide and regulate its use. Supported by grants and fellowships from the National Science Foundation, the Wenner-Gren Foundation, and the Newcombe Foundation, my doctoral research focused on AI’s integration into mental health care in both the US and Japan. My dissertation, “De-Humanizing Care: An Ethnography of Artificial Intelligence,” demonstrated that while AI care in the form of therapeutic chatbots can offer valuable support to users, it unfolds from and perpetuates the ableist outlook that care is an inherently scarce resource– which in turn limits the forms AI care takes and possibilities for what it might become.
Currently, as a postdoctoral scholar at the Decision Lab in UCSF’s Department of Neurology, I’m applying my ethnographic expertise, disability studies background, and human-AI relationality insights towards the goal of improving existing ethical guidelines by better centering the experiences and insights of disabled recipients of neurotech devices. I’m committed to ensuring that “nothing about us without us” is at the core of the growing, multi-billion dollar industry of neurotechnology and its increasing intersection with AI.
Select Publications
The Eugenics Logic Behind Running a Government Like a Startup Under the Trump Administration– Disability Visibility Project
AI Job Displacement: Perspectives on the Future of Work From Beneath the Silicon Ceiling– Somatosphere.net
Towards an Anthropological Praxis of User Data– Anthropology News
The Disabled Anthropologist – Routledge
Recent Talks & Interviews
Eugenics Logic – Interview & Podcast Episode – Old Mole Variety Hour, KBOO
“New Technologies, New Dilemmas: Bioethics In a Changing Healthcare Landscape” – (2025) breakout session at UCSF Health Services Research Symposium
“Rethinking the Role of Disability in Technology Ethics: From AI Ethics to Neuroethics – (2025) Presentation at the International Neuroethics Society (INS) annual meeting “Neuroethics at the Intersection of the Brain and Artificial Intelligence”
“Terms of (Dis)Service: The Stakes of Human-Chatbot Relationships” – (2024) Invited lecture for the UC Berkeley School of Information