The Machine Cannot Hold
The old YARID offices, before the move to the current premises, were on a second floor above a hardware stall. By eight in the morning the room smelled of the first rain, the cheap Nescafé someone had brewed on the hot plate, and the chapati oil from the stalls below. On the morning I am going to tell you about, I had come to that room with an implementation plan, a funded timeline, a consent form, and the professional confidence of a person who had walked a version of this meeting many times. I thought I was the one doing […] The post The Machine Cannot Hold appeared first on African Arguments.
The old YARID offices, before the move to the current premises, were on a second floor above a hardware stall. By eight in the morning the room smelled of the first rain, the cheap Nescafé someone had brewed on the hot plate, and the chapati oil from the stalls below. On the morning I am going to tell you about, I had come to that room with an implementation plan, a funded timeline, a consent form, and the professional confidence of a person who had walked a version of this meeting many times.
I thought I was the one doing the assessing.
I was wrong. I did not know this yet.
Reading the Room
Phiona walked in at nine. She was twenty-four, a representative of the refugee-serving organization she worked for, sent to evaluate whether this larger NGO, about to host a new peer navigation study, would be safe for the young women her organization was about to place inside it. She took under ninety seconds to read the room before she sat down. Not scanning for threats in any dramatic way. Just looking. The quality of light through the window. The language printed on the intake forms on the folding table. The way the staff moved around each other. Whether the researcher at the table was checking his phone. She was conducting an assessment for which the peer-reviewed literature does not yet have a name. She was doing it in under two minutes. The assessment would decide whether the young women she worked with, sex workers and LGBTQ youth in a city whose penal code policed both their occupation and their intimacy, walked into a study or did not.
She stayed. She became one of twelve peer navigators in Tushirikiane, a program whose name came not from the grant but from the community. Before the grant existed, we ran months of listening sessions across Kampala, and inside that listening the word surfaced in the mouths of people who were already doing the thing it named. Tushirikiane: let us hold each other up. We wrote it into the funded document because it was already being spoken. The document carried it forward. It did not invent it.
Phiona worked across four trials, a pandemic, and the passage of Uganda’s Anti-Homosexuality Act of 2023. The Tushirikiane program produced, across six years, forty-five peer-reviewed papers in journals that print the names of researchers. Her name does not appear in any of them, because peer review does not have an authorship category for the person who made the research possible. She is still working, in 2026, in Kampala. She is still reading rooms.
UNHCR does not know Phiona exists.
I do not mean this as a personal criticism of the agency. In September 2025, UNHCR published its AI Approach document and confirmed active deployment of algorithmic systems to assist with Refugee Status Determination (RSD), early-warning displacement forecasting, and detection of online threats to displaced persons. The RSD piece works by analyzing the procedural record of each case, identifying delays from scheduling conflicts, interviewer shortages, or transportation barriers. The document is carefully written. It frames AI as augmenting protection officers, not replacing them. It discusses equity, data governance, and digital rights. It is twenty-three pages long. Three months later, UNHCR Innovation published on community engagement in practice: AI analysis of refugees’ Telegram channels and Facebook posts, mapped by protection concern. The words “peer navigator” do not appear.
That absence is not random. The work cannot be digitized, so the system does not see it.
The Invisible Nervous System
Refugee Status Determination is the process that decides whether someone is a refugee at all, with all the legal protections and resettlement pathways that designation unlocks. Speed matters in RSD; backlogs are real; waiting years in legal limbo causes documented harm. I understand the case for throughput.
But protection is not a throughput problem.
In a 2025 study of urban displaced youth in Kampala’s informal settlements, we found that even participants with the highest levels of digital health literacy had suboptimal awareness of and access to sexual health services, with significant gaps by gender. High digital access did not translate into high service uptake. The navigators in the program were the variable that made the translation. Not the apps. Not the SMS messages. Not the validated instruments. The person who knew the neighborhood, who knew which clinic had a nurse who would not ask questions, who knew which participant needed to be walked to the door and which needed to be left to decide alone.
A 2025 study in Cambridge Prisms: Global Mental Health found that unpaid peer refugee helpers follow significantly higher depression trajectories than their paid counterparts. Not slightly higher. Significantly. The people doing the holding work that makes any digital humanitarian system usable at the community level are absorbing a cost that appears in no budget line and that no AI architecture accounts for.
Faster decisions will produce the appearance of improved protection. The peer navigators doing the interpretive work that makes a refugee’s account of their own life legible to any processing system will not be in the room when the efficiency gains are counted. They will still be in the actual room, doing translation that goes far beyond language, reading what the algorithm cannot read, holding what the algorithm cannot hold. And at some point a program manager facing a budget cut will look at the headcount line and note that the AI is already doing the processing. So what, exactly, are we paying these community workers for?
I have watched this logic move through global health for two decades. It is not dramatic and it is not malicious. It is just what optimization does when the thing being optimized is care.
What the Document Needs
What UNHCR’s AI Approach document needs, and does not have, is a theory of the labor sitting between its algorithms and the people those algorithms are supposed to protect. That labor is not informal or volunteered. It was never an add-on to the formal humanitarian system; it is the nervous system of that system, and it belongs overwhelmingly to people who are themselves refugees, carrying the same risks as the people they navigate, reading rooms since long before any of us thought to write a grant about it.
Whether AI belongs in RSD is a debate worth having. The question we keep not having is simpler: will we name what is already doing the work before scaling makes it invisible on paper and defundable in practice.
Phiona walked into a room in 2019 and read it before she sat down. Outside, the chapati oil cooled on the jikos. The red dust settled on the windowsill. The instrument the field had not funded kept running. It has been running for six years now, in one form or another, across four trials and forty-five papers and a city that tried, with increasing legal force, to pretend that some of the people it was holding did not exist.
The machine cannot hold. Phiona can. Someone needs to write that into the document before the document writes her out of the budget.
The post The Machine Cannot Hold appeared first on African Arguments.