The Politics of Humanitarian Data Infrastructure: Who Owns the System When Everyone Walks Away?
By Alex Nwoko
I wrote the email at 11am. It went to over 115 organisations — UN clusters, NGOs, working groups, coordination bodies — all of whom relied on the nationwide humanitarian reporting platform I helped manage as programme coordinator. The message was simple and devastating: the platform's sole donor had frozen all funding. Operations were being suspended immediately. There was no phased transition. No bridge funding. No contingency plan. No advance notice. The system that an entire country's humanitarian coordination depended on was going dark.
I knew, as I pressed send, what would happen next. I had spent that entire week receiving similar emails from other partners — their own USAID programme suspension notices arriving one after another. I had built enough data systems across several countries to understand that what was about to unfold was not a technical failure. It was a political one — a structural collapse that had been designed into the system from the beginning, waiting for the moment when a single point of failure would be tested.
Afghanistan in 2025 was that moment.
What Happened When the Platform Went Dark
The sequence was predictable in hindsight and catastrophic in practice.
The United States had been funding 43% of all humanitarian aid to Afghanistan — approximately $562 million. When the funding freeze hit, it did not arrive with a transition plan. It arrived as a stop order. The implementing organisation I worked for — the organisation that built, maintained, and hosted the platform — had no independent revenue stream for this programme. The platform ran on a single donor's money. When that money stopped, the platform stopped.
The consequences rippled outward in concentric circles of institutional failure. The lead UN coordination agency cancelled planned meetings with the implementing organisation and excluded it from critical information management discussions — institutional preservation in real time, distancing itself from a partner that could no longer deliver. Partners who had built their reporting workflows around the platform were left without access to essential humanitarian data mid-response. Cluster leads lost their evidence base. Working groups lost their analytical inputs. The shared picture of who was doing what, where, for whom simply vanished.
The reputational risk landed squarely on the implementing partner — even though the structural failure was never theirs alone to prevent. The donor decided to freeze funding. The coordination body decided to cut ties. The partners had no alternative system. Every actor retreated into self-preservation. Nobody fought for the shared infrastructure — because nobody owned it enough to fight for it.
The Power Map Nobody Draws
What the Afghanistan experience exposed is a power structure in humanitarian data infrastructure that everyone navigates but nobody maps.
The donor controls funding. A single government funded nearly half of all humanitarian operations in Afghanistan. One political decision in Washington collapsed humanitarian data infrastructure in over 50 countries in real time — because the funding model never required diversification or contingency. What happened in Afghanistan and several other countries was a perfect storm, arriving at the period when major donor governments were competing on who could cut more humanitarian funding. Germany, the UK, France, Japan, and Saudi Arabia all reduced aid budgets simultaneously. Total global humanitarian funding fell from $37 billion in 2024 to $20.5 billion in 2025 — its lowest level in a decade. The Council on Foreign Relations called it "the great aid recession". The Carnegie Endowment described it as a "painful, seismic shift" — not a temporary dip but a structural contraction in the global development partnership.
The UN coordination body controls legitimacy and access. The lead coordination agency determines whose data is authoritative and which platforms are endorsed. When funding was cut, its decision to distance itself from the implementing partner was a withdrawal of legitimacy — the platform's technical capabilities had not changed, only its funding.
The implementing partner controls the platform. But operational control without financial independence is an illusion. The implementing partner could not keep the platform running without the donor's money, could not transfer it without the coordination body's endorsement, and could not preserve partner access without both.
The government controls sovereignty — in theory. In principle, the government of Afghanistan — like any sovereign state — has the right and responsibility to own its humanitarian data infrastructure. But Afghanistan presented a familiar dilemma: a globally unrecognised Taliban leadership, banned under multi-donor funding agreements from accessing data on Afghan populations for understandable protection concerns — a topic explored further below. Even setting aside this legitimacy constraint, the broader reality applies across most developing country contexts: the capacity to absorb a nationwide reporting platform overnight is nonexistent. Sovereignty without capacity is a constitutional right without operational meaning.
But Afghanistan exposes an even deeper dilemma — one that the humanitarian data community has barely begun to articulate.
The Data Ownership Dilemma Under Contested Legitimacy
What happens to data sovereignty when the international community does not recognise the government that claims it?
Afghanistan under Taliban rule is not a failed state. It is a de facto authority — an entity that exercises effective territorial control, provides basic governance functions, and administers the population, but lacks international recognition. The Taliban have not been recognised by most UN Member States, and most donor countries as of 2025 were not maintaining a formal embassy in Kabul. Moreover, the donor conditions attached to humanitarian funding — particularly from the United States — explicitly prohibit sharing proprietary data, programme information, and institutional resources with the Taliban administration.
This creates an extraordinary paradox for data infrastructure. The humanitarian sector's best-practice principle is sovereign government ownership of data systems — build for the government, anchor in national institutions, transfer administrative control. But when the governing authority is sanctioned, unrecognised, or classified as a designated entity under counter-terrorism legislation, that principle collides with the legal and political conditions attached to the funding that built the system in the first place.
Afghanistan is not alone in this predicament. Nearly 200 million people live in areas where non-state armed actors or de facto authorities exercise some degree of territorial control. In Yemen, the Houthis have seized equipment — laptops, routers, communication devices — from UN agencies and NGOs, crippling their ability to manage data and deliver aid. The Houthi resistance to WFP's biometric registration system was driven not by data protection concerns but by geopolitical sovereignty claims over population data. In Sudan, both the Sudanese Armed Forces and the Rapid Support Forces have used bureaucratic control — visa restrictions, customs seizures, travel permits — to restrict humanitarian data flows and operational access. In Libya, competing administrations in Tripoli and the east have each claimed authority over humanitarian coordination, creating parallel data governance structures with no unified national owner.
In each of these contexts, the data infrastructure question is not simply "who hosts the server?" It is: to whom can you legally, ethically, and operationally transfer data sovereignty when the entity that controls the territory is the entity your donor prohibits you from engaging with?
This is the data ownership dilemma in contested legitimacy — and it has no clean resolution. The IASC Operational Guidance on Data Responsibility establishes principles for data protection in humanitarian action, but it was not designed for contexts where the sovereign authority itself is the data protection risk. The USAID Inspector General's assessments of Afghanistan programming documented the tension between operational necessity and anti-terrorism compliance — a tension that extends directly to data infrastructure ownership. And the academic literature on digitisation and sovereignty in humanitarian space has identified the fundamental problem: humanitarian organisations depend on grants of sovereign authority to operate, but the digital infrastructure they build generates data assets whose ownership is contested by the very authorities that granted access.
The practical consequence is paralysis. Data systems in these contexts cannot be transferred to the de facto government (donor conditions prohibit it), cannot remain with the implementing partner indefinitely (funding is temporary), and cannot be handed to the UN coordination body (which lacks the technical infrastructure and mandate to host them). The data sits in an institutional no-man's-land — owned by everyone in principle, controlled by no one in practice, and vulnerable to exactly the kind of overnight collapse that Afghanistan demonstrated.
Nobody controls continuity. This is the structural flaw. Continuity — the thing that matters most to the 115+ organisations whose daily coordination depends on the platform — is a shared responsibility that no single actor is mandated, funded, or structured to deliver. Every actor has a legitimate mandate. None of those mandates include ensuring that the shared data infrastructure survives when any one of them walks away.
This Is Not an Afghanistan Problem
It would be comforting to treat this as a unique failure — a perfect storm of political disruption, donor concentration, and institutional dysfunction specific to one country. It was not. Afghanistan was a stress test that revealed a system-wide architectural flaw.
The evidence is now overwhelming. The State of Open Humanitarian Data 2026, published by OCHA's Centre for Humanitarian Data, documented that crisis data availability fell from 74% to 68% across 22 humanitarian operations. OCHA's own information management capacity was cut by approximately 25%. UNHCR and IOM — two of the largest operational data producers in the system — saw data staff reductions of approximately 40%. The Centre for Humanitarian Data warned that "2024 may be the high-water mark of data availability for years to come".
The Center for Global Development framed it as "the coming humanitarian data drought". UN News reported budget cuts "devastating data gathering." Devex documented the broader collapse: humanitarian funding fell to $20.5 billion — its lowest level in a decade. And OCHA's Afghanistan assessment found 78% of coordination positions at national and sub-national level expected to be impacted. These are the information managers, GIS officers, and cluster coordinators who produce the analytical outputs that decision-making depends on.
The pattern is structural, not incidental. Humanitarian data infrastructure globally is built on the same fragile foundations: single-donor dependency, implementing-partner-hosted platforms, coordination mechanisms that assume continuous funding, and an absence of contingency protocols for when those assumptions fail.
The Architecture of Resilience
What would a resilient humanitarian data infrastructure look like? Not a different platform — a different governance architecture.
Sovereign government hosting. Data infrastructure that serves a country's humanitarian coordination should be hosted on infrastructure that the country's government controls. When the implementing organisation leaves — or is forced to leave — the data stays. The UNDRR Strategic Framework 2026-2030 identifies this principle as a critical gap requiring systematic attention.
Diversified, multi-donor funding. No data platform that serves an entire country's coordination should depend on a single donor. This requires pooled funding mechanisms, cost-sharing agreements, and minimum reserve requirements that guarantee operational continuity during transition periods.
Mandatory contingency protocols. The Afghanistan platform had no contingency plan for donor withdrawal — no bridge funding, no phased transition, no data escrow. Every humanitarian data platform should have a documented protocol specifying what happens when the primary donor withdraws, how long operations can continue on reserves, and how partner data is preserved during any transition.
Data continuity agreements. Partner data submitted to a coordination platform must remain accessible regardless of the platform's operational status. Data escrow — standard in commercial software — is virtually nonexistent in humanitarian data systems. The Grand Bargain 2.0 provides a policy framework, but the operational mechanisms have not been built.
Intersectoral governance that assigns continuity. Someone must own continuity — not the platform, not the data, but the ongoing availability of the shared coordination infrastructure. This means a continuity mandate assigned to a specific body, ideally the coordination mechanism itself, with the authority and resources to ensure the system survives the withdrawal of any single actor.
The Conversation Nobody Wants to Have
The reason this architecture does not exist is not technical. It is political: building resilient data infrastructure requires every actor to cede some control. Donors must accept that funding does not buy unilateral control over continuity. Coordination bodies must accept responsibility for the infrastructure they endorse. Implementing partners must accept that the platforms they build belong to the coordination mechanism. Governments must invest in the capacity to host and govern these systems.
The humanitarian data drought is not a future risk. It is a present reality. The communities that depend on these systems — the 23.7 million people in need of humanitarian assistance in Afghanistan alone — are losing the data infrastructure that enables their response to be coordinated, targeted, and accountable. The question is not whether we can afford to build resilient data governance. The question is whether we can afford not to — knowing what happens when a single email at 11am can take an entire country's coordination infrastructure offline.
Continue Reading
Building Disaster Data Systems That Governments Can Own: Lessons from 10 Years in Humanitarian Information Management
A flood vulnerability analysis I designed died quietly two years after I left — the trained staff member moved on, the dashboard stopped refreshing, and the analytical capability that informed life-saving decisions disappeared. The hardest lesson from a decade of building these platforms isn't technical. It's institutional.
Read more →Lessons from Building Humanitarian Data Platforms Across Multiple Crisis Contexts
Multiple countries. Seven data platforms. A decade of work. Each one taught me something I could not have learned from a textbook. Six principles emerged across all of them — and none are about technology.
Read more →The Data Ecosystem Maturity Assessment: A Practitioner's Guide to Diagnosing National Disaster Data Readiness
On my first week at a UN agency headquarters, I asked: "How many data systems does this Division use?" The answer took three weeks to assemble. That experience of mapping before building became the foundation for every data system project since. A maturity assessment is not a delay — it is the investment that ensures the system you build is the system that survives.
Read more →