Behind the Paper

From Algorithmic Control to Stewardship: A New Framework for Urban Climatic Justice

In an era of climate collapse and rapid urbanization, cities are increasingly turning to artificial intelligence (AI) to manage complex challenges. From optimizing traffic flow to rationing water, algorithms are becoming central to urban governance. But what happens when these "smart" systems, often praised for their efficiency, end up deepening existing social and spatial inequalities?

In my recent article, "Algorithmic Stewardship for Urban Climatic Justice," published in City and Built Environment, I argue that the current technocratic approach to "smart cities" is inadequate. Instead, we need a transformative shift towards algorithmic stewardship—a framework that retools urban AI to serve climatic justice rather than prioritise efficiency fetishism.

The Problem: When "Smart" Cities Make Inequality Smarter

Using Tehran as a pivotal case study, the paper reveals how algorithms, despite their technical elegance, can weaponize optimization:

  • Heat-Vulnerability Maps that equate adaptive capacity with property values, effectively erasing the sophisticated thermal knowledge of informal settlements.

  • Water Rationing Algorithms that prioritize supply to high-revenue industrial zones, systematically dehydrating peripheral communities and encoding "hydrological apartheid."

  • Disaster-Response Systems that triage human lives, valuing high-rise commercial districts over informal settlements in flood scenarios—a process I term computational necropolitics.

These aren't mere technical failures; they are design features of systems that optimize within unjust frameworks rather than challenging them.

The Solution: The Three Pillars of Algorithmic Stewardship

Algorithmic stewardship proposes a radical reconfiguration of urban AI, built on three core principles:

  1. Generative Transparency: Moving beyond "explainable AI" to make the value judgments within algorithms legible to the public. For example, an air-quality algorithm should not just report an index but reveal which industrial sectors are being sacrificed in cost-benefit calculations, turning a black box into a site for public deliberation.

  2. Systemic Accountability: Hardcoding ethics into the system's logic. This means algorithms must be designed with built-in checks for intergenerational equity, ensuring that solutions to today's water crisis don't sacrifice aquifer health for 2070. Accountability means an algorithm must ask itself: "Will this decision displace low-income families? Does it account for the thermal wisdom of street vendors?"

  3. Deliberative Polycentrism: Distributing power away from centralized tech hubs. Instead of the city's innovation center having sole control, stewardship advocates for community data cooperatives where local knowledge—like traditional water-sharing patterns—can shape climate models, creating friction against one-size-fits-all techno-solutionism.

Operationalizing Justice: From Principles to Practice

How do we bring this to life? The paper proposes concrete "institutional counter-architectures":

  • Algorithmic Courts of Equity: Public forums where citizens can formally challenge biased climate algorithms and demand their redesign.

  • Ecological APIs: Open data infrastructures that integrate community-generated knowledge, like the toxicity diaries of asthma patients or the dust-storm calendars of local herders, into official city models.

  • Democratic Immunity Zones: Protecting areas of urban life from computational incursion, such as cultural practices around traditional wind-catchers (badgirs) or local assembly control during crises.

A Path Forward

The transition to algorithmic stewardship demands a new research and action agenda focused on:

  • Epistemic Resistance: Valuing subaltern climate knowledges and creating "epistemic asylum zones" in AI training data.

  • Planetary Algorithmic Literacy: Equipping citizens to question computational decisions that affect their survival.

  • Post-Anthropocentric Custodianship: Granting legal representation and rights to non-human entities, like watersheds and tree canopies, in governance frameworks.

Conclusion

The choice is stark. We can continue down the path of computational authoritarianism, where climate adaptation becomes a vehicle for enforcing privilege. Or, we can embrace algorithmic stewardship to ensure that urban AI is a platform for repair and justice. For Tehran, and for all cities facing the climate crucible, this is not an intellectual abstraction—it is a survivability imperative. The future of our cities will be defined by algorithms; stewardship is our chance to ensure those futures remain human.


Read the full research article for free:
Dorostkar, E. Algorithmic stewardship for urban climatic justice. City Built Environ 3, 21 (2025). https://doi.org/10.1007/s44213-025-00065-4

Follow our research on Twitter | LinkedIn