
Code, Power, and Responsibility
by Wisdom Obinna, 2025 Design & Technology Fellow
The most unsettling lesson from visiting sites where atrocities occurred is not that evil once existed. It is that it was professionally effected. Auschwitz’s geometry—the sizing of blocks, the routing of transport, and the calibrated separation of functions— was not improvised savagery but implemented design. The precision of records, timetables, and construction drawings did not merely document murder. It enabled it. Confronting this history as a designer and technologist collapses the comfortable distance between past and present. The shock is not one of novelty but of recognition: we know these diagrams, these process charts, these optimization problems. They are familiar because they are kin to our own habits of mind.
FASPE insists that what we call “technical” work is never morally inert. The Holocaust was not only a political project but a professional one. It required architects who refined plans, engineers who improved throughput, communications staff who transformed orders into repeatable procedures, and business managers who synchronized logistics at scale. The language that shielded them was procedural: requirements, efficiency, standards, compliance. The words that reflect a competent workplace became a veil through which the human object of those functions disappeared. That pattern endures. We have our own euphemisms: data harvesting, engagement optimization, and content moderation, which allow us to continue building while suspending judgment about what our systems do to the people who live under them. To misname is to misrecognize; the vocabulary of abstraction is often the first instrument in abdication of responsibility.
This is not a claim about uniquely wicked individuals; rather, it is a claim about normal professional incentives. The historical record reveals how complicity evolves incrementally. Professionals did not wake one morning as villains. They were diligent employees who met deadlines, surpassed targets, and earned commendations, all while their craft was redirected from serving life to processing it. The gradient matters: one accepts a narrowed task definition (“I only handle scheduling”), then an expanded metric (“throughput is success”), then a taboo on questions (“ethics lives elsewhere”), until the sum of reasonable steps creates an unreasonable system. FASPE’s most bracing pedagogy is to make that gradient visible and then ask where we stand on it now.
The temptation in contemporary technology is to shelter in the myth of neutrality— tools have no politics, only applications do. However, technology is never introduced into a vacuum; it enters institutions, power hierarchies, and markets that shape both its purpose and its effects. Design sets defaults, narrows choice sets, encodes classifications, and assigns errors to some bodies more than others. An algorithm that “merely predicts” takes sides the moment its errors are borne unequally. A content system that “merely maximizes engagement” recasts attention scarcity into a weapon against deliberation. A deployment that “merely scales” multiplies whatever harms are already present in the pilot. Neutrality is not protection. It is in alignment with the status quo.
To resist that alignment, one must name three professional habits that FASPE makes it impossible to ignore.
First, the habit of abstraction. Abstraction is indispensable to modern engineering; it is also the path by which persons become variables. When a life is compressed into features for modeling, the choice of features—what counts, at what resolution, measured by whom, against what ground truth—is a moral choice disguised as a technical one. The history we studied in Berlin made this painfully clear; bureaucratic forms created categories that determined fate. Today, our pipelines, data collection, labeling, loss functions, and thresholds do the same work under different names. The corrective is not to abandon abstraction (that would be to abandon practice) but to contest it, to expose whose perspectives it encodes, whom it mismeasures, and whose story it excludes. Abstractions should be audited both for accuracy and bias, and also for representational justice: do the modeled have agency in how they are modeled? Are the harms of misclassification borne by those who consented to be classified? Can the system be refused?
Second, the habit of diffusion. Modern organizations distribute work across teams and time. Diffusion is productive, but it is also anesthetic. When no one person sees the whole, accountability evaporates into process. FASPE’s historical profession case studies (from crematoria optimization to transport logistics) show how division of labor softened consciences. Each contributed a “harmless” part whose integration proved lethal. Contemporary tech repeats this pattern; research writes a paper, platform integrates an API, growth tunes a metric, sales lands a client, policy drafts a statement. Everyone did their job, but no one owned the outcome. The professional remedy is structural, build deliberative choke points where cross-functional review can stop a launch, require chain-of-responsibility memos that name who is answerable for which externalities, and ensure red-team authority cannot be overruled by the same incentives that reward speed.
Third, the habit of amnesia. Progress culture prizes novelty. But the forgetting that attends this lust for speed is not innocent. The sites we visited in Poland and Germany were not merely memorials; they were arguments about the civic function of memory. In engineering, we ritualize “postmortems,” yet we rarely turn them into institutional memory that changes what we build next. We teach Bhopal, Tuskegee, and other canonical failures as isolated tragedies with tidy lessons learned, not as warnings about how social stratification, profit pressure, and bureaucratic dilution reliably reassemble the same risk conditions. FASPE’s counter-pedagogy means embedding historical literacy into professional identity, knowing the lineage of your tools, the politics of your standards, and the uses to which your predecessors’ inventions were put, not to indict yourself by association, but to sharpen your imagination.
From these habits follow professional obligations that are practical and exacting.
Design with situated accountability. If a system makes a consequential classification, those classified must have avenues to contest, correct, or refuse participation. Build contestability into the interface, not as reputational theater but as an operational requirement with time-bound response SLAs and escalation paths that cannot be routed around by product timelines.
Document value choices at decision time. Every major modeling or product decision should be accompanied by a short, public-facing statement of the tradeoffs accepted, the metric optimized, the populations prioritized, the harms foreseen, the unknowns accepted, and the rationale for proceeding. Make this document auditable and binding across re-orgs. Ethics that cannot survive an org- chart change are not ethics.
Rehumanize evaluation. Move beyond aggregate performance to distributional impacts. Report error stratified by demographic proxies and context of use; measure false positive/negative asymmetries as first-order regressions, not footnotes. Refuse claims of “objective” accuracy absent evidence that the people bearing the consequences of mistakes had a voice in defining success.
Create veto-power communities. Establish standing councils of affected stakeholders with real authority to pause or reshape features before scale-up. Compensate them for their time. Publish when their advice is rejected and why. Consultation without power is a ceremony; ceremony is how professions launder conscience.
Practice disciplined refusal. Some deployments should not exist in their proposed form, for these contexts, at this time. Normalizing “no” is not hostility to innovation. Rather, it is fidelity to the profession’s purpose. Write sunset clauses for sensitive capabilities; require renewal upon evidence of net good, not mere absence of scandal.
These obligations are integral to real engineering. In fact, they are engineering when human beings are the system boundary.
FASPE also reframes courage. The program’s most provocative claim is not that heroes were rare but rather that non-participation was often possible, and that when exercised collectively, it was powerful. The story of the Nazi euthanasia program offers documented cases of medical professionals who refused direct involvement and were reassigned rather than executed; while individual refusals did not stop the machine, widespread refusal would have slowed it. The lesson for technology is not naïve equivalence but strategic imagination: lone whistleblowing is not the only model of ethics. Coordinated micro-resistance, declining to build a feature without guardrails, insisting on pre-mortems for social risk, and collectively demanding external review for a high-risk client—these strategies shift the feasible set of corporate choices. Power compounds in networks. So does conscience.
A further challenge emerges from the spiritual economy of modern work: it rewards velocity, scale, and novelty, metrics that are indifferent to who absorbs error or who loses a voice. If we design only for what can be measured cheaply, we will design against those whose losses are expensive to detect. The corrective is methodological; pair quantitative excellence with participatory methods worthy of the name (not participation-washing). If the modeled are never in the room when modeling decisions are made, you have not built a product but a governance regime without consent.
There is a final discipline FASPE demands, and it is quieter than the rest: staying with the trouble of memory. The human tendency after exposure to atrocity is to let the mind protect itself. Daily life returns with its emails, dashboards, and sprints. Memory recedes. But work that shapes other people’s lives requires friction with that forgetting. Institutionalize it. Begin design cycles with a five-minute reading from a case file that went wrong. Open post-launch reviews with testimony from those harmed. Rotate engineers through support queues so the cost of error is not a rumor. These are not rituals of shame; they are devices that keep professionals answerable to the people for whom their abstractions stand.
Elie Wiesel’s admonition that neutrality helps the oppressor has been invoked often enough to risk cliché. It becomes real when it is translated into meetings, tickets, and roadmaps. Neutrality looks like approving a model card without the failure slice that would have complicated a launch, like accepting a client clause that prohibits disclosure of known risks, like congratulating ourselves on “responsible AI principles” while budgeting them out of the delivery plan. The cure is not rhetoric. Instead, it is redesigning the workflow so that neutral choices are no longer the path of least resistance.
The measure that matters in the end is simple and severe: did our systems preserve human agency and dignity for those with the least power to demand it? Elegance without that outcome is vanity. Scale without it is harm at speed. The profession we inherit is the same one we are building now—in documentation practices, in staffing and incentives, in whose stories enter the room when we decide what counts as a problem worth solving. FASPE’s gift is to remove innocence as an alibi. We know enough of our history to act differently, and to design as though memory were a component not a mood.
We will not be judged only by what we could build but also by what we refused to build until it could be made just. That standard is exacting. It is also the only one worthy of a profession that lays claim to shaping the future.
Wisdom Obinna was a 2025 FASPE Design & Technology Fellow. He is currently a doctoral student in Computer Science at Georgetown University.