
The Tech Examen
by Lindsay Sanneman, 2024 Design & Technology Fellow
Introduction
What I propose, therefore, is very simple: it is nothing more than to think what we are doing. - Hannah Arendt, The Human Condition
The hand of the Lord came upon me, and he brought me out by the spirit of the Lord and set me down in the middle of a valley; it was full of bones. He led me all around them; there were very many lying in the valley, and they were very dry. He said to me, “Mortal, can these bones live?” - Ezekiel 37:1-3
Our FASPE trip took us to many sites that speak to and memorialize professional involvement in the Holocaust. We visited the locations of early wartime atrocities, such as a forced labor camp, a euthanasia center near Berlin, and critical logistical and operational hubs like the House of the Wannsee Conference, where Nazi officials devised the Final Solution. Our journey culminated in a visit to Auschwitz, the ultimate symbol of the Third Reich’s genocidal regime.
Throughout this process, we reflected on key questions facing professional lawyers, business owners, and engineers who contributed to the design and operation of these sites. For example, we discussed the engineers of Topf and Sons, which was founded in Erfurt, Germany in 1878 as a developer of heating systems and crematoria.14 Under the Nazi Regime, Topf pivoted to meet the needs of the regime, and rather than designing crematoria that enabled dignified interment preparations for grieving families, Topf engineers began to optimize ovens to maximize throughput and minimize conspicuous odors. Some engineers, such as Kurt Prüfer, even put in extra hours, visiting Auschwitz throughout the design process, going beyond the minimum design requirements to exceed required specifications.6 How did so many regular professionals, many of whom seemed motivated by mundane career ambitions, become convinced to perpetuate such an atrocity?
The shift from “business as usual” in the post-WWI economy to the "death economy" of WWII did not take place overnight. An economic downturn and national shame in the wake of Germany’s defeat in WWI slowly eroded German people’s trust in the Weimar Republic.3 Once the Nazis came to power in 1933, Joseph Goebbels, the Nazi minister of propaganda, helmed a powerful apparatus that provided convenient scapegoats. People who were already on the margins of society were increasingly portrayed as dangerous “others” who posed risks to the dominant Aryan race.3,8 This situation allowed for the slow development of more extreme circumstances: first, forced labor camps filled with criminals and prisoners of war became broadly socially acceptable. Next, mass euthanasia of disabled or terminally ill persons, while unsettling to many, became an unspoken norm during the war. Finally, most Germans looked the other way as the Nazis undertook the systematic extermination of an entire race of people through death camps.
While top leaders of the Nazi regime like Goebbels served as thought leaders in German society and were central to shifting public opinion towards willful ignorance or even positive acceptance of the Holocaust, tens of thousands of regular citizens needed to passively or actively collaborate to make these atrocities possible. Over the course of the Nazi period, it became easy for career-motivated non-ideologues within the professions to either slip unquestioningly into support of the state or to convince themselves that their small, indirect, or ethically neutral role within the mass murder machine exonerated them from any culpability. One extreme example is Albert Speer, a leading architect within the regime who was ultimately responsible for the entire German industrial system. After the war, Speer claimed that he was a “pure technocrat unconcerned with ethical and political tasks” despite serving as armaments minister throughout the Third Reich.11
Learning about professionals like Speer and the Topf engineers left us with more questions than answers: Could things have been different, and if so, how? Who had the power to recognize the potential for change and actually effect it within the Nazi context? How could this power have been wielded effectively? Finally, and most importantly, how can we prevent ourselves in our own modern professional contexts from falling into similar traps?
From Nazi Germany to Modern Tech: The Power of Reflection
As a technologist, I was particularly struck by the similarities between the stories of Nazi professionals and my own experiences within modern tech. Engineers at that time like engineers and technologists today had the power to design and shape the world aroundthem, either towards human flourishing or hyper-optimized exploitation of marginalized people and even death. Engineering and design choices in the modern day, as in WWII, can either help or hinder systems of oppression. As technology designers, we must be mindful of the power we wield in the work we do if we hope to prevent harmful applications of new technologies from proliferating.
It has become all too easy to reassure ourselves that the technology we build is, at worst, morally neutral and that any given technological innovation cannot be good or bad in itself. In this view, only its applications have moral value. At the other extreme, the techno-optimist narrative of progress promises that technology can solve all problems and that the rising tide of innovation ultimately lifts all boats.10 In other words, as long as we invest in technological innovation, economic prosperity will increase, and its benefits will be shared broadly. However, the reality is that every technology is designed and deployed with specific ends in mind and within complex socio-technical contexts. As a result, all technological development is rooted in fundamentally political questions.22 History has demonstrated time and again that the degree of shared economic prosperity resulting from technological innovation depends on choices about what is developed and how. Broadly shared prosperity is by no means guaranteed.10
As modern technologists, it is not enough for us to accept narratives of progress or tech neutrality at face value. Uncritical approaches to technology development can prevent the broad sharing of societal benefits made possible by new technologies. Without critical assessment of our work, we can be blinded to the ways that new technologies reinforce existing systems of exploitation or marginalization or even produce novel ones. Our best approach, then, is to critically and continually reflect on what we are creating, how it might be applied, and the broader socio-technical consequences of our design and implementation choices.
The Jesuits and the Daily Examen
While this critical reflection is of the utmost importance, identifying the ways that modern tech is wrapped up in perpetuating or innovating systems of marginalization can be overwhelming. Given the complexity and global nature of modern socio-technical systems, it can be difficult to know where to start. Luckily, as technologists, complex systems are our wheelhouse. If we are capable of designing planes that fly, operating rovers on Mars, and developing generative artificial intelligence (AI) models with billions of parameters, perhaps we are also capable of thinking broadly and critically about the complex socio-technical impact of our work. In tech, many professional engineers like myself think of ourselves as problem solvers, and indeed, historically, our community has solved many amazingly complicated technical problems. If we hope to be ethical designers, we must apply these same problem-solving skills to thorny, professional moral issues.
In order to tackle complex technical problems, engineers break these problems into smaller, more manageable pieces. Individual engineers work on these smaller-scoped challenges and coordinate to bring individual components together to make the entire system work together. We can take a similar approach in reflecting on the broader socio-technical impact of our work, with individual technologists reflecting on their roles and decisions and how these relate to broader problems. Taking this approach, engineers, then, can come together and coordinate to address socio-technical changes as a group.
The Jesuits, a religious order within the Catholic tradition, offer a wealth of resources and practices that both those within and outside of the order leverage for individual and communal discernment, that is, the practice of thoughtfully and intentionally reflecting on decisions, including moral or vocational choices.16 One such practice, called the daily examen, invites practitioners to mentally review and evaluate the decisions they make each day.17 Practitioners begin by expressing gratitude for the gifts they have in their lives. They then move on to reviewing their actions from the day and the emotions that different moments throughout the day evoke in them. Next, they focus on one or a few moments and how their actions either did or did not align with their values. Finally, practitioners complete the daily examen by shifting their focus towards the next day, contemplating how they can more fully act in line with their values.
The examen can be practiced anywhere at any time. It is a flexible practice that leaves space for practitioners to spend as much or little time reflecting on their day as they would like. The shift in focus towards the practitioner’s agency enables them to focus on concrete ways to make small changes in their own lives to live their values more fully. On the one hand, this focus enables a person to avoid the potential paralysis that can emerge from the sheer complexity inherent in making sweeping changes. On the other, it can help a person to recognize their own potential to make impactful changes and to avoid becoming disillusioned and fatalistic about the possibility for change.
What changes might be possible if modern engineers and technologists adopted a practice like the examen? Drawing on three key themes touched on throughout the FASPE experience—abstraction, accountability, and collective power—I suggest an adaptation of the Jesuit daily examen called the “Tech Examen” that can be practiced by designers and technologists in their professional lives. In the following sections, I dive more deeply into each of these three theme areas and how they relate to the FASPE trip. I also propose questions corresponding to each category. At the end of this reflection, these questions are synthesized into the "Tech Examen," and I provide an overview of how to engage with this practice.
Dry Bones in Auschwitz
Near the end of the FASPE trip, when we visited the grounds of Auschwitz, I was reminded of the story of the dry bones in Ezekiel 37, which is also cited in an inscription at the entrance to the Yad Vashem memorial. In the story, the prophet Ezekiel is dropped into a valley of dry bones, a metaphor for the house of Israel, which was in exile in Babylon at the time. God then asks Ezekiel to prophesy to the bones to bring them back to life. God, working alongside Ezekiel’s prophecy, reinvigorates the bones with the breath of life, a symbol of the promise of restoration for the house of Israel. At the beginning of this story, God asks Ezekiel: “Mortal, can these bones live?”
While I was standing in Auschwitz, the buildings, the prison cells, the barbed wire, the crematoria, and all their wretched history seemed to cry out and ask the same. I wondered, can modern marginalized and oppressed bones around the globe live? Perhaps the first part of bringing dry bones back to life is taking a step back to reflect and “think what we are doing.”2

Figure 1: Inscription at the Entrance of Yad Vashem: “I will put my spirit within you, and you shall live, and I will place you on your own soil.” - Ezekiel 37:14
Rethinking Abstraction: Rehumanizing Tech
Thus says the Lord God to these bones: I will cause breath to enter you, and you shall live. I will lay sinews on you, and will cause flesh to come upon you, and cover you with skin, and put breath in you, and you shall live. - Ezekiel 37:5-6
The first theme that emerged from the FASPE site visits and discussions was the consequences of technologists’ choice to center abstraction in their decisions about design and policy. During the Holocaust, the Nazis reduced Jews, Romani people, the unhoused, gay men, individuals with disabilities, and those labeled as criminals to a set of caricatured features, making it easier for the people of Germany to see these groups as less than human. Instead of seeing a Jewish neighbor as a doctor, father, community member, or friend, many came to view them with suspicion. While some of these caricatures were informally promulgated in common conversation among German people or by the Third Reich’s propaganda machine, others were formally systematized in German institutions through census-tracking technologies, medical forms, and ordinances like the Nuremberg laws.3,4
One striking example of this involved the Brandenburg Euthanasia Center, which we visited as a group during our trip. Brandenburg was one of the earliest trial gassing locations, opened as part of Aktion T4 in 1939.13 At the start of the program, physically and mentally disabled Germans, including many children, were killed through experimental gassing. In choosing which patients were to live and which were to die, doctors relied on patient demographic data, which was reported on mandatory medical forms used in hospitals around the country. Categories used to classify patients included whether the patients received regular visits, how long they had been hospitalized, which illness they suffered from, whether they had committed a crime, their occupation (and whether they performed useful work), and their nationality. Patients who suffered from certain diseases, those committed for more than five years, those deemed criminally insane, or those not of German blood or nationality were to be immediately reported. Many of these patients were subsequently killed by gas. In euthanasia centers like Brandenburg, as in the

Figure 2: Medical Form used to determine patient outcomes9
concentration camps after them, abstracted representations of these patients determined whether they would live or die.
In the modern tech landscape, with AI technologies as a prime example, engineers rely heavily on developing and leveraging abstract representations of the world for modeling and decision-making. This abstraction is necessary, because it is never possible to capture the full complexity of the world in any one model. Thus, technological progress depends on the development of effective abstractions that represent the most important aspects of a particular decision-making task within a given context. With modern AI systems, these abstract representations are defined and iterated on through the choices that engineers make with respect to multi-step data pipelines. These include the selection of data to put in a given dataset, features to represent the data for the given decision task, AI model structures and training objectives, and output spaces for these models (i.e., the set of possible decision outcomes), among many other aspects.
Blind trust in the “objective” nature of these AI systems on the part of the general public and technologists themselves can lead people to believe that these abstractions are morally neutral. Since their perspectives are colored by their particular social contexts within the tech world, technologists especially might begin to believe that they have no agency with respect to which abstract representations are selected. We as technologists tend to employ techno-optimist lenses, trusting that new technologies will always improve society—–this tendency leads many of us to decide to work in tech in the first place. However, this viewpoint can make us complacent about the impact of our work. We risk departing from the “space of moral reasons,” where we reflect continually and intentionally on our moral responsibilities and choices.21
As a concrete modern example of misguided techno-optimism and uncritical data abstraction within an AI-based decision support system, the Design & Technology cohort studied the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) tool for predicting recidivism.1 Based on demographic data about a person accused of committing a crime, including their “current charges, pending charges, prior arrest history, previous pretrial failure, residential stability, employment status, community ties, and substance abuse,” the machine learning-based COMPAS tool assigns the defendant a risk score, predicting the likelihood that the defendant will reoffend before their trial if they are released on bail.15 The tool is intended to help judges more efficiently make decisions about pre-trial detention. It is supposed to be more impartial and fairer than human judges who are subject to cognitive biases.15
While the “objectivity” of this system seems like it could produce social benefit, there are several unchecked assumptions about bias and fairness in the current criminal justice system baked into its design. Since the COMPAS system was trained on human-generated data from the existing system, and machine learning-based systems tend to act as “mirrors” of the data they are trained on,21 COMPAS simply learned to automate many of the biases present within existing decisions. For example, disproportionate policing of certain populations in particular neighborhoods means that these populations, which are often populations of color, are overrepresented in the dataset. Beyond this fact, the features chosen to represent individuals in the COMPAS dataset (in other words, the abstractions chosen for the data and modeling of this application) include factors such as prior arrest history. Such datapoints likely have bias built into their measurement due to over-policing of certain neighborhoods. Another similar example is community stability, which is likely to correlate with race due in part to redlining. Not only is it difficult to accurately measure these features due to existing biases in the policing and criminal justice systems but utilizing these features to make future decisions also serves to reinforce the biases that are already present in the dataset because the machine learning system assumes that distribution of recidivism outcomes based on this set of features is constant over time.
Concretely, consider the findings from ProPublica’s investigative report on the COMPAS system. In their investigation, ProPublica discovered that although race was not used as a predictive feature in the COMPAS system, the system’s false positive rate for Black individuals (44.9%) far exceeded the rate for white individuals (23.5%).1 Black individuals, in other words, will disproportionately bear the brunt of the mistakes made by the system when it is deployed as a trusted decision-support tool for judges. Abstraction of individuals into historically biased or inaccurately measured features that are then used to predict an individual’s risk of recidivism serves to mirror and reinforce our broken criminal justice and policing systems.21 The COMPAS tool can amplify our existing brokenness, but it cannot possibly conceive of creative solutions that will meaningfully contribute to a more just society.
What does this mean for us technologists who work to develop systems like COMPAS? First, it is critical for us to be aware that we are constantly choosing abstract representations of the world when we create models for the tech applications that we work on.
We have the power to decide which abstractions we use, and there are consequences for the choices we make. We must be mindful of whose perspectives are represented in our choices of abstract representations and whose might be neglected.
If we hope to design systems informed by diverse perspectives, we cannot do it alone. As designers and technologists, many of us have access to social and financial resources that might limit our perspectives and blind us to the absence of neglected points of view. It is our responsibility to meaningfully seek out and engage a broad range of stakeholders in our technical decisions. It is especially critical to engage stakeholders who will be most impacted by new technologies. Without such perspectives informing both what should be designed for a given application and how, we are better off not designing new technology for this application at all. In engaging key stakeholders in the design process, we can also ask ourselves whether it might be possible to re-imagine who has the power to abstract. We can strive to place impacted communities in control of the abstractions used to represent them.
In designing new technologies, we must ask what is needed rather than merely what is possible. One way to achieve this goal is working with community representatives and organizations familiar with the needs of those affected by technological change. We must draw on practices like participatory design and value sensitive design7,12 while at the same time being careful to avoid "participation washing,"20 in which community members are briefly consulted in the design process to “check the participation box.” Instead, we must strive to foster meaningful, protracted relationships of mutuality, in which impacted communities are deeply involved in design processes and reap the benefits once new technologies are deployed. We can combine these qualitative and rehumanizing approaches with our quantitative modeling and design processes to produce better technical abstractions and models overall.
Breathing new life into the dry bones of those marginalized by tech requires proximity and mutuality between the modelers and the modeled and continuous reflection on the part of technologists about our model choices, including choices of abstraction. Questions we can reflect on related to abstraction in our work include:
- Who could be impacted by the work I did today?
- Which abstractions or representations did I use to characterize them or factors related to them?
- How might my design choices dehumanize these people, and what are the possible impacts of my work?
- Where my choices of abstract representations might serve to dehumanize people, how can I learn more about the people I model and their stories?
Reflecting on Accountability: Humanizing the Perpetrators
Suddenly there was a noise, a rattling, and the bones came together, bone to its bone. I looked, and there were sinews on them, and flesh had come upon them, and skin had covered them; but there was no breath in them. -Ezekiel 37:7-8
The second theme the FASPE group reflected on throughout our visits was the humanity of the perpetrators of the Holocaust and how seeing their stories in this way might bear on our professional roles today. Each site challenged us imagine ourselves in the shoes of the perpetrators and to ask questions about who these people were, what their motivations were, and how seemingly ordinary professionals could have contributed to the largest genocide in history. I often found that I could relate to the backgrounds of the perpetrators in ways that surprised me—many of the perpetrators were educated, middle-class professionals who valued their families and strove for success in their careers.3
I was particularly struck by our visit to the location of the Wannsee Conference. Of the participants in this genocidal meeting, most were professionals or academics of some kind. Eight had PhDs. As we walked onto the grounds of the conference on a beautiful late May morning, I reacted viscerally to seeing the building and the serene, well-kept grounds. The building immediately reminded me of a German research center for computer scientists called Dagstuhl, where I attended a seminar this fall. Dagstuhl is located amidst a remote and idyllic abandoned German castle. Lodging is provided on site, meals are catered, and the grounds include an extensive library, artwork, running paths, and a lake.
As I entered the House of the Wannsee Conference and saw the ornate dining room, the sunlit breakfast room, the marble floors, and the intricately decorated walls, I reflected on what the conference participants expected when they arrived on site for their meeting and what I have come to expect when I meet with my colleagues and friends to discuss technical solutions to hard problems.

Figure 3: The site of the Wannsee Conference and Dagstuhl Castle
We meet in top-tier cities with amazing conference venues, expect outstanding meals, stay in fancy hotels, and are waited on by attentive staff, such that nothing can impede our focus on the problems at hand.
While my colleagues and I do not gather to plan mass genocides, the experience of coming to a beautiful location to meet with colleagues and friends to discuss “hard problems” hit close to home. In my own research, I receive grants from the US Department of Defense to develop technologies that can enhance human-autonomy teaming, an area of research which has the potential for far-reaching military applications. While the participants at Wannsee were certainly well aware of the implications of what they were planning, many were convinced that they were working towards a public good. I wondered what it would take for me to convince myself that I am solving important problems in the name of advancing science or contributing to the common good, while I am actually enhancing our ability to execute killing operations that

Figure 4: Invitation Letter to the Wannsee Conference and to a Dagstuhl Seminar
target marginalized groups more efficiently and effectively. Perhaps I already am. This issue vexes me now, and I will continue to grapple with it throughout my career.
Inside the House of the Wannsee Conference, we viewed an exhibition displaying important documents from the meeting, which had been preserved in the years after the war. One document that captured my attention was an invitation to attend the conference. This document reminded me of the invitation I received to attend the seminar at Dagstuhl. I thought about the emotions I felt after receiving the invitation. I felt like my work, which I often feel uncertain about, had been validated. I could rest assured that my professional community had recognized my work as important and meaningful. My insecurities were temporarily assuaged, and I could feel gratified at the fact that I was making progress towards my professional and career ambitions.
While the participants at Wannsee were already top Nazi officials, I wondered how they responded to receiving their invitations to participate in the conference. Did they feel reassured that they were doing noble work? Did they feel validated in the positions they held? Did they feel honored to participate in something so important? I wondered to what extent the participants’ professional ambitions and insecurities hardened them to the consequences of their actions. To what extent do my own ambitions and insecurities dictate the direction of my work? Could they blind me to my own accountability with respect to societal harms? I came away from our visit to Wannsee with a renewed sense of the importance of reflecting on such questions, including what drives my own decision-making from day to day. I began to ask whether my motivations and decisions are aligned with my values and how I can approach decision-making with more intentionality. Questions that technologists can reflect on in considering our accountability within our professional roles include:
- Which decisions did I make today?
- Which motivations guided my decision-making process?
- Did these decisions align with my values?
- What will I do the same or differently in the future?
Identifying Collective Power
Then he said to me, “Prophesy to the breath, prophesy, mortal, and say to the breath: Thus says the Lord God: Come from the four winds, O breath, and breathe upon these slain, that they may live.” I prophesied as he commanded me, and the breath came into them, and they lived, and stood on their feet, a vast multitude. -Ezekiel 37:9-10
The third theme that emerged during the site visits and discussions throughout the FASPE trip was related to the power of collective action, both for good and ill. With each site we visited, it was clear that no one person alone could have executed the Holocaust. It took the participation and cooperation of tens of thousands, with some estimates suggesting up to 200,000 perpetrators in total.19 One way that the Nazis managed to foster such widespread participation was through the division of tasks and the minimization of information sharing between those performing different jobs.3 Each person performed a relatively narrow role, had minimal knowledge about what others were doing, and knew very little about the nature of the overall system. Although the German people most likely had some awareness of what was happening to Jews and other marginalized people throughout the Third Reich, information was only officially passed on a need-to-know basis, such that no individual held the entire picture of Nazi crimes.
This structure of work made it easier for each person involved within the system to feel a reduced sense of moral culpability. For example, at Auschwitz the train operators could say that they were only transporting people from one location to the next. The doctors could say that they were only selecting those who were fit for work. The guards could say that they were simply keeping order within the camps. The operators of the gas chambers could say that they were only mechanistically performing tasks they were ordered to perform. This distribution of responsibility not only diminished each individual’s sense of responsibility and accountability, but it also disempowered any one person from having enough agency to effectively fight the system on their own. Even if one person had the courage to stand up, there were plenty of additional people available to take their place. Beyond this fact, professional ambitions or fear of the repressive government likely kept many from speaking out.
However, during various FASPE site visits, we learned about many examples in which professional non-participation did not result in violent retribution or death but rather in reassignment to new roles or new tasks. For example, at the Brandenburg Euthanasia Center, we learned that doctors who refused to participate in the euthanasia program were most often simply assigned to practice elsewhere. While individual doctors who chose to refuse to cooperate with the T4 program could not have had sweeping effects on the efficacy of this Nazi program on their own, if many or all doctors refused to participate, this shift could have amounted to a substantial slowing of the process. The challenge for doctors in that context would have been identifying potential allies in resistance and determining meaningful concrete steps that they could have taken to be most effective.
Modern tech, while less obviously malevolent in its objectives, has similarities. For example, in companies with many software engineers, each individual engineer has limited power to dictate overall company directions within the scope of their work. On the flip side, each engineer also shoulders limited accountability for any negative outcomes associated with their work, since they contribute only small pieces of the overall puzzle. It can be easy to feel impotent to make changes in the face of the enormously complex socio-technical systems we interface with. In this way, it becomes easy to fall prey to a fatalism about our ability to work for good within these systems. Is it better to stay in our roles to try to make changes from within, or are we better off leaving altogether? While there are moral advantages to each choice, having a groundswell of people who stay in their roles and continue to wrestle with these questions can be powerful. Holding onto hope for change amid seemingly immutable and unyielding systems is possibly one of the most subversive actions that we can take.
What gives me hope is that while the Holocaust and similar atrocities are almost always perpetrated collectively, they can also be resisted collectively. In the modern tech context, if we can identify how tech-induced marginalization or exploitation is perpetrated, perhaps we also have what we need to shift collective action in a different direction. This task will require us to identify what “social capital” we have access to, in other words, what the inherent value of our social networks is for effecting change in our professional settings.18 Through strategies like those used in community organizing, we can begin to identify areas of mutual concern and to work towards maximizing the potential of leveraging this social capital towards positive change.5 If we stop and reflect on where we might find allies, how we can best foster relationships with them, and how we can work together to reimagine new directions for our work, perhaps we can start to change the tides of tech wherever it is not applied toward positive ends. Some questions we can reflect on when considering how to best leverage our collective power include:
- Which specific changes do I hope to make with respect to the work that I am doing or my work environment?
- What information or additional perspectives do I need access to have the greatest impact?
- What social capital and other resources do I need to make these changes?
- What social capital do I have access to now?
- Where am I lacking social capital that I need, and how can I start to build it?
- What power do my allies and I have to make meaningful changes, and what are the most effective concrete actions we can take to make those changes?
The Tech Examen
To practice the Tech Examen, begin by reflecting on the first two grounding questions to focus your attention on what is most important to you and where you find meaning. Then, with these reflections in mind, select one or more of the questions from the rehumanizing tech, assessing accountability, or identifying collective power categories to contemplate further. Choose as many or as few questions to ponder as you would like and spend as much or as little time reflecting as you wish.
The Tech Examen
Grounding Questions:
- What is most important to me? What are my values?
- Where did I find meaning and purpose today?
Rehumanizing Tech:
- Who could be impacted by the work I did today?
- Which abstractions or representations did I use to characterize them or factors related to them?
- How might my design choices dehumanize these people, and what are the possible impacts of such dehumanization?
- Where my choices of abstract representations might serve to dehumanize people, how can I learn more about the people I modeled and their stories?
Assessing Accountability:
- Which decisions did I make today?
- Which motivations guided my decision-making?
- Did these decisions align with my values?
- What will I do the same or differently in the future?
Identifying Collective Power:
What power do my allies and I have to make meaningful changes, and what are the most effective concrete actions we can take to make those changes?
Which specific changes do I hope to make with respect to the work that I am doing or my work environment?
What information or additional perspectives do I need access to in order to be most effective?
What social capital and other resources do I need to make these changes?
What social capital do I have access to now?
Where am I lacking social capital that I need, and how can I start to build it?
Lindsay Sanneman was a 2024 FASPE Design & Technology Fellow. She is an assistant professor of Computer Science at Arizona State University and a candidate for ordination in the Evangelical Lutheran Church in America.
Notes
- Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner. Machine bias. In Ethics of data and analytics, pages 254–264. Auerbach Publications, 2022.
- Hannah Arendt. The human condition. University of Chicago press, 2013.
- Doris L Bergen. War and genocide: A concise history of the Holocaust. Rowman & Littlefield Publishers, 2009.
- Edwin Black. IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America’s Most Powerful Corporation-Expanded Edition. Dialog press, 2012.
- Brian D Christens and Paul W Speer. Community organizing: Practice, research, and policy implications. Social Issues and Policy Review, 9(1):193–222, 2015.
- Liberation Route Europe. Biography: Kurt Prüfer.
- Batya Friedman. Value-sensitive design. interactions, 3(6):16–23, 1996.
- Stuart Hall et al. The spectacle of the other. Representation: Cultural representations and signifying practices, 7, 1997.
- Kristen Iannuzzi. Nazi Euthanasia and Action T4: Effects on the Ethical Treatment of Individuals with Disabilities. 2014.
- Simon Johnson and Daron Acemoglu. Power and progress: Our thousand-year struggle over technology and prosperity. Hachette UK, 2023.
- Eric Katz. The nazi engineers: Reflections on technological ethics in hell. Science and engi- neering ethics, 17:571–582, 2011.
- Michael J Muller and Sarah Kuhn. Participatory design. Communications of the ACM, 36(6):24–28, 1993.
- United States Holocaust Memorial Museum. Brandenburg T4 Facility.
- United States Holocaust Memorial Museum. Topf and Sons: An “Ordinary Company”.
- Northpointe. Practitioner’s Guide to COMPAS Core, 2015.
- Office of Ignatian Spirituality. Ignatian Discernment.
- Office of Ignatian Spirituality. The Examen.
- Robert D Putnam. Bowling alone: The collapse and revival of american community. Simon Schuster, 2000.
- Atika Shubert and Nadine Schmidt. Most Nazis escaped justice. Now Germany is racing to convict those who got away. CNN.
- Mona Sloane, Emanuel Moss, Olaitan Awomolo, and Laura Forlano. Participation is not a design fix for machine learning. In Proceedings of the 2nd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization, pages 1–6, 2022.
- Shannon Vallor. The AI Mirror: How to Reclaim Our Humanity in an Age of Machine Thinking. Oxford University Press, 2024.
- Langdon Winner. Do artifacts have politics? In Computer ethics, pages 177–192. Routledge, 2017.
- Wikipedia. Wannsee Conference.
- Wadern, Schloss Dagstuhl-Leibniz-Zentrum für Informatik GmbH, 66687. “Schloss Dagstuhl - Leibniz Center for Informatics.” Home, https://www.dagstuhl.de/. Accessed 5 Dec. 2024.