Designing for our Better Natures and Human Malleability

We are concerned that interaction with systems that often use coercive mechanisms to enhance superficial engagement encourage people to be more submissive and compliant and less reflective.

HUMAN MALEABILITY

The theory of machine intransigence and human malleability is that, despite the many enabling affordances of computers and computing, at this historical moment the combination of the properties of computing technologies and self-interested, narrowly conceived designs means that most people’s encounters with computers amount to something a great deal like being dominated or bossed around by the machine. Theories such as that of Interpersonal Psychology suggest that interactions with a dominating person call for a submissive reply. Many of the current computing systems, increasingly built into the fabric of our lives, get us used to the idea that our job is to comply with their demands. They offer us little or no meaningful recourse to behaviors that we object to.   Theories such as the Looking Glass Self (Cooley, 1902) point out that how we are seen affects our image of ourselves.  That is, we comply—and then we see ourselves as more submissive, less capable and more dependent.

This theory is an overarching theory related to large societal trends rather than specific experimental findings or self report. The theory relates epidemiological findings such as the prevalence of anxiety and depression to a variety of factors of which the nature of computers is only one.

The status of the theory is undetermined.   It is not proven. However, (1) there are indications that computers may change us in important ways that individuals are inclined to overlook and (2) the argument for exploration of these rests on how serious the implications would be if the theory was correct. After all, when the theory of germs arose, one source of ridicule was the impossibility of such little entitites having power over such large creatures as ourselves.

Furthermore, the design response does not have to await the verification of the theory. One may reasonably argue on other grounds that designers should routinely explore alternatives to coercive computing practices and cultivate practices that offer alternatives. The special force of setting such work inside a theory comes from the need to locate those moments in which it is particularly important that the machine emphasize its own limitations.

DESIGNING FOR OUR BETTER NATURES

Akin to junk food, modern technologies built into the fabrics of our lives are often very charismatic [1] but offer shallow promises of “better” and consumption-driven design sensibilities. They get us used to the idea that our job is to comply with the system’s demands. They offer us little or no meaningful recourse to push back on behaviors, demands or characterizations that we object to [3].

Our design approach begins by identifying this undue deference to a computing system as problematic. We argue that designers should routinely explore and cultivate alternatives to coercive computing practices. This is not critical design but a response to critical design. We look for small moves within the larger design spaces that that may be used to promote enduring human values. Here we focus on two projects that emphasize: (1) understanding other people’s point of view (ThoughtSwap) and (2) maintaining a clearer picture of our own purposes (CritiSearch). Additionally, (3) we will include one end of a third remote-communication project (FamilySong), providing an alternative motivation for its construction and design.

We see computing systems as operating within a larger socio-cultural context that is often taken for granted or invisible. Our designs reconfigure interactions [2], but in a way that encourages human control and reflection. Two design principles inform this:

Zensign
Zensign is the concept that what we exclude from design is as important as what we include [3]. Extra features are a distraction, which is a form of the exertion of power. Zensign helps us to reflect on the positionality and ideologies that, when embedded in the technology, may rob users of their agency.

The design of the technology is not the design of the sys- tem. Technology should either expose the larger system or at least leave mental space for integrated use-practices that may not be apparent.

Technology Takes a Back Seat
In our designs, technology takes a back seat to the negotiation of human purposes. The power of the technology is made visible in the way that the systems are embedded in the social fabric as mediators and enablers. It is not treated as omniscient.

Minimal, transparent designs support user focus on im- portant purposes, skills and abilities. By demonstrating these technologies together, we hope to draw attention to the technology’s impositions and present ways in which our design can support users to reconfigure the power differ- ences. We want to draw attention to limitations on design.

References

[1] Morgan G Ames. 2015. Charismatic technology. In Proceedings of The Fifth Decennial Aarhus Conference on Critical Alternatives. Aarhus University Press, 109–120.

[2] Lucy Suchman. 2007. Human-machine reconfigurations: Plans and situated actions. Cambridge University Press.

[3] Deborah G Tatar. 2014. Reflecting our better nature.interactions 21, 3 (2014), 46–49.

[4] Javier Tibau, Michael Stewart, Steve Harrison, and Deborah Tatar. 2019. FamilySong: Designing to Enable Music for Connection and Culture in Internationally Distributed Families. In Proceedings of the 2019 Designing Interactive Systems Conference.

 

Leave a Reply