In “Building Perfectionist Ethics into Action-theoretics Accounts of Function: A Beginner’s Guide”, Wittingslow (2024) offers a promising way to infuse normativity into action theory, using my account of ‘human capacities’ perfectionism. I argue in “Human Flourishing and Technology Affordances” (Ferdman, 2024) that the human good is constituted by the excellent exercise of our innate human capacities, and that digital technology affords (and constrains) action possibilities to develop and exercise these capacities—to know, create, be moral, be sociable, use our bodies and our willpower. Integrating this account into action theory, says Wittingslow, could “build normativity into areas of philosophy where axiological questions (moral, political, or aesthetic) otherwise lie unexamined” (Wittingslow, 2024, p. 3).

Action-theory accounts try to make sense of artefact function by embedding it within human intention and action: accounting for how the user should use the artefact in order to achieve a desired outcome (p. 3). As such they provide a framework for assessing ‘attributive’ goodness: whether an artefact fulfills the requirements of its class, e.g. whether a Tomahawk cruise missile is good for the function of bombing a target (p. 4). Yet, as Wittingslow shows, these theories do not do a very good job of accounting for ‘predicative’ goodness (i.e. whether they are plain good, Kraut, 2012), leaving them ill-equipped to assess whether the artefact (e.g. Tomahawk) is morally or ethically good. Wittingslow reveals how my account can fill this theoretical gap, by showing that we can use it to assess the extent to which an artefact’s function and affordances contribute to the cultivation and application of the various innate capacities that we take to be good in the normative sense.

At the same time, Wittingslow argues that my framework for analyzing digital technology and flourishing makes an unwarranted analytical distinction between digital and non-digital technology: both digital and non-digital technologies furnish us with affordances and constraints that affect our capacity development and exercise. By treating digital and non-digital technologies alike, he argues, my account could provide “the foundations of a method by which perfectionist ethics can be built into action-theoretic accounts of technical function” (Wittingslow, 2024, p. 4), making it relevant not only for philosophers and design theorists but practicing engineers and designers (p. 5).

I am largely sympathetic to Wittingslow’s view that there is no justification for an analytic distinction between digital and non-digital technologies in relation to affordances, capacities and flourishing. I nevertheless want to explore the notion that we have reason to focus moral attention on digital technologies, thereby potentially rescuing the distinction between the digital and non-digital. Digital technology is so powerful because it can scale incredibly quickly, creating systemic effects (Véliz, 2023). The ethical challenges it poses, therefore, are urgent. One important challenge in regards to flourishing, is that digital technologies might have profound systemic effects not only on shaping our human capacities, but on how we come to value them. My worry is that digital technologies, as they are currently designed and deployed, narrow the ‘field of affordances’ (Wilkinson & Chemero, 2024) more than non-digital technologies do, and could lead to deskilling of the human capacities and subsequently, to societal devaluation of these capacities. Let me try to offer a sketch of an argument, which, when developed, might substantiate the intuition that digital technologies are distinct from non-digital technologies, at least in their systemic effects on capacity development and exercise.

First, according to Danaher (2022), we live in an age of algocracy: big data, predictive analytics, machine learning, AI, and robotics are increasingly involved in governing human behavior (p. 256). One manifestation of algocracy is that algocratic systems might end up replacing many types of human activity with non-human activity. Some of this replacement might be beneficial to flourishing (e.g. cases where AI replaces repetitive and mundane tasks). Yet other types of replacement might undermine flourishing, if they replace human activity that requires the exercise of valuable human capacities. The more algocratic a system is, the more it is likely to encourage humans to act like ‘simple stimulus–response machines’ (Danaher, 2022; Frischmann & Selinger, 2018). In the language of human capacities, the more algocratic the system is, the more it might afford action possibilities that forgo the need to use our capacities, thereby narrowing the field of affordances to develop and exercise capacities, undermining these capacities in the long run.

On a similar vein, digital tools that replace human action with non-human action might make the relevant capacities redundant, perhaps even obsolete. Think of navigation apps: the app tells the user what to do, rather than affording them the action possibility of actively gathering information from the environment and working this information into a hierarchical structure of knowledge. Navigation apps afford us ease, comfort, efficiency and stress reduction, but at the same time they demand less from our capacities, possibly making redundant the capacities involved in navigation.Footnote 1

Moreover, digital gamified environments like social media or fitness apps push our heuristics away from the subtle, the dynamic, the sensitive, towards what can easily be measured at scale (Nguyen, 2021). An agent’s rich and subtle values are being impoverished, by a certain techno-social environment, by the simplification and quantification of these values (Nguyen, forthcoming). Importantly, while technology (digital or non-digital) can trigger value change (Danaher, 2021; van de Poel, 2021), the proliferation of digital gamified environments could lead to the devaluation of the human capacities, in turn fueling capacity deskilling on a societal scale.

Lastly, if human capacities are like skills, then like skills they need to be continuously practiced, in order to achieve a degree of competent exercise (Sherman, 1991). Yet to exercise our capacities competently, we need to exercise the capacity to will, which is difficult and requires effort (Bradford, 2015). When digital technologies are designed to be easy, simple, and in many cases – addictive, they likely undermine the capacity to use willpower, thereby undermining the likelihood of engaging the other capacities which are dependent on the exercise of willpower.

So, we have reason to distinguish digital technologies from non-digital technologies, not on analytic grounds (as Wittingslow is correct to point out), but on moral grounds: insofar as they play a part in replacing, impoverishing or devaluing our human capacities, compared to non-digital technologies.