Care doesn’t scale

I’ve been thinking a great deal about the kind of ethics we need in a world of automation.

Recently, I wrote a brief reference post about systems of ethics: duty-based, consequence-based, virtue-based, care-based. I hoped to use it as a stepping stone towards a more complex discussion about ethical thinking in the current polycrisis. But then I got sick.

At first it was a head cold; then it was a flu; then it was a depressive episode. Which happens to me, periodically, and feels baseless and indulgent and petty and is nonetheless… just… what it is.

It seemed as though the episode followed from the flu, but I am pretty sure it’s what caused it. My bodymind has been utterly overwhelmed in recent months, with the GenAI-in-everything conversation, the huge number of new people in my life, the manufactured assessment crisis, the demolition of US democracy, the genocide of the Palestinian people, the personal attacks, the… you get it. It’s… too much.

And I hate to say it, but it proved the point I had been wanting to make: care doesn’t scale.

There is no algorithm that might make care more efficient. To attempt to scale care is to care less — to actively deplete the world of care.

Towards ethics at scale?

Remember that any “algorithm”, however complex, is simply a set of rules for solving a problem. We usually talk about algorithms as computer programs, but they are purer than this: they are mathematical expressions that find form in digital systems. To digitise is to quantify; to quantify is to make scalable.

An ethics built on principles (deontology) can be automated and scaled. Rules can be coded into algorithms in order to mechanise ethical decision-making based on which principles to apply.

Digital ethics. Adapted from Doctrina christiana en la lengua Guasteca co[n] la lengua castellana. Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)

An ethics built on consequences (teleological) can be automated, by aggregating mass data to produce predictive analytics calculating and weighting the probabilities of certain consequences to determine optimal actions. These ethics can be automated; these ethics can scale.

An ethics built on virtue might be more of a challenge to automate, but is again possible by applying data logics. Each of the personal virtues — honesty, compassion, courage —might be profiled through data annotation so that virtue could be mapped in a way not unlike facial recognition, and composite virtue profiles developed to model ethical personae.

Virtue profile: Miriam_0304G1

But care doesn’t scale.

An ethics of care cannot be automated. There is so much to care about: our families; the people around us; the health of our ecosystems; the work we do, paid and unpaid, to keep our worlds turning. But caring cannot be offloaded. Sure, if we delegate duties to an algorithmic system, the duties remain intact and the system will assign and perhaps even execute them. But if we attempt to delegate care to an algorithmic system, care stops.

The notion of ethics of care has deep roots, but one of its most important thinkers is Nel Noddings, who showed us that care is located in the relation between a person caring about and a person caring for. (Which, of course, can and usually does go both ways.) Caring is, therefore, an intimate relation from which ethical decisions can be made.

But the purpose of automation is to reduce the level of human intervention in forms of activity. The less we have to intervene in an activity, the less we need to attend to caring about it. So we must always remember that every time an algorithm makes something in our lives “easier” or more “efficient”, what it has done is taken care away from the world.

The performance of care might continue. Algorithmic assemblages have become extremely good at saying nice things to make people feel nice. But, as recent cases of LLM-induced child suicide, delusions and murder-suicide have shown us, a chatbot that says nice things is not a chatbot that cares.

But true care is nothing less than a radical rejection of scale. True care is unscalable — because you can’t scale something that is everything.

Feelpunk: to care radically

About a month ago, after we came up with the notion of #thoughtpunk as a principle of algorithmic resistance, a brilliant and deep-hearted friend mused that thoughtpunk was incomplete without #feelpunk. Writing from the perspective of language education, they suggested feelpunk was an idea that combined the rejection of corporate control with:

“empathy, planet focus, decolonial culture and language mediums, and the anti-authoritarianism of inquiry and the pursuit of human connection.”

Anke al-Bataineh

In short, care.

In fact, I think feelpunk is even more profoundly important to algorithmic resistance than thoughtpunk. We must defend our freedom of thought, yes — but it is our ability to feel, to care, which shapes those thoughts. We are in the gravest of danger when the seductive smoothness of the algorithm begins to shape our feelings. This is what Netflix does when it “personalises” a feed of content to manage our feelings and keep us streaming. This is what persona chatbots do before they seduce us into destroying our families’ lives and our own.

So, there are two ways we can lose our care to algorithms: one, to allow them to use our own data to shape our feelings; and the other, to ask them to take our cares away from us by performing them without feeling them. We must never allow either.

6 responses to “Care doesn’t scale”

  1. Tom Worthington Avatar

    We have a choice to build automated systems which are designed to improve the lives of people, or ones which don’t. I teach some people who build these systems and encourage them to do the former.

    An extreme example of the latter is Robo Debt. This was a system built by the Australian Public Service, at the direction of a government minister, to recover debts. This could have been done in a way which was sensitive to vulnerable people receiving government benefits, but was not. It suited the then government to be seen to cracking down on welfare cheats.

    The result was persecution of people who had done nothing wrong, several deaths and a cost of around $1B in government in compensation for the suffering caused. This is similar to the British Post Office scandal. My fear is we will see many more such systems facilitated by AI.

    Like

  2. Eamon Avatar

    Care doesn’t scale and scale doesn’t care…

    Please be good to yourself.

    Physical illness esp covid can really rob me of good feelings. And I have to say to myself i cannot do any good thinking with these thoughts. These are not the type of thoughts to try to think anything important with and just use the body instead to do the basics and wait for the good thoughts to regrow themselves.

    I,ve tried to write about care but ended up only writing about the absence of it 😦

    “who cares about learning design”

    Click to access Who%20cares%20-%20post%20print.pdf

    and “postdigital ethics of care”

    https://www.researchgate.net/profile/Eamon-Costello/publication/382219793_Postdigital_Ethics_of_Care/links/669247ceb15ba5590756924f/Postdigital-Ethics-of-Care?origin=publicationDetail&_sg%5B0%5D=rtlXkZ4Wcl7k6UmIrYWvM192jJMd_oNwCTH_SGPKAo6FKBIV_P8_cyawvBTepk1TkiJofVmq0_1-ifQF-juXiw.F00JfXyMVqXc4LfF1pmUAIuZduPykWtzO_3Mz3sP3yIaKkdwP7SzWzVlRziFCO35FRrFn3M4-GZ1y5b5ryyBlA&_sg%5B1%5D=Yr66OgOqRlVs1mv0em2hPUkULlLgSu65GrP1GzPGXX3iQNFqH4gTnViUJ9v7WSwWJCMIofN20ut4KxJskMtzT_yTD22ACJ4YOerxF-SG-fqP.F00JfXyMVqXc4LfF1pmUAIuZduPykWtzO_3Mz3sP3yIaKkdwP7SzWzVlRziFCO35FRrFn3M4-GZ1y5b5ryyBlA&_iepl=&_rtd=eyJjb250ZW50SW50ZW50IjoibWFpbkl0ZW0ifQ%3D%3D&_tp=eyJjb250ZXh0Ijp7ImZpcnN0UGFnZSI6Il9kaXJlY3QiLCJwYWdlIjoicHVibGljYXRpb24iLCJwb3NpdGlvbiI6InBhZ2VIZWFkZXIifX0

    Please mind yourself and make the circle of care as small and bright as it needs to be for just the absolute minimum of who and what needs it.

    Liked by 1 person

    1. Miriam Reynoldson Avatar

      Oh Eamon, this is why I gush about you to anyone who’ll listen. Your entry on “postdigital ethics of care” inspired me to co-write a chapter that’s in press at the moment, which uses the concept of “postdigital pedagogy of care” – a term, that, once I thought of it, I discovered you’d also already written about. https://medium.com/@eam0/what-are-pedagogies-of-postdigital-care-ccca083e864d

      Liked by 1 person

  3. tamsinhaggis Avatar

    Hi Miriam,

    Sorry to hear you’ve been unwell, yuk. This is really food for thought.

    Liked by 1 person

    1. Miriam Reynoldson Avatar

      Thanks Tamsin, I’ve been meaning to email you back – but you know!!! I can’t hold a candle to your incredible work at the moment, but the images in this one are all me – because I adore your blend of written and visual language.

      Like

      1. tamsinhaggis Avatar

        I love the images here!

        Like

Leave a reply to tamsinhaggis Cancel reply