Rob Ricci reviewed The Unaccountability Machine by Dan Davies
Nicely encapsulates my way of thinking about human systems!
5 stars
This book starts with the idea of accountability sinks: parts of a system that no one is responsible for, so there is no accountability for their actions. These are processes, algorithms, etc. that by design are not the responsibility of any one person, and which cannot be overridden so that no one can be held accountable for the outcomes that they produce. Davies makes the case that to some degree these are necessary in large systems, because we cannot really cope with systems in which there is personal responsibility for every decision. However, large systems can build so many accountability sinks that eventually no one is accountable for anything, and the system constantly produces outcomes that everyone involved claims not to want, and eventually break down.
From there, he goes into cybernetics, and the ways in which cybernetic theory describes the functioning and non-functioning of systems. One of the ways …
This book starts with the idea of accountability sinks: parts of a system that no one is responsible for, so there is no accountability for their actions. These are processes, algorithms, etc. that by design are not the responsibility of any one person, and which cannot be overridden so that no one can be held accountable for the outcomes that they produce. Davies makes the case that to some degree these are necessary in large systems, because we cannot really cope with systems in which there is personal responsibility for every decision. However, large systems can build so many accountability sinks that eventually no one is accountable for anything, and the system constantly produces outcomes that everyone involved claims not to want, and eventually break down.
From there, he goes into cybernetics, and the ways in which cybernetic theory describes the functioning and non-functioning of systems. One of the ways that accountability sinks make systems dysfunctional and unstable is that they break the flow of information: when you're unhappy with an outcome, there is, by design, no one you can express that unhappiness to that can do anything about it. At a small scale, this is tolerable and maybe even helpful to the health of the system. But when the accountability sinks are too numerous or in the wrong spots, the system becomes incapable of absorbing new information, eventually ceasing to respond to changing conditions, new needs, signs that it is failing, or growing dissatisfaction.
Eventually, all that is left is for those who are dissatisfied with the system's functioning is to pull the fire alarm (emergency brakes in Davies' examples). Davies' core thesis is that one of the problems with our modern large systems is that they are too full of accountability sinks, there are no nuanced ways of communicating what's going wrong, how it could be done better, what we actually want, etc. All we are left with is these single-bit messages where people signal 'something is going wrong' using the emergency stop mechanisms because that's all they can do.
And yes, this connects to AI, and so much more that's going wrong, but at this point you should probably just stop reading my review and read the book!