Another automation is possible

Submitted by AWL on 23 June, 2015 - 5:45

Automation is everywhere. From robots on production lines to the cockpits of planes; from automated market trading to highly skilled medical diagnosis via a whole range of blue and white collar occupations, few jobs seem to be immune to the replacement of human, living labour by computerised systems.

One report has recently predicted that as much as 47% of US employment is at risk. This is not just futuristic hype: the US has just gone through a “jobless recovery” from the 2008 crisis.

Automation also affects our everyday life outside work. The GPS maps in our phone or car; the algorithms (procedures that underlie computer programs) that learn about our activity on Facebook and suggest friends or things to do; the driverless cars that Google is developing — all take previously human tasks such as finding our way or our friends and turn them over to machines.

There are two inadequate responses found on the left. The first is just to fight defensive battles and seek to preserve jobs or oppose the introduction of new technology. The second is just to marvel at how capitalism develops technology that would enable life to be better with less work and abundance under “fully automated luxury communism”. We do need both immediate struggles and visions of the future but both fail to look at the processes and technologies that underlie automation and thus to develop a critical approach that would enable us to decide what should be kept and what rejected from current technological developments.

Nicholas Carr’s The Glass Cage seeks to do that. Carr is not anti-technology but has written several books critical of how technology is remodelling the way we live and what it means to be human. His aim is to “humanise technology”. The Glass Cage gathers material from a wide range of research across many disciplines.

Carr starts by pointing out that automation has gone beyond the point where it is vulnerable to the critique that there are forms of knowledge, learnt through experience and often subconscious, that cannot be translated into computer programs. Today computing power is fast and cheap enough to solve these problems by other means – essentially the brute force of calculation as seen in Big Blue’s defeat of chess champion Gary Kasparov. “The strategies are different, the outcomes for practical purposes are the same.”

Why not then simply embrace automation, particularly when computers are more consistent than humans? Much of the rest of the book is dedicated to explaining why we shouldn’t, how the outcomes do differ and why automation as practised today can have bad effects, often unexpected.

Carr discusses effects that result from the way automation “alters the character of the entire task, including the roles, attitudes and skills of the people who take part in it.” Jobs, such as that of airline pilot, are reduced to watching over an automated system — the glass cockpit — with little sense of the real world beyond the screen. The skills needed to deal with unexpected events deteriorate, users are lulled into a false sense of security because the system is there and they can believe wrong or misleading information coming from the system even when their senses or experience tell them otherwise.

In other professional jobs such as architecture and medicine the easy availability and enforced use of support systems can lead to a closing down of possibilities and a resort to pre-defined stereotypical solutions. This, as Carr points out, is not a consequence of the use of computers per se but rather the assumptions and range of possible actions built in to the software by its designer. “The character and the goals of the work... are shaped by the machine’s capabilities.” Use of the system thus comes to shape the way work is seen and carried out.

The deskilling that can result from the mediation of work by intelligent systems is not restricted to the skills needed to do a particular job but, with their widespread adoption, can undermine broader human capacities.

Carr talks about how the hard acquired skills of an Inuit tribe in orienting in the Arctic landscape for hunting is giving way to the use of cheap, easy to use GPS systems with dangerous results.

Closer to home, London cabbies armed with “The Knowledge” of London drummed into in their heads are doing battle with Uber’s cheaper minicabs, controlled by a computer app with drivers using GPS systems. Carr argues that GPS use does not merely do away with map reading skills but provides a one-dimensional, impoverished view of our environment which leads to a loss of appreciation of the world around us as well as our ability to solve spatial problems for ourselves.

Further, what artificial intelligence pioneer turned critic Joseph Weizenbaum described as the shift from judgement to calculation leaves us dependent on systems that cannot take ethical decisions. While humans can be wrong, they can weigh the possible consequences of their actions. Automated systems, even with powerful learning abilities, cannot deal with the full context and wide-ranging possibilities of everyday decisions. Carr gives the example of how a driverless car might react to an animal or a child crossing its path and the different considerations a human might implicitly use to decide in an instant.

What then are the alternatives? In a chapter entitled “Automation for the People”, Carr explores approaches to systems design that reject the assumptions that technology should simply aim to replace humans. Human-centred design aims to build systems around the user, seeking a division of labour between human and machine that aims to build on human skills, providing interfaces that do not reduce the user to as mere monitor of the system with an unchallenging job.

This approach has been around since at least the 80s but has not been widely adopted. Why? “Concerns about the effects of computers on people’s minds and bodies have been trumped by the desire to achieve maximum efficiency, speed and precision — or [ “and”, surely? BR] turn as big a profit as possible.” Another automation is possible but not without challenging the priorities and goals of capital.

Carr does not draw this conclusion and in the end, despite occasional mentions of the real drivers of automation such as the reduction of labour costs, he seems to have no perspective for doing anything about it beyond hoping those in a position of power take note. At the same time he encourages a resistance “to bring progress down to earth... our highest obligation is to resist any force that would enfeeble or enervate the soul.”

“We have an obligation to be more involved in decisions about [technologies’] design and use — before technological momentum forecloses our options”, he rightly comments, but provides no clue as to how this might happen; no mention of how trade unions, left political parties and the broad anti-capitalist movements might — and should — engage with these questions. Nothing about the need for technology development processes to be democratised and opened up or for alliances between technologists and those affected by their work to enable this. All of which perspectives flow from his analysis of what's wrong.

In the end, then, Carr provides no perspective for action. Instead he ends the book with a misplaced attack on those who see work-replacing technologies, with all qualifications and desirable changes to their nature, as an integral part of human liberation. “To cede choices about the texture of our daily lives to a grand abstraction called progress is folly.” This ignores that a desirable daily life must depend on a level of material wealth and free time that in turn depends on technology. Humanising technology cannot mean ignoring its economic benefits and how they might be realised in a different society.

Carr raises many of the critical questions that should be asked about one of the dominant technological trends today. He does so in a wide-ranging and non-technical way that makes the book easy to read. In the end his radical humanism does not supply a way out but it flags up to the left why a critical analysis of technology is necessary.

In the 80s unions and the left were seriously concerned about the consequences of microprocessor technology and talked about strategies to deal with it. We need to rekindle that discussion today.

Add new comment

This website uses cookies, you can find out more and set your preferences here.
By continuing to use this website, you agree to our Privacy Policy and Terms & Conditions.