Will technology take the jobs of future generations?

A new wave of technological innovation is likely to transform the global economy over the coming years. Is there a danger that it will destroy more jobs than it creates? David Kingman investigatesMan vs Machine

The recent 2014 Consumer Electronics Show, the latest edition of a convention which is held each January in Las Vegas, sounds like it must have been a wondrous spectacle to have attended.

As is the case every year, all the major technology companies flock to the event to give the media and the general public a taste of their vision of what technology will be capable of doing in the near future. Some of the sights that were on display at this year’s event included home-made musical instruments created using the latest 3D printers, gigantic TVs the size of a room which are capable of showing images in “ultra high definition”, pairs of glasses with tiny computer processors inside them which enable their wearers to surf the internet or take photographs just by blinking, and new cars that can drive themselves and whose on-board computers  can relay messages to their owners  while they are still in bed to warn them about heavy traffic on their normal morning commute.

Many of these new devices sound incredible, and will have the potential to transform our everyday lives if they ever appear on the mass-market. Yet should we be worried that too much technological innovation might take jobs away from future generations?

Would you like a robot as your doctor?

A recent issue of The Economist contained a special report which examined this important conundrum. In many ways, this debate is just the latest instalment in a long-running saga which goes right back to the Luddite movement in the 19th century, when textile artisans who were fearful of having their jobs replaced by new industrial machinery protested by vandalising factories. Ever since, commentators who issue warnings about the dangers posed by new technology replacing jobs have often been dubbed with the pejorative term “Luddites”.

Obviously, the issues at stake in this debate are extremely complicated. On the one hand, supporters of the view that technology will be good for future generations can point to previous waves of technological innovation, which have enabled huge rises in productivity to occur. There are some amazing statistics which support this view: for example, at the beginning of the 20th century agriculture employed a third of all Americans, whereas now it employs just 2%, yet they are able to produce much larger amounts of food because of better machinery and advances in agricultural science.

Economists generally tend to see anything which increases productivity as a good thing. An increase in productivity occurs when you get more outputs from a given unit of input; i.e. if one man working with a tractor can suddenly produce as much food as five men working by hand could have done previously. But what of those four men who have now lost their jobs? If technology is cheaper than labour, then the increase in productivity should allow food prices to fall, reducing the cost of living and giving everyone else higher disposable incomes. This will boost demand in other sectors of the economy, creating new jobs for those displaced farm workers to perform. As The Economist argues in its report, the gains associated with the Industrial Revolution led to a huge surge in living standards which benefited practically everyone:

“Real incomes in Britain scarcely doubled between the beginning of the common era and 1570. They then tripled from 1570 to 1875. And they more than tripled from 1875 to 1975. Industrialisation did not end up eliminating the need for human workers. On the contrary, it created employment opportunities sufficient to soak up the 20th century’s exploding population.”

However, those who are pessimistic about technology would argue that the lessons of the past may no longer hold true in the future. The last 20 years have witnessed a burst of technological progress which has fundamentally changed the way we live, from computers to the internet, smartphones and tablets, yet in both Britain and America the incomes of ordinary workers have stagnated. Some theorists argue that part of this is partly due to technology causing a “hollowing-out” of Western economies; jobs are increasingly becoming divided between low-level service tasks which cannot (yet) be automated, either for technological reasons or because human labour is too cheap for it to be economic to do so, and highly paid “knowledge workers” who benefit from high levels of education which enable them to work in fields such as finance, computer programming and other technology-driven industries.

A pair of American economists named Andrew McAfee and Erik Brynjolfsson even published a book in 2011 called Race against the Machine in which they argued that a “great decoupling” has occurred in the US economy over the last 30 years between increases in productivity and GDP growth, which have both risen, and the wages of the average American worker and the level of employment, which have both failed to keep pace with them (this is demonstrated by a series of charts on Andrew McAfee’s website, which is available here). They think that technology is gradually replacing an ever-increasing share of routine jobs, squeezing those on middle incomes (such as bank clerks or car assembly workers) who lack the skills to compete at the higher end of the labour market.

Technological pessimists warn that the biggest danger facing future generation is that yet more jobs are set to gradually be automated over the coming years. Two Oxford economists, Carl Benedikt Frey and Michael Osborne, warned in a 2013 paper that jobs in 47% of the categories into which economists traditionally sort occupations are in danger of being automated, including plenty of white-collar industries such as accountancy, legal work and healthcare.

It is not too difficult to see how some of the technological innovations which are already in the pipeline could potentially disrupt industries which are large employers. For example, take the transport and logistics industry. This sector has proved remarkably resilient to gains in worker productivity; taxi services with a single driver have been operating in London since 1605, and that is still how they work today. However, self-driving cars were one of the major themes of January’s Consumer Electronics Show, and it is widely considered to be a matter of when, not if, they will be available to the general public. While it will make life easier for consumers, this could potentially put thousands of people who work driving vehicles out of a job. Automation is already making its presence felt on the railways, where just in London the Docklands Light Rail system now operates driverless trains, and plans to replace all 750 staffed ticket offices on the London Underground with machines by 2015 are supposedly afoot.

Healthcare is another field where rising costs are often blamed on the difficulty of increasing productivity. If it took one nurse to take a blood sample a hundred years ago, then it still takes the same in modern hospitals today. However, even in this sector technology is being developed which is designed to take some of the burden off human members of staff, such as remote healthcare that can detect changes in a patient’s vital signs while they are going about their daily life at home.

The Economist argued that it is inevitable that advances in technology are going to change the global economy far more rapidly and disruptively than we are currently prepared for. Technology may have amazing potential to transform our lives, but what can we do to make sure as many people as possible will benefit from it?

Education, education, education…

Unsurprisingly, the main answer that jumps out is that we need to improve our education systems so that as many people as possible are equipped with the necessary skills to take advantage of the improvements in technology. This could include some radical breaks with educational tradition, so that practical attributes like people-skills and how to hold a conference over Skype are taught alongside Maths and English. Education will also need to become much more of a continuous, lifelong process in which people are encouraged to refresh their skills as technology changes, and to prevent too many workers being left behind when the industry where they work undergoes disruptive innovation.

Of course, technology is also likely to transform the world of education, with the growth of online learning enabling people to pick up new skills alongside millions of other learners around the world from the comfort of their own homes. In addition to reforming education, governments will have a key role to play in deciding when to allow the creators of new technology to innovate freely and when there are possible costs to society which need to be taken into account.

Overall, future generations seem to have much to look forward to in the way of technological innovations that we can only dream about today; but the rest of society will have to innovate as well if we want them to have the greatest possible benefit. This is bound to be an uncomfortable process, so it is one our government needs to begin making preparations for. If they do not rise to this challenge then the dystopian vision of the film “Blade Runner”, in which society has fractured into a technological elite and a vast underclass for whom there are no remaining jobs left to perform, could be all they have to look forward to instead.