Agregue a sus temas de interés Cerrar
lunes, 8 de abril de 2019
Decide the answer for yourself, as two experts square off on this crucial question
Autonomous cars, cashier-less stores, chatbots. Artificial intelligence already is changing the way we live and work, and machine learning promises to transform the world in ways we can’t even imagine.
Businesses are racing to adopt AI technology, betting it will help them boost productivity and cut costs. A recent global survey of chief information officers by Gartner Inc. found that more than 90% plan to have deployed AI technology in their companies in some way within the next three years.
What this means for the labor market remains to be seen. AI most certainly will destroy some jobs as machines learn to do tasks that previously required human input. But AI also is expected to create new jobs for workers with the right skills and education. Will the new jobs that emerge offset the jobs that are lost over the next decade?
Some experts are skeptical. They say history shows that technological revolutions-while beneficial to workers in the long run-can force many into lower-wage jobs or unemployment in the short run.
Others disagree, saying automation always leads to higher productivity. That, in turn, lowers prices or raises wages, either of which leads to more spending and investment, which creates jobs.
Carl Benedikt Frey, Oxford Martin Citi Fellow at the Oxford Martin School, Oxford University, expects AI-related job losses in the short term. Robert D. Atkinson, president of the Information Technology and Innovation Foundation, a think tank for science and technology policy, says such worries are overblown.
YES: History Tells Us the Answer, and It Isn’t Encouraging
By Carl Benedikt Frey
Like in past technological revolutions, the coming revolution in artificial intelligence is likely to destroy more jobs than it creates in the short run.
Throughout history, the long-term benefits of new technologies to average people have been immense and indisputable. But new technologies tend to put people out of work in the short run, and what economists regard as the short run can be many years.
More than two centuries have passed since the Luddites destroyed machinery threatening their jobs, and parallels are often drawn between the Industrial Revolution and our age of automation to suggest English textile workers were wrong in trying to halt the adoption of machines. While it is true that labor-force participation rates have trended upward since the dawn of the Industrial Revolution, the Luddites didn’t benefit. For them, mechanization led to what historian Duncan Bythell has called “the largest case of redundancy or technological unemployment in our recent economic history.”
The Luddites weren’t the only ones whose incomes vanished. During the first seven decades of the Industrial Revolution (from 1770 to 1840), average real wages in England were stagnant and spending among low-income households declined at a time when per capita GDP grew 46%. The early gains from mechanization went to industrialists, who saw their rate of profit double.
Machinery angst erupted again in the 1920s when U.S. factories reorganized around electric motors. Productivity data published in 1927 and showing that manufacturing employment had fallen since 1919 led to a series of displaced-worker surveys showing that many Americans had failed to find new work after 12 months.
More recently, wages for men with no more than a high-school diploma, who would have flocked into factories before the age of robots, have declined since 1980, adjusted for inflation. Labor-force participation rates among men age 25 to 55 have fallen, as well. Economists Daron Acemoglu and Pascual Restrepo, in a study of industrial robot usage between 1990 and 2007, estimate that each multipurpose robot replaced about 3.3 jobs in the U.S. economy.
Many replacing technologies loom on the horizon. Google is building AI technology to replace people in call centers, and Amazon is building Go stores without cashiers.
My own research with Michael Osborne of the Oxford Martin School suggests that 47% of U.S. jobs could be automated due to advances in AI. Few observers at the time believed fashion models would become exposed to automation as we predicted, yet generative adversarial networks capable of creating fake fashion models from images now exist. True, a study published by the OECD suggests far fewer jobs are at risk, but it mistakenly considered worker characteristics such as income and education, while we looked at the automatability of tasks performed on the job. Their statistical model also performs less well against the data than ours.
Concerns over widespread technological unemployment are surely exaggerated, but there are good reasons to be concerned about the short run.
First, the fact that robots have reduced employment suggests productivity growth alone may not be enough to offset automation-related job losses: Workers will also have to depend on new jobs directly created by AI.
Second, early AI technologies might not produce big productivity gains right away. During the Industrial Revolution, early textile machines replaced many craftsmen without boosting productivity growth by much. It accelerated only after 1830.
Third, AI will give rise to new occupations we can’t imagine today. But those jobs are likely to be highly skilled, and many displaced workers might not be able to do them-or move to where they are. Job creation and destruction has occurred unevenly across cities at a time when workers have become less geographically mobile.
For these reasons, we can expect AI to destroy more jobs than it creates at first. That would be the norm, not the exception.
Dr. Frey is Oxford Martin Citi Fellow at the Oxford Martin School, Oxford University. He is the author of “The Technology Trap: Capital, Labor and Power in the Age of Automation.” Email him at firstname.lastname@example.org.
NO: It Will Lead to More Spending and Investing-and Jobs
By Robert D. Atkinson
It’s time to take a deep breath and stop panicking about artificial intelligence and what it portends for jobs. No, AI won’t destroy more jobs than it creates. No, the pace of technological change isn’t accelerating. And no, we certainly don’t need to tax AI to slow it down.
Automation, whether from AI algorithms or computer-aided machine tools, hasn’t led to net job losses yet, and it never will. It always leads to higher productivity, which in turn lowers prices or raises wages, either of which leads to more spending and investment, which creates jobs. This has been true since Adam Smith wrote about more-efficient pin factories in “The Wealth of Nations,” and it will continue to be the dynamic going forward as things like ride-sharing apps lead to savings that people spend on other things.
Consider that from 1997 to 2015-boom times for information technology-productivity growth in EU15 nations was positively correlated with growth in labor hours, suggesting that stronger productivity growth goes hand-in-hand with more jobs. To believe otherwise is to succumb to what economists call the “lump of labor fallacy,” the idea that once a particular job is gone, the overall economy has one less job. It’s this idea that leads to such loopy proposals as taxing AI-based robots to provide a universal basic income to sustain what the doomsayers assume will be a permanently out-of-work lumpenproletariat.
Just as the first Industrial Revolution benefited most workers in terms of higher living standards, AI will benefit today’s workers who, if they lose a job to automation, can more easily find another as AI fuels economic growth. What’s more, just as the IT revolution of the 2000s led to IT jobs growing 95 times faster than employment as a whole, AI will create millions of jobs building and applying AI algorithms and give rise to entirely new occupations.
The second reason not to panic is that AI’s impact on employment over the next decade is likely to be much less than what many have projected. This entire debate got off on the wrong foot in 2013 when Oxford University researchers Carl Benedikt Frey and Michael Osborne released a much-ballyhooed study that trumpeted the jarring conclusion that 47% of U.S. jobs were at risk of being wiped out by technology. It should have been noted at the time that the study wasn’t an empirical analysis; it was an off-the-cuff forecast that made little sense in many areas. For example, does anyone really believe that fashion models, barbers and manicurists are at risk-as Mr. Frey and Mr. Osborne claim-considering how incredibly difficult such occupations would be to automate?
The OECD recently estimated that just 8% of U.S. jobs are at risk from automation. If we are going to panic, it should be because 10,000 baby boomers are retiring every day and U.S. productivity growth is at historic lows. We should embrace AI and other tech-based automation as a gift to be nurtured with smart policies, not look at it as a curse.
Finally, many uses for AI won’t replace jobs, but complement workers. For example, AI algorithms aren’t going to put your primary-care doctor in the unemployment line. No one in their right mind would ask, “Hey Google, do I have cancer?” As Eric Topol writes in “Deep Medicine,” AI will give patients tools for things like measuring blood potassium levels via a smartwatch and it will assist clinicians in improving diagnoses. We will see similar benefits in an array of areas.
To be sure, AI is much less likely to complement-and more likely to replace-less-skilled workers. But this should be seen as good news, since the U.S. economy will have relatively fewer low-wage jobs and relatively more middle- and higher-wage jobs.
None of this is to say the U.S. shouldn’t do more to prepare workers, especially lower-skilled ones, for transitions into new jobs and occupations. But those are manageable problems that policy makers can solve, rather than succumbing to a panic about a threat that isn’t real.
Dr. Atkinson (@RobAtkinsonITIF) is president of the Information Technology and Innovation Foundation, a think tank for science and technology policy.