We ain’t seen nothin’ yet—of our tech future
Our scientific knowledge is surging, leading to innumerable new applications that will bring growth
Latest News »
- Scientists develop prosthetic limbs that take feedback from human body
- Presential polls: Ram Nath Kovind to kick-start nation-wide tour from UP tomorrow
- MP minister Narottam Mishra disqualified for 3 years over election expenditure
- Civil nuclear deal will be part of Narendra Modi - Donald Trump discussions: White House
- J&K: 3 more arrested for DSP’s mob lynching in Srinagar
With the global economy yet to recover from the 2008 economic crisis, concern about the future— especially of advanced economies—is intensifying. My Northwestern University colleague Robert J. Gordon captures the sentiment, arguing in his recent book, The Rise And Fall Of American Growth that the productivity enhancing innovations of the last century and a half cannot be equaled. If true, advanced economies should expect slow growth and stagnation in the coming years. But will the future really be so bleak?
Probably not. Pessimism has reigned over economists’ outlooks for centuries. In 1830, the British Whig historian Thomas Macaulay observed that, “(i)n every age, everybody knows that up to his own time, progressive improvement has been taking place; nobody seems to reckon on any improvement in the next generation.” Why, he asked, do people expect “nothing but deterioration”? Macaulay’s perspective was vindicated by the dawn of the railway age. Transformative advances in steel, chemicals, electricity, and engineering followed.
When it comes to our technological future, I expect a similar outcome. Indeed, I’d go so far as to say, “We ain’t seen nothin’ yet.” My optimism is based not on some generalized faith in the future, but on the way science (or “propositional knowledge”) and technology (“prescriptive knowledge”) support each other. Just as scientific breakthroughs can facilitate technological innovation, technological advances enable scientific discovery, which drives more technological change. In other words, there is a positive feedback loop between scientific and technological progress.
The history of technology is full of examples of this feedback loop. The 17th century scientific revolution was made possible partly by new, technologically advanced tools, such as telescopes, barometers and vacuum pumps. One cannot discuss the emergence of germ theory in the 1870s without mentioning prior improvements in the microscope. The techniques of X-ray crystallography used by Rosalind Franklin were critical to the discovery of the structure of DNA, as well as to discoveries that led to over 20 Nobel prizes.
The instruments available to science today include modern versions of old tools that would have been unimaginable even a quarter-century ago. Telescopes have been shot into space and connected to high-powered adaptive-optics computers to reveal a universe quite different from the one humans once imagined. In 2014, the builders of the Betzig-Hell microscope were awarded a Nobel Prize for overcoming an obstacle that had been considered insurmountable, bringing optical microscopy into the nanodimension.
Consider the revolutionary instruments and tools that have emerged in recent years. Start with the computer. Economists have made valiant efforts to assess computers’ impact on the production of goods and services, and to measure their contribution to productivity. But none can adequately account for the opportunities computers have created for scientific research.
There is no lab in the world that does not rely on them. The term “in silico” has taken its place next to “in vivo” and “in vitro” in experimental work. And entire new fields such as “computational physics” and “computational biology” have sprung up ex nihilo. In line with Moore’s Law, advances in scientific computation will continue to accelerate for many years to come, not least owing to the advent of quantum computing.
Another new tool is the laser. When the first lasers appeared, they were almost an invention in search of an application. Nowadays, they are almost as ubiquitous as computers, used for seemingly mundane daily uses from document scanning to ophthalmology. The range of research areas that now rely on lasers is no less broad, running the gamut of biology, chemistry, genetics, and astronomy. Recently, lasers enabled the confirmation of gravitational waves—one of the holy grails of physics.
Yet another technological innovation that is transforming science is the gene-editing tool CRISPR Cas9. Already, sequencing genomes is a fast and relatively cheap process, its cost having dropped from $10 million per genome in 2007 to under $1,000 today. CRISPR Cas9 takes this technology to a new level, as it enables scientists to edit and manipulate the human genome. While that idea may give some people pause, the technology’s potential beneficial applications —such as enabling essential crops to withstand climate change and water salination—cannot be overestimated.
Furthermore, digitization has lowered access costs for researchers substantially. All research relies on access to existing knowledge; we all stand on the shoulders of the giants (and even average-size figures) who came before us. But, until recently, learning what one needed to know to come up with scientific and technological innovations took a lot more work, with countless hours spent scouring libraries and encyclopedia volumes. Nowadays, researchers can access mega databases, where they can find patterns and empirical regularities.
Our scientific knowledge is surging forward, leading to innumerable new applications. Technology will forge ahead as well, in scores of expected and unexpected areas. It will bring economic growth, albeit perhaps not the kind that will register fully if we continue to rely on our outdated standards for national income accounting.
Joel Mokyr is professor of economics and history at Northwestern University.