In this blog, Kaila Colbin writes about the future of lab-grown meat.
The first name that typically comes up when someone hears about exponential technologies is Gordon Moore. In 1965, Moore, one of the co-founders of Intel, observed that the number of transistors we could fit on an integrated circuit had been doubling every 18 months or so¹. He also predicted that the number would continue to double as far forward as we could see.
That prediction — you may have heard of it — is now famously known as “Moore’s Law,” and it’s still going strong. But there’s a fun fact you may not know about Moore’s Law. Gordon Moore based his prediction on a dataset of, wait for it, four.
He had just four data points, and from them generated his entire hypothesis. This “law,” this self-fulfilling prophecy that has guided the entire computer industry for more than 50 years, was, as Moore called it, a “wild extrapolation.”
But then Ray Kurzweil came along. An inventor and futurist, Kurzweil, asked an important question: What if, instead of looking at the number of transistors we can fit on an integrated circuit, we look instead at what we might call the price-performance of computing?
The specific question Kurzweil asked was this: How many instructions per second can you buy for $1,000?
It may seem like an arbitrary distinction, but it’s an important one. Where Moore was asking an engineering question, Kurzweil had gone up a level of abstraction — and, in doing so, allowed us to see the underlying phenomenon.
As it turns out, the answer to Kurzweil’s question is a number that has been doubling for over 120 years.
Most exponential curves are charted on a log scale, meaning the y-axis goes 1, 10, 100, 1000 instead of 1, 2, 3, 4. A straight line on a log scale is an exponential curve.
Kurzweil’s “Law of Accelerating Returns” started with electromechanical punch cards before moving on to relays, vacuum tubes, transistors, and only then integrated circuits.
Seen through this lens, Moore’s Law is actually the fifth iteration of the underlying doubling phenomenon, not the first. Every time we’ve maxed out on a given technology, a new one has come along to continue the trend.
Part II: Not Just Computing
Kurzweil’s insight allowed us to understand that this trend is not about transistors, it’s about digitization. It’s about the shift from the physical world to the world of information.
And it doesn’t just apply to computing. Once any technology becomes information-enabled—powered by ones and zeroes—it too will start to follow a similar doubling trend².
Think about photography. Photography used to be substrate-enabled: you were dependent on physical film to do the thing.
But then we created the digital camera.
The first digital camera, in 1975, had a resolution of 0.01 megapixels. Today, the camera on my phone has a resolution of 40 megapixels.
There are lots of examples where we can see the trend. Let’s take a look at a few, starting with the cost to sequence a human genome.
It cost $2.7 billion to sequence the first genome, including the need to sequence a bunch of different organisms and do a whole heap of ancillary research. And then for a few years, the price came down at a pace almost entirely consistent with Moore’s Law.
And then, in 2007, Next-Generation Gene Sequencing came on the scene, and the price plummeted from $10 million to $1 million in less than a year. As of last August, it cost less than $700 to sequence a genome. Expert biotechnologist Raymond McCauley anticipates a near future in which it will be cheaper to sequence your genome than it is to flush your toilet.
Let’s try one of my favorite charts of all time, from the CIA, the World Bank, Bernstein, and the Energy Information Administration. It’s called Welcome to the Terrordome, and it measures the price-performance of energy by different fuel types:
Read the complete article at www.kailacolbin.medium.com