William Hertling predicted that we'd see human-level AI by 2024 at soonest and 2050 at latest, all by plotting out computer hardware development. His other hard-predictions included the rise of Napster, YouTube, and solid state drives— in their correct years.
Here's the article in question, and it's from June 2012:
Pretty much everyone would like a sure-fire way to predict the future. Maybe you’re thinking about startups to invest in, or making decisions about where to place resources in your company. Maybe you just care about what things will be like in 10, 20, or 30 years.
There are many techniques to think logically about the future, to inspire idea creation, and to predict when future inventions will occur.
I’d like to share one technique that I’ve used successfully. It’s proven accurate on many occasions. And it’s the same technique that I’ve used, as a writer, to create realistic technothrillers set in the near future. I’m going to start by going back to 1994.
Predicting Streaming Video and the Birth of the Spreadsheet
There seem to be two schools of thought on how to predict the future of information technology: looking at software or looking at hardware. I believe that looking at hardware curves is always simpler and more accurate.
This is the story of a spreadsheet I’ve been keeping for almost twenty years.
In the mid-1990s, a good friend of mine, Gene Kim (founder of Tripwire and author of When IT Fails: A Business Novel) and I were in graduate school together in the Computer Science program at the University of Arizona. A big technical challenge we studied was piping streaming video over networks. It was difficult because we had limited bandwidth to send the bits through, and limited processing power to compress and decompress the video. We needed improvements in video compression and in TCP/IP – the underlying protocol that essentially runs the Internet.
The funny thing was that no matter how many incremental improvements we made (there were dozens of people working on different angles of this), streaming video always seemed to be just around the corner. I heard “Next year will be the year for video” or similar refrains many times over the course of several years. Yet it never happened.
Around this time I started a spreadsheet, seeding it with all of the computers I’d owned over the years. I included their processing power, the size of their hard drives, the amount of RAM they had, and their modem speed. I calculated the average annual increase of each of these attributes, and then plotted these forward in time.
I looked at the future predictions for “modem speed” (as I called it back then, today we’d called it internet connection speed or bandwidth). By this time, I was tired of hearing that streaming video was just around the corner, and I decided to forget about trying to predict advancements in software compression, and just look at the hardware trend. The hardware trend showed that internet connection speeds were increasing, and by 2005, the speed of the connection would be sufficient that we could reasonably stream video in real time without resorting to heroic amounts of video compression or miracles in internet protocols. Gene Kim laughed at my prediction.
Nine years later, in February 2005, YouTube arrived. Streaming video had finally made it.
The same spreadsheet also predicted we’d see a music downloading service in 1999 or 2000. Napster arrived in June, 1999.
The data has held surprisingly accurate over the long term. Using just two data points, the modem I had in 1986 and the modem I had in 1998, the spreadsheet predicts that I’d have a 25 megabit/second connection in 2012. As I currently have a 30 megabit/second connection, this is a very accurate 15 year prediction.
Why It Works Part One: Linear vs. Non-Linear
Without really understanding the concept, it turns out that what I was doing was using linear trends (advancements that proceed smoothly over time), to predict the timing of non-linear events (technology disruptions) by calculating when the underlying hardware would enable a breakthrough. This is what I mean by “forget about trying to predict advancements in software and just look at the hardware trend”.
It’s still necessary to imagine the future development (although the trends can help inspire ideas). What this technique does is let you map an idea to the underlying requirements to figure out when it will happen.
For example, it answers questions like these:
- When will the last magnetic platter hard drive be manufactured? 2016. I plotted the growth in capacity of magnetic platter hard drives and flash drives back in 2006 or so, and saw that flash would overtake magnetic media in 2016.
- When will a general purpose computer be small enough to be implanted inside your brain? 2030. Based on the continual shrinking of computers, by 2030 an entire computer will be the size of a pencil eraser, which would be easy to implant.
- When will a general purpose computer be able to simulate human level intelligence? Between 2024 and 2050, depending on which estimate of the complexity of human intelligence is selected, and the number of computers used to simulate it.
Wait, a second: Human level artificial intelligence by 2024? Gene Kim would laugh at this. Isn’t AI a really challenging field? Haven’t people been predicting artificial intelligence would be just around the corner for forty years?
What shocked me was that SSD prediction, that flash drives would overtake magnetic storage by 2016. That one was so right on the money, I actually had to roll back my chair and take a moment. It's been one of the year's biggest comp-sci stories— "SSDs Catch Up To And Overtake HDDs". In price, they're on par. In storage capability, flash has far superseded magnetic storage, with the largest SSD currently coming in at around 60 TB, which is six times the size of the largest available HDD.