People have underestimated the exponential growth of Covid-19 just like they do the exponential growth of technology. One minute, the virus is not very much of a threat, and one may be tempted to laugh at people taking it too seriously, and then the next minute tens of thousands die and the virus shuts down the global economy! Likewise, one minute people laugh at the stupidity of SIRI and crack jokes at Alexa's bloopers, then suddenly they do a lot better, and before long, you'll be able to hold a conversation with them. The ink will be barely dry on news articles about how "we need our A.I. to have common sense" or "deep learning is only good for pattern recognition, and we're still far from having reasoning", and then these will be solved, and a new thing the machines can't do will need to be found. I think the same is also true of advanced new BCI (=Brain Computer Interface) technology -- the articles mention just how limited and primitive the technology is today, and how far away we are from translating imagined speech to text; but before you know it, these will be solved, and the world, transformed.
This should also serve as a fore-warning about what can go wrong, not just about what can go right. For example, maybe in the not-too-distant future a rouge A.I. system will be developed as a simple hacking tool, but that will take over the world's computer networks at an alarming rate -- at first, it will seem like a joke; but in weeks, days, or hours, large numbers of people will die: