When the political classes discussed how democracy can go awry, they tend to come back to many repeating trends in history: political apathy, extremist sentiment, radical populism, strong military support over the government, anti-intellectualism, civic decline, a sense of degeneration and a desire to reverse it through a strongman or ideological purity, widespread willful ignorance.
Democracy is deeply flawed. It's biggest strength is also its Achilles' Heel. The entire citizenry is supposed to be sovereign. This maximizes civic participation and minimizes the chance for government abuse of the population. At least in theory. Usually, the government still finds a way to marginalize a group. But generally, it makes sense. Democrats are what gives a government its authority and power, so it's not in the government's interest to marginalize democrats. An aristocracy wouldn't marginalize aristocrats; a plutocracy wouldn't marginalize plutocrats; an ergatocracy wouldn't marginalize ergatocrats. So that makes sense in theory, even though in practice, it leads to instability because government works best if its power base is educated.
The thing about aristocracy and technocracy is that you can expect wealthy aristocrats to be well educated and small in number, while technocrats are literally defined by their technical expertise.
Democracies need 100% of their population to be educated in order to function. Not just literate but also well-rounded.
Information bubbles are extremely dangerous in a democracy. If group A believes President X is responsible for mass murder but group B believes he's not and no such murder happened and both bring "evidence" proving their side, you reach an impasse. Now imagine that for many smaller issues and even things irrelevant to politics. If groups start distrusting basic information and facts, they'll vote in irrational ways.
This is just natural. We humans can't know all things at all times, and have to make deductions from what we can glean. There are only so many hours in the day and our attention is so limited, and we tend to reject that which goes against our views anyway.
Once synthetic media comes along, this will reach critical levels when people start persisting in entirely personalized bubbles, seeing only what they want to see. In 2032, Chelsea Clinton might be president, but many groups use media synthesis to change all news and images to show that Donald Trump Jr. is president. And if you live in certain households that run with this, you might even have no clue Clinton is president the entire time and would think everyone saying otherwise is deluded.
Democracy can't function in such a world.
But that's besides the point.
This just further shows how fragile democracy is. The main point to glean is that humans can only process a limited amount of information and can only act on that information at a certain speed. Democracy can die that way, but that's really how it died in Ancient Rome too, so what's so revelatory?
Simple: it also means that democracy can fail if there was a power base that's TOO smart, TOO capable, and TOO informed to the point that the common citizenry can't possibly keep up.
And everyone here can guess what that means.
But you see, the funny thing about artificial intelligence and narratives of it takeover is that we have to write fiction and fiction generally has to be exciting, especially if it's sci-fi. It's why I'd love to kickstart a "slice of tomorrow" literary genre in order to really go into the nitty gritty of future life, developments, politics, and so on with no need to tell an action or romance story or have grave stakes or dystopian consequences.
If you think about the future in such a way, you see that AI overlords are inevitable, but in the most boring possible way.
Right now, we already live in an era of automated stock markets and enterprise expert systems. But in the future, as more and more capable AI is integrated into business and government for different purposes (such as optimization and cognitive assistance), we'll start to see the realms of politics and economics really speed up. More management decisions will be automated, and soon even executive decisions will follow.
At first, this can greatly assist humans. We'll have these agents parse all the relevant data and optimize them for the most efficient solutions to any given problem.
But then more businesses automate. More leave executive decisions to these superbots. And the data coming in requires faster and faster responses to more and more complex information.
Soon, you need to be the equivalent of a nuclear scientist just to understand the basics, and that's if you had several days to sit on it. You need to make a decision within an hour. And then a minute. Failure to do this means your business goes under within just a few weeks.
And it goes doubly so for government. Bots are sending you 100 years of information in a day and need you to act immediately. But you can't, Mr. President. You're only human. The fastest brain signal is only 270 miles per hour, whereas these bots have light-speed computing capabilities: 670,616,629 miles per hour.
At some point, it makes no sense to let humans govern. It's literally impossible to keep up. Humans now get in the way.
Say a citizens group organized for a vote on how to limit the powers of AI. It takes 8 hours from start to finish.
In those 8 hours, the AIs have accomplished 5,000 years worth of thought and governance. They've modeled every single possibility of that vote hundreds of times over. They already taken appropriate actions. By the time the vote is done, the whole point of it is moot. Even when we try to be self-sufficient and independent, the machines are already a thousand steps ahead of us anyway.
Human enhancement doesn't change anything for the same reasons. Not all humans will upgrade, and the fact we have to toss away biology just to keep up proves the point that something has changed and democracy is obsolete.
This is all very outlandish, so it makes sense it's not discussed often. That and "AI took over the world over the course of decades through iterative improvements in efficiency" isn't quite as exciting as "killer robots slaughtered and enslaved their arrogant human masters." But that is the most likely outcome. If we're ruled over by AI, it's because we let it.