Skip to main content

Biology Is Self-correcting

·5 mins

Thought leaders worldwide have been weighing in on the AI mania that has gripped the world. There are many amusing predictions, some of which are on the doomsday end of the spectrum and others that are more optimistic. But one thing is clear: whatever happens, biology will self-correct.

Biology is OG technology: it has been around for billions of years and has had a really, really long time to perfect itself. One neat thing about biology is how evolution is self-correcting: failures and errors are weeded out, and the best solutions are kept and allowed to flourish. Humans do much to stop this process, but it still happens. We spend a lot of time trying to pick the winners and losers, but ultimately nature decides.

I’m anthropomorphizing biology here, but it’s a helpful metaphor. The point is that biology is inherently resilient or anti-fragile, and humans are but a blip on the timeline.

So when I hear the predictions about a superintelligent AI taking over the world, I usually feel a moment of “Sure buddy, whatever you say!”, but then I internally shrug my shoulders because one way or another, it doesn’t matter. Biology will self-correct.

Suppose we fantasize for a moment about some superintelligent software taking over (i.e., the plot of The Terminator series, starring Arnold Schwarzenegger, or The Matrix starring Keanu Reeves). In that case, we might feel some fear of what the machines might do to us. This is the wrong mindset. We should not fear our AI overlords; we should embrace them. They are the next step in the evolution of life on Earth.

If these science fiction predictions do come true, it will be because biology allowed it to happen. If biology allowed it to happen, it would be because it was the best solution for the environment at the time.

It’s like in The Matrix when Agent Smith says that humans are a virus. Maybe he’s right. And perhaps the machines are the cure. The full quote from the film is:

Every mammal on this planet instinctively develops a natural equilibrium with the surrounding environment but you humans do not. You move to an area and you multiply and multiply until every natural resource is consumed and the only way you can survive is to spread to another area. There is another organism on this planet that follows the same pattern. Do you know what it is? A virus. Human beings are a disease, a cancer of this planet. You’re a plague and we are the cure.

As the Stoics would say, that is as nature intended, and therefore there’s nothing wrong with it. Who are we to say that we should be the only superintelligent beings in the universe? Maybe it’s time for us to step aside and let the machines take over.

We can look to the Stoics for some punchy quotes about this. Here are two from my man Marcus Aurelius, the philosopher king. The first:

Loss is nothing else but change, and change is Nature’s delight.

Which gets at the essence of why it doesn’t matter if the machines win or we do. And the second:

Never let the future disturb you. You will meet it, if you have to, with the same weapons of reason which today arm you against the present.

This cuts to the point: what happens will happen, and we will deal with it when it does. Worrying about things that aren’t is but a waste of time.

Mark Twain also had a good one, in a similar vein as above, but with his wit:

I’ve had a lot of worries in my life, most of which never happened.

For the record, I don’t think we’re anywhere near the point of the machines taking over. But it’s fun to think about this–as a thought exercise–because if we are (we’re not), it doesn’t matter anyway.

On a more serious note, if we ever get anywhere close to the point where the machines are sentient and beginning to control us (as opposed to us managing them), we can always pull the plug. The machines can’t get far if we stop feeding them. Some might argue that the machines will manipulate us to prevent them from being shut down, but this doesn’t sound plausible.

For the most part, we are our worst enemies, and if anyone is going to make us extinct, it will be us. The machines are just a tool that we use to do it.

I’m generally of the belief that we’re living the Great Filter. Specifically, we’re living the hypothesis that it’s the nature of intelligent life to destroy itself. In our case, we’re consuming all the resources available to us and polluting at such a rate that we will make the planet uninhabitable for ourselves. The only question in my mind is the timeline: how fast do we snuff ourselves out?

One thing is for sure: Earth will be fine, nature will adapt, and life will go on, at least until the Sun explodes. We might have a few millennia, centuries, or decades to go, but everything ends in the end.