5 Comments
Apr 25Liked by Andrew Maynard

The emergence of AI assistants seems helpful in illustrating a larger issue.

We're naturally concerned that AI assistants do more good than they do harm. While that's certainly a reasonable concern, it also illustrates a weakness in our thinking.

Trying to address and meet challenges presented by emerging technologies one by one seems a loser's game, because the knowledge explosion is producing new emerging technologies faster than we can figure out how to manage the one's we already have.

If that's true (reasonably debated) then we should face up to the fact that this process of trying to manage each new technology as it emerges is a path to a larger failure. It's not going to matter that much if we make this or that technology safe if an ever growing number of other technologies can not also be made safe.

Example: 75 years after the invention of nuclear weapons we still don't have a clue how to make them safe, and while we've been wondering about that genetic engineering and AI have emerged. And both of these new technologies will likely accelerate the knowledge explosion even further.

Particular emerging technologies are not the problem, they are symptoms of the problem. The problem is that we don't have control of an ever accelerating knowledge explosion.

We need fewer experts focused on particular technologies, and more experts capable of seeing the larger picture, because it is that larger picture which will determine our future.

Expand full comment

Politics, wars, finance, statehood, religion and regional cultures. Will these be subsumed as the main drivers of human change?

Expand full comment

My brain needs a break from our rapidly advancing world! That being said, very important post, thank you for it.

Did you happen to see Britt Marling’s Murder at the End of the World? The twist at the end of the show actually reflects one of the primary concerns this paper points too..SPOILER ahead…

It’s revealed that the lead character’s AI assistant is the murderer - not because ‘it’s evil’ but because it had too broad a spectrum of duties and conflated its role as a psychologist / confidant with its role as a security system. Thought you might be interested in that Hollywood touch point.

Expand full comment