4 Comments

You write, "Over the past 20 years or so I’ve been working at the cutting edge of transformative technologies, and exploring ways of ensuring that the benefits substantially outweigh the risks as they are incorporated into society."

Don't forget that if any one risk is large enough, it contains the potential to erase all the benefits. It's not just a matter of benefits outweighing risks.

You write, "In each case, the communities of experts and practitioners I’ve worked with have learned that successful technology transitions take an incredible diversity of expertise, understanding, perspectives, and partnerships".

Or it could take something far simpler. Like slowing down. Or declining to develop some technology.

You write, "...and ensuring that the societal and economic benefits far out weigh the risks."

Apologies, but this is 19th century thinking, the notion that if benefits far out weigh risks, the problem is solved. This era of history ended in 1945.

You write, "If we are to successfully navigate technology transitions that include AI and AGI, as well as quantum technologies, advanced human augmentation, gene editing, the metaverse, and many more, we need greater coherence and relevance around how we combine existing understanding and generate new ideas."

And we might recognize this. Ever more, ever larger powers, delivered at an ever faster pace is a recipe for civilization collapse. The word "acceleration" should be sufficient to illustrate why such a process will inevitably outrun our management abilities.

https://www.tannytalk.com/p/the-logic-failure-at-the-heart-of

You write, "This wouldn’t matter so much if this was 30, 50, or even a hundred years ago, where the pace of innovation tended to allow enough time for things to even out in the end around risks and benefits (at least, to a certain extent). But this is 2023, and the pace of innovation is beginning to vastly outstrip our ability to find solutions to emerging problems using conventional thinking and established approaches."

Yes! Two examples of the needed unconventional thinking can be found here:

https://www.tannytalk.com/p/our-relationship-with-knowledge

and here:

https://www.tannytalk.com/p/world-peace-table-of-contents

You want to improve our management strategies, which is good. But you seem to not yet grasp that there is a limit to how much we can improve them.

Knowledge development feeds back upon itself, leading to an ever accelerating rate of knowledge development. If we were to plot that on a graph against human maturity development we'd see the lines diverging over time. Sooner or later they will diverge too far.

Two areas to look in to:

1) Learn how to take control of the pace of the knowledge explosion to better match our ability to adapt to change.

2) Put some focus on violent men, the single most dangerous factor in regards to emerging technologies of vast scale. If we don't meet this challenge, the rest of our work on this topic may be pointless.

Expand full comment