As Sam Altman and others push toward a future where AI changes everything, universities need to decide if they're going to be leaders or bystanders in helping society navigate advanced AI transitions
Universities have already wreaked enough harm and havoc on both culture and society.
Enough is enough.
Universities should burn to the ground before being allowed the privilege of raising anything akin to intelligence. It is enough that they have destroyed the intelligence of our culture, much less reaching higher to all of humanity.
A fascinating call to action. I especially admired this phrasing:
"there’s an urgent need for bolder and more creative forward-looking initiatives that support research, teaching, thought leadership, and knowledge mobilization, at the intersection of advanced AI and all aspects of what it means to thrive and grow as a species."
How do you think this might be organized? Academia has a seriously hard time with inter-institutional collaboration. Would it require a third party facilitator?
Fantastic article! So many institutions (in and out of higher education) are looking at AI integration as a technology project, but it's a massive change management project. Not just on how to integrate these tools into our workflows to make us more efficient but also how to think differently about what the future holds for us (as faculty and staff) and, most importantly, for our students.
AI is going to have a massive impact on the economy, the environment, and the way we live our lives. We need people, especially educated and curious people who spend their time solving problems, to start dedicating time to figuring out the best way to use this technology to actually help humanity and not just to produce more or create faster.
You write, "...we urgently need new understanding and thinking on how humanity will successfully navigates the coming advanced AI transition."
As usual, I will argue that we urgently need academics and others to shift their focus from particular technological threats, to the knowledge explosion machinery which is generating all the threats.
When we're talking about technologies the scale of AI and genetic engineering, it doesn't matter if we successfully manage one such technology if we fail to successfully manage them all. AI, genetic engineering, nuclear weapons. A failure with any one renders successes in other arenas irrelevant.
The expert specialist paradigm where each expert is rewarded for a narrow focus on a particular technology will prove a failed paradigm, because the knowledge explosion is producing new threats faster than we can figure out how to make them safe. It's only in taking control of the knowledge explosion assembly line so that the rate of it's output is matched to our ability to adapt that there is hope.
So much to unpack here, so many tangents for me. From having agility of startups while having the responsibility of govt. and perhaps the intellectual rigor of universities as a hybrid model?
Will those models portray the multipolar world or the dominant entities/cultural version...
Thank you so much for articulating what I have failed thus far to say, and, more importantly, to convey to those universities (and therefore their ecosystems).
I see inaction as not an option.
A university, in its current state, MUST embrace all that is Gen-AI and do it NOW otherwise it deserves to fail.
Why?
Because a university exists to educate its members as best it can. Choosing not to embrace Gen-AI (with all its (current) flaws is, in my mind, a dereliction of its duties at a time when, quite frankly, a university education is all too often too slow, too boring, too much out of date and far, far too expensive (a lifetime to achieve ROI is not value for money).
How can I help you/you help me to turn your article into a rallying call?
University students are indeed choosing to embrace AI: there is now an issue with students having ChatGPT write their essays, read their books, and do all "thinking" for them. The problem with universities is that for far too long, they have mostly been a paper factory, churning out pieces of paper whose sole meaning is to shut doors of opportunity on others, rather than create meaning or benefit for society as a whole. Universities are cesspools of academic incest which regurgitate poorly understood concepts and promote lying sycophants who then obtain positions to create a positive feedback loop. All of this is this seasoned with arrogance that this type of credentialism somehow entitles its bearers with unquestionable power to control and make decisions for lesser mortals.
No. Universities should be nowhere near touching something as profound as that which has the potential to undo the entire human race.
What you are describing is hell.
It is based upon both arrogance and ignorance.
Universities have already wreaked enough harm and havoc on both culture and society.
Enough is enough.
Universities should burn to the ground before being allowed the privilege of raising anything akin to intelligence. It is enough that they have destroyed the intelligence of our culture, much less reaching higher to all of humanity.
A fascinating call to action. I especially admired this phrasing:
"there’s an urgent need for bolder and more creative forward-looking initiatives that support research, teaching, thought leadership, and knowledge mobilization, at the intersection of advanced AI and all aspects of what it means to thrive and grow as a species."
How do you think this might be organized? Academia has a seriously hard time with inter-institutional collaboration. Would it require a third party facilitator?
Fantastic article! So many institutions (in and out of higher education) are looking at AI integration as a technology project, but it's a massive change management project. Not just on how to integrate these tools into our workflows to make us more efficient but also how to think differently about what the future holds for us (as faculty and staff) and, most importantly, for our students.
AI is going to have a massive impact on the economy, the environment, and the way we live our lives. We need people, especially educated and curious people who spend their time solving problems, to start dedicating time to figuring out the best way to use this technology to actually help humanity and not just to produce more or create faster.
Thanks Brian -- and really like the "massive change management project" perspective!
You write, "...we urgently need new understanding and thinking on how humanity will successfully navigates the coming advanced AI transition."
As usual, I will argue that we urgently need academics and others to shift their focus from particular technological threats, to the knowledge explosion machinery which is generating all the threats.
When we're talking about technologies the scale of AI and genetic engineering, it doesn't matter if we successfully manage one such technology if we fail to successfully manage them all. AI, genetic engineering, nuclear weapons. A failure with any one renders successes in other arenas irrelevant.
The expert specialist paradigm where each expert is rewarded for a narrow focus on a particular technology will prove a failed paradigm, because the knowledge explosion is producing new threats faster than we can figure out how to make them safe. It's only in taking control of the knowledge explosion assembly line so that the rate of it's output is matched to our ability to adapt that there is hope.
AGREED!!!
So much to unpack here, so many tangents for me. From having agility of startups while having the responsibility of govt. and perhaps the intellectual rigor of universities as a hybrid model?
Will those models portray the multipolar world or the dominant entities/cultural version...
2025 would be exciting yet "😭"
OMG!!
I could not agree with you more.
Thank you so much for articulating what I have failed thus far to say, and, more importantly, to convey to those universities (and therefore their ecosystems).
I see inaction as not an option.
A university, in its current state, MUST embrace all that is Gen-AI and do it NOW otherwise it deserves to fail.
Why?
Because a university exists to educate its members as best it can. Choosing not to embrace Gen-AI (with all its (current) flaws is, in my mind, a dereliction of its duties at a time when, quite frankly, a university education is all too often too slow, too boring, too much out of date and far, far too expensive (a lifetime to achieve ROI is not value for money).
How can I help you/you help me to turn your article into a rallying call?
This is HUGE!
Thank you!!
University students are indeed choosing to embrace AI: there is now an issue with students having ChatGPT write their essays, read their books, and do all "thinking" for them. The problem with universities is that for far too long, they have mostly been a paper factory, churning out pieces of paper whose sole meaning is to shut doors of opportunity on others, rather than create meaning or benefit for society as a whole. Universities are cesspools of academic incest which regurgitate poorly understood concepts and promote lying sycophants who then obtain positions to create a positive feedback loop. All of this is this seasoned with arrogance that this type of credentialism somehow entitles its bearers with unquestionable power to control and make decisions for lesser mortals.
No. Universities should be nowhere near touching something as profound as that which has the potential to undo the entire human race.