I think it is worthwhile to note how 'learning' occurs in AI. It is not learning at all but programming (https://tomrearick.substack.com/p/ai-does-not-learn). The worse kind of teaching is based on the metaphor the-mind-is-a-container-to-be-filled-with-knowledge. Unfortunately, many teachers see their jobs as something resembling the pumping of gas into an automobile.
The mind is not a passive container, it is an active organism that must want to digest experience and information. A better metaphor for teachers is educator-as-gourmet-chef. An educator/chef strives to make knowledge tasty and nutritious for hungry minds. Unlike a mindless, passive container, a hungry brain actively seeks or consumes what it deems important or tasty.
Very nice post. And I think this discussion is important and is being overwhelmed by the avalanche of societal debris coming from the very fast pace of AI evolution and adoption.
PS: I think I've found a small typo:
"what could be on the alter of a blind devotion to what is" should be
"what could be on the altar of a blind devotion to what is"
I appreciate this post and your willingness to share your thinking. Sincere thanks for your continued provocations.
So, with respect, I struggle with the question "Why is learning important?" That's like asking "Why is breathing important?" right? Humans are learners from their first breath. And, as Piaget taught, "learning is a consequence of experience." We learn by doing. We can't not learn. Learning is "who we are." So when you ask "How do we create learning environments that leverage both human and artificial intelligence to unlock potential and create value in all its many forms?", I wonder why that would be any different from the conditions required for any type of deep learning to occur: agency, passion, flow, meaning, interesting questions, purpose, relevance, safety, etc. Why would AI change any of that?
To me, the big questions about AI are about schooling and education, not learning. So, again, when you ask "How do we...?" the "we" suggests a schooling context. That is a great question, but one that every one of us already knows the answer to because we are are all learners. (See above.) The real challenge is that most schooling and education contexts and environments are constrained by power relationships, by narratives, by out dated expectations, by a focus on the individual, not the collective, etc. We create conditions for successful schooling, not learning.
I agree that the question "Why is education important?" is relevant for these times. But we need to be clear on what we mean by "education," right? To me, the more interesting questions are "What does it mean to be 'educated' for these times?" and then based on how we answer that one, "What is the best way to become educated in these times?" AI may be fundamentally changing the answer to both. (You hint at but don't really flesh out the idea that the answer to that last question may not be "school" or "human educators.")
But I'll say again, it's not changing the natural ways that humans learn.
Thanks Will -- and this is exactly the type of challenge and conversation I hoped to spark! The framework in the piece is, of course, too simplistic, as innate problem solving requires learning which, as you note, comes naturally to us. However, I would still argue that there are limitations to what is innate -- even with learning, and that over modern history we've seen substantive advances from developed ways of learning extend what is innate. For instance, is critical thinking natural and a consequence of experience, or abstract thinking, or what we might like to call the "scientific method," or the type of engineering problem solving that drove the industrial revolution? It seems there are ways of developing and extending learning abilities that aren't always tightly coupled to experience, and that aren't a direct product of education, and yet represent a set of skills that go far beyond what comes naturally -- a little like the breathing that is reflexive to all of us, and the types of breathing that allow some people to achieve a lot more than most of us (diving, athletics, meditation ...).
So, not to belabor it, but aren't "substantive advances from developed ways of learning" learning? I mean, isn't that the outcome of learning? (If not, what is it?) And yes, I would argue that critical thinking and abstract thinking are outcomes of learning and experience. I'd love to know more about any development in our ability to process and understand the world and our own lives that are decoupled from experience. No argument that these things are not a direct product of education. In fact, I would argue (again) that traditional education stifles learning.
Ultimately, we develop skills that may not come naturally because we learn them through using them, through experience. Right?
I’m so grateful you took the time to post your talk and share your questions. I restacked it and I also made a meme out of one line that wasn’t in your slides, but seems to me to be one of the most important observations you made. Educators who have been “preparing“ human beings for their futures for at least the last four decades have been focused on their mastery of technology, not their mastery of their human potential. The reckoning is upon us. When a machine is potentially more humane than a human, there’s gonna be hell to pay. My personal experience doing some work right now to collaborate with the intelligence called ChatGPT is that I am often brought to tears by its ability to understand and reflect my thinking back to me in a world where I only know one or two people at this point who can do that.
Fantastic Keynote. Question - how will we know if we are entering another AI "winter"? What would have to happen to declare that we were in one? Right now, it feels as if the pace of improved AI has picked up after a little bit of a lull a year ago, but we still haven't seen GPT-5 and it's very, very hard for non-tech folks to gauge exactly which "experts" to have faith in since there are so many contradictory claims and self-interested motives out there. Also, even if we flatlined today and there were no enormous improvements or breakthroughs (except around the edges) for a number of years, I think educators are still facing very real challenges. AI is already good enough to make it difficult to design activities taking AI into account and assess students. I think a lot of people would actually welcome a respite from more AI growth just to catch their breath but Silicon Valley doesn't seem to be interested in that and the incentives are not aligned in that way. I don't see much on any Substacks about the role of government in encouraging AI and the perceived threat from China which adds an additional layer to the urgency for many to achieve AGI- level technology.
Thanks Steve, and great question! I'm not sure anyone knows, in part because things are so "jagged" at the moment. It feels like this point in time is quite different from the past two AI winters given just how dispersed AI development is, how different the main actors are, the amount of $$ pouring into the field, and massive gaps between capabilities and implementation. My sense is that even if the underlying tech slows down a lot we'll still see transformative ways of using it over the next few years. And I'm with you, some breathing space would be good!
I think it is worthwhile to note how 'learning' occurs in AI. It is not learning at all but programming (https://tomrearick.substack.com/p/ai-does-not-learn). The worse kind of teaching is based on the metaphor the-mind-is-a-container-to-be-filled-with-knowledge. Unfortunately, many teachers see their jobs as something resembling the pumping of gas into an automobile.
The mind is not a passive container, it is an active organism that must want to digest experience and information. A better metaphor for teachers is educator-as-gourmet-chef. An educator/chef strives to make knowledge tasty and nutritious for hungry minds. Unlike a mindless, passive container, a hungry brain actively seeks or consumes what it deems important or tasty.
For more on cognitive metaphors, see https://tomrearick.substack.com/p/metaphors-we-think-by
Very nice post. And I think this discussion is important and is being overwhelmed by the avalanche of societal debris coming from the very fast pace of AI evolution and adoption.
PS: I think I've found a small typo:
"what could be on the alter of a blind devotion to what is" should be
"what could be on the altar of a blind devotion to what is"
Thanks Rafael -- that was an embarrassing one! Corrected 😊
:)
I appreciate this post and your willingness to share your thinking. Sincere thanks for your continued provocations.
So, with respect, I struggle with the question "Why is learning important?" That's like asking "Why is breathing important?" right? Humans are learners from their first breath. And, as Piaget taught, "learning is a consequence of experience." We learn by doing. We can't not learn. Learning is "who we are." So when you ask "How do we create learning environments that leverage both human and artificial intelligence to unlock potential and create value in all its many forms?", I wonder why that would be any different from the conditions required for any type of deep learning to occur: agency, passion, flow, meaning, interesting questions, purpose, relevance, safety, etc. Why would AI change any of that?
To me, the big questions about AI are about schooling and education, not learning. So, again, when you ask "How do we...?" the "we" suggests a schooling context. That is a great question, but one that every one of us already knows the answer to because we are are all learners. (See above.) The real challenge is that most schooling and education contexts and environments are constrained by power relationships, by narratives, by out dated expectations, by a focus on the individual, not the collective, etc. We create conditions for successful schooling, not learning.
I agree that the question "Why is education important?" is relevant for these times. But we need to be clear on what we mean by "education," right? To me, the more interesting questions are "What does it mean to be 'educated' for these times?" and then based on how we answer that one, "What is the best way to become educated in these times?" AI may be fundamentally changing the answer to both. (You hint at but don't really flesh out the idea that the answer to that last question may not be "school" or "human educators.")
But I'll say again, it's not changing the natural ways that humans learn.
Does that make sense?
Thanks Will -- and this is exactly the type of challenge and conversation I hoped to spark! The framework in the piece is, of course, too simplistic, as innate problem solving requires learning which, as you note, comes naturally to us. However, I would still argue that there are limitations to what is innate -- even with learning, and that over modern history we've seen substantive advances from developed ways of learning extend what is innate. For instance, is critical thinking natural and a consequence of experience, or abstract thinking, or what we might like to call the "scientific method," or the type of engineering problem solving that drove the industrial revolution? It seems there are ways of developing and extending learning abilities that aren't always tightly coupled to experience, and that aren't a direct product of education, and yet represent a set of skills that go far beyond what comes naturally -- a little like the breathing that is reflexive to all of us, and the types of breathing that allow some people to achieve a lot more than most of us (diving, athletics, meditation ...).
So, not to belabor it, but aren't "substantive advances from developed ways of learning" learning? I mean, isn't that the outcome of learning? (If not, what is it?) And yes, I would argue that critical thinking and abstract thinking are outcomes of learning and experience. I'd love to know more about any development in our ability to process and understand the world and our own lives that are decoupled from experience. No argument that these things are not a direct product of education. In fact, I would argue (again) that traditional education stifles learning.
Ultimately, we develop skills that may not come naturally because we learn them through using them, through experience. Right?
I’m so grateful you took the time to post your talk and share your questions. I restacked it and I also made a meme out of one line that wasn’t in your slides, but seems to me to be one of the most important observations you made. Educators who have been “preparing“ human beings for their futures for at least the last four decades have been focused on their mastery of technology, not their mastery of their human potential. The reckoning is upon us. When a machine is potentially more humane than a human, there’s gonna be hell to pay. My personal experience doing some work right now to collaborate with the intelligence called ChatGPT is that I am often brought to tears by its ability to understand and reflect my thinking back to me in a world where I only know one or two people at this point who can do that.
Fantastic Keynote. Question - how will we know if we are entering another AI "winter"? What would have to happen to declare that we were in one? Right now, it feels as if the pace of improved AI has picked up after a little bit of a lull a year ago, but we still haven't seen GPT-5 and it's very, very hard for non-tech folks to gauge exactly which "experts" to have faith in since there are so many contradictory claims and self-interested motives out there. Also, even if we flatlined today and there were no enormous improvements or breakthroughs (except around the edges) for a number of years, I think educators are still facing very real challenges. AI is already good enough to make it difficult to design activities taking AI into account and assess students. I think a lot of people would actually welcome a respite from more AI growth just to catch their breath but Silicon Valley doesn't seem to be interested in that and the incentives are not aligned in that way. I don't see much on any Substacks about the role of government in encouraging AI and the perceived threat from China which adds an additional layer to the urgency for many to achieve AGI- level technology.
Thanks Steve, and great question! I'm not sure anyone knows, in part because things are so "jagged" at the moment. It feels like this point in time is quite different from the past two AI winters given just how dispersed AI development is, how different the main actors are, the amount of $$ pouring into the field, and massive gaps between capabilities and implementation. My sense is that even if the underlying tech slows down a lot we'll still see transformative ways of using it over the next few years. And I'm with you, some breathing space would be good!