One way to make constructive use of generative AI in it's current form is to think less about "me as a writer" and more about the needs of an audience. Here's an example...
Imagine I'm writing an article about XYZ. In the article I may reference a number people, things, events etc that are related to XYZ. But I don't have the time or inclination to write separate articles further explaining all these factors related to XYZ. And so...
AI to the rescue. Gen AI would allow me to very efficiently produce articles on these related factors, which can then be linked to from the main article I'm writing myself. Now the reader has access to much more information. Instead of delivering one page to the reader, now I'm delivering twenty.
The articles generated by AI in it's current form may not be as artful and eloquent as a human written article. In very many cases, the appropriate response to that is.... So what, who cares? In very many cases the purpose of an article is to deliver useful information, not win a literary award.
Imho, way too many writers are focused on themselves, how they appear to others etc. And these distracting personal ego factors drive much of the resistance to AI content. The focus for many writers is to be on the stage, in the spotlight, holding the microphone, the center of attention. Anything which interferes with this self centered agenda may be resisted.
If our focus is on delivering useful information to readers, instead of on ourselves, there will be many cases when mass produced AI generated articles can serve that mission.
Were I an instructor I would teach, dial back your ego, and focus on serving the reader.
I tried ChatGPT and several other tools to see what they would provide me as a writer. They were useful in conducting broad Google searches, but fell down on the generative side of things. The writing they produced was definitely meh. As an editor, I would have thrown most of it back for a rewrite.
What people don’t realize is that, though the new front ends are attractive, and the interaction is interesting, the underlying AI is more or less the same as it has been for a while. It will get a lot better, maybe soon, but it’s not there yet.
And I liked the response of the student who said he was there to learn, not have the AI do the thinking for him. I think you are right to say students may have a better view of current AI than many who have been sucked in by all the hype would have us believe.
I asked my students similar questions via a survey (the questions all came from ChatGPT and Claude) and like you I was a bit surprised. I think it's because I (like lots of educators) have immersed ourselves in this A.I. world, and I certainly see it as revolutionary and therefore feel I need to provide my students with skills I see them as needing. But my students don't see it this way. Only 50% of my students (grad students n = 24) had used a generative AI tool like ChatGPT. Only 1 student said they used GenAI very often, most Rarely or Never. 2/3 of the students were only slightly or not familiar at all with the concept of generative AI and its potential applications. When asked if professors should allow the use of generative AI for assignments and papers, almost 80% were unsure, or responded probably not or definitely not.
One way to make constructive use of generative AI in it's current form is to think less about "me as a writer" and more about the needs of an audience. Here's an example...
Imagine I'm writing an article about XYZ. In the article I may reference a number people, things, events etc that are related to XYZ. But I don't have the time or inclination to write separate articles further explaining all these factors related to XYZ. And so...
AI to the rescue. Gen AI would allow me to very efficiently produce articles on these related factors, which can then be linked to from the main article I'm writing myself. Now the reader has access to much more information. Instead of delivering one page to the reader, now I'm delivering twenty.
The articles generated by AI in it's current form may not be as artful and eloquent as a human written article. In very many cases, the appropriate response to that is.... So what, who cares? In very many cases the purpose of an article is to deliver useful information, not win a literary award.
Imho, way too many writers are focused on themselves, how they appear to others etc. And these distracting personal ego factors drive much of the resistance to AI content. The focus for many writers is to be on the stage, in the spotlight, holding the microphone, the center of attention. Anything which interferes with this self centered agenda may be resisted.
If our focus is on delivering useful information to readers, instead of on ourselves, there will be many cases when mass produced AI generated articles can serve that mission.
Were I an instructor I would teach, dial back your ego, and focus on serving the reader.
I tried ChatGPT and several other tools to see what they would provide me as a writer. They were useful in conducting broad Google searches, but fell down on the generative side of things. The writing they produced was definitely meh. As an editor, I would have thrown most of it back for a rewrite.
What people don’t realize is that, though the new front ends are attractive, and the interaction is interesting, the underlying AI is more or less the same as it has been for a while. It will get a lot better, maybe soon, but it’s not there yet.
And I liked the response of the student who said he was there to learn, not have the AI do the thinking for him. I think you are right to say students may have a better view of current AI than many who have been sucked in by all the hype would have us believe.
I asked my students similar questions via a survey (the questions all came from ChatGPT and Claude) and like you I was a bit surprised. I think it's because I (like lots of educators) have immersed ourselves in this A.I. world, and I certainly see it as revolutionary and therefore feel I need to provide my students with skills I see them as needing. But my students don't see it this way. Only 50% of my students (grad students n = 24) had used a generative AI tool like ChatGPT. Only 1 student said they used GenAI very often, most Rarely or Never. 2/3 of the students were only slightly or not familiar at all with the concept of generative AI and its potential applications. When asked if professors should allow the use of generative AI for assignments and papers, almost 80% were unsure, or responded probably not or definitely not.
Really interesting -- and underlines even more that we should be engaging with our students!
It's been discussed, but just need to find the time to design the study and get the IRB approved -- much needed though