It's popular among AI folks to think in terms of phases of AI, of which the current and most reachable target is likely “oracular AI”. Tools like ChatGPT are one manifestation of this, a form of question and answer system that can return answers that will soon seem superhuman in terms of breadth of content and flexibility of style. I suspect most educators don't think about this framework of AI as oracle much, but we should, because it explains a lot both about the current hype cycle around large language models and can help us gain critical footing with where to go next.
The generative “AI” hype cycle has been at peak hype for the past month or so and it follows completely predictable tech patterns. Hypers tout all the amazing miraculous things that will be possible; doubters wonder aloud whether these things will fail to deliver on their utopian promises (because these things always fall short of their utopian promises), and most of the obvious consequences and outcomes get overlooked.
Any new technology or tool, no matter how shiny its newness, can help students experiment with how technology mediates thought. I suspect that's the least problematic use of generative “AI” and large language models in the short term. One reason I think of this kind of activity as play or experimentation is that if you go much further with it, make it a habit, or take it for granted, then the whole enterprise becomes much more suspect. Most consumer-facing applications showing off large language models right now are variations of a human in the loop system. (ChatGPT exposes a particularly frictionless experience for interacting with the underlying language model.)
A key question for any human in the loop systems is that of agency. Who's the architect and who is the cog? For education in particular, it might seem that treating a tool like chatGPT as a catalyst for critical inquiry puts humans back in control. But I'm not sure that's the case. And I'm not sure it's always easy to tell the difference.
Inspired by and forked from kettle11's world builder prompt for ChatGPT, this is a bare bones adaptation to show how low can be the lift for creating “personalized AI”. This relies on the fundamental teacher hacks to expand conversation: 1. devil's advocacy and 2. give me more specifics.
Try it, adapt, and see what you think. (Full prompt below the break. Just paste into ChatGPT and go from there.)
This BBC piece about the origins of the de-cluttered household caught my eye: https://www.bbc.com/culture/article/20230103-the-historical-origins-of-the-de-cluttered-home
It's a swift and effective overview of architectural minimalism and the cyclical waxing and waning of fashion for de-cluttered interiors. The pendulum has swung towards maximalism and eclecticism for a bit now and perhaps there are hints that is starting to swing back. I suspect the article presents too linear a summary, as there seem always holdouts that can linger on until suddenly becoming “in” again as the pendulum swings back. But this piece got me thinking about how much minimalism is cyclical in other areas outside its home base of architecture and design.
A recent opinion piece in WaPo by journalist Markham Heid tackles the ChatGPT teacher freakout by proposing handwritten essays as a way to blunt the inauthenticity threat posed by our emerging AI super-lords. I've seen the requisite pushback on this piece around accessibility, but I think the bulk of criticism (at least what I've seen) still misses the most important point. If we treat writing assignments as transactional, then tools like ChatGPT (or the emerging assisted writing players, whether SudoWrite or Lex, etc.) may seem like an existential threat. Generative AI may well kill off most transactional writing (not just in education. I suspect boilerplate longform writing will increasingly be a matter of text completion). I have no problem with that. But writing as part of pedagogy doesn't have to be and probably shouldn't be solely transactional. It should be dialogic, and as such, should always involve deep engagement with the medium along with the message. ChatGPT just makes urgent what might have otherwise been too easy to ignore.
My new year's resolution: more writing. Because otherwise the bots win. Or, rather, otherwise the bots won't have enough fodder to generate ways for students to cheat? Not sure, but I think I need to practice writing like a human.
Apparently there's been a lot happening on the AI[^1] front that kind of got people talking these past few months. In predictable fashion, some teachers are stoked, others are freaked out, and most aren't quite sure what to do about OpenAI's big reveal that a massive language model can be coaxed to write a passably decent essay with little effort or significant know-how.
Recently I was leading a meeting with a group of very young designers presenting a low-fi version of an idea for part of our product. It was gamified. It had delightful animations and heavy lift technological fixes for the problem at hand. It was a version of an app and interactions that one sees over and over. Make it competitive, make students award each other likes or fires or hot streaks (or whatever you want to call it), and that will overcome the problem (perceived problem) of no one actually wanting to do that whole learning thing.
So much edtech marketing tries to sell the idea of “engagement”; I've written before about why I find that phrase so pernicious. While I'm still bothered by the way that selling “engagement” through technology makes it seem like what teachers do is inherently not engaging (e.g. “boring” lecture, plain old non-technologized classrooms), the more damaging part of buying into the marketer's story, that technology's goal is “engagement”, comes from the way such framing distracts from the more valuable — and undervalued — part of teaching and learning: reflection. I would put it starkly: knowledge and the act of knowing comes not from engagement but from reflection percolating and punctuated over time.
This is only the second time in nearly 20 years that I have not been teaching classes at this point of the year. The last time was for a full-year sabbatical, something temporary but wholly part of the business of academia; this Fall absence is likely to be more permanent. Others have asked whether it is strange to come to this point in the year, where I am typically polishing up course sites or digging into assignment scheduling for the term. But it's not. The rhythm of the academic year is out of mind, and I feel free (free!) as I have not felt in years.