Nikki Ashley
Nikki Ashley
2023-09-06
x   min read
With so many roles apparently at risk of being replaced by ChatGPT, including teachers, should those of us who design learning be quaking in our boots?

Could ChatGPT replace learning designers?

Table of contents

When the latest version of ChatGPT was unleashed, it felt like the mainstream media saw generative AI (with only ChatGPT really making the headlines) as something that would either doom or save humankind. Many reports warned of professions that would soon be wholly obsolete.The dust seems to be settling. The articles now are more often about where generative AI will fit into professional toolkits – including at universities. That seems like common sense; digital tools replacing certain human activities is hardly new. I sit here typing on a keyboard, using ubiquitous software without a secretary or typist in sight!Many technologies (both physical devices and digital tools) are pretty common in most of our lives.A personal example: I love to read. Around 150 books a year kind of love to read. I also love books – the physical look and feel of them. 11 bookcases kind of love books. And I also have a Kindle with a subscription that I make good use of. I bought it before a holiday some years ago, as the Kindle was a simple, lightweight alternative to carting a pile of books with me. This is one of many examples where technology and tradition complement each other, and both enhance my life.In a professional context, an example of another complementary pairing is voiceovers. Here at SiyonaTech we use AI voice software to create a first draft for video and animation, until both we and the client are happy with the story, then we use professional voiceover artists to record the final versions. AI voice offers a useful drafting tool but cannot replace the calibre and nuance of a human professional.Digital learning is our space. We like technologies we can add to our toolkits. Always curious about technologies that intersect with learning, it was inevitable that we’d want to check out ChatGPT and other large language models (LLMs).So how does learning design fare? Will we be among the 300 million scrabbling for a new way to pay the bills?We don’t think so.But it’s not just arrogance or blind optimism! Here are some of the ways AI can’t do what we do.

Content

If it was enough to put relevant content in front of learners:

  • All students would get A’s for everything!
  • There would be no need for repeated compliance learning!
  • We could replace all interactive learning with books!

No matter how important the content is, if you’re not engaged in some way, it just won’t stick. If you don’t understand or care about it, you won’t learn it. If it’s worthy, but the delivery format is boring, you tune out.Chances are behind some of your favourite and best subjects at school, there was an inspiring and gifted teacher. As an adult, you’re bound to have sat through a presentation at some point where the presenter clearly knew their subject, but goodness were they boring!Most likely that’s the main thing you remember – how boring they were.So learning designers are always looking for ways to engage, make learning memorable, help it fit with what you already know and your existing experience. We use adult learning principles, learning models and theories, neuroscience research, storytelling principles, gamification… and we apply the elements we believe fit best in each learning project.It’s what we do.SMEs usually pull together content from their area of expertise; designing the best learning experience to get that content across as learning is our area of expertise.Generative AI can gather content. Learning designers can make it ‘learnable’

Evidence

We like evidence. We look for experts, research, data, empirical evidence. There’s lots of evidence on what makes good learning, so we like to use it.As much of what generative AI produces is scraped from the internet, there is an inherent issue with most of the content – it was originally created by humans. Why is this an inherent issue then? Because we know humans have different perspectives, different levels of understanding, and can often perpetuate misinformation – whether deliberate or intentional – human-generated content is not necessarily fact.ChatGPT even warns us: “The model has been trained on a diverse range of internet text, allowing it to generate human-like text in response to prompts given to it.”Remember TayTweets? Microsoft’s Tay was designed to mimic human language patterns (those of a 19-year-old American girl) and to learn from interacting with human users of Twitter. It had to be shut down within 16 hours as it had learned to post inflammatory and offensive tweets. Whether it was because Tay came from Microsoft, whether it was 19-year-old girls rejecting the concept that code could mimic them, or other reasons, at least some of the responses had to be generated by humans wanting to prove they could ‘beat the system’. Humans like to challenge predictable responses. Boaty McBoatface anyone?So how good is generative AI at separating myth from fact? It seems like that’s not guaranteed yet. “When citing sources of information, as is standard in academic work, it simply makes them up.” (We plan to explore an example of myth vs truth in another article.)Another human trait is to rely on information learned previously, which may now be obsolete. The 'half-life of facts' refers to how what we know as 'fact' can change. Think you know the colours of a rainbow? Think again! Heard that adage that you only use 10% of your brain? Brain scanning technology proved this to be completely false. There’s an interesting TED talk on The Half-Life Of Facts, if you want to learn more about that.Generative AI can gather content. Learning designers can determine if it’s myth or truth – at least as it stands at the current time!

Questions

Now this could prove to be interesting!Question writing in learning is commonly underestimated as a skill. Writing a good effective question is hard. These are some of the things learning designers take into account when writing questions:

  • Whether the aim of a question is to measure understanding of what’s been covered, or to bring to mind things learned from experience outside the specific learning experience, or simply to engage in thinking about something
  • That the question options should not be obvious, but generate thinking (helps it stick better)
  • That learners usually want to know why something is right or not, and the feedback should reflect that
  • That the question stem should be designed to generate consideration of the answer, not to decrypt mangled grammar in the question itself

The nature of prompts used to generate AI responses may improve questions skills all round. It’s rather helpful to see an immediate response that varies with the nature of the question (prompt). If it’s not the response you wanted, you need to write better prompts! The quality of the question generally directs the quality of the response. A strong question is part of the holistic flow of the learning.Generative AI doesn’t ask questions, but it could improve your question skills! And learning designers can consider all the elements that make questions an integral part of the learning.

Final thoughts

Generative AI will evolve.Yet the ‘big idea’ (LLMs capable of generating human-like text) is out there and its evolution will probably only refine and improve on that idea. So learning designers probably have the shape of where it will fit into our assorted toolkits – first drafts, some editing, and idea threads to investigate. Just as PowerPoint didn’t turn boring presenters into inspiring presenters, LLMs won’t turn collated content into great learning experiences.Until the next big thing, we reckon we don’t need to quake in our boots, and I doubt we’re giving Geoffrey Hinton any sleepless nights.

A post by Nikki Ashley, Senior Learning Designer at SiyonaTech.