You are viewing your 1 free article this month. Login to read more articles.
Could ChatGPT put this children’s science writer out of a job? Let’s put it to the test.
One hundred million users signed up to ChatCPT within two months of its launch, and I’m one of them. As a science writer, I was keen to try out the OpenAI chatbot’s so-called game-changing ability to provide written answers to any question. Answers reportedly so creative and coherent that machine learning is predicted to disrupt the children’s book industry.
ChatGPT handled my straightforward science questions with impressive ease. At first, it feels like a GPS for navigating the Internet sprawl – avoiding the pile up of search engine results on your way the information you need. It’s easy to see why it has already been incorporated into science and maths tutoring systems, and can even pass some of the world’s toughest exams.
But can it tackle the hardest questions I’ve ever faced? Every author who visits schools delights in the improbable questions asked by children. Are feelings made from atoms? What colour is nothing? Who taught the first teacher? What’s the speed of dark? I’ve collected 70 of these curious conundrums in The Bedtime Book of Impossible Questions, a Guardian children’s book of 2022.
So can ChatGPT be bamboozled by a six-year-old’s curiosity and creativity? I put it to the test.
If you want to know why we have eyebrows, or if watermelons have a self-defence system, the chatbot quickly delivers. But when a question doesn’t lend itself to a neat answer, AI doesn’t yet make the grade.
“No, dogs cannot swim in ice cream,” ChatGPT tells me. “Ice cream is a frozen dessert made from milk, cream and sugar, and it has a thick and sticky consistency. If a dog tried to swim in ice cream they would likely get stuck and find it difficult to move.”
This might be true, but it would be a disappointing answer to offer a child. The most valuable response to a wildly creative question is to run with it. If that means chucking a hypothetical dog into some hypothetical ice cream, then let’s go! With the help of an academic library, I found that a team of scientists once challenged humans to swim in gloopy ice cream mix. Extrapolating the results to canines lets me reward a creative question with enthusiasm, while making connections to all kinds of serious science.
Carefully curated books about science and nature have the power to connect with children in a way that AI can’t; to encourage them to notice the insect, appreciate its beauty and mystery and yearn to discover its name
In my book, I hop from genetics to childhood experiences via pick ‘n’ mix, concluding that it’s impossible to say who you’d be if you weren’t you. ChatGPT takes the opposite stance, deciding: “Yes, you would still be you, no matter what year you were born in.”
Different human writers would come up with different answers, too, but I’m struck by the chatbot’s unshakeable confidence. It reflects the nature of machine learning itself, the process behind AI, which actively avoids negative results like ‘I don’t know’. However, its neat answers obscure the true nature of science: there is plenty we don’t know yet, and children can join this continuous journey of discovery.
At first glance, ChatGPT’s answer for a seven-year-old is thoughtful. “Just like toys and machines,” it begins, “our bodies are made up of lots of tiny parts called cells. As we get older, these cells stop working as well as they used to, and they don’t fix themselves as easily. This can make our bodies feel tired and not work as well.”
But in reaching for a relatable analogy, the AI makes an error (of punctuation or science), telling us that machines and toys are made from cells. Human authors and editors make mistakes too. But the analogy is not one that science writers would choose. Our bodies don’t work like toys or machines, and for children this idea might cause confusion.
Writing non-fiction for young audiences is a process of careful curation. What stays in, what gets left out? Which concepts need extra scaffolding? Which analogies will help children make an emotional connection? Which metaphors might cause misconceptions? My answers don’t come from comparing as many texts as possible, but from knowing and understanding my audience.
AI is only as good as its training data. It could learn to personalise answers for a certain group of children, or even for an individual child, but getting it to do this would mean feeding it buckets of personal data. Ethical and legal concerns mean this is fraught with difficulty.
ChatGPT is impressive, and it’s just getting started. It’s essential that writers and publishers understand how readers and learners are embracing AI software, and how this might develop.
It can already do certain things faster than books can – such as helping a child identify an insect they have spotted. But carefully curated books about science and nature have the power to connect with children in a way that AI can’t; to encourage them to notice the insect, appreciate its beauty and mystery and yearn to discover its name.
If ChatGPT is our GPS for navigating the data superhighway, a book is a chance for children to step out of the traffic and linger where they can enjoy the view. To discover bridges to their own knowledge and experience. To explore unexpected avenues… and to inspire the next impossible question.