Insights
Generative AI content has become all the rage in the past year. But generative AI faces issues around quality, including content that verbose, lacking creativity and originality, and far too predictable
Quality is the most important element of content. All too often, in the contemporary age of media saturation, in a world where the loudest voices are most often heard, quality is demoted in favour of quantity. That is a phenomenon that generative AI seems destined to accelerate.
In a matter of seconds, people can produce copy with no spelling mistakes, copy that is grammatically perfect. But people fail to note other issues with that copy. It is remarkable that, in fact, that so little of the discourse around ChatGPT and generative AI has focussed on the absence of quality.
To put it simply: generative AI copy is not very good. It is verbose and dull. It is confident, but often incorrect. It is authoritative, self-assured, but lacking originality of substance and style. And, perhaps most damaging, generative AI copy is predictable, something which great writing seeks to avoid.
Below we explore three core quality issues around generative AI – and show you how to avoid them.
The problem: ChatGPT is the most used generative AI system. It is profoundly boring. We asked it whether it was boring and let the tech speak for itself: ‘The outputs generated by this language model aim to be engaging and informative, tailored to meet your specific requirements and preferences.’
That was boring. So we wrote: ‘That was boring.’ The tech responded: ‘I apologize if my previous response did not meet your expectations. If there’s a specific topic or style you’d prefer, please provide more details or let me know how I can better assist you and I’ll adjust my responses accordingly.’
Again, boring. Generative AI follows grammar and punctuation rules to perfection, along with various professional semantic rules, all of which produce outputs that are verbose, generic, unoriginal, and stale. Sentences are often the same length and repetitive, which is discouraged in copywriting.
The tech has no sense of humour. Attempts to award the tech a sense of humour, as Elon Musk is currently attempting, have been, much like Elon Musk, unfunny, cringeworthy, and completely lacking self-awareness.
Generative AI does not do sarcasm. Its idiom is usually serious, its delivery confident, sometimes verging on arrogant. In short, it makes for a perfectly acceptable reading experience, but not an exceptional one, not one that is likely to make your audience return.
How to avoid the problem: Re-write, at the very least. Generative AI possesses many potential uses, but drafting engaging articles is not one of them. The tech can help you draft an article, provide inspiration, summarise complex information, but it should not be used to produce complete drafts. It’s worth noting that more attentive readers quickly recognise AI-generated content and such readers are unlikely to return to your site. If they want AI-generated content, they can just go straight to the AI.
It is beyond the scope of the article to show people how to write. But the emergence of generative AI does encourage writers to write more creatively, with more daring. So writers need to break rules, make sentences unique and interesting, and avoid any of the verbosity upon which AI systems rely.
Originality will quickly distinguish you from generative AI copy. Tone of voice will also help you differentiate. Writers typically champion authoritative writing, but that may shift with the rise of AI, and writing that feels explorational, writing that does not pretend to know all the answers, may well prove more successful in the age of generative AI. That seems a welcome change.
You can improve outputs on the generative AI systems. The outputs will still require redrafting, but you can make them more applicable and more engaging. Start by fine-tuning your prompts. Consider all of the following when making the initial request from the generative AI platform.
On top of that, you can make outputs more engaging by adding playful prompts. These, at present, are perhaps not as useful as they might seem. They are entertaining and at time impressive but help very little in terms of publishing content. Nonetheless, you can trial some of the following:
There are plenty of other ways to generate more engaging AI content. The above will help boost initial engagement, perhaps provide greater insights, but you should still apply human oversight. Remember that, if you want to product high-quality content, you’ll need to re-write outputs.
The problem: At present, most AI platforms notice trends in data and regurgitate those trends, providing feedback to prompts. It might be able to notice patterns that humans might not have found, or even provide insight that humans might not have noticed. But it does not present anything new.
AI lacks human creativity and the capacity for serious independent thought. The systems are typically programmed to perfect certain tasks and seldom allow for novel solutions. Generative AI content suffers precisely because it parrots available information, especially in terms of ‘how to’ articles.
Asking ChatGPT to write an article on fundraising trends in 2024 will produce an article that largely rehashes pre-existing content on fundraising. The tech uses a huge amount of content to produce outputs, but much of that content is out-of-date and not particularly good.
How to avoid the problem: Generative AI, at present, should be used with caution. It’s great for certain things: summarising complex information, providing ideas for brainstorming, definitions, heading and subheading ideas, and so much more.
But, as mentioned above, the tech is not ready to write an entire article, especially one that is meant to provide new and interesting ideas to your audience. It is striking, in fact, how many ‘thought leaders’ produce generative AI content, essentially regurgitating ideas that already exist.
Content writers typically use AI in the act of drafting. They may refer to the system to overcome writer’s block, aiming for inspiration, but they seldom use the tech to draft entire sentence or paragraphs. The tech, at present, is just not good enough, not creative enough, to draft full articles.
The problem: Writing does have rules. The rules are in place to maximise cognition, so that everyone who has learnt the rules can understand the writing. But the best writing breaks the rules, openly and drastically, while retaining cognition.
Some of the best prose depends on rule-shattering linguistic devices to create a certain feeling, to convey a driving impulse, to shock or stun the reader, to imbue urgency, to reflect the mental state of the narrator, and so on. So, yes, prose should be concise and sharp and you should omit needless words, but you should also abandon everything, at the right time.
Great writing is unpredictable. We enjoy reading because we don’t know what’s going to happen next. That might mean on the level of words, or the way words interact – alliteration, assonance, rhyme, repetition. It could mean at the sentence level, or the level of interacting sentences. One long sentence, for example, might be followed by a short sentence, which creates a rhythm, prevents monotony, and better engages the reader. And it works. Or the unpredictable can exist at the level of plot, with twists and turns and so on. Great writers use plot and prose to keep us on our toes. Generative AI, by its very nature, produces profoundly predictable content. It is trained to do so.
How to avoid the problem: Use generative AI for certain tasks, not others. It’s fine to ask generative AI systems to generate prose for internal policies, outgoing emails, title inspiration, or brainstorming sessions. But if you’re aiming to engage an audience, if you want readers to return to your content, if you want to create thought leadership content, generative AI is not the best route to success.
Use generative AI to produce initial ideas. But follow it up with writing and re-writing. Use AI to define architectural designs, describe a wedding dress, or provide service delivery ideas. But do not rely on its every sentence and certainly do not publish every sentence.
As we’ve seen with all of the above, meaningful human oversight is a necessity. Ensure you participate actively in the creation of content, applying and embracing the unpredictable, rather than relying on the predictable generative AI to create all copy. The risks of relying completely on AI are many: it raises problems with ethics, accuracy, and quality. But applying human oversight to the tech can generate incredible results and allow you to reap many of its benefits.
For the sixth year in a row, we're bringing back an action-packed event filled with Digital Fundraising insights from the charity and tech sectors. Join us on 7th October 2024 for a free, one-day online event featuring informative webinars and interactive workshops.