(Author’s note: One of the paragraphs in this article was generated by ChatGPT. See if you can guess which one.)
At the end of 2022, I had the opportunity to familiarise myself with ChatGPT, a language model chatbot developed by American AI research laboratory, OpenAI, the same folks behind the DALL-E AI art generator. Using terabytes of text sourced from the internet, ChatGPT is trained to produce human-like responses and interact with users based on given prompts. The chatbot has caused quite the stir since it debuted in November last year, reaching over a million users in just five days and producing some truly entertaining social media content. You can spend a fun afternoon asking it to write haikus about kitchen appliances (which is a sentence I never thought I’d write).
On the serious side of things, the technology on display here, and what it’s ultimately capable of, is now under more scrutiny than ever. Case in point, the International Conference on Machine Learning (ICML) announced this month a new policy banning the use of AI tools to write scientific papers. Meanwhile, education departments in the US are barring student access to the chatbot out of “safety and accuracy” concerns. The bot also allegedly set alarm bells off at Google, with the technology posing a threat to its search business. Meanwhile, Microsoft is reportedly looking to incorporate it into the Microsoft Office suite and is even in talks to invest $10 billion in OpenAI.
Putting ethical usage and application arguments aside, technologies like ChatGPT present many questions for marketers, one of which is: Can a robot write a think piece?
Right now, that answer is no.
What the AI can do
On a fundamental level, it could be argued that an AI like ChatGPT is not capable of having an original thought because it simply cribs everyone else’s and presents them in different ways (such is the nature of its training). But that doesn’t mean this technology can’t be incredibly useful.
ChatGPT offers a very simple, conversation-driven user interface. Anyone can use it by giving it a prompt, and if you’re not happy with its response, you can adjust your prompt or give it feedback to generate a new response. When I tested it by typing in a few prompts, the bot made a good case for itself.
A prompt to write a release announcing the United States government was banning citizens from using TikTok came back with cohesive results, as did an article announcing the launch of a new streaming feature on Twitter. In the Twitter article, the bot even made up details, like how users could find and activate the feature on the Twitter app’s homepage. Taking it a step further, the bot produced a decent email template announcing an upcoming Black Friday sale and even a feature article highlighting the various attractions tourists could visit in the Middle East.
What stands out with this AI is its grasp of structure and archetypal forms of marketing copy. The examples above exhibited the kind of writing an agency may put out and, in the case of the TikTok press release, an understanding of how to articulate information and deliver it in order of priority. A marketing student or intern could look at the content to learn more about length, format, and writing style. A junior or senior writer could produce work based on its guidance on how to best approach a topic and what elements to include.
What the AI can’t do (yet)
Where ChatGPT shows its limits is when you ask it to come up with original ideas. Sure, the bot will make the argument for you (say, to prioritise Africa’s digital transformation) as long as the prompt makes the desired outcome clear. But directly asking it to come up with sophisticated ideas for content that you would then produce yourself results in output that just mirrors the stipulated prompt. Asking it to come up with an idea for an essay about digital transformation in Africa resulted in the bot telling me to write an essay about digital transformation in Africa.
Perhaps more complex prompts could deliver more complex work, but there are other factors to consider.
For one, ChatGPT was built to mimic human speech and not to be an information repository. While it may have had access to a large amount of online text during its training, it does not have real-time internet access (thank goodness) and as such, its dataset and knowledge cut-off year is the end of 2021. This limits its ability to return current or accurate results.
For example, when I asked it to write a bio for Clockwork, it incorrectly declared the agency was founded in 2006. When asked what the fastest marine mammal is, the bot proceeded to give an entirely different answer after its first answer, the peregrine falcon, was pointed out to be incorrect. Fact-finding and checking remain a core part of any content-production process – especially when it comes to producing content that’s timely and explores the trends of the day. And a bot that changes its mind over the facts for the sake of generating content isn’t exactly ideal.
Another issue with ChatGPT as a search engine is that it is not designed to be objective. Language models are trained on a large dataset of text, which means that they can often reflect the biases and prejudices present in that dataset. This can lead to ChatGPT providing biased or skewed results, which can be harmful and misleading for users.
Embrace the future (and our robot overlords)
The combination of subjectivity, lack of fact-checking capabilities, and limited original thought mean ChatGPT can’t replace traditional journalistic intuition or real insights. Luckily, we still need humans for that. But as a tool in the arsenal of smart content producers and marketers, it can be very useful. Bouncing a few ideas off it for a think piece may yield some direction or input on what the piece could include; it’s adept at shortening text or providing a few copy options, and it’s great at rewriting if you give it the right direction. Ultimately, marketers and writers can use it to complete certain tasks more efficiently, but because of how confidently it produces misleading or incorrect information, you still need someone skilled operating the controls.
Given the speed at which this technology is developing, that may not be the case in a few years. But until then, we can have fun with the haikus.
(Author’s note: The paragraph that discusses the use of ChatGPT as a search engine – third from the bottom – was generated by the AI. During the editing process, it was the only paragraph that my editor asked to be cut. Take that, robot!)