Commercial Content Creation with GPT-3
‘Auto-generated code’ is a deservedly hyped use case for GPT-3 but, as a marketer, I think about content creation more broadly.
We have been experimenting with the GPT-3 beta at Crowdbotics and, like some others, we have already been able to produce some compelling results in natural-language-to-code applications. Tools incorporating this technology have the potential to be industry-changing. More interesting results with GPT-3 at Crowdbotics here.
That being said, GPT still has a ways to go before replacing software developers. However, GPT-3 is currently at parity with some forms of low- to -mid-end commercial content production. While not the flashiest application of this technology, relatively reliable, auto-generated, shallow content creation has far-reaching implications.
Below, is a quick example of how GPT-3 can effectively write a compelling article in a few steps, right out of the box.
‘Effective’ and ‘compelling’ are the key words.
The focus is not on whether the article is accurate, but whether the article fulfills the purpose of the author/editor. In many contexts, ‘compelling’ content creation means ‘does it generate views and clicks?’ Is it ‘fit’ in the memetic sense. Quality, for better or for worse, (almost always for worse) is often secondary.
In terms of ‘effectiveness’, did GPT-3 save time writing: were fewer resources used to produce it than otherwise would have been? Could it be used to create more content with the same resources than was previously possible?
I will not wade into how GPT-3 should be used. The question is rather, ‘is GPT-3 ready for commercial marketing and content production use cases that mirror the demands of the market today?’
The results from GPT-3 are often better, but not categorically different, than if you were to train a neural network on your own, using a reasonably sized corpus, with a tool like Tensorflow. A primary difference is that laborious and resource-intensive training is not needed with GPT-3. GPT-3 gets up to speed right away using a few simple prompts, and will eventually be accessible via open API.
Accessibility and ease-of-use are key. Because training parameters are an order of magnitude larger, GPT-3 becomes relatively reliable in narrow use cases, where GPT-2 was unpredictable.
To write a share-worthy article, a writer/editor can simply train or prompt on what has been shared.
Instead of training on large amounts of data, GPT-3 only needs basic prompts.
For the purposes of this example, we’ll use TechCrunch as it happens to be generally less political than other publications. Here are the 10 ten most shared posts from TechCrunch over the past 6 months.
The top articles from TechCrunch can be used to produce another 10 potentially share-worthy articles. (All of the following results use a .5 temperature.)
An introductory paragraph can be expanded for a single article title:
“Banksy’s New York City ‘bemusement park’ opens”
The title and intro can then be used to expand the content into a larger article.
And, so on.
This short series of compounding prompts, with little human intervention, produces a fairly plausible 400-word, inverted pyramid, ‘news’-style article similar to what has already been successful on TechCrunch.
Banksy’s New York City ‘bemusement park’ opens
The artist Banksy has opened a new “bemusement park” in the New York City borough of Brooklyn. The “theme park” is called “Dismaland,” a name that is clearly a play on Disneyland, the largest theme park in the world.
The opening of Dismaland comes just weeks after the “Queen’s Beast” exhibit opened at the British Museum in London. The exhibit features the ten statues that stood in front of the London palace of Queen Elizabeth II. The statues, which were commissioned by the British monarchy, were created by four different artists, including Banksy.
The Dismaland exhibit is being held at a derelict seaside resort in Weston-super-Mare, England. The “theme park” features works by over 50 artists, including Damien Hirst, Jenny Holzer, and Jimmy Cauty. The park also features a “bemusement park” and a “garden of evil.”
Banksy’s Dismaland is located in a dilapidated lido that closed in 2000. It was the scene of a fire in 2008.
The artist Banksy, who is known for his graffiti art, is believed to be Robin Gunningham. He has never revealed his true identity.
Dismaland is Banksy’s first major exhibition in the United States. The artist opened a new exhibit in the New York City borough of Queens in October 2015. The exhibit, which was called “Better Out Than In,” featured a series of installations and graffiti art in different locations in the borough.
The Dismaland exhibit is scheduled to run through September 27, 2015.
Banksy has posted a video on YouTube about the Dismaland exhibit. The video is called “Dismaland – An Introduction.”
The video begins with a young girl asking Banksy if he is Banksy. The artist says that he is not Banksy, but that he is a friend of Banksy. The girl then asks Banksy if he is the person who made the “Queen’s Beast” statues. The artist says that he is not the person who made the statues, but that he knows the person who made them.
The video then shows a series of images from the Dismaland exhibit. The images include an animatronic Ronald McDonald, a swimming pool filled with “oil,” a sculpture of a dead Cinderella, and a sculpture of a young girl.
The video ends with Banksy saying that the Dismaland exhibit is “the most disappointing place on the planet.”
Banksy’s Dismaland does in fact exist, but it is in Somerset, not New York City. This article is not accurate but it rings true, and reads as though written by a human. When it comes to article-length content creation, GPT-3 has near-crested the far side of the uncanny valley and, at minimum, has arrived at the plateau of “truthiness“.
Back to the question: is this the current state of GPT-3 useful for marketing applications. The answer is ‘yes.’
GPT-3 can be used out of the box to create new copy iterations, descriptions, high-level articles, summaries, expansions, dynamic chat, content personalization, etc.
The simple example above can be tweaked for application within SEO and marketing agencies, not just content farms. Check any freelancer marketplace and you will see many listings for freelance writing gigs — and they’re not all looking for Cormac McCarthy (and even McCarthy is now available in bot form). There is a large market for middlegrade content creation.
I have used GPT-3 experimentally to create descriptions for React Native and Django modules used in the Crowdbotics App Builder.
Creative and subjective uses of GPT-3, where validation is not so much a factor, or there is a tolerance for fuzziness, have immediate application today. Accordingly, it’s safe to say that the market for GPT-like services (and the market for services countering the effects of GPT-like services) will be a multi-billion dollar industry with a few years.
Fairly mundane applications of the technology, such as shallow content creation, will inevitably result in sizable ripples. For example, Google regularly has to adjust their search algorithms to account for the proliferation of lower quality content through various means. Widespread and varied use of GPT-3 will make certain aspects of search algorithms such as length, recency, topic authority, uniqueness far easier to game. More adjustments will be needed.
(It’s also not at all far-fetched that we may see a GPT-powered service take a Duck-Duck-Go-sized bite out of Google’s Search supremacy in general through a direct question and answer type model. The euphemism ‘Google Programmer’ — a developer spending their time googleing solutions to engineering problems — could quite easily evolve into ‘GPT Programmer’. Image then the concept expanded to other industries.)
Looking ahead to the public release of GPT-3, subsequent alternative NLP engines, or an eventual GPT-4, the possibilities are vast, close at hand, but still somewhat speculative. While it may not be as be sexy, marketing and content creation use cases for GPT-3 are here now. Good enough is good enough.