I started a new business and AI wrote the business plan

OpenAI GPT-2 text generator (Talk to Transformer) wrote my business plan

AI employees are already fairly prevalent around us. Chatbot customer service reps, content recommendation algorithms, spam filters – they are literally doing the jobs that would take tens of millions of people to do. We don’t get mad at Google for using AI spam filters, for the same reason we don’t get mad at Ford for making engines that do the jobs of millions of horses. But what about AI writers? Will text generators such as Talk to Transformer and GPT-2 by OpenAI change this AI employee conundrum?

That’s why I tested the value of an AI employee in the writer role.

My Cofounder, Talk to Transformer 

In the past, I’ve played around with Talk to Transformer (the text generator created on OpenAI GPT-2) for fun. Mostly my experiments consisted of making funny one-liners. But I’d never thought about actually working with the tool to string together a larger piece of writing.

So I momentarily hired Talk to Transformer to help me write a business plan. And this is what it created:

Honestly, I’m pretty proud of the resulting business plan (which you can view here). Most text generators are not going to be this succinct and coherent. With a little polish and editing from me it could actually be an executive summary.

I’m doubtful that Talk to Transformer will be stealing any copywriting positions anytime soon. More than likely, we’ll see a greater emergence of text generators like Quill Engage. These are NLGs which are good at writing very specific, formulaic writing. Quill’s specialty is creating website analytics reports.

However, I’m not actually putting the AI writer out of the realm of possibility because GPT-2 is hiding some power.

What is GPT-2 from OpenAI?

Talk to Transformer is a tool created on the back of a generative language model called GPT-2, created by OpenAI (Elon Musk and Sam Altman are the cofounders).

Natural language generation essentially is a statistical / probabilistic art. It uses probability to guess the next word in a sentence based on context and previous knowledge (data). And then they record that prediction one word at a time to build a sentence. They don’t understand the words’ meaning. Instead, they just objectively predicting the next word.

Natural language generation is kind of like you trying to guess a cake recipe by hearing just a couple of the ingredients.

If I tell you that the recipe has 2 cups of flour and 1 cup of cocoa powder, then you can figure out the rest of the ingredients for this chocolate cake because there is generally a standard and common ratio with baking powder, sugar, salt, etc. Yes, you’ll end up with a very basic cake with very little nuance. But that’s all the context you have.

If, however, I were to tell you that recipe has ½ cup of sour cream and ¾ cup of lime juice, then you can guess it’s going to be a key lime pie and take it that direction.

NLG operates very similarly. Depending on what you prompt it with, that’s the direction it’ll go in creating. Generally, it will lack nuance because it’s predicting based on probability. And the most probable result is also the most common/average. More about this conundrum in creative AI here.

What sets the GPT-2 algorithm apart is the way they designed the algorithm and the quantity of data analyzed. OpenAI trained GPT-2 simply to predict the next word in 40GB of Internet text (roughly 8 million web pages).

It wasn’t given a bunch of english rules to follow. It’s a simple formula with fantastic results. They refused to release it because the algorithm is “too dangerous for the public”.

Instead, OpenAI has released small and medium-sized versions of the language model for developers to play around with. One of those developers is Adam King who created Talk to Transformer which uses the medium-size version of GPT-2.

Inevitable/Human

Can you imagine the abilities of the large GPT-2 model? Only time will tell.

Already a future thinker?
Then become a friend.