Since the emergence of GPT-3 there is a flood of startups battling for the attention of users selling AI copywriting services for various formats.

Emails, blog posts, landing pages, ad copy, sales communication...

Remember when most of the translation and proofreading work was still done by humans?

Then arrived DeepL, Google Translate, and Grammarly.

Democratizing and accelerating literacy and language for everybody.

Guess what?

What happened there is now happening with text creation.

The source layer of the whole value chain.

Let’s discuss what is happening with AI copywriting, why GPT-3 is just the beginning, and how most of these AI copywriting tools are not doing much beyond building some pretty face on top of the same core.

Jarvis, Copy AI, CopySmith, CopyShark, Writesonic, Anyword...

They came, saw GPT-3, and now are trying to win you as a prospective user.

Some AI copywriting tools have grown incredibly fast, building strong revenue runways, and raising from fancy investors.

Often overpromising with hundreds of templates that do not really deliver what they was said  to be and when boiled down they are all the same having no power to influence and improve on their very core text quality.

They all are dependent on...

GPT-3 from OpenAI.

Capable of a lot, a true generalistic powerhouse of artificial intelligence and a lighthouse for many curious minds who have now flocked to the field of natural language generation.

But GPT-3 is only the beginning of an incredibly fast-developing technology branch.

Suddenly there was GPT-3 and the danger became an opportunity. The history of OpenAI

Bare with me for a short technical introduction. Let me tell you more about the meaning of those seemingly secretive letters.

G P T - 3 stands for Generative Pre-trained Transformer v3.

Let me walk you through it - word by word.

Starting off easy “generative” means that the AI model is meant to create.

With “transformer”, we are not talking about Autobots the size of a 3-story building as we know them from movies.

Transformers are a relatively new development in the field of deep learning and artificial intelligence.

A technique giving a machine the capability to learn the patterns of the human language and continuing it by creating sentences word for word.

A transformer model alone is like a newborn not able to do much up until it learns from its parents.

Transformers learn by reading through billions of examples to become creative.

That’s why it is called “pre-trained” on a large body of human language.

We talk about trillions of words.

I will also talk about models here and there.

We are not doing high fashion here, hence, try to think of a model as a snapshot of something.

Including all experiences, learnings, and developments it went through.

Think about how you woke up this morning, this single moment reflecting and including your whole life full of experience and learning which made you the person you are today.

Imagine you take a snapshot of this moment with all the previously influencing events. This is a model pre-trained on your experiences.

Let me know if this made the GPT-3 model a little bit more understandable to you in the comments or in a DM.

We are TextCortex an AI startup that takes away 80% of your writing work.

The history of OpenAI

Why is OpenAI important in this context?

They were the lighthouse capable of gathering crucial interest in the field.

Drawing people from all background into the space from inquisitive academics, serious corporations, creative entrepreneurs all the way to single individuals who all want to be part when transformer technology makes history.

Let’s start this storyline with those who have created GPT-3.

End of 2015 - The story starts as with so many innovative developments we see today with the technoking himself, Elon Musk.

Together with Sam Altman and a few other investor buddies, they pledged a whopping US$ 1 billion to build a non-profit organization that would freely collaborate with the research community and spearhead the ethical development of AI.

2019 was a truly eventful year for OpenAI.

After Elon Musk left the organizationM icrosoft got involved with another billion USD transforming the company into a commercial-driven for-profit organization.

While OpenAI got ready to market, one of their models started to make waves -  - back then proclaimed of being too dangerous to release to the world.

open-ai-news
https://www.theverge.com/2019/11/7/20953040/openai-text-generation-ai-gpt-2-full-model-release-1-5b-parameters

Sidenote: The dangers mentioned above was one of the reasons for me to move to space while I was still studying machine learning at university.

linkedin-dominik-lambersy-post
https://www.linkedin.com/posts/lambersy_gpt-transformer-nlg-activity-6909166441522987008-u_mS?utm_source=linkedin_share&utm_medium=member_desktop_web

In 2020 they released GPT-3 to the world.

An AI model almost 116 times larger than its predecessor with 175 bn. parameters*.

Initially, only a few handful of selected institutions had access to experiment with it up until recently when they opened GPT-3 up to everybody willing to pay the expensive buck for its capabilities.

  • think about parameters as the size of the AI brain complementing and helping you.

The emergence of AI copywriting

GPT-3 has motivated a lot of product builders to create a pleasantly looking user interface on top of GPT-3. I just mentioned a few of them in the beginning.

However, we observe new launches of the ever so identical product on a weekly basis.

After seeing the 40th tool, I stopped collecting information about them to my knowledge base...

I saw some of them claiming they are differentiating themselves by being the better “communicator to the API of GPT-3”.

What does that mean? Is it an actual point of differentiation?

API is short for an application programming interface.

Sounds complex, but do you remember how the big oracles in movies always had a gatekeeper talking for them?

The API is such a thing for computers.

All these AI copywriting companies are making a pilgrimage to the gatekeeper of GPT-3 to send their user’s desire and receive an answer to it.

Back to the point of being the best communicator to the gatekeeper API of GPT-3.

While communication is crucial, it doesn’t effect the process of the oracle when creating.

Only OpenAI can improve the artificial intelligence creating for you.

However, GPT-3 is not the only oracle on the block anymore.

With the increased interest, there is a variety of different oracles available and at TextCortex it is our daily bread to train them on our own data to achieve their best for their respective purpose.

4 things to be careful with when using GPT-3 writing tools

As I said we have seen those AI copywriting tools pop up like mushrooms on a rainy forest floor.

Most of them with the single motivation - make quick money.

That becomes a problem, when you are regarding the stability of the company behind the software.

We made some estimations and taking an example in our power users, their operations would cost around 100 USD per month for a single user.

My condolences to the ones who have started lifetime deals only to realise that GPT-3 doesn’t come for free.

Sad for the customers who bought those to now stand in front of the shut door with software that is not answering back.

Also, large AI companion companies like Replika AI with their 7 million users have moved away from GPT-3 because of the limitation of not being able to influence quality while simultaneously paying a high operational cost for the sake of being locked in dependencies.

1. Don’t jump on the cheapest (lifetime) deal

As with many things in life buying cheap is expensive.

So are lifetime deals.

I have seen many users reach out to us because at one point the software they used to work with either shut down customer support or in itself wasn’t working anymore because it had a fundamental operational flaw.

Be Careful... with paying the quick buck to somebody with dollar signs in their eyes.

2. Don’t fall for the # of templates trick

Many advertised templates are mere placeholders in order to get an idea of what you might be interested in.

This is a common complaint we observe. A bad “instruction to creation relevancy” or an ever-repeating creation pattern.

We actively ask our users in close conversation within our communities what they want.

When we see enough interest in a format, we dig deep, we gather data, we train our own AI transformer models and offer our communities something sustainable.

3. When everybody uses the same it can hurt your rankings

ai-generated-content-google
https://www.searchenginejournal.com/google-says-ai-generated-content-is-against-guidelines/444916/

As we are speaking, all these rule-based software which claim to be AI but ultimately just push and spin your inputs through a cookie-cutter process are getting hit.

Even though modern AI technologies like GPT-3 feel incredibly creative and natural, it gives you a sense of security assuming detecting their creations would be a challenge.

However, if too many people are using a single pattern to create it can be reverse engineered.

We see providers running GPT-3 offering you to write 10.000 Blog articles per month.

Those bad actors will just spiral out more trace to come up with a solution.

Currently, we think that a relevancy through content (RTC) metric might be leveraged to detect usage of AI generated content.

Let me give you a comparable situation in the market for mobile phones.

If you were to offer a service, build an app or attack a system for which one would you go?

Apple’s iOS with apx. 27.5% or Androids 71% global mobile operating system share?

So think about what Google’s first target will be when tackling AI-generated content.

It will come to your advantage to use purpose-driven models which are experts in their field.

Next to it you should look for as much customisability as possible. We for example offer you to leverage different creativity engines.

4. Instability when the infrastructure updates

Those large language models are developing continuously.

Any update and training the underlying infrastructure effects the output of the text quality.

As most AI copywriting tools are reliant and dependent on the many variations of GPT-3 a change in their infrastructure disrupts the quality.

It takes time until they have found “their communication to the oracle again”. ;)

Why GPT-3 is only the beginning of natural language generation and how we do things differently at TextCortex

I was spoiling it quite a few times now already and you can guess my answer to whether we achieved the Olymp of natural language generation with GPT-3?

It’s a no, we haven’t.

With no hesitation, I say it again GPT-3 was the bold step that made waves calling individuals, academics, businesses, and governments to the field.

Nonetheless, it is the beginning of a new era. Currently we see an arms race in creating AIs with ever more parameters, ever-bigger brains.

AI-gpt-3-future

Coming back to my transformers (the movie) narrative Microsofts Megatron-Turing model.

A model with a brain 3x the size of GPT-3.

Using its 530 bn. parameters to write a description for your product or your blog article is like building a coal power plant for the only purpose of charging your smartphone.

It doesn't stop there.

Rumours emerged that GPT-4 will be in the trillion parameters.

Google announced to have achieved a model in the trillions already.

The Chinese Wu Dao model is there as well.

Does that mean they are 10x better than GPT-3?

Does greater parameter size come with better creation?

Does great power come with great responsibilities?

One thing is sure: with great parameter sizes certainly comes great power consumption.

Building, training, and operating such large language models are a disaster for the environment.

Remember how I talked about “pre-trained” models earlier? It doesn’t help to create a brain that outsizes the amount it can learn from. The smartest toddler is limited to learning what it can observe from its environments.

robot-and-human-learning-from-book

We have huge respect for the OGs of the NLG space from OpenAI to DeepMind to EleutherAI to AI21 Labs.

All pouring massive efforts and resources into bringing humanity a step forward.

We are looking to join them in their quest and despise those who are thirsting the quick buck from the merits of their work.

At TextCortex we like to do the hard work.

Because no lasting and competitive value is built by taking the shortcut when it regards your very core.

The quality of creation.

And while GPT-3 is a generalist, a true jack of all trades, it is already immense in size for the use case of AI copywriting.

Like humanity has gone from generalisation to specialisation to develop and arrive where we are today.

This will come to natural language generation as well.

That’s why at TextCortex we are not developing a one size fits all model. We build small purpose-driven models establishing and orchestrating a network.

Because a “one size fits all", simply doesn’t fit all.

Instead of one massive trillion parameter AI model we build a network of 100s and 1000s of models which serve you as an expert AI writing companion.

We are training, building, testing, developing, experimenting and deploying our own AI models on our own carefully selected knowledge and data.  

Teaching them to become specialists in a world of generalists.

Instead of offering 100s of pre-writing templates which often miss the relevance to your needs we are releasing templates that we specifically explored, gathered knowledge, and trained our AIs to master.

Choose the proper AI writing tool for yourself

It is a simple question of what would you prefer?

A professional with one year of experience or one with 10 years of deep knowledge in the field you need?

man-training-in-shape-and-skinny-man

Does a lawyer who should write reliable contracts also needs to know how to write the most engaging blogs about chocolate chip muffin recipes?

With our smaller purpose-driven AI models methodology, we can teach them why, how and what certain formats and writing styles look like.

For example our long-form model was trained on more than 10 million highly engaging blogs.

numbers-of-engaging-blogs

They understand that long-form content consists of an engaging intro to an informative main part with a conclusion bringing everything to a point.

Adapting AI models to our user needs is our daily bread.

Every day we strive for enlarging this network of AI experts and we bring them into every text box you need them to be.

You are an avid writer?

Check out our chrome extension hyper charging your creation in every text box there is.

You are a developer looking for an NLG AI solution? Approach us to test our API.

You want to contribute data to our cause? Please be our guest.

Let’s build creative AIs with purpose!