- What is NLG?
- What is GPT-3 and why is it so popular?
- What models does the GPT-3 use?
- “Training” and self-improvement of the GPT-3
- What are the capabilities of GPT-3?
- Can GPT-3 replace humans?
- Will GPT-3 affect SEO and content marketing?
- Text generated using GPT-3 in CONTADU
- A short tutorial on generating text in CONTADU
Although we may not fully realize it, Artificial Intelligence is getting closer to us. We have already mentioned it in our previous articles and there are also some examples showing where in our everyday life we meet with AI.
In this article, we will focus on one aspect of AI, which is the ability to generate text using the language model.
We’ll look at how writing text works with AI, and what its benefits are. We will also consider whether AI is already developed enough in this area to replace humans and what are the forecasts for the future in terms of its development. And of course, we will check how text generation can help with content marketing.
What is NLG?
NLG (Natural language generation) is a process of processing computer data, the effect of which is obtaining a natural language. To understand how NLG works more easily, we can imagine it to be similar to trying to turn thoughts into writing. A lot of factors contribute to the fact that we are able to create a coherent and logical statement. Algorithms work similarly: from a huge database of texts, they select and analyze, on the basis of convergence, the most optimal set of words, which they then arrange in the linguistic order (depending on the language of the content we want to obtain).
This is how NLG is made: by training a statistical model using machine learning, usually on a huge amount of text from open available on the web. Solutions that enable NLG are gaining more and more popularity, and text generation is gaining application in various industries related to text creation. We can distinguish several models that deal with text generation: the Megatron model, for which Microsoft and Nvidia are responsible, referred to as the largest artificial network of neurons in the world, the Chinese Wu Dao model and the currently most famous NLG model, i.e. GPT-3 developed by researchers from the company OpenAI.
What is GPT-3 and why is it so popular?
GPT-3 (Generative Pre-trained Transformer 3) is a language model created by OpenAI, co-founded by Elon Musk. Thirty researchers worked on the development of the latest version throughout the year. The main task of the model is to generate textual content, or more precisely: the purpose of GPT-3 is to predict the next word in the text while maintaining a logical course. The latest model, DaVinci, has been “trained” on 570 gigabytes of text (a number selected by OpenAI researchers out of 45 terabytes of data!), Which is really a huge amount of information.
These texts include data from the Common Crawl collection (about 60%), as well as millions of domains selected by OpenAI, such as Reddit, BBC, and Wikipedia. It is this huge amount of data that allows GPT-3 to understand our language. In terms of parameters, the largest engine that can be used, DaVinci, has as many as 175 billion neural network connections (the previous model, GPT-2, had 1.5 billion such connections), which allow the model to recognize text context and better understand our intention.
What models does the GPT-3 use?
The engine or GPT-3 models can generate and understand natural language. There are currently four engines available: DaVinci, Curie, Babbage, and Ada. The engines differ in the level of advancement and power, which means that they can be used for various tasks. DaVinci is the most developed and gives the best results in working with text, but Ada is the fastest and generates lower costs than the other engines. It all depends on the purpose for which we want to use the models offered by OpenAI.
“Training” and self-improvement of the GPT-3
GPT-3 is an autoregressive model, so it can improve itself with the help of statistical prediction. It is also worth adding that the “training” process, i.e. data analysis based on statistical convergence, was not supervised by OpenAI. So the company did not introduce any filters that would somehow limit or direct training through algorithms. This decision has its advantages (the model learns to understand the world like a human being) and disadvantages (much of the content available on the Internet is harmful and will not contribute to the development of the model), but remember that ultimately it depends on us how we will use the possibility of generating texts.
What are the capabilities of GPT-3?
It is mainly about generating everything that has a linguistic structure. Currently, the model is best at working with English, but it is likely that in the future, users of other languages will be able to benefit more freely from OpenAI. So the GPT-3 can answer questions, summarize longer texts, write articles on any topic, blog posts and even poems. What’s more: it can even create the computer code needed to write an application or website, or generate chords and melodies in a song. All you need to do is enter the command in the window and the algorithms will do the rest for you.
What is the quality of GPT-3’s effects? As the president of OpenAI, Sam Altman himself said: “The hype around GPT-3 is too big. AI will change the world, but GPT-3 is just a teaser.” Therefore, generating text using OpenAI technology is impressive today, because thanks to GPT-3 we can actually write an article that will not differ much from the text created by a publicist or journalist (this was the case when Guardian published an Article written entirely using GPT -3, the edition of which was not completely different from the editing of the editor’s text). Remember, however, that the possibilities offered by GPT-3 still require human supervision.
Can GPT-3 replace humans?
Well, for now journalists, copywriters, writers or programmers can rest easy: GPT-3 is still not advanced enough technology to replace all those who work with text overnight. Of course, we all know how fast the progress is in this type of field and we know situations like this when Microsoft laid off a large group of employees (50 people in the US and about 30 people in the UK) responsible for the news department, and entrusted their tasks to an artificial intelligence. But even in this case, someone was constantly watching over the effects of AI’s work. Without such control, it is still impossible to efficiently use the possibility of generating texts.
The future will show whether the machines will write texts and completely take over the role of journalists or copywriters. One thing is certain: artificial intelligence will definitely change even more (because it constantly changes) the world we know today. Take, for example, the language models. The GPT-3 can already do a lot and models such as the Megatron developed by Microsoft and Nvidia with 530 billion parameters (for comparison, GPT-3 has 175 billion parameters) or the Chinese Wu Dao 2.0 created under the leadership of the Beijing Academy of Artificial Intelligence with an unimaginable number 1.75 trillion parameters! Unfortunately, none of these models can be used instead human work at the moment, but all indications are that the future will belong to artificial intelligence.
Will GPT-3 affect SEO and content marketing?
As usual, when it comes to questions about the future, it is difficult to give an unambiguous answer. However, it is hard to believe that generating text in the near future will not affect the sphere of content marketing. The next year may show the direction in which the market is going, but there is no doubt that tools such as GPT-3 or DALL.E (another OpenAI model that creates photos using text descriptions) will strongly influence to industry development.
It is very possible that in the years to come, generating content that complies with the principles of even more semantic SEO will be the norm and it will come as no surprise to anyone. However, today, when we are just entering the era of artificial intelligence, it is not entirely obvious.
Over time, however, the use of GPT-3 and other language models in various industries will become more common, and only then will it be possible to estimate how new technologies have changed working with text and the entire content marketing world. That is why it is worth ensuring access to the latest language models today, so as not to oversleep when the revolution begins. We can already use the GPT-3 capabilities in CONTADU. Below, we will present somehow we can quickly create a text that can then be used in a content marketing strategy.
Text generated using GPT-3 in CONTADU
At the outset, it should be noted that the method of generating texts is still experimental. What does it mean? For example, it’s far from perfect: even if the text looks grammatically correct, it may contain inaccuracies, logical or factual errors. We, the users, have to decide whether the text we receive will be satisfactory for us. For this to be the case, appropriate prompts are needed, which will allow GPT-3 to meet our expectations in the best possible way.
CONTADU is already working on creating sketches with which the user’s intentions will be more easily transferred to the generator. We also need to edit it, and only finally focus on SEO optimization. And let’s also remember that OpenAI provide limits on the amount of generated content to make sure that each text is still created with human intervention. It’s time to generate the text!
A short tutorial on generating text in CONTADU
Let’s start by going to the AI generation assistant, which can be found in the Content Writer module. Then, in the window, enter the text fragment, on the basis of which the algorithms will generate the next part of the text.
We mentioned earlier that GPT-3 currently works best in English. However, in CONTAD, thanks to advanced machine translation, we can generate texts in over 120 languages. Let’s try to do this and see what the result will be.
After clicking, the algorithms prepare the text that we can add to your articles. In this study, we were able to obtain the following text:
Now that the text has been generated, we can either add it to our article (by clicking on the green button “Use text”) or, if this is not what we wanted, we can try again. Let’s assume that we like the text, so let’s add it to our draft. We can also make changes (for example, replace the word “days” with “months”).
Now let’s try to expand the text fragment. For example, we want to add a few advantages of using artificial intelligence in content marketing:
So we select the text, choose the option “[All] Expand” …
and we generate the rest of the list of benefits.
It is also worth paying attention to the “temperature” of the generated text. We can set it when we click on the thermometer while generating or extending the text. The lower the temperature, the text is more improved, but it is also much shorter and less unique. If the temperature is higher, then the text is also longer and more unique, but may contain more incorrectness. So it’s up to us what kind of text we want to get.
After creating the text using the assistant, we should optimize it according to SEO rules so that the text can get great results on Google. There should be no problem with this in the Content Writer module.
Well, that’s all we need to know about text generation for now. If you want to create industry articles, blog posts or product descriptions for an online store using GPT-3, we invite you to test CONTADU.