OpenAI taught the GPT-3 generative language neural network not only to complete the text, but also to edit it. For example, it can be asked to rewrite the text, explaining the necessary changes in simple terms. A description of the new feature is available on the company blog.
GPT-3 is a generic generative language model developed by OpenAI in 2020. It was pre-trained on 570 gigabytes of unmarked text from the Internet and thanks to this it was able to get a general understanding of how meaningful text should look like in the language with which it works. After pre-training such a model, you can show just a few examples in order to teach a specific task, and not necessarily textual in nature – for example, it turned out to be able to perform simple arithmetic operations.
GPT-3 allows you to write texts, but in fact, the ability of the original model is reduced to predicting the next word in a sentence, says N+1. Gradually, the developers refined the neural network, mainly adapting it to practical tasks, including writing code according to text instructions. They also teach it new possibilities. For example, GPT-3 has recently been taught to search the Internet for answers to questions and back up claims with references to sources.
Now the OpenAI developers have taught the neural network not only to generate and supplement text, but also to edit it. Editing works in two modes. During the actual editing, the user gives the model text or code and describes what needs to be done with it in plain language, for example, the algorithm can be asked to rewrite the sentence so that it becomes in the first person.
In the second mode, the model does not edit the text, but inserts an addition at a given place; for this, it needs to be given text that comes before and after this place. For example, GPT-3 can be asked to write a logical link between two paragraphs in text.
NIXSolutions notes that OpenAI has trained new model capabilities for text and code generation. They have been tested on the GitHub Copilot service writing the code for a while now and are now available in the beta API for GPT-3 and Codex.