I recently heard of GPT-3 and I don't understand how the attention models and transformers encoders and decoders work. I heard that GPT-3 can make a website from a description and write perfectly factual essays. How can it understand our world using algorithms and then recreate human-like content? How can it learn to understand a description and program in HTML?
2 Answers
GPT-3 (and the likes) don't really have any understanding of the semantics nor pragmatics involved in the language. However, they are good at constructing text content similar to the contents created by a person (when the texts and the concepts are not too "complicated").
- 42,615
- 12
- 119
- 217
- 132
- 1
- 12
Two years after the original question was made I feel it is time to provide an updated answer:
How can it understand our world using algorithms and then recreate human-like content?
GPT-3, GPT-4 (transformers, Chat-GPTs and the likes) do not understand our world. They do provide answers based on any info available on the "training datasets" and on any additional "training dataset" used to "fine-tune" the model. Those answers recreate human-like content even if the models/algorithms do not understand nor have a representation of the world.
- 132
- 1
- 12