How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.
- 0 Posts
- 3 Comments
Joined 3 years ago
Cake day: June 4th, 2023
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
We have the term AGI because we sometimes want to communicate something more specific, and AI is too broad of a term.


They never claimed that it was the whole thing. Only that it was part of it.