17

When ChatGPT is generating an answer to my question, it generates it word by word.
So I actually have to wait until I get the final answer.

Is this just for show?
Or is it really real-time generating the answer word by word not knowing yet what the next word will be?

Why does it not give the complete answer text all at once?

Rexcirus
  • 1,309
  • 9
  • 22

2 Answers2

9

ChatGPT is a conversational-agent based on GPT3.5, which is a causal language model. Under the hood, GPT works by predicting the next token when provided with an input sequence of words. So yes, at each step a single word is generated taking into consideration all the previous words.

See for instance this Hugging Face tutorial.

To further explain: while outputting entire sequences of words is in principle possible, it would require a huge amount of data, since the probability of each sequence in the space of all sequences is extremely small. Instead, building a probability distribution over half a million of english words is feasible (in reality just a tiny fraction of those words is often used).

On top of that, there may be some scenic effect, to simulate the AI "typing" the answer.

Rexcirus
  • 1,309
  • 9
  • 22
5

Why does ChatGPT not give the answer text all at once?

Because ChatGPT is autoregressive (=generates each new word by looking at previous words), as Rexcirus mentioned.

Is this just for show?

On https://beta.openai.com/playground, output words/tokens are displayed faster when using smaller models such as text-curie-001 than larger ones such as text-davinci-003. I.e., the inference time does seem to impact the display time.

https://twitter.com/ArtificialAva/status/1624411499375603715 compared the display speed of ChatGPT vs. ChatGPT Plus vs. ChatGPT Turbo Mode and showed that ChatGPT Turbo Mode is over twice faster to display the output, which further indicates that ChatGPT shows its response word by word due to its backend (computation time + autoregressive).

Franck Dernoncourt
  • 3,473
  • 2
  • 21
  • 39