Yes, because at the beginning there was tons of room for improvement.
I mean take openAI word for it: chatGPT 5 is not seeing improvement compared to 4 as much as 4 to 3, and it’s costing a fortune and taking forever.
Logarithmic curve, it seems.
Also if we run out of data to train, that’s it.
Yes, because at the beginning there was tons of room for improvement.
I mean take openAI word for it: chatGPT 5 is not seeing improvement compared to 4 as much as 4 to 3, and it’s costing a fortune and taking forever. Logarithmic curve, it seems. Also if we run out of data to train, that’s it.