In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
Mini
14
į¹€651
2027
35
expected

Resolves N/A if the number is not public/calculateable from public information, or if GPT-4 is not released by market close.

Get į¹€600 play money
Sort by:

It seems very unlikely that the release of GPT-4 will resolve this. However the market does not close until 2027, and I will leave it open until then in case the information is either (credibly) leaked or OpenAI decides to release it after the fact. If neither of those occurs the market will resolve N/A

(Epistemic status: Iā€™m doing all this calculation from memory so might be wrong) GPT-3 was trained on 3.2e23 FLOPs (general calculation is C=6ND, N number of params and D datasize), I think we should expect around twice the size and 50x the dataset size, in which case 3.2e25 FLOPs should be reasonable for GPT-3. This would imply a resolution of 32 to this question. 150 is way too high imo

More related questions