What will be true of Grok 3's parameter counts?
xAI recently open-weighted Grok 2, revealing that it has 115B active parameters and 270B total parameters (per internet claims in various places that I have not vetted closely). xAI also plans to open-weight Grok 3 in about six months.
Grok 3 was trained in a data center containing 100k+ H100 GPUs; xAI claims that Grok 3 was pre-trained on more than 10x as much FLOP as Grok 2 (training FLOP scales in proportion to active parameters, as well as the size of the training data).
N/A if Grok 3 is not opened in one year, and we never get confirmation or a credible report of Grok 3's params by that date.
Via Elon should be sufficient to resolve
https://www.reddit.com/r/mlscaling/comments/1oxz078/grok_5_in_q1_of_2026_6_trillion_parameter_model/