Gigacasting avatar
Gigacasting
closes Jul 1
Is GPT-4 a mixture of experts?
37%
chance

MoE

Sort by:
cos avatar
cosbought Ṁ0 of YES

It's definitely an ensemble model, but I don't think it's a mixture of experts (i.e., I believe it consistently accesses all subnetworks without gating).

firstuserhere avatar
firstuserhere

Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar.

ShaiNatapov avatar
Shai Natapov

Your markets are often very vague, which is why I avoid betting

IsaacKing avatar
Isaac King

@ShaiNatapov I think that's intentional, Gigacasting seems to enjoy controversy.

IsaacKing avatar
Isaac King

And given how many traders they frequently get, I guess clear resolution criteria aren't something most traders actually care about? I find that weird too.

Gigacasting avatar
Gigacastingis predicting NO at 51%
NoaNabeshima avatar
Noa Nabeshimais predicting YES at 57%

ForrestTaylor avatar
Forrest Taylor

What does MoE mean?

NoaNabeshima avatar
Noa Nabeshimais predicting YES at 49%

@ForrestTaylor The model includes something like this: https://arxiv.org/abs/1701.06538

NoaNabeshima avatar
Noa Nabeshimais predicting YES at 49%

@ForrestTaylor Mixture of Experts

RobinGreen avatar
Robin Green

What does this mean? Basically a bunch of experts in a trenchcoat, pretending to be an AI? Don't be silly.