
12Bish, anywhere from 10-15B is fine. Lower is also cool, a 1B parameter model that matches performance will resolve yes.
Related questions



@dmayhem93 I get it’s the resolution criteria it just seems very unlikely - OpenAi is super conservative with what they release


@Issc They've released some non-LLMs like Whisper and CLIP. The last LLM was GPT-2. https://github.com/openai
I didn't think it was likely either but Mira buying ~1,500 makes me reconsider

@sucralose I think they might just be trying to spike the market idk
All good points actually - still sitting on a no because I think they aiming more towards the safe and closed environment especially with regulation kicking up


@rockenots rounded yes, mostly to account for the awkward reporting of model weights that occasionally excludes embedding layers, or rounds it.
@dmayhem93 clearly I should have read the description, but having too many parameters seems like an odd complaint

Related questions







