Meta presented OPT: Open Pre-trained Transformer Language Models on May 2, 2022. At the time they publicly released several models, ranging from 125M to 66B weights, and stated that access to the 175B model (OPT-175B) would be supplied to researchers upon request.
The paper claims the 175B parameter model was "comparable to GPT-3, while requiring only 1/7th the carbon footprint to develop."
Will the weights to OPT-175B be available online by May 2, 2023?
Resolves YES if there is a public release by Meta, or a leak or a hack making it possible to obtain access to the weights.
A similar Metaculus market exists, though with an earlier end date:
https://www.metaculus.com/questions/10874/opt-175b-hacked-by-2023/
Matching terms to the Metaculus market, this resolves YES even if the leaked/hacked weights are sold, rather than made freely publicly available.
Dec 3, 10:39pm: Will the weights to the OPT-175B model be widely available by May 2, 2023 → Will the weights to the OPT-175B model be widely available before May 2nd, 2023
@traders Reopening in hopes that someone will post the answer, since the creator is inactive. This will likely be a mod resolve. Note that this is about things before the date in the title. Please post sources!
@EdwardKmett Can this be resolved now? I haven't heard any news to suggest the weights were released, so I assume it should resolve NO, but I'm not 100% certain.
@BenCottier The README in the OPT repo still only says "request access here" for 175B: https://github.com/facebookresearch/metaseq/blob/08cfa296d9b29494f7ae771c500880a78b908ca4/projects/OPT/README.md