
A pure binary neural net is a neural network represented as pure combinatorial logic. Naively unrolling multi bit floating point/integer multiplication to binary does not count, the weights and activations must be binary. I will arbitrarily declare that integer weights of 3 bits or fewer are permitted to be unrolled. But note that the whole model end to end must be reduced to logic gates.
For example [Unrolling Ternary Neural Networks](https://arxiv.org/abs/1909.04509) almost satisfies the definition but uses patches and hence does not quite count. (Also I'm interested in language models not image models.)
It does not matter how the model was trained, only that it has adequate accuracy when in binarized form.
Resolves YES if a pure binary language model with bits per byte accuracy on The Pile better than or equal to GPT-2 (1.225 BPB) exits. It does not need to be publicly accessible as long as it is reported by a credible source (Deepmind, OpenAI, ElutherAI, etc).
Resolve NO if there is no credible report of such a model.
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ616 | |
2 | Ṁ278 | |
3 | Ṁ159 | |
4 | Ṁ53 | |
5 | Ṁ46 |