
Is the reason GPT-4 cannot rhyme or do arithmetic well due to BPE?
2
90Ṁ15resolved Mar 19
Resolved
N/A1H
6H
1D
1W
1M
ALL
I kind of want to run an experiment to see if the reason GPT-4 cannot rhyme or do arithmetic well is because of Byte-Pair Encoding. I think this could be done by training gpt2-small with and without bpe. The problem is that GPT2 small is not as capable so it won't be as impressive. Also maybe I'll do a literature search and find that indeed someone has already tried this. I am buying into the yes market so as to incentivize myself to actually work on solving this, but IDK if thats a good strat.
The test I'll run it on is this friend's website with prompts that should trigger rhyming. As shown below.
https://modelbehavior.ngrok.io?prompt=loco%2C%20poco%2C%20coco%2C%20ro&comps=%5B%5D
This question is managed and resolved by Manifold.
Get
1,000 to start trading!