Or whatever mobile device has replaced phones by then.
Needs to return responses within a few seconds.
Inspired by https://twitter.com/Grayyammu/status/1635574200621465601
🏅 Top traders
that was quick. i guess i shouldnt be surprised
@cloudprism I think it was already true when I created the market, I just didn't know.
@IsaacKing oh wow, and yeah I just meant gestures at ai acceleration
Does anybody believe this market should not yet resolve YES?
@IsaacKing i guess everyone is in favour of YES
A mobile phone with GPT-3,
Running an LLM, oh my, oh me!
By April 2025, will it be?
Or is this just a tech fantasy?
@IsaacKing llama 7B on pixel 7 support: https://github.com/rupeshs/alpaca.cpp/tree/linux-android-build-support
Should be faster than 1 word per second (Judging by the fact, that modern PC's run it at 5 words per second and a raspberry pi 4b runs it at 1 word per second, it should run somewhere near the 2.5 words per second mark) @IsaacKing
@firstuserhere Is LLaMA comparable to GPT-3?
@IsaacKing yes, its qualitatively similar to gpt3.5. in fact, the 65B model outperforms GPT3 on many tasks despite being way smaller (more than 10x smaller) (and trained on only publically available data)
In fact, from the abstract of LLaMa paper:
"In particular, LLaMA-13B outperforms GPT-3 (175B) on most benchmarks"
I can run Bing AI on my Pixel V. Does that count?
@CarsonGale You have to actually be running the model itself, not a webpage that submits API calls to the model over the internet.
@IsaacKing makes sense
Yeah it's slow for now but someone got 6B LLaMa to run on a pixel 6
Isn't it enough to resolve yes @IsaacKing
@ValeryCherepanov How slow is it?
@IsaacKing the thread mentions 5 mins but that is very primitive as it isn't making using of the pixel's Nn chip
@firstuserhere Much too slow.
@IsaacKing actually i just reopened the thread since the day i posted it, and someone apparently did some sort of porting? Let's see
@firstuserhere @IsaacKing oh wow under 30 seconds for .cpp rewrite, this is insane (see demo in embedded tweet)
@firstuserhere and it's not even on a pixel 6, its a 5 and doesn't have the 6's tensor SoChip which presumably will speed it up quite a bit
@firstuserhere @IsaacKing fast enough? :P
@firstuserhere Hmm, that still took quite a while.