Will a LLM of at least GPT-3 caliber be runnable on a mobile phone by April of 2025?
17
350Ṁ38k
resolved Mar 27
Resolved
YES

Or whatever mobile device has replaced phones by then.

Needs to return responses within a few seconds.

Inspired by https://twitter.com/Grayyammu/status/1635574200621465601

Get
Ṁ1,000
to start trading!

🏅 Top traders

#NameTotal profit
1Ṁ74
2Ṁ57
3Ṁ28
4Ṁ25
5Ṁ24


Sort by:
predictedYES 2y

that was quick. i guess i shouldnt be surprised

predictedYES 2y

@cloudprism I think it was already true when I created the market, I just didn't know.

predictedYES 2y

@IsaacKing oh wow, and yeah I just meant gestures at ai acceleration

2y

Does anybody believe this market should not yet resolve YES?

predictedYES 2y

@IsaacKing i guess everyone is in favour of YES

2y

A mobile phone with GPT-3,
Running an LLM, oh my, oh me!
By April 2025, will it be?
Or is this just a tech fantasy?

predictedYES 2y
predictedYES 2y

@firstuserhere
Should be faster than 1 word per second (Judging by the fact, that modern PC's run it at 5 words per second and a raspberry pi 4b runs it at 1 word per second, it should run somewhere near the 2.5 words per second mark) @IsaacKing

2y

@firstuserhere Is LLaMA comparable to GPT-3?

predictedYES 2y

@IsaacKing yes, its qualitatively similar to gpt3.5. in fact, the 65B model outperforms GPT3 on many tasks despite being way smaller (more than 10x smaller) (and trained on only publically available data)

In fact, from the abstract of LLaMa paper:

"In particular, LLaMA-13B outperforms GPT-3 (175B) on most benchmarks"

2y

It’s so over

2y

I can run Bing AI on my Pixel V. Does that count?

2y

@CarsonGale You have to actually be running the model itself, not a webpage that submits API calls to the model over the internet.

2y

@IsaacKing makes sense

2y

Yeah it's slow for now but someone got 6B LLaMa to run on a pixel 6

predictedYES 2y

Isn't it enough to resolve yes @IsaacKing

2y

@ValeryCherepanov How slow is it?

predictedYES 2y

@IsaacKing the thread mentions 5 mins but that is very primitive as it isn't making using of the pixel's Nn chip

2y

@firstuserhere Much too slow.

predictedYES 2y

@IsaacKing actually i just reopened the thread since the day i posted it, and someone apparently did some sort of porting? Let's see

predictedYES 2y

@firstuserhere @IsaacKing oh wow under 30 seconds for .cpp rewrite, this is insane (see demo in embedded tweet)

2y

@firstuserhere and it's not even on a pixel 6, its a 5 and doesn't have the 6's tensor SoChip which presumably will speed it up quite a bit

2y

@firstuserhere @IsaacKing fast enough? :P

2y

@firstuserhere Hmm, that still took quite a while.

1y
Comment hidden

What is this?

What is Manifold?
Manifold is the world's largest social prediction market.
Get accurate real-time odds on politics, tech, sports, and more.
Or create your own play-money betting market on any question you care about.
Are our predictions accurate?
Yes! Manifold is very well calibrated, with forecasts on average within 4 percentage points of the true probability. Our probabilities are created by users buying and selling shares of a market.
In the 2022 US midterm elections, we outperformed all other prediction market platforms and were in line with FiveThirtyEight’s performance. Many people who don't like betting still use Manifold to get reliable news.
ṀWhy use play money?
Mana (Ṁ) is the play-money currency used to bet on Manifold. It cannot be converted to cash. All users start with Ṁ1,000 for free.
Play money means it's much easier for anyone anywhere in the world to get started and try out forecasting without any risk. It also means there's more freedom to create and bet on any type of question.
© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules