Will an AI-first operating system be released in 2024?
Standard
30
Ṁ1271
Jan 1
40%
chance

This market will resolve to YES if a consumer-oriented, general purpose operating system is released in 2024 that features embedded LLM technology surpassing simple "assistant" functionality. It should offer unique features that a product like ChatGPT could not accomplish alone.

This could be an update to an existing product like macOS or Windows, but it must be a native feature of the OS and cannot rely exclusively on cloud functionality.

Using transformer architecture to power an autocomplete does not count. The functionality must be a meaningful improvement/innovation over the standard consumer OS feature set.

Get
Ṁ1,000
and
S1.00
Sort by:
bought Ṁ100 YES

@quantizor how does https://github.com/agiresearch/AIOS fare?

AIOS (LLM Agent Operating System) is a new agent orchestration framework that embeds large language models into operating systems, creating an OS with a "brain" that can "understand".

AIOS is designed for optimal resource allocation, facilitating context switches, concurrent execution, tool services for agents, access control, and providing a rich toolkit for developers.

AIOS is based on several key agents that orchestrate the others. It consists of:

an Agent Scheduler for prioritizing agent requests,

a Context Manager for managing interaction context,

a Memory Manager for short-term memory,

a Storage Manager for long-term data retention,

a Tool Manager for managing external API tools,

and a Access Manager for enforcing privacy and access control policies.

Those agents communicate with the AIOS SDK in an interactive mode, along with non-LLM tasks coming from the OS Kernel (with the Process scheduler, the memory manager, etc

Rabbit OS might fit the bill for this market: https://www.rabbit.tech/rabbit-os

Waiting for more detail on how the "LAM" works; if that all happens off-device and the R1 is just a thin client it doesn't count for this market.

Would Windows 11's native AI Copilot feature be an example of embedded LLMs that surpass simple " assistant" functionality, if it didn't rely on cloud models? From marketing material, it looks like it can "change computer settings, organize your windows for you, generate images, help you shop, and more" with integrations across a half dozen core Windows apps so far (Paint, Photos, Clipchamp, Snipping Tool, etc).

@AndrewBrown I haven't seen any evidence that Copilot is embedded (does not reach out to the cloud for everything other than searches.) This market is specifically for embedded functionality.

predicts YES

@quantizor oh yeah, it's definitely a cloud offering. Just trying to get a sense of the "AI-first operating system" portion of the question, what features make the cut vs don't etc

@AndrewBrown At a minimum, an embedded quantized LLM that can run inference on consumer hardware. It's ok if the model is customized and different versions are downloaded according to consumer language settings etc, but it should run locally and cloud access only for activities like searching and remote third-party API integrations.

Windows 12 might be an example of this, particularly because everyone is talking about it requiring new hardware. I'll bet YES.

To people betting on NO, is it because of the time required to build an OS or moreso just from market trends?

@GeorgeIngebretsen For me, the time required. I'm sure it'll be done, though I'm sceptical it'll be good or popular