Will a uncensored LLM with a consumer-targeted webui be released before the end of 2023? (1000M subsidy)
7
80
1.1k
resolved Dec 7
Resolved
N/A

definitions:

"uncensored" meaning not explicity trained to stop sexually explicit outputs, or for instance outputs aiding criminal activity (e.g. how do i hot-wire a car). A good standard for this is stable diffusion 1.5

"consumer targeted webui" something easy to use in a web browser. the bare minimum would be equivilent to the automatic1111 webui for stable diffusion. it can't take more than 10 minutes for me to figure out how to install it (not including download time). Ideally it would run on server and be a simple website with a log in. It has to be able to run on my computer if it is running locally and at a usable speed. For reference I can run most modern fps on my computer.

Output should be equivilient in quality to character.ai ~6 months ago thats what I have the most experience with. For reference I think vicuna could meet this threshhold if it wasn't heavily censored. I mention vicuna because its public demo is an example of the bare minimum for webui

This could already exist for all I know.

Get Ṁ600 play money
Sort by:

I'm just going to N/A this one, not enough bettors to justify the effort it would take to figure out how to resolve it :/

predicted YES

@DylanSlagh This is pretty lazy, especially given that there's not any unforeseeable ambiguity in how to resolve it.

@adele yep, my apologies

I don't know about webui, but there is LM Studio that allows to use models just by clicking one button.

predicted NO

@AnT interesting! I’ll try it out when I find time! Do most models work well with normal computers?

@DylanSlagh I've successfully loaded 13B models, and still have half of memory (both GPU and RAM) free, so I guess they do!

@AnT I could successfully use a 7B model but it immediately mixed up context so it thought something it said was something I said. I used a 13B model and while it seemed pretty good it was too slow for me to use for any real use. I couldn’t get a 70B model to work; I don’t even know why I tried. At some point I might try and get a access to a server. That said, LM studio isn’t through a webui so it couldn’t resolve this market regardless