Will an open-source uncensored LLM chatbot exist with outputs as good or better than current character.ai in 2023?
32
1.1K
830
resolved Dec 24
Resolved
YES

If it is open-sourced but payment is necessary for me because my hardware isn't up to the task then the market still resolves YES.

I will be the final judge of the quality of the output

ADDED 7/12/23 please let me know if anyone has any objections:

I must be able to figure out how to use it in less than an hour not including any download times. anything that takes more than an hour will not be included. It would be great if it could run on a server somewhere because I only have like 8GB of RAM.

If necessary I will create a list of model candidates after the close date and I'll try my best to try them. I'd be grateful if anyone had any suggestions for youtube tutorials on running models locally

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ148
2Ṁ48
3Ṁ43
4Ṁ26
5Ṁ19
Sort by:

Notice to all bettors, I've not yet to be able to run any model larger than 13B and figuring out how to use google colab or something like that to run larger models is quite daunting as I have not programming experience. I'm currently expecting this market. to resolve N/A or No as none of the models I've been able to try so far are as good as character.ai was.

@ShadowyZephyr feel obligated to ping you as the largest YES holder. Open to any arguments

imo no is not a good resolution because this has happened

@jacksonpolack fair enough, I feel like actually being able to use these models is important for resolution as well though; as I have to judge the quality. I really just don’t have the programming experience or the hardware

predicted YES

@DylanSlagh I literally just give you an example of a model that is better, and it's running ON A WEBSITE.

https://chat.lmsys.org/

Go to direct chat, and choose mixtral 8x7b.

Equal quality to 3.5-turbo which is better than characterai was at the time.

Also, if you don't have the programming experience or hardware, why did you make this market in the first place? "I'm gonna make this market about open-source AI, but I don't actually want to learn how to test open-source AI to resolve it, so I might just resolve it NO even if the thing happens"?

@ShadowyZephyr Ok sorry. You “just” mentioned that meaning 5 months ago right? Sorry that I didn’t remember that. I’ve tried it now and plan on resolving it YES. I’ll be honest I expected in march when I created this market that there would be uncensored LLMs usable to consumers. Sure there has been progress is open-source LLMs in 9 months but the availability to consumers has been dismal. The website that resolves this YES literally says “DONT USE THIS FOR PORN” in a big warning and while the models themselves aren’t limited by that I’d call that a pretty big negative to future usage and growth for that use case especially if they start enforcing their policy

predicted YES

@DylanSlagh The model itself is open source. I'm giving you the website because you said you don't want to have to deal with hardware limitations and time constraints, so you can easily verify that it is better than character.ai.

If you really wanted to use it for porn, you could use it locally. I'm not sure the website version is 100% uncensored, but there are versions that are.

So you can set it up locally easily and it's uncensored, you just said you don't have the expertise of that, so I gave an example to prove the model is really good enough to satisfy the criteria.

I've been using LM Studio for a little bit now. I don't think anything that my computer has run so far is as good or better than character_ai from a when I created this market. The system prompts don't even seem to work for me so it's quite very difficult to create a character. If anyone has any ideas for models to try out it would be much appreciated. Also if anyone knows how I might create a server for running these models faster that would also be very helpful. So far the best model I've tried is Synthia-13B

@DylanSlagh If I could somehow run the larger model sizes on a server this market would probably resolve YES

predicted YES

Guanaco-33B was removed from chat.lmsys.org but I think the fact that it was there should cause this to to resolve YES already since characterAI isn't that good afaik.

Of course someone could come up with something even better like an uncensored version of llama2. We still have 5 months left in the year.

predicted YES

Have you tried Guanaco-33b? There is also guanaco-65b-merged on nat.dev but not free

predicted YES

Guanaco is not very censored. There is also wizardlm vicuña which is 100% uncensored. Test those out

@ShadowyZephyr how you would recommend trying out wizard vicuna? I've tried vicuna-13b via the demo but its definitely heavily censored. the top results for searching for wizard vicuna are academic papers and the process for running local LLMs still seems a little too complex for me as far as I can tell unless I've missed something

bought Ṁ50 of YES

@DylanSlagh I don't know, I think you'd have to run it locally to get the uncensored version because it's only on Huggingface. You can try out Guanaco-33B on the chat.lmsys.org which is pretty uncensored but I don't know what they do if you break their TOS, maybe you'll get IP banned or something

bought Ṁ100 of YES

@DylanSlagh Try guanaco-33b from demo, not gpt-3.5 level but neither is character ai iirc

Idk how good character v1.2 is but I think this can happen

I tried this but I don’t think it’s possible to give it a system prompt or memory yet I definitely don’t think the output is better than Character.AI yet. If this got a web interface and ran on servers so it could be a bigger model than what my computer can run, that might qualify as YES https://github.com/antimatter15/alpaca.cpp

bought Ṁ20 of YES

@DylanSlagh how about now? We have even 65B models running ggg

@Messi haven’t got the chance to try anything out yet! Honestly will probably just wait until this stuff is more user friendly

Has anybody tried this yet? If I manage to try this out when I get home and it’s as good as this guy is implying then the market resolves YES https://twitter.com/brianroemmele/status/1637871062246649856?s=46&t=zMQts_D_wVMa7F5sRsxPCQ

@DylanSlagh On closer inspection it doesn't look like its available for download (yet)

Does the leaked Facebook model count?

@MartinRandall How can I try it out? as a chatbot

@MartinRandall I’ll try to get it to work when I can

@MartinRandall Yeah I don't even know where I would start with that. I'll have to try for someone to create a user-friendly version

More related questions