Will good quality personalised AI novels be instantly and cheaply available by the end of 2027?

I'm imagining that I ask an LLM or similar AI system "write me a novel of [number] pages, in the [genre] genre, set in [setting] featuring [type of characters], and dealing with [themes]". Possibly even something like "in the style of [author or mashup of authors]".

When I say "good quality" that's just my subjective judgement. It needs to suit my tastes, it doesn't need to be up with my favourite books, but I need to enjoy it, think it was a good use of my time to read it, and not just be pushing through to the end so I can resolve this market. My tastes aren't especially refined. I normally read books between 300 and 1000 pages so I'll probably be looking for books in that range.

When I say "instant and cheap" I mean something like within an hour and for approximately the cost of a normal book. If you think the precise details of either of these are important, let me know and I'll set some definite criteria. I expect it'll be obvious one way or the other.

If such a service becomes available before the close date, I'll read a couple of examples and judge if I think they're good quality. If I don't feel the bar has been met by 2027 end, I might have to wait a month or two to give me a chance to read a couple of the best books 2027-end could generate. I'll try to resolve as quickly as I reasonably can, but I'm not the fastest reader and if the quality isn't obviously good then I might need to read more than one to be happy with my judgement.

I'm not wedded to the format of my example prompt above. But it needs to be natural language (or at least very user friendly) and quite short. For example, maybe you could get something with one reasonable length paragraph, or maybe you can get something a bit more personalised in two or three paragraphs. Again, if anybody would feel more comfortable with a more quantitative criterion let me know and I'll think about it.

I won't bet in this market, since it's so subjective.

Get Ṁ600 play money
Sort by:
bought Ṁ40 of YES

I think this is strictly easier than the movie market. So I don't see any reason not to sell some of my shares there to buy here when this one's trading lower.

predicts NO

@robm Great point. Though the movie version has more time than this one (by a year).


predicts YES

@dreev No it doesn’t. They only close two days apart. It’s part of scott alexander’s 5 year prediction post whose markets all resolve at the beginning of 2028

predicts NO

@DylanSlagh Ah, thanks! I jumped to the wrong conclusion from that "in 2028" in the title.

I doubt this will matter for resolution but I wonder if the AI could do much better by taking as many hours to write the thing as it takes you to read it, always staying one page ahead of you. From your perspective it's even more instant but the AI has hours or days or longer to keep making it better.

@dreev provided it was just one prompt and the user experience from then on was just reading, I'd count this. The important thing for this question is the user experience. The way in which that experience is achieved doesn't matter.

In outcomes where inference is slow/expensive for the highest-quality models, I can imagine it becoming commonplace for people to pay for inference runs that finish overnight for like $50. Compared to book prices nowadays, I don't know if this qualifies as "instant and cheap," but it's certainly fast enough and cheap enough that vast archives of these generations would be available. With respect to a highly personalized large product like a book, this would certainly be faster and cheaper than any similarly sophisticated custom product, and I think it actually meets the spirit of the question.

Overall, I'm worried about controversy in resolving this question in worlds where turbo models make mediocre novels ~instantly, but by waiting just a bit longer with the best models we get very compelling results.

@AdamK I would not resolve YES in the case of a $50 book overnight (in today's prices). But I would feel a bit uncomfortable. There is some amount of "spirit of the market" but I think that's too much. Then again, if we find ourselves in a situation where there are excellent books if you wait longer and pay more, and mediocre books that are cheap and instant, it depends on the quality of the mediocre books. I could see myself using the existence of the high quality ones as a kind of tie break.

Personally, my hope is that everything will be moving so fast that once we get to a $50 book overnight it's only a matter of months before we get a $1 book in ten minutes, so the chance of the end date falling in the awkward zone is small. But maybe that's naive.

quality is relative, if a system can infinitely produce passable mass market paperback tier writing, the marginal utility of that standard of writing diminishes, and people won't want to sit around and read it all day. it'll be similar to the feeling of generative content in video games that is all technically unique, but not intriguing enough to warrant continued interaction, even though relative to games of the past it is of much greater quality.

@brubsby that's a really good point! Maybe if I worry that's happened to me I'll need to reread some human-written books from today to recalibrate myself...

predicts YES

I'm a strong yes. I'd buy yes even if I knew that no huge LLM breakthroughs would happen in the next 4 years, just gradual incremental progress.

predicts YES

@jayharris Current context windows are at about few percents of a book in size. It seems realistic that they could increase by a couple orders of magnitude in four years.

bought Ṁ25 of YES

@mariopasquato Hah 😅 more like 2 times a book in size and easyishly extendable

predicts YES

@TheBayesian It depends on the book I guess? Moby dick stands at 200k words, chatGPT has a context window of 8000 tokens (up to 32000) https://help.openai.com/en/articles/7127966-what-is-the-difference-between-the-gpt-4-models so we are at 4% (16%). Assuming 1 token per word, but it’s more than that typically.

predicts YES

@mariopasquato Gpt4 turbo has a 128k context window, and claude 2.1 has a 200k context window

predicts YES

@TheBayesian I see. Then in 4 years they will definitely be able to fit a full book in the window. Buying more YES

I understand this market in that way, that the full response to your given prompt is generated by AI. That AI may also contain several different steps to improve and prolong the story autonomously. But there must not be a human involved in the process that redirects the story with additional prompts to conruct further development of the story or a human taking the whole story, giving it to an AI, asking it to prolong it over and over again.

Also I find the length of the book a quite important factor. So I find that requirement should not be shortened from the initial 300-1000 pages to a lot less.

Very interesting market. I'm excited to see where it's going :)

@Felle yes, your understanding is correct: I'm remaining agnostic about what's going on under the hood, but it needs to be automated from prompt to book, with no human input required along the way.

I agree with you on the length factor as well. 300 pages isn't a hard bound, but it definitely needs to be a proper novel. If it can do 100 pages reliably but goes to shit at 300 pages, that's a NO.