closes Jan 1, 2026
Will some U.S. lawyers be negatively affected financially due to AI by end of 2025?
70%
chance

Requires at least 3 articles from traditionally reputable news organizations reporting that some lawyers have lost income, job security, or hiring velocity as a result of AI-based automation.

I won't be proactively searching for such articles - I will need to come across them organically or they can be posted in the comments / sent to me via Twitter message or other DM.

Sort by:
firstuserhere avatar
firstuserherebought Ṁ30 of YES

From GPT-4's technical report ... thingy:

The impact of GPT-4 on the economy and workforce should be a crucial consideration for policymakers and other stakeholders. While existing research primarily focuses on how AI and generative models can augment human workers, GPT-4 or subsequent models may lead to the automation of certain jobs.[81] This could result in workforce displacement.[82] Over time, we expect GPT-4 to impact even jobs that have historically required years of experience and education, such as legal services.[83]

firstuserhere avatar
firstuserhere

Found this tool - spellbook (https://www.spellbook.legal/) which is for assistance in drafing contracts. People who know more about this can comment on if they think this is relevant

firstuserhere avatar
firstuserhere

@firstuserhere not to the resolution of the market* but to the day to day job

citrinitas avatar
Anton

Rules lawyering: what happens if a lawyer loses their job due to an AI doing something unrelated to lawyering?

E.g. a US congressman who has passed the bar exam is not reelected, and a reputable news organization claims that it is because they "failed to adequately address the issue of AI-based automation"

E.g. a lawyer is fired from a law firm due to some sort of deepfake scandal

CarsonGale avatar
Carson Gale

@citrinitas i would not consider that example within the spirit of the question, so would not see that suitable to include within the criteria

firstuserhere avatar
firstuserhereis predicting YES at 33%

@citrinitas related question- what if a lawyer loses the job due to the model "dreaming" up a little fact that turns out to be wrong, or the lawyer being punished for blindly trusting the AI and not using good quality check?

CarsonGale avatar
Carson Gale

@firstuserhere This is on the line, but as long as the content in question involves legal work, I would think this should resolve positively (assuming it is happening to numerous lawyers / has been reported on by 3+ news organizations).

Mira avatar
Mira

There's a mismatch between the question and the description on all these: The question says "AI" but the description says "LLM" - a much more specific thing.

If a lawyer is replaced by Leibniz' dream of an automated law machine, but it uses GOFAI techniques like a massive Prolog database of facts, consequences, it would count YES according to the title but NO according to the description.

Can you clarify whether it is "anything that any marketing department slaps the term 'AI' on" or specifically LLMs?

CarsonGale avatar
Carson Gale

@Mira Makes sense - I think I will change the description in these questions to reference AI rather than LLMs. I can refund any traders that bet according to the more strict LLM definition. LMK if you disagree.

CarsonGale avatar
Carson Gale
CarsonGale avatar
Carson Galebought Ṁ10 of YES

Please note - I am modifying the question title to "Will some U.S. lawyers be ..." instead of the current wording, to be more in line with the description. If any traders oppose this modification over the course of the next day, please comment below and I will consider refunding your trades if your arguments make sense.

The resolution criteria of the question does not change as outlined in the more detailed description, so I do not expect any pushback.

CarsonGale avatar
Carson Gale

Stemming from this Tweet: https://twitter.com/ATabarrok/status/1609961833301180416?t=1gdXbVQTsZI7kliZM2enhg&s=19