Will AI be a great alternative career path for me as a coder if AI learns to code?
17
165
460
2032
37%
There will be lots of well-paid AI job opportunities for me
51%
AI will not be a giant pain to work with
53%
I don't think working on AI will make alignment races worse
70%
I lose my engineer privilege

Manifold seems to find it likely that coding will be significantly automated by 2032:

This is a potential issue for me as a programmer.

One backup I see is that I know a lot about AI, so I could pivot to working within AI, which seems quite plausible if AI becomes big. However it runs into a bunch of potential problems:

  • There have to be well-paid AI jobs available for me: which might not be the case if e.g. AI development is very centralized so there's not much need for AI engineers, or if my sketchy AI credentials won't be well-regarded, or if cancel culture gets worse (perhaps due to being AI-powered?) and people find e.g. my racist comments and decide not to hire me.

  • AI has to not be a giant pain to work with: whenever I try to do AI projects on my own, it tends to be a demoralizing pain, e.g. training takes a long time, AI frameworks are annoying/buggy/poorly documented, etc.. This is potentially exacerbated by me being ADHD or something, but whatever the answer is, it might mean I don't want to work in AI.

  • AI might be destroying the world: for a long time I expected AI alignment to be deadly. I've recently gotten a very promising idea for how to solve the worst part of the alignment problem, and so I'm more optimistic right now, as it means I could probably work within AI without contributing to destroying the world, but otherwise that seems like an ethical issue.

I think of myself as having "engineer privilege": I can sort of just bumble around doing random stuff in my life, and then because there's such a huge need for high-g engineers, I will have plenty of money even if the stuff I do is otherwise stupid. Engineer privilege is extremely convenient and I would like to not lose it.

Resolution criteria

Questions are all conditional on Will I find AI to negatively impact my job opportunities as a coder by 2032? resolving yes; otherwise they will resolve N/A.

There will be lots of well-paid AI job opportunities for me: Resolved subjectively.

The main criterion is through comparison with my current situation where e.g. I can just interview for Big Tech and earn higher than the 95th percentile for my age, and I can just log into LinkedIn and have a bunch of recruiters contact me with comfortably paid job opportunities anywhere that have super easy requirements, and I can just talk with friends and acquaintances to get job offers in cool startups. It doesn't have to work exactly the same as it does now, just qualitatively it should be at least this good. ("At least" rather than making a margin of error because being relatively new in the job market, this would be expected to be like the lowest point of my career.)

I start counting this criterion from the end of the 6 month period mentioned in the "Will I find AI to negatively impact my job opportunities as a coder by 2032?" question, and it has to extend for at least 6 months.

AI will not be a giant pain to work with: Resolved subjectively.

This question only resolves if I regularly spin up new AI projects, similarly to how I currently regularly spin up new programming projects; otherwise it resolves N/A. If I expect the AI projects to be P A I N, in the same way I do now, then this question resolves NO, otherwise if I often find it easy to just whip up some AI to do something, then it resolves YES.

(I personally expect this to resolve YES, through a combination of "I get more used to how AI code works" and "Society sets stuff up that makes AI easier to use" and "I paper over the worst pains with my own solutions".)

I don't think working on AI will make alignment races worse: This can resolve YES for many reasons. Maybe the alignment problem gets solved, maybe there is a good job that doesn't improve capabilities, maybe I can get a job working on alignment, etc.. It is sufficient for there to be 1 good job available to me, it doesn't have to be that most AI work isn't making alignment races worse.

I lose my engineer privilege: Resolved subjectively. It is similar to "There will be lots of well-paid job opportunities for me", but more based on the practical consequences for me. First of all if there is some other engineer job that doesn't quite fit into AI or coding, then this question might resolve NO. But second of all if there is some other weird way that I might lose it other than due to societal changes (e.g. maybe I get brain damage), then this question might resolve YES.

Other questions: I've set anyone able to add new answer options because that seems kind of interesting. I will resolve those to the best of my ability, assuming they are relevant to the question, otherwise I reserve the right to resolve them in whichever way is most annoying to the person who added the answer option. If you do add another answer option, please post your preferred resolution criteria, though I may deviate from them or make changes to them.

Get Ṁ200 play money

More related questions