On January 4th, 2023 Term Sheet, a well-known financial newsletter by Fortune (typically regarding PE/VC) posted a series of predictions regarding the calendar year 2023.
One of these predictions was the following:
“I think a lot of boring philosophy is going to become important next year—like what is the meaning of truth and how do we know things… You have two problems [stemming from generative A.I. and large language models] that I think we’re going to face next year. One is that the amount of lies and bullshit that we are subjected to is going to increase exponentially, just because bad actors will use A.I. to generate all sorts of horrible garbage. But the second problem is that most of the use cases that people are thinking about—what A.I.s are going to solve—aren’t actually going to come to pass until we get a pretty different type of technology that is capable of actually reasoning, rather than just auto-completing words… It’s going to force people to think about epistemology and stuff that investors haven’t thought about. College philosophy majors will become employable.” —Phil Libin, co-founder and CEO, All Turtles and mmhmm
I will not attempt to initially define all resolution criteria in this market and will instead attempt to handle any nuances/complications/data feasibility as it arises. If by end of 2023 I think it is not possible to confidently resolve this market in the spirit in which it was intended, I reserve the right to resolve as "n/a".
Any clarifications to the resolution criteria will be listed below, along with the applicable date:
[TBU]
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ89 | |
2 | Ṁ35 | |
3 | Ṁ21 | |
4 | Ṁ14 | |
5 | Ṁ13 |
People are also trading
Actually, as matt will tell you, phil majors are already employable.
However, they therefore can't Become employable. So clearly, resolves no.
@jacksonpolack Yglesias will never get tired of reminding people he majored in philosophy at Harvard.
(He’s right though.)
@NicoDelon what did you want clarified exactly? Or just looking for preemptive clarification?
I'm planning on sitting down and reviewing this series after year-end most likely.
I will not attempt to initially define all resolution criteria in this market and will instead attempt to handle any nuances/complications/data feasibility as it arises. If by end of 2023 I think it is not possible to confidently resolve this market in the spirit in which it was intended, I reserve the right to resolve as "n/a".
No one asked so I’m simply asking what the resolution criteria are. Robert hinted at something but it wasn’t taken up.
Given most companies AI Ethics teams are focused on DEI and most companies AI alignment groups are focused on empirical research, I think this is unlikely to happen in 2023.
While philosophy is certainly valuable for AI alignment (and notKillEveryoneism), I don't see that as a current trend showing up in hiring.
For resolution, could you maybe look at job listing containing content along the lines of "Requirements: Masters or PHD in philosophy" and "Machine Learning" in 2022 compared to 2023?
Given that Microsoft removed their AI ethics department it's hard to know.
It's an area I'm in though and I have given talks on the ethics of certain AI implementation... I really don't know. I think it won't necessarily be more valued but will lose much less value when compared to other majors.
If it rises in the ranks of employability compared to other falling majors is that a positive resolution?
@Fivelidz ethics is one small subfield of philosophy, though — it seems like the quote referenced in the description is talking about epistemology, metaphysics, etc.