
People are also trading
Not in the sense that it will replace all types of job families. If we take a look at some similar historical events, we will notice that every new technological advancement humans achieve leads to the creation of new types of jobs, new challenges, new goals, etc.
We stress a lot about job replacement when what we will actually witness is a job transformation. For example in software development, the things that used to take full engineering team efforts and months of development 10 years ago, can be completed in a few days now. Same way, things that take us months to complete now, would be achievable within days in 10 years.
@trixwit you unfortunately get to a point with this tech which is unlike any seen previously, which is the simultaneous obsolescence of a large portion of humans (at least 30% but imo would be unsurprising if 70%), with the biggest to date required training time to overcome this obsolescence. All tech in the past either had the portion be relatively small or was simply a tool that could be quickly adapted to, but it doesn't look like either would be the case this time.
(By large adaptation time I mean e.g. on a scale of a college degree)
As an example in horses I think you could think of it as better saddles, then a carriage for horses, but then eventually the early car prototype.
I like the horses analogy
Yes totally agree with you. I misread the original question and thought it was about AI instead of AGI.
My initial comment might sound like I am not taking the risks of AI seriously. I actually think AGI will blow away many jobs that we love having including programming.
If I had to give my opinion, I don't think AGI will happen within the next 10-year span. There will be intermediate generations of AI between the current AI and the AGI. Which will make the impact more gradual and transformational. So I believe the next phase in the industry would be augmentation between human intelligence and AI (carriage for horses phase?) and may last longer than we expect. Narrow AIs are bad at dealing with the complexity, unpredictability, and dynamism of the real world.
@Soli AGI = no human or group of humans is capable at beating the computer at any intellectual task.
@Soli Since AGI might be poorly defined or misleading (people equating it to ChatGPT), I would eliminate the term AGI altogether, and perhaps use the broader term AI instead.
@nsokolsky This seems like a very high threshold. Do you have any sources or additional explanations for this definition? For example, there are a variety of anthropocentric tasks where it may be easy for the most elite group of humans to maintain an advantage, even if AI exceeds the average person/professional 95%+ of the time.
For example a professional comedian is vaguely/arguably engaged in an intellectual task. If the greatest comedian outperforms AI in making people laugh, does that automatically mean AGI hasn't been achieved even if it outperforms people in law, medicine, music and many other intellectual fields?
@Maniuser yes, if there exists at least 1 comedian or a group of comedians that is objectively better than the best AI, we haven't achieved AGI yet.
@Maniuser just assume agi can replace human labor. All human physical labor. Grunt work jobs. Plumbers, construction workers, cleaners.
Ironically enough, the complexity to maneuver in real world is less than the complexity of doing high abstract leadership and planning jobs
I don’t think it will happen in the next 10 years and possibly won’t happen in my lifetime. If it does happen I foresee just two possible outcomes:
We all die
We all live happily in the singularity
#1 will happen to every human by default by year 120 of their life at the latest and is already a part of everyone’s planning. #2 doesn't require planning, as the machines will do everything for you in that scenario.
The suggested answers are flawed. "I don't believe AGI will impact my life in that timeframe" is not an explanation for "Not At All".
AGI doesn't have to come in the next 10 years for its eventually expected arrivap to affect one's plans for the next 10 years.
@WilliamKiely you have a point i also forgot to add other as an option and skip but its not a scientific study so 🤷🏻♂️
@TimothyBandors Can you elaborate on how it would hit infosec more than all other jobs, or at least knowledge work ones?
AGI is probably one of the most important factors in my decisions.
I was more comfortable with laying off software engineers because I don't foresee the need to rehire them if the cryptocurrency industry ever recovers, because advanced LLMs are already taking over development tasks. Software engineer salaries are simply too high right now for the work the typical engineer puts out.
I'm looking to sell my company, to get capital to invest in nVidia, because I think there's more money to be made by simply buying nVidia stock than there is in continuing to operate the business.
I'm going to initiate a lawsuit against BlockFi CEO Zac Prince pro se, right before the statute of limitations expires. I can't afford to pay an attorney because he stole $3.3 million from me, but the longer I delay, the greater the chance there is that I can make suing him and Griffin Tiedy and the other lenders that defrauded me a full time job by using advanced LLMs as assistants. I just need to delay the case long enough for AI tools to become more capable.
@SteveSokolowski This comment, plus glancing through your twitter, suggests that you’re in an unusual (though not unique) situation of having a lot of human capital, time, and clarity of mission. Critically, you feel like you’re “in the hole” by default and are taking big swings to get back. I’m not expressing this eloquently but it contrasts with many SWE’s situations of “I’m primarily motivated to protect (and carefully grow) my money/family/health”.
Am I understanding that right, and what do you think the implications are in your life strategy?
@MatthewRitter That's an interesting take. The reason is because I don't have that family and wealth to protect.
When I was 20, I decided that I would follow the "FIRE" strategy. I worked 60 hours a week and saved half or more of my income. The plan was that instead of working for 40 years, I would do almost nothing but work for half that time, and then do the traditional "dating/family/kids" stuff late if I chose to do that, while I could live in a big house or buy yachts or something (I never did have time to plan what I would buy.) During that 20 years, I never even bought a car and still drive the same 20-year-old Prius, and never spent a single one of the 200+ bitcoins I had.
I was successful and made $12 million, and was about 6 months from "finishing up." However, I was provided false information in video calls by the CEOs and management of multiple cryptocurrency lending firms, and lost all of the money. That caused my business to nearly fail, led to all but one being laid off, and destroyed about 19 lives (others' money was also lost; including multiple kids' college educations and the care of a kid with autism, there was $20 million between all of us at one time).
So now, in this new situation, I realized I won't ever do the "traditional" family thing. A lot of these people like to blame me instead of Caroline Ellison and Zac Prince. My goal is to earn the money again in 10 years instead of 20, because that's how long I think we have left before I need that money to buy intelligence augmentation when AGI takes over. There will be no retirement for any of us anymore.
You're 100% right about money. The best way to describe me is a broke teenager starting out in life, just 20 years older. I now have a stock trading machine learning binary classifier that has an accuracy of 89.9% and a backtest on unseen forward data of 384% CAGR, and little money to actually run the strategy against. I can seek $8 million against various plaintiffs; even if I win just one of the suits and the person declares bankruptcy after paying 25%, the amount should be some good "seed capital" for the bot. And, note that even if I didn't have this bot, the expected value from these lawsuits is far higher than any "normal" career could earn.
My experience is that making money is trivial. I never had any trouble buying the right stocks and I never lost a single dollar in buying bitcoins. What's hard is keeping money. I calculated that the odds of losing all your net worth in a scam are so high that the FIRE movement is fatally flawed. I don't think that one can get the probability of total loss down below 2% per year, meaning subscribing to that movement means most people are going to fail and return to work.
The lesson here for those reading this is that you should not follow the advice of "save lots of money." They don't teach you in school how to keep money; they just say "put aside your paycheck for retirement," without actually teaching people bankruptcy law and how little FDIC insurance covers and so on.
@SteveSokolowski Thank you for those thoughts. I don’t agree with all of the world model (obviously I agree/respect your personal story) but constructive sharing of viewpoints, and the experience/evidence behind them, is the best part of this site.
I’m sorry for your loss, and don’t blame you. Best of luck making it back, and/or any other goals you might have.
Because there is little I can do about it, I cannot let it factor into my day to day. If I thought writing my congress person was effective, I could consider that, but even if it was effective, I’m not sure that congress would be able to do anything about it (but I am fairly confident that I would prefer a us based ai to one from china)