AGI I define as a entity that is better than humans in everything skill related. Hence a robot who can’t feel emotions will still pass as AGI
I don't think lesswrong people got the beliefs they did because of gpt4. If you disagree with me, you can bet against me / take my limit orders
@OnurcanYasar if you bet enough against 2020s, it reaches the limit order I put up, and you'll be letting me buy YES shares from you at the price of the limit order
@OnurcanYasar I think research funding and algorithmic improvements and hardware improvement and research interest and the scaling hypothesis all point to this likely happening during the next decade. I'm a bit of a doomer yeah, if I make profits from this bet I don't expect it to last me a long time
@TheBayesian Ok makes sense. I am optimistic about AGI, but don’t think we are too nearby to achieve it. It seems to me the closer you think we are near to AGI the more you tend to be a doomer. There I would expect some correlation
@OnurcanYasar yeah the correlation makes sense! The closer we are to AGI, the less time we have to get it right, so the more likely it is to go poorly. If we are around, without agi, in 2040, I will be much, much more confident in our odds of civilizational survival.