Will Tyler Cowen agree that an 'actual mathematical model' for AI X-Risk has been developed by October 15, 2023?
164
5.9K
6.9K
resolved Oct 16
Resolved
NO

On the Russ Roberts ECONTALK Podcast #893, guest Tyler Cowen challenges Eliezer Yudkowsky and the Less Wrong/EA Alignment communities to develop a mathematical model for AI X-Risk.

https://www.econtalk.org/tyler-cowen-on-the-risks-and-impact-of-artificial-intelligence/

This market resolves to "YES" if Tyler Cowen publicly acknowledges, by October 15 2023, that an actual mathematical model of AI X-Risk has been developed.

Two clips from the conversation:

https://youtube.com/clip/Ugkxtf8ZD3FSvs8TAM2lhqlWvRh7xo7bISkp

...But, I mean, here would be my initial response to Eliezer. I've been inviting people who share his view simply to join the discourse. So, they have the sense, 'Oh, we've been writing up these concerns for 20 years and no one listens to us.' My view is quite different. I put out a call and asked a lot of people I know, well-informed people, 'Is there any actual mathematical model of this process of how the world is supposed to end?'

So, if you look, say, at COVID or climate change fears, in both cases, there are many models you can look at, including--and then models with data. I'm not saying you have to like those models. But the point is: there's something you look at and then you make up your mind whether or not you like those models; and then they're tested against data...

https://youtube.com/clip/Ugkx4msoNRn5ryBWhrIZS-oQml8NpStT_FEU

...So, when it comes to AGI and existential risk, it turns out as best I can ascertain, in the 20 years or so we've been talking about this seriously, there isn't a single model done. Period. Flat out.

So, I don't think any idea should be dismissed. I've just been inviting those individuals to actually join the discourse of science. 'Show us your models. Let us see their assumptions and let's talk about those.'...

Get Ṁ200 play money

🏅 Top traders

#NameTotal profit
1Ṁ2,094
2Ṁ1,572
3Ṁ1,150
4Ṁ639
5Ṁ612
Sort by:

Tyler's most recent summary of his views on AI X-Risk, published Nov 19, 2023:

https://marginalrevolution.com/marginalrevolution/2023/11/my-summary-views-on-ai-existential-risk.html

third paragraph from the end:

"Note that the pessimistic arguments are not supported by an extensive body of peer-reviewed research — not in the way that, say, climate-change arguments are. So we’re being asked to stop a major technology on the basis of very little confirmed research. In another context, this might be called pseudo-science."

sold Ṁ731 of YES

I admit defeat :(

As a heads up, I did email TC about this and he did seem interested. When I asked him if he thought it counted as a mathematical model, he said "What do you think?", to which I wrote up a decent length email explaining why I think it counts. I also disclosed that I had a bias as I was invested in this market. He then posted a link to this market on MR and didn't respond for a week, so I followed up and asked him his thoughts. He said roughly "amused people care so much what I think". He has not responded since.

[Insert exasperated emoji here]

Additional discussion on the Russ Roberts ECONTALK podcast with Zvi Mowshowitz:
(41:17-44:40) modeling AI X-Risk mathmatically similar to climate models
(1:15:50-1:17;44) developing a gears-level model of AI X-risk

https://www.youtube.com/watch?v=IpTBiTMGGn0

"At the halfway point of the market to the Oct 15th closing date, I haven’t found any further public comment from you on the topic .

 Have you updated your position on the existence of a mathematical model for AI X-Risk?"


From: Tyler Cowen 

To: Joe Brenton

Sun, Jul 30 at 7:24 AM

Still haven’t seen one…

Tyler

Marc Andreessen also made this point about models on Lex Friendman's podcast recently.

predicted NO

AI risk =

max(BMI, hair length) * (sci-fi novels read) **0.5 / (coding ability) * (regulatory capture probability)

—-

NB

Global warming risk =

(Female = 3, Male = 1) / age ** 0.5

* 1/ (0.5 + number of offspring)

* population density of neighborhood

/ (1+ growth rate of country) ** 2

predicted NO

Veganism =

Iq / (upper body strength)

* (neuroticism * conscientiousness) **0.5

@Gigacasting For your global warming risk model, if I'm interpreting it right, the higher the positive growth rate of the country, the lower^2 the subjective global warming risk. This seems counter-intuitive to me. Care to elaborate? Are you using it as a proxy for GDP or something else?

@Gigacasting Где ты? Я скучаю по твоей заднице. 🍑

predicted NO

@parhizj He’s implying that if you have kids you don’t care about global warming (kids produce lots of CO2)

@mariopasquato Ah I think I was thinking economic growth not population growth.

predicted NO

@parhizj I see… anyway if you are growing a lot you are likely a developing country (catch-up growth) so you care more about increasing standards of living than CO2