The contents of this riseup pad https://pad.riseup.net/p/first-manifold-experiment-keep will be squiggle code that estimates the number of piano tuners in chicago on July 1st 2022
2
100Ṁ253resolved Jul 4
Resolved
YES1H
6H
1D
1W
1M
ALL
A riseup pad is a lightweight, anonymous, and ephemeral alternative to google docs.
Squiggle is an estimational programming language in early access https://www.squiggle-language.com
You can view the history of the pad by clicking the timepiece-looking icon on the top right corner. The pad will disappear forever if it goes 365 days without an edit.
If a vandal of some kind erases or nonsensizes the contents of the pad, and no one fixes it by the time I view it after market close, market will resolve to N/A.
Hint: legal squiggle is a sequence of assignments separated by newline or `;`, with optionally an expression on the last line. A non-assignment (i.e. an expression) that is not on the last line is illegal (and the compiler will tell you it's unhappy). An expression is a number, probability distribution, or data like a list or record. Read the squiggle docs. You should return a distribution, not a number.
I will not bet on this market, and I will not view the pad until market close. My only interaction with the pad will be initializing it with a three-line comment at the top (and then reading it after market close)
At resolution time I will write my own squiggle file that estimates the number of piano tuners in chicago and resolve YES if the kullback-leibler divergence between my distribution and the Manifold community's distribution is finite.
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
🏅 Top traders
# | Name | Total profit |
---|---|---|
1 | Ṁ72 |
Sort by:
Well done, manifriends! Here's my final verdict: https://develop--squiggle-documentation.netlify.app/playground#code=eNqNVNFu2kAQ%2FJWVyQOJUAIteUHNQ5WkaqoAakkeKlk6HXiNTznvubdnEI3y792zaQQNSWNZstjbmdkdj3lMuHDru02FnIyCr7HXFC4dBe%2Bs1HJteVuc1WWp%2FWavdp2Z4PxfqCETjLazX7VZLi3Ogje0TEZJpwN3X69h%2FHly82V6ewWX0%2FH4fnJz93MEKXXAEYQC4dtsOoGg5yNgDKCZ67IKRg5X2tbIUgFDVR04Yh7TpHKVyxeFWeilS5PRx35z9eTAaHKsKvQq16WxGzntn%2FbPn48KTZnFLHaEmtDLc4PaS9tAGJ5SigpxiBYObH5jlF8XOuAKPWSGZbd53YzHiCWDR82O9NxiBEuzhtx4DoAcTCm4HpSo6WIIIg4csgxXF4OD4JrFNhBLaSG4W8wDBAd65UwG5IiRWJa2guLaBobcu3Jv1E8wSKktqKZwscfWJedLbbvD3uBY7pQaV1RjBStDamuqwLrdo12b4QSOXrh7DGewIxZ%2FHr3ts%2Bz4imRjfQe%2B399MJjEcIl5bHW1WLt%2BZazCOlpyPUypczVg4mx3cdNC09SBu%2BdzJO0SHBc5gn1YGQb9ACmqHZG1CodpFhUgC1m%2FUYtK2lv5j5n84TuDlhNGPmAaLascxtXb%2BQR7S7%2FUSm3cRjRWNYV94zvsRJ7FDZYIK%2BgFZhRbb0hy06cPpsDWqsUpeCyFmoh2hc1RcxdmFI5Zkr63gy0VP4C3lTvtlx4Q7Qmj5ojXQRiMl%2BfcgUq9n8t3DncH7rWuD14HZ5fTHdUweL5yPcXqwV0Y%2B%2BSXSArtvjtZ7LdTHW7bk6Q%2BiNf7o
I resolved yes even though due to monte carlo details the `klDivergence` function sometimes returns infinite or complex result, but about a third of the time it returns around 2.4! The reason I'm not being a stickler or a hardass is because `klDivergence` is a finicky function and I'm pleased y'all got the spirit of it.
My sketch and your sketch differ by 3 OOMs, so we're definitely thinking a little differently about it.
@Quinn Thanks. For reference I have no background in statistics beyond high school level, just a little self-education in Python. I found while testing my code that the Monte Carlo simulation tended to come back with more wildly varying results the more distribution arithmetic was stacked together (initially I had a distribution for every single variable) so I figured a more meaningful result would be had from minimizing the number of distribution arithmetic operations performed, even though it's not strictly in the spirit of a Fermi estimate.