Episode 27 - Big Algorithm, Fat Tails, and Converging Priors
Military looking for algorithms that require less data
Where’s Waldo Finding Robot
Twitter is the only place that hasn’t removed Alex Jones
Oops! This one went out of date in about a day. We’ll follow up next week!
Tweet on Bayesian Priors that don’t have convergent posteriors
The idea behind “Bayesian” approaches is that if 2 pple have different priors, they will eventually converge to the same estimation, via updating. A “wrong” prior is therefore OK.
Under fat tails, if you have the wrong prior, you never get there. (Taleb & Pilpel, 2004)
Video that mentions all the stats about the world getting better: