As suggested at this thread to general “yeah sounds cool”. Let’s see if this goes anywhere.
Original inspiration:
The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)
If your sneer seems higher quality than you thought, feel free to make it a post, there’s no quota here
So today I learned there are people who call themselves superforcasters®. Neat!
The superforecasters® have had a melding of the minds and determined that covid-19 was 75% likely to not be a lab leak. Nifty! This is useless to me!
Looking at the website of these people with good enough judgement to call themselves “Good Judgement”, you can learn that 100% of superforecasters® agree that there will be less than 100 deaths from H5N1 this year. I don’t know much about H5N1 but I guess that makes sense given that it’s been around since 1996 and would need a mutation to be contagious among humans.
I found one of the superforecaster®-trainee discussion topics where they reveal some of the secrets to their (super)forecasting(®)-trainee instincts
Riveting!
Let’s go next to find out how to give up our individuality and become a certified superforecaster® hive brain.
Fans of certain shonen anime may recognize this technique as Kodoku – a deadly poison created by putting a bunch of insects in a jar until only one remains:
“But what’s the catch Saturn”? I can hear you say. “Surely this is somehow a grift nerds find or a way to fleece money out of governments”.
Nonono you’ve got the completely wrong idea. Good Judgement offers a 100$ Superforecasting Fundamentals course out of the goodness of their heart I’m sure! I mean after all if they spread Superforecasting to the world then their Hari-Seldon-Esque hivemind would lose it’s competitive edge so they must not be profit motivated.
Anyway if you work for the UK they want to hear from you:
Maybe they have superforecasted the fall of the british empire.
And to end this, because I can never resist web design sneer.
Dear programmers: if you apply the CSS
word-break: break-all;
to the string “Privacy Policy” it may end up rendered as “Pr[newline]ivacy Policy” which unfortunately looks pretty unprofessional :(A superforecaster making a prediction
Is that from a cult classic I may not have run into? Because just the presence of Orbital in the background there has piqued my curiousity
Mean Girls! (the original)
Thanks, got a weekend watch then :)
I know what you’re thinking.
You’re thinking Saturn, that could have been a post!
I know I know, but I can’t handle that kind of pressure. If someone else wants to make a post about this, or prediction markets, don’t let me stop you. It’s an under-sneered area at the intersection of tech weirdos, that other kind of tech weirdoes, and that third kind of tech weirdos.
I understood this reference. I know it as Gu poison, which is listed in the wikipedia article you linked!
When I was a kid I read a vignette of a guy trying to scam people into thinking he was amazing at predicting things. He chose 1024 stockbrokers, picked one stock, and in 512 envelopes he said the stock would be up by the end of the month, and in the other 512 he said it would go down. You can see where this story is going, i.e. he would be left with one person thinking he predicted 10 things in a row correctly and was therefore a superforecaster. This vignette was great at illustrating to child me that predicting things correctly isn’t necessarily some display of great intelligence or insight. Unfortunately what I didn’t know is that it was setting me up for great disappointment when after that point and forevermore, I would see time and time again that people would fall for this shit so easily.
(For some reason when I try to think of where I read that vignette, vonnegut comes to mind. I doubt it was him.)
@swlabr @sailor_sega_saturn
“He chose 1024 stockbrokers, picked one stock, and in 512 envelopes he said the stock would be up by the end of the month, and in the other 512 he said it would go down.”
1024 stamps?
This guy is clearly already made of money so why is he even bothering.
/s
Stocks and stamps, 2 tastes that have gone great together since Charles Ponzi.
also: the Jen Barber storyline from this IT crowd ep.
lmao this is one of my all time favorite grifts. I’ve never understood why it isn’t more popular among us connoisseurs. it’s so baldfaced to say “statistically, someone probably has oracular powers, and thanks to the power of science, here they are. you need only pay us a small incense and rites fee to access them”
Isn’t it weird these people came out of internet atheism of all things and go right into this stuff?
It’s really gotta be emphasised that these guys didn’t come out of internet atheism and frankly I would really like to know where that idea came from. It’s a completely different thing which, arguably, predates internet atheism (if we read “internet atheism” as beginning in the early 2000s - but we could obviously push back that date much earlier). These guys are more or less out of Silicon Valley, Emile P Torres has coined the term “TESCREALS” (modified to “TREACLES”) for - and I had to google this even though I know all the names independently - “Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective Altruism, and Longtermism”.
It’s a confluence of futurism cults which primarily emerged online (even on the early internet), but also in airport books by e.g. Ray Kurzweil in the 90s, and has gradually made its away into the wider culture, with EA and longtermism the now most successful outgrowths of its spores in the academy.
Whereas internet atheism kind of bottoms out in 1990s polemics against religion - nominally Christianity, but ultimately fuelled by the end of the Cold War and the West’s hunger for a new enemy (hey look over there, it’s some brown people with a weird religion) - the TREACLES “cluster of ideologies” (I prefer “genealogy”, because this is ultimately about a political genealogy) has deep roots in the weirdest end of libertarian economics/philosophy and rabid anti-communism. And therefore the Cold War (and even pre-Cold War) need for a capitalist political religion. OK the last part is my opinion, but (a) I think it stands up, and (b) it explains the clearly deeply felt need for a techno-religion which justifies the most insane shit as long as there’s money in it.
Yeah, I hung out a lot in Internet skeptic/atheist circles during the 2005-10 era, and as far as I can recall, the overlap with LessWrong, Overcoming Bias, etc., was pretty much nil. This was how that world treated Ray Kurzweil.
I read
and immediately thought someone should introduce PZ Meyers to rat/EA as soon as possible.
Turns out he’s aware of them since at least 2016:
Are these people for real?
More recently, it seems that as an evolutionary biologist he apparently has thoughts on the rat concept of genetics: The eugenicists are always oozing out of the woodwork
FWiW I used to read PZM quite a bit before he pivoted to doing youtube videos which I don’t have the patience for, and he checked out of the new atheist movement (such as it was) pretty much as soon as it became evident that it was gradually turning into a safe space for islamophobia and misogyny.
PZ is aware and thinks they’re bozos. As a biologist, he was particularly pointed about cryonics.
Technoloons is a good word, going to have to remember that
I wonder if the existence of RationalWiki contributes to the confusion. Even though it’s unrelated to and critical towards TREACLES, the name can cause confusion.
Imo because the whole topic of superforecasters and prediction markets is both undercriticized and kaleidoskopically preposterous in a way that makes it feel like you shouldn’t broach the topic unless you are prepared to commit to some diatribe length posting.
Which somebody should, it’s a shame there is yet no one single place you can point to and say “here’s why this thing is weird and grifty and pretend science while striclty promoted by the scientology of AI, and also there’s crypto involved”.
If you have ever wondered why so many Rationalists do weird end of year predictions and keep stats on that, it is because they all want to become superforecasters. (And remember by correctly forecasting trivial things that are sure to happen, you can increase your % of correct forecasts and you can become a superforecaster yourself. Never try to forecast black swans for that reason however (or just predict they will not happen for more superforecastpoints)).
See also you could have been a winner! And got a free sub to ACX!
this is what literary Bayesianism promises: the ability to pluck numbers out your ass and announce them in a confident voice