• elmtonic@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    The cool thing to note here is how badly Yud here misunderstands what a normal person means when they say they have “100% certainty” in something. We’re not fucking infinitely precise Bayesian machines, 100% means exactly the same thing as 99.99%. It means exactly the same thing as “really really really sure.” A conversation between the two might go like this:

    Unwashed sheeple: Yeah, 53 is prime. 100% sure of that.

    Ellie Bayes-er: (grinning) Can you really say to be 100% sure? Do not make the mistake of confusing the map with the territory, [5000 words redacted]

    Unwashed sheeple: Whatever you say, I’m 99% sure.

    Eddielazer remains seated, triumphant in believing (epistemic status: 98.403% certainty) he has added something useful to the conversation. The sheeple walks away, having changed exactly nothing about his opinion.

    • maol@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Mr Yudkowsky is supposedly able to understand advanced maths but doesn’t know what rounding is. I think we did rounding in 3rd or 4th grade…

      • bitofhope@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        You might be comfortable using a single significant digit for any probabilities you pull out of your ass, but Yud ‘s methods are free of experimental error. He works using Aristotelian science, working stuff out from pure reason, which brought you bangers like “men have more teeth than women”. In Yud’s case, most of his ideas are unfalsifiable to begin with, so why not have seventeen nines’ worth of certainty in them? Literally can’t be false! Not even AWS would promise these kinds of SLAs!

  • YouKnowWhoTheFuckIAM@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I am absolutely astonished that anybody with the most basic understanding of relativity would ever take Yud as some kind of brain genius after he shows his entire mucky arse all over just those two opening paragraphs

    • YouKnowWhoTheFuckIAM@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Btw…

      I am not going to discuss the actual experiments that have been done on calibration—you can find them in my book chapter on cognitive biases and global catastrophic risk1—because I’ve seen that when I blurt this out to people without proper preparation, they thereafter use it as a Fully General Counterargument, which somehow leaps to mind whenever they have to discount the confidence of someone whose opinion they dislike, and fails to be available when they consider their own opinions.

      lol

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I had 100% certainty that 53 was prime when I was 12. Does that mean I was smarter than Yud back then? (I may have become more stupid since learning about LW)

  • blakestacey@awful.systemsM
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I didn’t expect that the repetition of a banal yet occasionally useful saying like “the map is not the territory” could make a person deserve being shoved into a locker, but life will surprise us all.

    Mixed in with the rank, fetid ego are amusing indications that Yud gave very little thought to what Bayesian probability actually means. I find that entertaining.

    • blakestacey@awful.systemsM
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Suppose you say that you’re 99.99% confident that 2 + 2 = 4.

      Then you’re a dillbrain.

      Then you have just asserted that you could make 10,000 independent statements, in which you repose equal confidence, and be wrong, on average, around once. Maybe for 2 + 2 = 4 this extraordinary degree of confidence would be possible

      Yes, how extraordinary that I can say every day that the guy in front of me at the bodega won’t win the Powerball. Or that [SystemRandom().random() >= 0.9999 for i in range(10000)] makes a list that is False in all but one spot.

      P(x|y) is defined as P(x,y)/P(y). P(A|A) is defined as P(A,A)/P(A) = P(A)/P(A) = 1. The ratio of these two probabilities may be 1, but I deny that there’s any actual probability that’s equal to 1. P(|) is a mere notational convenience, nothing more.

      No, you kneebiter.

      • titotal@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        I roll a fair 100 sided dice.

        Eliezer asks me to state my confidence that I won’t roll a 1.

        I say I am 99% confident I won’t roll a 1, using basic math.

        Eliezer says “AHA, you idiot, I checked all of your past predictions and when you predicted something with confidence 99%, it only happened 90% of the time! So you can’t say you’re 99% confident that you won’t roll a 1”

        I am impressed by the ability of my past predictions to affect the roll of a dice, and promptly run off to become a wizard.

        • Sailor Sega Saturn@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          Ah but the machine gods could be tinkering with your neural wiring to make you think you’re rolling a die when in reality the universe is nothing but the color pink. That’s right, reality is nothing but a shade of Fuchsia and dice don’t actually exist. You should take this possibility into account when adjusting your priors for some reason.

          Epistemic Status: Barbie.

    • blakestacey@awful.systemsM
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Suppose I said, “I have this clock that I really like. It’s a very nice clock. So, I am going to measure everything I can in terms of the times registered on this clock.”

      “OK,” you might say, while wondering what the big deal is.

      “In fact, I am going to measure all speeds as the time it takes to travel a standard unit of distance.”

      “Uh, hold on.”

      “And this means that, contrary to what you learned in Big University, zero is not a speed! Because the right way to think of speed is the time it takes to travel 1 standard distance unit, and an object that never moves never travels.”

      Now, you might try to argue with me. You could try to point out all the things that my screwy definition would break. (For starters, I am throwing out everything science has learned about inertia.) You could try showing examples where scientists I have praised, like Feynman or whoever, speak of “a speed equal to zero”. When all that goes nowhere and I dig in further with every reply, you might justifiably conclude that I am high on my own supply, in love with my own status as an iconoclast. Because that is my real motivation, neither equations nor expertise will sway me.

      Yud argues that 0 and 1 are not probabilities in exactly this way. He says that you can’t turn a probability of 0 or 1 into an odds ratio, because you’d be dividing by 0. This and everything that followed is just getting high off his own supply. One could try showing how he presumes his own conclusion. One could try showing how he breaks the basic idea that probabilities by their nature add up to 100% (given an event E, what can Yud say is the probability of the event E-or-not-E?). One could even observe that the same E. T. Jaynes he praises in that blog post uses 1 as a probability, for example in Chapter 2 of Probability Theory: The Logic of Science (Cambridge University Press, 2003). If you really want to cite someone he admires, you could note that Eliezer Yudkowsky uses 1 as a probability when trying (and failing) to explain quantum mechanics, because he writes probability amplitudes of absolute value 1.

      As an academic, I have to hold myself back from developing all those themes and more.