Best 135 quotes of Eliezer Yudkowsky on MyQuotes

Eliezer Yudkowsky

  • By Anonym
    Eliezer Yudkowsky

    A burning itch to know is higher than a solemn vow to pursue truth. To feel the burning itch of curiosity requires both that you be ignorant, and that you desire to relinquish your ignorance.

  • By Anonym
    Eliezer Yudkowsky

    After all, if you had the complete decision process, you could run it as an AI, and I'd be coding it up right now.

  • By Anonym
    Eliezer Yudkowsky

    And someday when the descendants of humanity have spread from star to star, they won’t tell the children about the history of Ancient Earth until they’re old enough to bear it; and when they learn they’ll weep to hear that such a thing as Death had ever once existed!

  • By Anonym
    Eliezer Yudkowsky

    Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there.

  • By Anonym
    Eliezer Yudkowsky

    Boys," said Hermione Granger, "should not be allowed to love girls without asking them first! This is true in a number of ways and especially when it comes to gluing people to the ceiling!

  • By Anonym
    Eliezer Yudkowsky

    By and large, the answer to the question "How do large institutions survive?" is "They don't!" The vast majority of large modern-day institutions - some of them extremely vital to the functioning of our complex civilization - simply fail to exist in the first place.

  • By Anonym
    Eliezer Yudkowsky

    By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.

  • By Anonym
    Eliezer Yudkowsky

    Crocker's Rules didn't give you the right to say anything offensive, but other people could say potentially offensive things to you, and it was your responsibility not to be offended. This was surprisingly hard to explain to people; many people would read the careful explanation and hear, "Crocker's Rules mean you can say offensive things to other people.

  • By Anonym
    Eliezer Yudkowsky

    Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud. Submit yourself to ordeals and test yourself in fire. Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts.

  • By Anonym
    Eliezer Yudkowsky

    - Every time someone cries out in prayer and I can't answer, I feel guilty about not being God. - That doesn't sound good. - I understand that I have a problem, and I know what I need to do to solve it, all right? I'm working on it. Of course, Harry hadn't said what the solution was. The solution, obviously, was to hurry up and become God.

  • By Anonym
    Eliezer Yudkowsky

    Existential depression has always annoyed me; it is one of the world's most pointless forms of suffering.

  • By Anonym
    Eliezer Yudkowsky

    Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so? Support like, say, spending a day apiece watching twenty different jobs and then another week at their top three choices, with salary charts and projections and probabilities of graduating that subject given their test scores? The more so considering this is a central allocation question for the entire economy?

  • By Anonym
    Eliezer Yudkowsky

    He'd met other prodigies in mathematical competitions. In fact he'd been thoroughly trounced by competitors who probably spent literally all day practising maths problems and who'd never read a science-fiction book and who would burn out completely before puberty and never amount to anything in their future lives because they'd just practised known techniques instead of learning to think creatively.

  • By Anonym
    Eliezer Yudkowsky

    I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.

  • By Anonym
    Eliezer Yudkowsky

    I am tempted to say that a doctorate in AI would be negatively useful, but I am not one to hold someone’s reckless youth against them – just because you acquired a doctorate in AI doesn’t mean you should be permanently disqualified.

  • By Anonym
    Eliezer Yudkowsky

    I ask the fundamental question of rationality: Why do you believe what you believe? What do you think you know and how do you think you know it?

  • By Anonym
    Eliezer Yudkowsky

    If cryonics were a scam it would have far better marketing and be far more popular.

  • By Anonym
    Eliezer Yudkowsky

    If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality. Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.

  • By Anonym
    Eliezer Yudkowsky

    If I'm teaching deep things, then I view it as important to make people feel like they're learning deep things, because otherwise, they will still have a hole in their mind for "deep truths" that needs filling, and they will go off and fill their heads with complete nonsense that has been written in a more satisfying style.

  • By Anonym
    Eliezer Yudkowsky

    If people got hit on the head by a baseball bat every week, pretty soon they would invent reasons why getting hit on the head with a baseball bat was a good thing.

  • By Anonym
    Eliezer Yudkowsky

    If the iron is hot, I desire to believe it is hot, and if it is cool, I desire to believe it is cool.

  • By Anonym
    Eliezer Yudkowsky

    If you are equally good at explaining any outcome, you have zero knowledge.

  • By Anonym
    Eliezer Yudkowsky

    If you handed [character] a glass that was 90% full, he'd tell you that the 10% empty part proved that no one really cared about water.

  • By Anonym
    Eliezer Yudkowsky

    If you've been cryocrastinating, putting off signing up for cryonics "until later", don't think that you've "gotten away with it so far". Many worlds, remember? There are branched versions of you that are dying of cancer, and not signed up for cryonics, and it's too late for them to get life insurance.

  • By Anonym
    Eliezer Yudkowsky

    If you want to build a recursively self-improving AI, have it go through a billion sequential self-modifications, become vastly smarter than you, and not die, you've got to work to a pretty precise standard.

  • By Anonym
    Eliezer Yudkowsky

    If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.

  • By Anonym
    Eliezer Yudkowsky

    I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.

  • By Anonym
    Eliezer Yudkowsky

    I'm wondering if there's a spell to make lightning flash in the background whenever I make an ominous resolution.

  • By Anonym
    Eliezer Yudkowsky

    Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.

  • By Anonym
    Eliezer Yudkowsky

    [...] intelligent people only have a certain amount of time (measured in subjective time spent thinking about religion) to become atheists. After a certain point, if you're smart, have spent time thinking about and defending your religion, and still haven't escaped the grip of Dark Side Epistemology, the inside of your mind ends up as an Escher painting.

  • By Anonym
    Eliezer Yudkowsky

    I see little hope for democracy as an effective form of government, but I admire the poetry of how it makes its victims complicit in their own destruction.

  • By Anonym
    Eliezer Yudkowsky

    It is triple ultra forbidden to respond to criticism with violence. There are a very few injunctions in the human art of rationality that have no ifs, ands, buts, or escape clauses. This is one of them. Bad argument gets counterargument. Does not get bullet. Never. Never ever never for ever.

  • By Anonym
    Eliezer Yudkowsky

    Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can.

  • By Anonym
    Eliezer Yudkowsky

    Like that's the only reason anyone would ever buy a first-aid kit? Don't take this the wrong way, Professor McGonagall, but what sort of crazy children are you used to dealing with?" "Gryffindors," spat Professor McGonagall, the word carrying a freight of bitterness and despair that fell like an eternal curse on all youthful heroism and high spirits.

  • By Anonym
    Eliezer Yudkowsky

    Litmus test: If you can't describe Ricardo's Law of Comparative Advantage and explain why people find it counterintuitive, you don't know enough about economics to direct any criticism or praise at "capitalism" because you don't know what other people are referring to when they use that word.

  • By Anonym
    Eliezer Yudkowsky

    Lonely dissent doesn't feel like going to school dressed in black. It feels like going to school wearing a clown suit.

  • By Anonym
    Eliezer Yudkowsky

    Many have stood their ground and faced the darkness when it comes for them. Fewer come for the darkness and force it to face them.

  • By Anonym
    Eliezer Yudkowsky

    Maybe you just can't protect people from certain specialized types of folly with any sane amount of regulation, and the correct response is to give up on the high social costs of inadequately protecting people from themselves under certain circumstances.

  • By Anonym
    Eliezer Yudkowsky

    Moore's Law of Mad Science: Every eighteen months, the minimum IQ necessary to destroy the world drops by one point.

  • By Anonym
    Eliezer Yudkowsky

    Most Muggles lived in a world defined by the limits of what you could do with cars and telephones. Even though Muggle physics explicitly permitted possibilities like molecular nanotechnology or the Penrose process for extracting energy from black holes, most people filed that away in the same section of their brain that stored fairy tales and history books, well away from their personal realities: Long ago and far away, ever so long ago.

  • By Anonym
    Eliezer Yudkowsky

    My experience is that journalists report on the nearest-cliche algorithm, which is extremely uninformative because there aren't many cliches, the truth is often quite distant from any cliche, and the only thing you can infer about the actual event was that this was the closest cliche. It is simply not possible to appreciate the sheer awfulness of mainstream media reporting until someone has actually reported on you. It is so much worse than you think.

  • By Anonym
    Eliezer Yudkowsky

    Not every change is an improvement but every improvement is a change; you can't do anything BETTER unless you can manage to do it DIFFERENTLY, you've got to let yourself do better than other people!

  • By Anonym
    Eliezer Yudkowsky

    Our coherent extrapolated volition is our wish if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted.

  • By Anonym
    Eliezer Yudkowsky

    Part of the rationalist ethos is binding yourself emotionally to an absolutely lawful reductionistic universe - a universe containing no ontologically basic mental things such as souls or magic and pouring all your hope and all your care into that merely real universe and its possibilities, without disappointment.

  • By Anonym
    Eliezer Yudkowsky

    Physiologically adult humans are not meant to spend an additional 10 years in a school system; their brains map that onto "I have been assigned low tribal status". And so, of course, they plot rebellion - accuse the existing tribal overlords of corruption plot, perhaps to split off their own little tribe in the savanna, not realizing that this is impossible in the Modern World.

  • By Anonym
    Eliezer Yudkowsky

    Rationality is the master lifehack which distinguishes which other lifehacks to use.

  • By Anonym
    Eliezer Yudkowsky

    Reality has been around since long before you showed up. Don't go calling it nasty names like 'bizarre' or 'incredible'. The universe was propagating complex amplitudes through configuration space for ten billion years before life ever emerged on Earth. Quantum physics is not 'weird'. You are weird.

  • By Anonym
    Eliezer Yudkowsky

    Remember, if you succeed in everything you try in life, you're living below your full potential and you should take up more difficult or daring things.

  • By Anonym
    Eliezer Yudkowsky

    Science has heroes, but no gods. The great Names are not our superiors, or even our rivals, they are passed milestones on our road; and the most important milestone is the hero yet to come.

  • By Anonym
    Eliezer Yudkowsky

    Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.