Eliezer Yudkowsky famous quotes
Last updated: Sep 5, 2024
-
Many have stood their ground and faced the darkness when it comes for them. Fewer come for the darkness and force it to face them.
-- Eliezer Yudkowsky -
You are personally responsible for becoming more ethical than the society you grew up in.
-- Eliezer Yudkowsky -
Trying and getting hurt can't possibly be worse for you than being... stuck.
-- Eliezer Yudkowsky -
The police officer who puts their life on the line with no superpowers, no X-Ray vision, no super-strength, no ability to fly, and above all no invulnerability to bullets, reveals far greater virtue than Superman—who is only a mere superhero.
-- Eliezer Yudkowsky -
Not every change is an improvement but every improvement is a change; you can't do anything BETTER unless you can manage to do it DIFFERENTLY, you've got to let yourself do better than other people!
-- Eliezer Yudkowsky -
If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality. Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.
-- Eliezer Yudkowsky -
Through rationality we shall become awesome, and invent and test systematic methods for making people awesome, and plot to optimize everything in sight, and the more fun we have the more people will want to join us.
-- Eliezer Yudkowsky -
Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can.
-- Eliezer Yudkowsky -
If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.
-- Eliezer Yudkowsky -
A burning itch to know is higher than a solemn vow to pursue truth. To feel the burning itch of curiosity requires both that you be ignorant, and that you desire to relinquish your ignorance.
-- Eliezer Yudkowsky -
It is triple ultra forbidden to respond to criticism with violence. There are a very few injunctions in the human art of rationality that have no ifs, ands, buts, or escape clauses. This is one of them. Bad argument gets counterargument. Does not get bullet. Never. Never ever never for ever.
-- Eliezer Yudkowsky -
You cannot rationalize what is not rational to begin with - as if lying were called truthization. There is no way to obtain more truth for a proposition by bribery, flattery, or the most passionate argument - you can make more people believe the proposition, but you cannot make it more true.
-- Eliezer Yudkowsky -
If people got hit on the head by a baseball bat every week, pretty soon they would invent reasons why getting hit on the head with a baseball bat was a good thing.
-- Eliezer Yudkowsky -
Most Muggles lived in a world defined by the limits of what you could do with cars and telephones. Even though Muggle physics explicitly permitted possibilities like molecular nanotechnology or the Penrose process for extracting energy from black holes, most people filed that away in the same section of their brain that stored fairy tales and history books, well away from their personal realities: Long ago and far away, ever so long ago.
-- Eliezer Yudkowsky -
Have I ever remarked on how completely ridiculous it is to ask high school students to decide what they want to do with the rest of their lives and give them nearly no support in doing so? Support like, say, spending a day apiece watching twenty different jobs and then another week at their top three choices, with salary charts and projections and probabilities of graduating that subject given their test scores? The more so considering this is a central allocation question for the entire economy?
-- Eliezer Yudkowsky -
Moore's Law of Mad Science: Every eighteen months, the minimum IQ necessary to destroy the world drops by one point.
-- Eliezer Yudkowsky -
I ask the fundamental question of rationality: Why do you believe what you believe? What do you think you know and how do you think you know it?
-- Eliezer Yudkowsky -
Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud. Submit yourself to ordeals and test yourself in fire. Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts.
-- Eliezer Yudkowsky -
If you've been cryocrastinating, putting off signing up for cryonics "until later", don't think that you've "gotten away with it so far". Many worlds, remember? There are branched versions of you that are dying of cancer, and not signed up for cryonics, and it's too late for them to get life insurance.
-- Eliezer Yudkowsky -
Science has heroes, but no gods. The great Names are not our superiors, or even our rivals, they are passed milestones on our road; and the most important milestone is the hero yet to come.
-- Eliezer Yudkowsky -
There are no surprising facts, only models that are surprised by facts; and if a model is surprised by the facts, it is no credit to that model.
-- Eliezer Yudkowsky -
Crocker's Rules didn't give you the right to say anything offensive, but other people could say potentially offensive things to you , and it was your responsibility not to be offended. This was surprisingly hard to explain to people; many people would read the careful explanation and hear, "Crocker's Rules mean you can say offensive things to other people.
-- Eliezer Yudkowsky -
Lonely dissent doesn't feel like going to school dressed in black. It feels like going to school wearing a clown suit.
-- Eliezer Yudkowsky -
The human brain cannot release enough neurotransmitters to feel emotion a thousand times as strong as the grief of one funeral. A prospective risk going from 10,000,000 deaths to 100,000,000 deaths does not multiply by ten the strength of our determination to stop it. It adds one more zero on paper for our eyes to glaze over.
-- Eliezer Yudkowsky -
Litmus test: If you can't describe Ricardo 's Law of Comparative Advantage and explain why people find it counterintuitive, you don't know enough about economics to direct any criticism or praise at " capitalism " because you don't know what other people are referring to when they use that word .
-- Eliezer Yudkowsky -
If you want to build a recursively self-improving AI, have it go through a billion sequential self-modifications, become vastly smarter than you, and not die, you've got to work to a pretty precise standard.
-- Eliezer Yudkowsky -
The people I know who seem to make unusual efforts at rationality, are unusually honest, or, failing that, at least have unusually bad social skills.
-- Eliezer Yudkowsky -
We underestimate the distance between ourselves and others. Not just inferential distance, but distances of temperament and ability, distances of situation and resource, distances of unspoken knowledge and unnoticed skills and luck, distances of interior landscape.
-- Eliezer Yudkowsky -
If I'm teaching deep things, then I view it as important to make people feel like they're learning deep things, because otherwise, they will still have a hole in their mind for "deep truths" that needs filling, and they will go off and fill their heads with complete nonsense that has been written in a more satisfying style.
-- Eliezer Yudkowsky -
By and large, the answer to the question "How do large institutions survive?" is "They don't!" The vast majority of large modern-day institutions some of them extremely vital to the functioning of our complex civilization simply fail to exist in the first place.
-- Eliezer Yudkowsky -
My experience is that journalists report on the nearest-cliche algorithm, which is extremely uninformative because there aren't many cliches, the truth is often quite distant from any cliche, and the only thing you can infer about the actual event was that this was the closest cliche.... It is simply not possible to appreciate the sheer awfulness of mainstream media reporting until someone has actually reported on you. It is so much worse than you think.
-- Eliezer Yudkowsky -
Part of the rationalist ethos is binding yourself emotionally to an absolutely lawful reductionistic universe a universe containing no ontologically basic mental things such as souls or magic and pouring all your hope and all your care into that merely real universe and its possibilities, without disappointment.
-- Eliezer Yudkowsky -
Physiologically adult humans are not meant to spend an additional 10 years in a school system; their brains map that onto "I have been assigned low tribal status". And so, of course, they plot rebellion accuse the existing tribal overlords of corruption plot perhaps to split off their own little tribe in the savanna, not realizing that this is impossible in the Modern World.
-- Eliezer Yudkowsky -
If cryonics were a scam it would have far better marketing and be far more popular.
-- Eliezer Yudkowsky -
Like that's the only reason anyone would ever buy a first-aid kit? Don't take this the wrong way, Professor McGonagall, but what sort of crazy children are you used to dealing with?" "Gryffindors," spat Professor McGonagall, the word carrying a freight of bitterness and despair that fell like an eternal curse on all youthful heroism and high spirits.
-- Eliezer Yudkowsky -
You will find ambiguity a great ally on your road to power. Give a sign of Slytherin on one day, and contradict it with a sign of Gryffindor the next; and the Slytherins will be enabled to believe what they wish, while the Gryffindors argue themselves into supporting you as well. So long as there is uncertainty, people can believe whatever seems to be to their own advantage. And so long as you appear strong, so long as you appear to be winning, their instincts will tell them that their advantage lies with you. Walk always in the shadow, and light and darkness both will follow.
-- Eliezer Yudkowsky -
I see little hope for democracy as an effective form of government, but I admire the poetry of how it makes its victims complicit in their own destruction.
-- Eliezer Yudkowsky -
World domination is such an ugly phrase. I prefer to call it world optimisation.
-- Eliezer Yudkowsky -
To worship a sacred mystery was just to worship your own ignorance.
-- Eliezer Yudkowsky -
You couldn't changed history. But you could get it right to start with. Do something differently the FIRST time around. This whole business with seeking Slytherin's secrets... seemed an awful lot like the sort of thing where, years later, you would look back and say, 'And THAT was where it all started to go wrong.' And he would wish desperately for the ability to fall back through time and make a different choice. Wish granted. Now what?
-- Eliezer Yudkowsky -
I'm lazy! I hate work! Hate hard work in all its forms! Clever shortcuts, that's all I'm about!
-- Eliezer Yudkowsky -
I don't want to rule the universe. I just think it could be more sensibly organised.
-- Eliezer Yudkowsky -
Okay, so either (a) I just teleported somewhere else entirely (b) they can fold space like no one's business or (c) they are simply ignoring all the rules.
-- Eliezer Yudkowsky -
To confess your fallibility and then do nothing about it is not humble; it is boasting of your modesty.
-- Eliezer Yudkowsky -
Existential depression has always annoyed me; it is one of the world's most pointless forms of suffering.
-- Eliezer Yudkowsky -
...there's something in science like the shine of the Patronus Charm, driving back all sorts of darkness and madness...
-- Eliezer Yudkowsky -
When you are older, you will learn that the first and foremost thing which any ordinary person does is nothing.
-- Eliezer Yudkowsky -
Why does any kind of cynicism appeal to people? Because it seems like a mark of maturity, of sophistication, like you’ve seen everything and know better. Or because putting something down feels like pushing yourself up.
-- Eliezer Yudkowsky -
Boys," said Hermione Granger, "should not be allowed to love girls without asking them first! This is true in a number of ways and especially when it comes to gluing people to the ceiling!
-- Eliezer Yudkowsky -
Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.
-- Eliezer Yudkowsky -
There is no justice in the laws of nature, no term for fairness in the equations of motion. The Universe is neither evil, nor good, it simply does not care. The stars don't care, or the Sun, or the sky. But they don't have to! WE care! There IS light in the world, and it is US!
-- Eliezer Yudkowsky -
And someday when the descendants of humanity have spread from star to star they won’t tell the children about the history of Ancient Earth until they’re old enough to bear it and when they learn they’ll weep to hear that such a thing as Death had ever once existed
-- Eliezer Yudkowsky -
What people really believe doesn't feel like a BELIEF, it feels like the way the world IS.
-- Eliezer Yudkowsky -
The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else.
-- Eliezer Yudkowsky -
The strength of a theory is not what it allows, but what it prohibits; if you can invent an equally persuasive explanation for any outcome, you have zero knowledge.
-- Eliezer Yudkowsky -
- With respect, Professor McGonagall, I'm not quite sure you understand what I'm trying to do here. - With respect, Mr. Potter, I'm quite sure I don't. Unless - this is a guess, mind - you're trying to take over the world? - No! I mean yes - well, NO! - I think i should perhaps be alarmed that you have trouble answering the question.
-- Eliezer Yudkowsky -
There were mysterious questions, but a mysterious answer was a contradiction in terms.
-- Eliezer Yudkowsky -
I'm wondering if there's a spell to make lightning flash in the background whenever I make an ominous resolution.
-- Eliezer Yudkowsky -
- Every time someone cries out in prayer and I can't answer, I feel guilty about not being God. - That doesn't sound good. - I understand that I have a problem, and I know what I need to do to solve it, all right? I'm working on it. Of course, Harry hadn't said what the solution was. The solution, obviously, was to hurry up and become God.
-- Eliezer Yudkowsky -
Every mystery ever solved had been a puzzle from the dawn of the human species right up until someone solved it.
-- Eliezer Yudkowsky -
There is light in the world, and it is us!
-- Eliezer Yudkowsky -
That which the truth nourishes should thrive.
-- Eliezer Yudkowsky -
If the iron is hot, I desire to believe it is hot, and if it is cool, I desire to believe it is cool.
-- Eliezer Yudkowsky -
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there.
-- Eliezer Yudkowsky -
I am tempted to say that a doctorate in AI would be negatively useful, but I am not one to hold someone’s reckless youth against them – just because you acquired a doctorate in AI doesn’t mean you should be permanently disqualified.
-- Eliezer Yudkowsky -
Rationality is the master lifehack which distinguishes which other lifehacks to use.
-- Eliezer Yudkowsky -
Maybe you just can't protect people from certain specialized types of folly with any sane amount of regulation, and the correct response is to give up on the high social costs of inadequately protecting people from themselves under certain circumstances.
-- Eliezer Yudkowsky -
There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.
-- Eliezer Yudkowsky -
If you don’t sign up your kids for cryonics then you are a lousy parent.
-- Eliezer Yudkowsky -
After all, if you had the complete decision process, you could run it as an AI, and I'd be coding it up right now.
-- Eliezer Yudkowsky -
If you are equally good at explaining any outcome, you have zero knowledge.
-- Eliezer Yudkowsky -
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
-- Eliezer Yudkowsky -
We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.
-- Eliezer Yudkowsky -
Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.
-- Eliezer Yudkowsky -
The purpose of a moral philosophy is not to look delightfully strange and counterintuitive or to provide employment to bioethicists. The purpose is to guide our choices toward life, health, beauty, happiness, fun, laughter, challenge, and learning.
-- Eliezer Yudkowsky -
By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.
-- Eliezer Yudkowsky -
I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.
-- Eliezer Yudkowsky -
Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.
-- Eliezer Yudkowsky -
Singularitarians are the munchkins of the real world. We just ignore all the usual dungeons and head straight for the cycle of infinite wish spells.
-- Eliezer Yudkowsky -
The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.
-- Eliezer Yudkowsky -
He'd met other prodigies in mathematical competitions. In fact he'd been thoroughly trounced by competitors who probably spent literally all day practising maths problems and who'd never read a science-fiction book and who would burn out completely before puberty and never amount to anything in their future lives because they'd just practised known techniques instead of learning to think creatively.
-- Eliezer Yudkowsky -
This is one of theprimary mechanisms whereby, if a fool says the sun is shining, we do notcorrectly discard this as irrelevant nonevidence, but rather find ourselvesimpelled to say that it must be dark outside.
-- Eliezer Yudkowsky -
[...] intelligent people only have a certain amount of time (measured in subjective time spent thinking about religion) to become atheists. After a certain point, if you're smart, have spent time thinking about and defending your religion, and still haven't escaped the grip of Dark Side Epistemology, the inside of your mind ends up as an Escher painting.
-- Eliezer Yudkowsky -
When you think of intelligence, don't think of a college professor; think of human beings as opposed to chimpanzees. If you don't have human intelligence, you're not even in the game.
-- Eliezer Yudkowsky
You may also like:
-
Aubrey de Grey
Author -
Ben Goertzel
Researcher -
David Brin
Scientist -
David Chalmers
Philosopher -
Donald J. Boudreaux
Economist -
Douglas Hofstadter
Professor -
I. J. Good
Mathematician -
John Poindexter
Armed force officer -
K. Eric Drexler
Engineer -
Marvin Minsky
Scientist -
Massimo Pigliucci
Professor -
Max Tegmark
Professor -
Nick Bostrom
Philosopher -
Peter Thiel
Entrepreneur -
Ray Kurzweil
Author -
Sam Harris
Author -
Vernor Vinge
Computer Scientist