Eliezer Yudkowsky
By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.
— Eliezer Yudkowsky
By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.
— Eliezer Yudkowsky
Ending up with that gigantic outsized brain must have taken some sort of runaway evolutionary process, something that would push and push without limits. And today's scientists had a pretty good guess at what that runaway evolutionary process had been. Harry had once read a famous book called Chimpanzee Politics. The book had described how an adult chimpanzee named Loot had confronted the aging alpha, Heroes, with the help of a young, recently matured chimpanzee named Nikki. Nikki had not intervened directly in the fights between Loot and Heroes, but had prevented Heroes's other supporters in the tribe from coming to his aid, distracting them whenever a confrontation developed between Loot and Heroes. And in time Loot had won, and become the new alpha, with Nikki as the second most powerful......though it hadn't taken very long after that for Nikki to form an alliance with the defeated Heroes, overthrow Loot, and become the new alpha. It really made you appreciate what millions of years of hominids trying to outwit each other - an evolutionary arms race without limit - had led to in the way of increased mental capacity.' Cause, y'know, a human would have totally seen that one coming.
— Eliezer Yudkowsky
Every mystery ever solved had been a puzzle from the dawn of the human species right up until someone solved it.
— Eliezer Yudkowsky
For an instant Harry imagined... Just for an instant, before his imagination blew a fuse and called an emergency shut down and told him never to imagine that again.
— Eliezer Yudkowsky
He wanted to write someone and demand a refund on his dark side which clearly ought to have irresistible magical power but had turned out to be defective.
— Eliezer Yudkowsky
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
— Eliezer Yudkowsky
I don't want to rule the universe. I just think it could be more sensibly organized.
— Eliezer Yudkowsky
If you once tell a lie, the truth is ever after your enemy.
— Eliezer Yudkowsky
I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.
— Eliezer Yudkowsky
© Spoligo | 2025 All rights reserved