Eliezer Yudkowsky
Eliezer Yudkowsky | |
---|---|
Born | September 11, 1979 |
Nationality | American |
Eliezer Shlomo Yudkowsky (born September 11, 1979[citation needed]) is an American blogger, writer, and advocate for friendly artificial intelligence.
Biography
Yudkowsky has said "My parents were early adopters, and I’ve been online since a rather young age. You should regard anything from 2001 or earlier as having been written by a different person who also happens to be named “Eliezer Yudkowsky”. I do not share his opinions."[1]
He is a resident of the San Francisco Bay Area[citation needed]. Largely self-educated,[2]: 38 he co-founded the nonprofit Machine Intelligence Research Institute (formerly the Singularity Institute for Artificial Intelligence) in 2000 and continues to be employed there as a full-time Research Fellow.[3]: 599
Work
Yudkowsky's interests focus on Artificial Intelligence theory for self-awareness, self-modification, and recursive self-improvement, and on artificial-intelligence architectures and decision theories for stable motivational structures (Friendly AI and Coherent Extrapolated Volition in particular).[3]: 420 Yudkowsky's most recent work is on decision theory for problems of self-modification and Newcomblike problems.
Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias[4] sponsored by the Future of Humanity Institute of Oxford University. In February 2009, he helped to found LessWrong,[5] a "community blog devoted to refining the art of human rationality".[2]: 37 LessWrong has been covered in depth in Business Insider.[6] Core concepts from LessWrong have been referenced in columns in The Guardian.[7][8] LessWrong has been mentioned briefly in articles related to the technological singularity and the work of the Machine Intelligence Research Institute (formerly called the Singularity Institute).[9] It has also been mentioned in articles about online monarchists and neo-reactionaries.[10]
Yudkowsky contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks.[11]
Fan fiction
Yudkowsky has also written several works[12] of science fiction and other fiction. His wide-ranging Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality.[2]: 37 [13][14][15][16][17][18] The New Yorker described it as "recast[ing] the original story in an attempt to explain Harry's wizardry through the scientific method."[19]
References
- ^ http://yudkowsky.net/
- ^ a b c Singularity Rising, by James Miller
- ^ a b Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. ISBN 0-670-03384-7.
- ^ "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01.
- ^ "Where did Less Wrong come from? (LessWrong FAQ)". Retrieved September 11, 2014.
- ^ Miller, James (July 28, 2011). "You Can Learn How To Become More Rational". Business Insider. Retrieved March 25, 2014.
{{cite web}}
: Italic or bold markup not allowed in:|publisher=
(help) - ^ Burkeman, Oliver (July 8, 2011). "This column will change your life: Feel the ugh and do it anyway. Can the psychological flinch mechanism be beaten?". The Guardian. Retrieved March 25, 2014.
{{cite web}}
: Italic or bold markup not allowed in:|publisher=
(help) - ^ Burkeman, Oliver (March 9, 2012). "This column will change your life: asked a tricky question? Answer an easier one. We all do it, all the time. So how can we get rid of this eccentricity?". The Guardian. Retrieved March 25, 2014.
{{cite web}}
: Italic or bold markup not allowed in:|publisher=
(help) - ^ Tiku, Natasha (July 25, 2012). "Faith, Hope, and Singularity: Entering the Matrix with New York's Futurist Set It's the end of the world as we know it, and they feel fine". BetaBeat. Retrieved March 25, 2014.
{{cite web}}
: Italic or bold markup not allowed in:|publisher=
(help) - ^ Finley, Klint (November 22, 2013). "Geeks for Monarchy: The Rise of the Neoreactionaries". TechCrunch. Retrieved March 25, 2014.
{{cite web}}
: Italic or bold markup not allowed in:|publisher=
(help) - ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9.
- ^ [1]
- ^ David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31."'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
- ^ Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31.
- ^ "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31.
- ^ Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31.
- ^ Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". archive.org. Retrieved 2013-04-10.
- ^ "Harry Potter and the Methods of Rationality". Esr.ibiblio.org. 2010-07-06. Retrieved 2012-08-31.
- ^ pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"
Publications
- Creating Friendly AI (2001)
- Levels of Organization in General Intelligence(2002)
- Coherent Extrapolated Volition(2004)
- Timeless Decision Theory (2010)
- Complex Value Systems are Required to Realize Valuable Futures (2011)
- Tiling Agents for Self-Modifying AI, and the Löbian Obstacle (2013)
- A Comparison of Decision Algorithms on Newcomblike Problems (2013)
Further reading
- Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
- The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265–272, 289, 321, 324, 326, 337–339, 345, 353, 370.
External links
- Personal web site
- Less Wrong - "A community blog devoted to refining the art of human rationality" co-founded by Yudkowsky.