fbpx
Wikipedia

Why The Future Doesn't Need Us

"Why The Future Doesn't Need Us" is an article written by Bill Joy (then Chief Scientist at Sun Microsystems) in the April 2000 issue of Wired magazine. In the article, he argues that "Our most powerful 21st-century technologies—robotics, genetic engineering, and nanotech—are threatening to make humans an endangered species." Joy warns:

The experiences of the atomic scientists clearly show the need to take personal responsibility, the danger that things will move too fast, and the way in which a process can take on a life of its own. We can, as they did, create insurmountable problems in almost no time flat. We must do more thinking up front if we are not to be similarly surprised and shocked by the consequences of our inventions.

While some critics have characterized Joy's stance as obscurantism or neo-Luddism, others share his concerns about the consequences of rapidly expanding technology.[1]

Summary

Joy argues that developing technologies pose a much greater danger to humanity than any technology before has ever done. In particular, he focuses on engineering, nanotechnology and robotics. He argues that 20th-century technologies of destruction such as the nuclear bomb were limited to large governments, due to the complexity and cost of such devices, as well as the difficulty in acquiring the required materials. He uses the novel The White Plague as a potential nightmare scenario, in which a mad scientist creates a virus capable of wiping out humanity.

Joy also voices concerns about increasing computer power. His worry is that computers will eventually become more intelligent than we are, leading to such dystopian scenarios as robot rebellion. He notably quotes Ted Kaczynski (the Unabomber) on this topic.

Joy expresses concerns that eventually the rich will be the only ones that have the power to control the future robots that will be built and that these people could also decide to take life into their own hands and control how humans continue to populate and reproduce.[2] He started doing more research into robotics and people that specialize in robotics, and outside of his own thoughts, he tried getting others' opinions on the topic. Rodney Brooks, a specialist in robotics, believes that in the future there will be a merge between humans and robots.[3] Joy mentioned Hans Moravec's book ''Robot: Mere Machine to Transcendent Mind'' where he believed there will be a shift in the future where robots will take over normal human activities, but with time humans will become okay with living that way.[4]

Criticism

In The Singularity Is Near, Ray Kurzweil questioned the regulation of potentially dangerous technology, asking "Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk that these same technologies may someday be used for malevolent purposes?" However, John Zerzan and Chellis Glendinning believe that modern technologies are bad for both freedom and the problem of cancer, and that the two issues are connected.[5][6][7]

In the AAAS Science and Technology Policy Yearbook 2001 article "A Response to Bill Joy and the Doom-and-Gloom Technofuturists", John Seely Brown and Paul Duguid criticized Joy for having technological tunnel vision on his prediction by failing to consider social factors.[8]

John McGinnis argues that Joy's proposal for "relinquishment" of technologies that might lead to artificial general intelligence (AGI) would fail because "prohibitions, at least under current technology and current geopolitics, are certain to be ineffective". Verification of AGI-limitation agreements would be difficult due to AGI's dual-use nature and ease of being hidden. Similarly, he feels that Joy's "Hippocratic oath" proposal of voluntary abstention by scientists from harmful research would not be effective either, because scientists might be pressured by governments, tempted by profits, uncertain which technologies would lead to harm down the road, or opposed to Joy's premise in the first place. Rather than relinquishment of AGI, McGinnis argues for a kind of differential technological development in which friendly artificial intelligence is advanced faster than other kinds.[9]

Extropian futurist Max More shares Kurzweil's viewpoint on matters of the impractical and ineffective nature of "technological relinquishment," but adds a larger moral and philosophical component to the argument, arguing that the perfection and evolution of humanity is not "losing our humanity" and that voluntarily-sought increased capacity in any domain does not even represent "a loss" of any kind.[10]

In Zac Goldsmith's article about Bill Joy's interview, he quotes him on how some concerns with new developing technologies are actually more dangerous than he expressed in the article, because Goldsmith claims that the developers of these machines are giving them too much power.[11] Goldsmith states his belief that scientists don't think of a lot of things that can go wrong when they start making inventions, because that will lead to less funding.

In Sophie Tysom's review about Bill Joy's article she says Joy shouldn't be one minded when it comes to newer technology, and should also see that there could be a "compromise" made between him and those new technologies.[12] She also agrees that he has a point for being worried about what will happen in the long run, but doesn't think that these technologies will try to control us in the future. Joy responded to this, stating that he liked that people were starting to respond to his article because it gave them an input on the subject.[13]

Aftermath

After the publication of the article, Bill Joy suggested assessing technologies to gauge their implicit dangers, as well as having scientists refuse to work on technologies that have the potential to cause harm.

In the 15th Anniversary issue of Wired in 2008, Lucas Graves's article reported that the genetics, nanotechnology, and robotics technologies have not reached the level that would make Bill Joy's scenario come true.[14]

Noted conservative commentator Alex Jones cited the article during a discussion on the implications of transhumanism with comedians Joe Rogan and Tim Dillon on the October 27, 2020, episode of the Joe Rogan Experience.[15]

References

  1. ^ Khushf, George (2004). "The Ethics of Nanotechnology: Vision and Values for a New Generation of Science and Engineering", Emerging Technologies and Ethical Issues in Engineering, National Academy of Engineering, pp. 31–32. Washington, DC: The National Academies Press. ISBN 030909271X
  2. ^ Joy, Bill (2000-04-01). "Why the Future Doesn't Need Us". Wired. ISSN 1059-1028. Retrieved 2019-11-14.
  3. ^ Messerly, John (2016-02-17). "Critique of Bill Joy's "Why the future doesn't need us"". Reason and Meaning. Retrieved 2019-11-14.
  4. ^ Moravec, Hans (1998). Robot: Mere Machine to Transcendent Mind. Oup Usa.
  5. ^ Zerzan, John (31 October 2002). "What Ails Us?". Green Anarchy. Federated Anarchy Inc (10). Retrieved 5 March 2012.
  6. ^ "Age of Grief". Primitivism.com. Retrieved 2009-07-08.
  7. ^ Cohen, Mark Nathan (1991). "Health and the Rise of Civilization". Primitivism.com (excerpt); Yale University Press. Retrieved 2009-07-08.
  8. ^ Brown, John Seely; Duguid, Paul (2001). "A Response to Bill Joy and the Doom-and-Gloom Technofuturists" (PDF). Science and Technology Policy Yearbook. American Association for the Advancement of Science.
  9. ^ McGinnis, John O. (Summer 2010). "Accelerating AI". Northwestern University Law Review. 104 (3): 1253–1270. Retrieved 16 July 2014.
  10. ^ More, Max (May 7, 2000). "Embrace, Don't Relinquish the Future". Extropy. Retrieved 22 July 2018.
  11. ^ Goldsmith, Zac (October 2000). "Discomfort and Joy: Bill Joy Interview". Ecologist. 30. Retrieved 20 November 2019.
  12. ^ Tysom, Sophie (January 2001). "Technological Utopias or Dystopias: Is There a Third Way?". 20 (1): 15–16. Retrieved 20 November 2019. {{cite journal}}: Cite journal requires |journal= (help)
  13. ^ Joy, Bill (15 September 2000). "The dark side of technology". Vital Speeches of the Day. 66 (23): 706–709. ProQuest 221453952.
  14. ^ Graves, Lucas (24 March 2008). "15th Anniversary: Why the Future Still Needs Us a While Longer". Wired. Wired.com. Retrieved 2009-10-22.
  15. ^ Joe Rogan Experience #1555 - Alex Jones & Tim Dillon. YouTube. Archived from the original on 2021-12-11.

Further reading

  • Messerly, John G. "I'm glad the future doesn't need us: a critique of Joy's pessimistic futurism." ACM SIGCAS Computers and Society, Volume 33,Issue 2, (June 2003) ISSN 0095-2737

External links

  • "Why the future doesn't need us", Wired, April 2000
  • Rants & Raves: "Why the Future Doesn't Need Us"
  • Bill Joy Hopes Reason Prevails
  • The Center for the Study of Technology and Society: Special Focus on Bill Joy's Hi-Tech Warning
  • Bill Joy – Nanotech, and Genetics, and Robots, Oh My!

future, doesn, need, article, written, bill, then, chief, scientist, microsystems, april, 2000, issue, wired, magazine, article, argues, that, most, powerful, 21st, century, technologies, robotics, genetic, engineering, nanotech, threatening, make, humans, end. Why The Future Doesn t Need Us is an article written by Bill Joy then Chief Scientist at Sun Microsystems in the April 2000 issue of Wired magazine In the article he argues that Our most powerful 21st century technologies robotics genetic engineering and nanotech are threatening to make humans an endangered species Joy warns The experiences of the atomic scientists clearly show the need to take personal responsibility the danger that things will move too fast and the way in which a process can take on a life of its own We can as they did create insurmountable problems in almost no time flat We must do more thinking up front if we are not to be similarly surprised and shocked by the consequences of our inventions While some critics have characterized Joy s stance as obscurantism or neo Luddism others share his concerns about the consequences of rapidly expanding technology 1 Contents 1 Summary 2 Criticism 3 Aftermath 4 References 5 Further reading 6 External linksSummary EditJoy argues that developing technologies pose a much greater danger to humanity than any technology before has ever done In particular he focuses on engineering nanotechnology and robotics He argues that 20th century technologies of destruction such as the nuclear bomb were limited to large governments due to the complexity and cost of such devices as well as the difficulty in acquiring the required materials He uses the novel The White Plague as a potential nightmare scenario in which a mad scientist creates a virus capable of wiping out humanity Joy also voices concerns about increasing computer power His worry is that computers will eventually become more intelligent than we are leading to such dystopian scenarios as robot rebellion He notably quotes Ted Kaczynski the Unabomber on this topic Joy expresses concerns that eventually the rich will be the only ones that have the power to control the future robots that will be built and that these people could also decide to take life into their own hands and control how humans continue to populate and reproduce 2 He started doing more research into robotics and people that specialize in robotics and outside of his own thoughts he tried getting others opinions on the topic Rodney Brooks a specialist in robotics believes that in the future there will be a merge between humans and robots 3 Joy mentioned Hans Moravec s book Robot Mere Machine to Transcendent Mind where he believed there will be a shift in the future where robots will take over normal human activities but with time humans will become okay with living that way 4 Criticism EditIn The Singularity Is Near Ray Kurzweil questioned the regulation of potentially dangerous technology asking Should we tell the millions of people afflicted with cancer and other devastating conditions that we are canceling the development of all bioengineered treatments because there is a risk that these same technologies may someday be used for malevolent purposes However John Zerzan and Chellis Glendinning believe that modern technologies are bad for both freedom and the problem of cancer and that the two issues are connected 5 6 7 In the AAAS Science and Technology Policy Yearbook 2001 article A Response to Bill Joy and the Doom and Gloom Technofuturists John Seely Brown and Paul Duguid criticized Joy for having technological tunnel vision on his prediction by failing to consider social factors 8 John McGinnis argues that Joy s proposal for relinquishment of technologies that might lead to artificial general intelligence AGI would fail because prohibitions at least under current technology and current geopolitics are certain to be ineffective Verification of AGI limitation agreements would be difficult due to AGI s dual use nature and ease of being hidden Similarly he feels that Joy s Hippocratic oath proposal of voluntary abstention by scientists from harmful research would not be effective either because scientists might be pressured by governments tempted by profits uncertain which technologies would lead to harm down the road or opposed to Joy s premise in the first place Rather than relinquishment of AGI McGinnis argues for a kind of differential technological development in which friendly artificial intelligence is advanced faster than other kinds 9 Extropian futurist Max More shares Kurzweil s viewpoint on matters of the impractical and ineffective nature of technological relinquishment but adds a larger moral and philosophical component to the argument arguing that the perfection and evolution of humanity is not losing our humanity and that voluntarily sought increased capacity in any domain does not even represent a loss of any kind 10 In Zac Goldsmith s article about Bill Joy s interview he quotes him on how some concerns with new developing technologies are actually more dangerous than he expressed in the article because Goldsmith claims that the developers of these machines are giving them too much power 11 Goldsmith states his belief that scientists don t think of a lot of things that can go wrong when they start making inventions because that will lead to less funding In Sophie Tysom s review about Bill Joy s article she says Joy shouldn t be one minded when it comes to newer technology and should also see that there could be a compromise made between him and those new technologies 12 She also agrees that he has a point for being worried about what will happen in the long run but doesn t think that these technologies will try to control us in the future Joy responded to this stating that he liked that people were starting to respond to his article because it gave them an input on the subject 13 Aftermath EditAfter the publication of the article Bill Joy suggested assessing technologies to gauge their implicit dangers as well as having scientists refuse to work on technologies that have the potential to cause harm In the 15th Anniversary issue of Wired in 2008 Lucas Graves s article reported that the genetics nanotechnology and robotics technologies have not reached the level that would make Bill Joy s scenario come true 14 Noted conservative commentator Alex Jones cited the article during a discussion on the implications of transhumanism with comedians Joe Rogan and Tim Dillon on the October 27 2020 episode of the Joe Rogan Experience 15 References Edit Khushf George 2004 The Ethics of Nanotechnology Vision and Values for a New Generation of Science and Engineering Emerging Technologies and Ethical Issues in Engineering National Academy of Engineering pp 31 32 Washington DC The National Academies Press ISBN 030909271X Joy Bill 2000 04 01 Why the Future Doesn t Need Us Wired ISSN 1059 1028 Retrieved 2019 11 14 Messerly John 2016 02 17 Critique of Bill Joy s Why the future doesn t need us Reason and Meaning Retrieved 2019 11 14 Moravec Hans 1998 Robot Mere Machine to Transcendent Mind Oup Usa Zerzan John 31 October 2002 What Ails Us Green Anarchy Federated Anarchy Inc 10 Retrieved 5 March 2012 Age of Grief Primitivism com Retrieved 2009 07 08 Cohen Mark Nathan 1991 Health and the Rise of Civilization Primitivism com excerpt Yale University Press Retrieved 2009 07 08 Brown John Seely Duguid Paul 2001 A Response to Bill Joy and the Doom and Gloom Technofuturists PDF Science and Technology Policy Yearbook American Association for the Advancement of Science McGinnis John O Summer 2010 Accelerating AI Northwestern University Law Review 104 3 1253 1270 Retrieved 16 July 2014 More Max May 7 2000 Embrace Don t Relinquish the Future Extropy Retrieved 22 July 2018 Goldsmith Zac October 2000 Discomfort and Joy Bill Joy Interview Ecologist 30 Retrieved 20 November 2019 Tysom Sophie January 2001 Technological Utopias or Dystopias Is There a Third Way 20 1 15 16 Retrieved 20 November 2019 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Joy Bill 15 September 2000 The dark side of technology Vital Speeches of the Day 66 23 706 709 ProQuest 221453952 Graves Lucas 24 March 2008 15th Anniversary Why the Future Still Needs Us a While Longer Wired Wired com Retrieved 2009 10 22 Joe Rogan Experience 1555 Alex Jones amp Tim Dillon YouTube Archived from the original on 2021 12 11 Further reading EditMesserly John G I m glad the future doesn t need us a critique of Joy s pessimistic futurism ACM SIGCAS Computers and Society Volume 33 Issue 2 June 2003 ISSN 0095 2737External links Edit Why the future doesn t need us Wired April 2000 Rants amp Raves Why the Future Doesn t Need Us Bill Joy Hopes Reason Prevails The Center for the Study of Technology and Society Special Focus on Bill Joy s Hi Tech Warning Bill Joy Nanotech and Genetics and Robots Oh My Retrieved from https en wikipedia org w index php title Why The Future Doesn 27t Need Us amp oldid 1121685226, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.