This is the pre-event announcement for an online lecture by Miles Brundage, a PhD student in Human and Social Dimensions of Science and Technology at Arizona State University.
Here is the full talk info:
A Social Science Perspective on Global Catastrophic Risk Debates: The Case of Artificial General Intelligence
Thursday 25 July 2013, 17:00 GMT (10:00 Los Angeles, 13:00 New York, 18:00 London)
To be held online via Skype or equivalent. RSVP required by email to Seth Baum (seth [at] gcrinstitute.org). Space is limited.
Abstract: Researchers at institutions such as the Machine Intelligence Research Institute (MIRI) and the Future of Humanity Institute (FHI) have suggested that artificial general intelligence (AGI) may pose catastrophic risks to humanity. This talk will contextualize such concerns using theories and frameworks drawn from science and technology studies (STS) other social science fields. In particular, I will seek to answer the question: what are the conceptual and practical tools available to a non-technical scholar, citizen, or policy-maker seeking to address global catastrophic risks from particular technologies? Using AGI as a case study, I will illustrate relevant concepts that could inform future work on global catastrophic risks such as boundary work (the rhetorical techniques scientists use to demarcate their own work from the speculations of futurists and journalists and thereby cement their own credibility while distancing themselves from potential catastrophic consequences of their disciplines), visioneering (the articulation of and attempt to bring about speculative technological futures, such as those of Eric Drexler in the case of nanotechnology and Ray Kurzweil in the case of AGI), plausibility (a useful framework for assessing future outcomes from technology, as opposed to probability), and responsible innovation (a rapidly growing field of inquiry assessing the various ways in which the public and policy-makers can positively influence the social impacts of scientific and technological research).