I was interviewed recently on one of my favorite podcasts, Eric Molinsky’s Imaginary Worlds. Check it out:
Frankenstein: Annotated for Scientists, Engineers, and Creators of All Kinds
Mary Shelley’s Frankenstein has endured in the popular imagination for two hundred years. Begun as a ghost story by an intellectually and socially precocious eighteen-year-old author during a cold and rainy summer on the shores of Lake Geneva, the dramatic tale of Victor Frankenstein and his stitched-together creature can be read as the ultimate parable of scientific hubris. Victor, “the modern Prometheus,” tried to do what he perhaps should have left to Nature: create life. Although the novel is most often discussed in literary-historical terms—as a seminal example of romanticism or as a groundbreaking early work of science fiction—Mary Shelley was keenly aware of contemporary scientific developments and incorporated them into her story. In our era of synthetic biology, artificial intelligence, robotics, and climate engineering, this edition of Frankenstein will resonate forcefully for readers with a background or interest in science and engineering, and anyone intrigued by the fundamental questions of creativity and responsibility.
This edition of Frankenstein pairs the original 1818 version of the manuscript—meticulously line-edited and amended by Charles E. Robinson, one of the world’s preeminent authorities on the text—with annotations and essays by leading scholars exploring the social and ethical aspects of scientific creativity raised by this remarkable story. The result is a unique and accessible edition of one of the most thought-provoking and influential novels ever written.
Elizabeth Bear, Cory Doctorow, Heather E. Douglas, Josephine Johnston, Kate MacCord, Jane Maienschein, Anne K. Mellor, Alfred Nordmann
About the Editors
David Guston is Professor and Founding Director of the School for the Future of Innovation in Society at Arizona State University, where he also serves as Codirector of the Consortium for Science, Policy, and Outcomes..
Ed Finn is Founding Director of the Center for Science and the Imagination at Arizona State University, where he is also Assistant Professor with a joint appointment in the School of Arts, Media, and Engineering and the Department of English.
Jason Scott Robert is Lincoln Chair in Ethics, Associate Professor in the School of Life Sciences, and Director of the Lincoln Center for Applied Ethics at Arizona State University.
“This new, remarkable annotated edition of Frankenstein with its accompanying essays brings the ‘modern Prometheus’ flawlessly into our century in a manner sure to inspire scientists and nonscientists in a conversation that Shelley herself might not have foreseen but surely would have encouraged.”
—Arthur L. Caplan, Drs. William F. and Virginia Connolly Mitty Professor, founding head of the Division of Bioethics at the School of Medicine, New York University
“This wonderful new edition is a happy addition to the critical literature examining the meaning of the tale for our twenty-first-century commitments to heroic science, engineering, and technology.”
—Rachelle D. Hollander, Director, Center for Engineering Ethics and Society, National Academy of Engineering
“The Promethean tale of Frankenstein is a rich source of questions about the price that scientists and the public pay for knowledge. This annotated edition rescues the classic allegory from popular culture’s caricature and presents it with a framework for exploring the questions raised. Among the many questions, perhaps the most important is, when scientists either from amoral arrogance or negligent lack of foresight present a discovery society is not prepared to deal with—nuclear weapons, engineered gene lines, climate modification—what is the scientists’ responsibility going forward? Is it merely to watch in horror as the knowledge is unleashed on society?”
—Rush D. Holt, Chief Executive Officer, American Association for the Advancement of Science; Executive Publisher, Science Family of Journals
Two centuries ago, on a dare to tell the best scary story, 19-year-old Mary Shelley imagined an idea that became the basis for Frankenstein. Mary’s original concept became the novel that arguably kick-started the genres of science fiction and Gothic horror, but also provided an enduring myth that shapes how we grapple with creativity, science, technology, and their consequences.
Two hundred years later, inspired by that classic dare, we’re challenging you to create new myths for the 21st century along with our partners National Novel Writing Month (NaNoWriMo), Chabot Space and Science Center, and Creative Nonfiction magazine.
Don’t miss the announcement video, featuring yours truly in a role I’m sure to regret:
The recent scandal with Facebook’s Trending Topics news module goes deeper than the revelation that it was humans all along hiding behind the algorithm. It should come as no surprise that Facebook has bias — every organization does. It’s what you do about the bias, how you attempt to disclose it and manage it, that makes a difference. News organizations have been grappling with that question for a long time, creating formal and informal codes of conduct, oversight systems and transparency rules.
Facebook Trending story: The Wizard of Oz algorithm
CNN, May 14, 2016
In reality algorithms have to run on actual servers, using code that sometimes breaks, crunching data that’s frequently unreliable. There is an implementation gap between what we imagine algorithms do in a perfect computational universe and all the compromises, assumptions, and workarounds that need to happen before the code actually works at scale. Computation has done all sorts of incredible things, sometimes appearing both easy and infallible. But it takes hundreds or thousands of servers working in tandem to do something as straightforward as answer a search engine query, and that is where the problems of implementation come in.
Slate, February 26, 2016
What Algorithms Want: Imagination in the Age of Computing, MIT Press, Spring 2017.
The apotheosis of the algorithm is here. In the past several years we’ve hit a turning point, leaving endless debates about artificial intelligence behind in favor of tacitly accepting complex computational systems that tell us where to go, who to date and what to think about (to name just a few examples). The mythos of computation has become almost universal: with every click, every terms of service agreement, we buy into the idea that big data, ubiquitous sensors and various forms of machine learning can model and beneficially regulate all kinds of complex systems, from picking songs to predicting crime. Already these culture machines dominate the stock market, compose music, drive cars, write news articles, and author long mathematical proofs—and their powers of creative authorship are just beginning to take shape. This book proposes that we are missing the algorithmic sea change by focusing only on the crests of waves—we continue studying books, films and games when we should be paying much closer attention to search bars, mobile applications, text prediction systems and other rapidly evolving tools for thinking and authoring. Scholars and cultural critics assume algorithms are all about code. They’re actually about culture.
What Algorithms Want takes on the challenge by reading contemporary algorithms in the context of a long cultural history. The figure of the algorithm, which computer scientists use as convenient shorthand for “a method for solving a problem,” is a mythic concept much older than the invention of the computer, with deep roots in the Enlightenment and the philosophical tradition of rationalism. I excavate this historical narrative through a genealogy of the algorithm as a figure in contemporary culture, tracing its origins in cybernetics, symbolic logic and language philosophy. These foundations inform interpretive readings of a variety of algorithmically entangled cultural works: Apple’s Siri, Netflix’s House of Cards, Ian Bogost’s Cow Clicker and the cryptocurrency Bitcoin, among other objects of analysis. Though seemingly very different from each other, all of these works are algorithmic forms that have been authored by complex computational systems in collaboration with (often unwitting) humans. We work with and think through these culture machines, re-enforcing and reinventing the mythos of the algorithm as we go. I develop a method I call “algorithmic reading” to offer original interpretations of these new modes of hybrid authorship, which involve millions of computer processes and human beings thinking, creating and enacting culture together. Algorithmic reading is reading by the lights and shadows of machines: the brilliant illumination of computationally enhanced cognition and the obfuscations of black boxes. As all culture comes increasingly under the sway of the algorithm, I argue that algorithmic reading will be a vital method for the humanities in the 21st century.
Beyond helping us develop a new reading method, these cultural works teach us something important about the nature of algorithms themselves: namely, that algorithms can never be separated from the conditions of their implementation. Not only are algorithms cultural all the way down, they are systems for belief as much as they are rational tools—the latest incarnations of a tradition that encompasses Liebnitz’s quasi-spiritual mathesis universalis, medieval religious automatons and contemporary representations of god-like artificial intelligence. Coming to terms with this deep culture structure ultimately reveals the interpreter to be herself complexly enmeshed within algorithmic culture machines, from search engines and word processors to the social media platforms on which she shares her work.
The stakes of this conversation are high as algorithmic thinking reorders entire industries, cultures and creative traditions. Even the engineers behind some of the most successful and ubiquitous algorithmic systems in the world—executives at Google and Netflix, for example—admit that they only understand some of the behaviors these systems exhibit. But their rhetoric is transcendent and emancipatory, equating code, bandwidth and freedom. Our standard assumptions about algorithms are historically and critically shallow, and at best we comprehend them through layers of abstraction and analogy. To understand this sea change, we need to read and experiment with algorithms as they are: cultural machines of oceanic depth and complexity.
We spend an awful lot of time now thinking about what algorithms know about us: the ads we see online, the deep archive of our search history, the automated photo-tagging of our families. We don’t spend as much time asking what algorithms want. In some ways, it’s a ridiculous question, at least for now: Humans create computational systems to complete certain tasks or solve particular problems, so any kind of intention or agency would have to be built in, right?
Slate, December 9, 2015
No work of literature has done more to shape the way people imagine science and its moral consequences than Frankenstein; or The Modern Prometheus, Mary Shelley’s enduring tale of creation and responsibility. The novel’s themes and tropes—such as the complex dynamic between creator and creation—continue to resonate with contemporary audiences. Frankenstein continues to influence the way we confront emerging technologies, conceptualize the process of scientific research, imagine the motivations and ethical struggles of scientists, and weigh the benefits of innovation with its unforeseen pitfalls.
Arizona State University will serve as a network hub for celebration of the bicentennial of the writing and publication of Frankenstein, 2016-2018. The Frankenstein Bicentennial Project will encompass a wide variety of public programs, physical and digital exhibits, research projects, scientific demonstrations, competitions, festivals, art projects, formal and informal learning opportunities, and publications exploring the novel’s colossal scientific, technological, artistic, cultural and social impacts.
I co-chair the editorial board for Tomorrow Project USA, an ongoing collaboration with Intel designed to inspire science and fact-based conversations about the future.
I am the co-editor of Hieroglyph: Stories and Visions for a Better Future. The book is the product of a thriving community of science fiction writers, scientists, engineers and many others collaborating on ambitious, technically grounded visions of the near future.
About the Book
Inspired by New York Times bestselling author Neal Stephenson, an anthology of stories, set in the near future, from some of today’s leading writers, thinkers, and visionaries that reignites the iconic and optimistic visions of the golden age of science fiction.
In his 2011 article “Innovation Starvation,” Neal Stephenson argued that we—the society whose earlier scientists and engineers witnessed the airplane, the automobile, nuclear energy, the computer, and space exploration—must reignite our ambitions to think boldly and do Big Stuff. He also advanced the Hieroglyph Theory which illuminates the power of science fiction to inspire the inventive imagination: “Good SF supplies a plausible, fully thought-out picture of an alternate reality in which some sort of compelling innovation has taken place.”