Building Cages Around the Imaginary
Brianna Dym / The Roux Institute at Northeastern University
Before I was a computer scientist, I taught English and wrote novels. When most people learn this, they respond with surprise. What a change in careers! To go from the world of literature and creativity to the industrialism of computers. From my perspective, I see an overwhelming amount of similarity between the two disciplines. Computer science is a science of the artificial. We fashion programs through a multi-layered process of abstraction, writing in languages that are interpreted by different codes and then reassembled into something that does not necessarily exist in a concrete sense, but is nevertheless interpreted by the people who observe it as having some presence or sway in reality. Broadly speaking, I feel as though the craft of writing does much of the same.
It is with this perspective toward computer science—that it is a creative pursuit similar to the process of creative writing—that I argue for a broader shift in the cultural perspective toward new and emerging technologies and what they ought to be capable of doing.
Currently (in the year 2023), we are experiencing a massive spike in both interest and worry around generative artificial intelligences—AI that uses large statistical models to generate something that resembles the likeness of a requested output (Epstein & Hertzmann, 2023). This kind of technology is the future of all technologies, or at least that is what it feels like in the middle of this constant fervor.
New technologies are always on the horizon and constantly changing both what they promise to be and what they can actually accomplish. Before it existed, versions of generative AI were envisioned in speculative narratives about the future. A particular subgenre of science fiction, design fiction, represents a means of imagining what is possible with emerging technologies, a form of speculation for researchers and designers in computer science (Blythe, 2014). Instead of speculating on the futures of these emerging technologies, I instead look back through nearly 50 years of essays written by Ursula K. Le Guin about fantasy, science fiction, and the nature of humanity’s capacity to imagine alternate possible worlds.
Originally published in 1969, The Left Hand of Darkness is a science fiction novel written by Le Guin that speculates on the nature of human beings if gender as a social condition were removed:
Whatever was left would be, presumably, simply human. It would define the area that is shared by men and women alike…[A]s an experiment, it was messy. All results were uncertain; a repetition of the experiment by someone else, or by myself seven years later, would [certainly] give different results.” (1993, p. 160)
In two separate essays, Le Guin provided commentary on the story, her world-building, and how her story connected with broader social issues. For those unfamiliar with the novel, The Left Hand of Darkness explores a fictional world where gender is fluid. That is, its inhabitants physically alter their sex organs and other traits traditionally tied to sexual dimorphism in humans.
Despite so much of the book’s focus on the fluidity of gender and sexuality, the characters are heterosexual by default. In her later essays, Le Guin remarked that there was no reason for heterosexuality to exist as a norm in the world she imagined, and that she did not entirely know why she had made the assumption:
“I quite unnecessarily locked the Gethenians into heterosexuality. It is a naively pragmatic view of sex that insists that sexual partners must be of opposite sex! In any [of the characters’ homes] homosexual practice would, of course, be possible and acceptable and welcomed—but I never thought to explore this option; and the omission, alas, implies that sexuality is heterosexuality. I regret this very much.” (1993, p. 169)
Le Guin is not the only person to bring her own biases into a speculative work, and she will be far from the last. However, she also draws attention to a theme that runs across speculative fiction: in a world of wondrous imagination where anything could be possible, people gravitate toward the familiar. In her 1974 essay “Why Are Americans Afraid of Dragons?” Le Guin critiqued the cultural phenomenon of reinventing realism within fantasy and science fiction, writing, “Fake realism is the escapist literature of our time” (as published in The Language of the Night, 1993, p. 37).
I call upon Le Guin’s work to draw attention to an important possibility in the realm of not only science fiction, but also that of design. Our capacity to imagine the role of new and emerging technologies–to define what those technologies ought to be and how they should be used—is shaped by our own cultural experiences in the world. In a world where artificial intelligence is often framed as either the scheming villain or a helpful friend (Nader et al., 2022), how do we begin to make sense of the tools that are actually being developed and their potential trajectories?
A person’s imagination and the assumptions that they bring into the imaginary will shape their speculations on the possible futures of technology. To Le Guin’s longstanding point: what are the assumed norms we bring with us into our speculative work? And how do we divest ourselves of those assumptions? I probe these questions from the perspective of someone educating people in the process of designing new technologies, but they remain salient to all creative work that delves into the imaginary.
To step into the role of a designer is to ask oneself to speculate on how technology development should progress. As designers, we have built cages around our imaginations.
As a science of the artificial, nearly everything in computer science was first generated as a construct within the human mind (Simon, 1996). Why a program functions the way it does is in part because a person argued for a specific mental architecture being better than others. In the realm of video games, large developers are ever-increasingly pushing for hyper realistic graphics. Generative AI is able to more accurately mimic human actions.
Like with the realm of speculative fiction, as Le Guin argued, the cutting edge of our technical frontier cannot escape the bounds of faking a certain realism. And so we need to dismantle the cages we’ve constructed around our imagination.
To do so, I put forth what many other computer scientists have already argued for: that we broaden participation in computing. For many years, researchers have argued for the necessity of broadening participation in computing, not only for the advantages of increasing the number of skilled workers needed for a career in demand, but also for the need to include diverse perspectives in developing technologies for the betterment of all society (Peckham et al., 2007). As an educational discipline, computer science still struggles to integrate different perspectives, different ways of knowing. There is a stereotype of computer science as a discipline of the unimaginative, of pure logic, single-minded obsessiveness, and uniformity (Lewis et al., 2016, 2019).
It is unfortunate that these stereotypes plague the discipline when, much like the imagined worlds that Le Guin created, those norms only exist because we assume them to be there. Dismantling the cages around our imagination is merely a matter of shifting perspective, and assuming different values and goals for the future of technology.
Image Credits:
- A Remixed Image by Brianna Dym
- Boat on the water, fantasy, great beyond. Image by Johannes Plenio on Unsplash.
- NASA Rocket Launch. Image by NASA on Unsplash.
- NASA porthole view. Image by NASA on Unsplash.
Blythe, M. (2014, April). Research through design fiction: narrative in real and imaginary abstracts. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 703-712).
Epstein, Z., & Hertzmann, A. (2023) Art and the science of generative AI. Science, 380, 1110-1111. DOI:10.1126/science.adh4451
Le Guin, U. K. (1969). The left hand of darkness. Ace Books.
Le Guin, U. K. (1993). The language of the night: essays on fantasy and science fiction. HarperCollins Publishers.
Lewis, C. M., Anderson, R. E., & Yasuhara, K. (2016, August). “I Don’t Code All Day”: Fitting in Computer Science When the Stereotypes Don’t Fit. In PACM ICER (pp. 23-32).
Lewis, C., Bruno, P., Raygoza, J., & Wang, J. (2019, July). Alignment of goals and perceptions of computing predicts students’ sense of belonging in computing. In PACM ICER (pp. 11-19).
Nader, K., Toprac, P., Scott, S., & Baker, S. (2022). Public understanding of artificial intelligence through entertainment media. AI & Society, 1-14.
Peckham, J., Harlow, L. L., Stuart, D. A., Silver, B., Mederer, H., & Stephenson, P. D. (2007). Broadening participation in computing: issues and challenges. ACM SIGCSE Bulletin, 39(3), 9-13.
Simon, H. A. (1996). The Sciences of the Artificial. MIT press.