*Aarseth, Espen J. 1997. “Introduction.” In Cybertext: Perspectives on Ergodic Literature, 1–23. Baltimore: Johns Hopkins University Press.
Aarseth attempts to develop a theory of cybertext works, with a focus on “ergodic texts.” Aarseth’s scholarly interest lies in texts that are purposefully shaped by the reader’s tangible and visible actions and decisions. He bases his speculation on the concept that cybertexts are labyrinthine and user-dependent, and contain feedback loops. Aarseth criticizes the counterarguments that many texts can be read as cybertexts, but does not concede that this distinction derives from cybertexts’ necessarily electronic mode. The inherent performativity involved in reading cybertexts occurs in a network of various parts and participants, compared to the more conventional reading model of reader/author/text. Further, Aarseth argues, ergodic texts (primarily virtual games and MUDs) are defined by the agency and authority of the human subject (reader) whose decisions affect the outcome of the text as a whole.
*Balsamo, Anne. 2011. Introduction: “Taking Culture Seriously in the Age of Innovation.” In Designing Culture: The Technological Imagination at Work, 2–25. Durham, NC: Duke University Press.
Balsamo studies the intersections of culture and innovation and acknowledges the unity between the two modes (“technoculture”). She argues that technological innovation should seriously recognize culture as both its inherent context and a space of evolving, emergent possibility, as innovation necessarily alters culture and social knowledge creation practices. Balsamo introduces the concept of the “technological imagination”—the innovative, actualizing mindset. She also details a comprehensive list of truisms about technological innovation, ranging from considering innovation as performative, historically constituted, and multidisciplinary to acknowledging design as a major player in cultural reproduction, social negotiation, and meaning-making. Currently, innovation is firmly bound up with economic incentives, and the profit-driven mentality often obscures the social and cultural consequences and implications of technological advancement. As such, Balsamo calls for more conscientious design, education, and development of technology, and a broader vision of the widespread influence and agency of innovation.
Chamberlin, Barbara, Jesús Trespalacios, and Rachel Gallagher. 2014. “Bridging Research and Game Development: A Learning Games Design Model for Multi-Game Projects.” In Educational Technology Use and Design for Improved Learning Opportunities, edited by Mehdi Khosrow-Pour, 151–71. Hershey, PA: IGI Global. doi:10.4018/978-14666-6102-8.ch00.
Chamberlin, Trespalacios, and Gallagher describe the learning games design model for multi-game projects as used to develop math games. The authors present a design approach that integrates content, instructional design, and gaming aspects via a complete project team. They continue to discuss the implications of their approach for other educational games. Although research on game development is available, the authors say that research on educational game development is still relatively new and rare. They situate their study through the affirmation of adapting game play in learning. The authors conclude that development teams may refer to the following critical components for production of new projects: immersion, guiding questions, and team formation.
*Clement, Tanya. 2011. “Knowledge Representation and Digital Scholarly Editions in Theory and Practice.” Journal of the Text Encoding Initiative 1: n.p. doi:10.4000/jtei.203.
Clement reflects on scholarly digital editions as sites of textual performance, wherein the editor lays down and privileges various narrative threads for the reader to pick up and interpret. She underscores this theoretical discussion with examples from her own work with the digital edition In Transition: Selected Poems by the Baroness Elsa von Freytag-Loringhoven, as well as TEI and XML encoding and the Versioning Machine. Clement details how editorial decisions shape the social experience of an edition. By applying John Bryant’s theory of the fluid text to her own editorial practice, she focuses on concepts of various textual performances and meaning-making events. Notably, Clement also explores the idea of the social text network. She concludes that the concept of the network is not new to digital editions; nevertheless, conceiving of a digital edition as a network of various players, temporal spaces, and instantiations promotes fruitful scholarly exploration.
Davidson, Cathy N. 2011. “Why Badges? Why Not?” [blog post]. HASTAC. https://badges-why-not.
In this much-debated HASTAC post, Davidson argues in support of the “Badges for Lifelong Learning” competition and for the use of badges as an alternate credential system in academia, training, and education. She notes that one of the key benefits of badges is that they recognize achievement and contribution over reputation or credentials, and thus offer alternatives to current institutional and educational credentials and evaluation standards. This blog post incited an extensive discussion about badges as a new credential system. In the comments section, Ian Bogost offers a critical view, pointing out issues such as the false dichotomy between badges and the current letter-grade system, the question of standardization of badges, and issues such as the labour metrics that go with badge systems.
*Davidson, Cathy N., and David Theo Goldberg. 2004. “Engaging the Humanities.” Profession: 42–62. doi:10.1632/074069504X26386.
Davidson and Goldberg argue that despite marginalization, humanistic approaches and perspectives remain significant for successful, holistic university environments. Rather than taking a field-specific approach, Davidson and Goldberg propose a problemor issue-based humanities model that allows for a more interdisciplinary approach. In this way, the comprehensive interpretive tools and complex models of cultural interaction integral to humanities work may resolve varied and continuous issues. The authors suggest that a conceptual and physical shift toward interdisciplinarities within institutions (rather than interdisciplinary institutions, models, or methods) offers a realistic and flexible approach to transforming academia and education.
Drucker, Johanna. 2006. “Graphical Readings and the Visual Aesthetics of Textuality.” TEXT: An Interdisciplinary Annual of Textual Studies 16: 267–76. http://www.jstor.org/stable/30227973.
Drucker discusses design aspects and graphic features that often go unnoticed in print, manuscript, electronic, and text formats. She states that the conception of design elements as autonomous entities is problematic, since it ignores the relational forms of expression in design systems. Drucker describes the space of the page as a system, or a quantum field, in which all graphical elements operate together in “a relational, dynamic, dialectically potential ‘espace’ constitutive of, not a pre-condition for, the graphical presentation of a text” (270–71). Defining the categories of graphic, pictorial, and textual space, Drucker performs a reading of a page from Boethius’s Consolatione to demonstrate her proposed reading and interpretive approach to materiality in textual studies.
—. 2011a. “Humanities Approaches to Graphical Display.” Digital Humanities Quarterly 5 (1): n.p. http://www.digitalhumanities.org/dhq/vol/5/1/000091/000091.html.
Drucker proposes a usability and interaction design approach to data visualization in humanities fields. She draws attention to the fact that many digital visualization tools presuppose an observer-independent reality and an unquestionable representation. Counter to traditional humanities thinking, these tools do not acknowledge ambiguity, interpretation, or uncertainty. Drucker urges humanists to recognize all data as capta (which is actively taken rather than given). Furthermore, she advocates for forms of visual expression that display information as constructed by human motivation and perceived according to the interpretation of the viewer or reader. Her argument also opens up space for more 3D representations in data visualization, adding subjective experience to otherwise 2D expressions of time and space. Drucker stresses that such graphical approaches are imperative for humanities tenets to be applied and implemented in digital graphical expressions and interpretations.
*—. 2011b. “Humanities Approaches to Interface Theory.” Culture Machine 12: 1–20. http://www.culturemachine.net/index.php/cm/article/viewArticle/434.
In Drucker’s humanities theory of interface, she argues that the interface is the predominant site of cognition in digital spaces and requires cognizant, intellectual design. Drucker’s theory is predicated on interface design that considers the constitution of a subject, not the expected activities of a user; on graphical reading practices and frame theory; on constructivist approaches to cognition, and on integrating multiple modes of humanities interpretation. She argues for a humanities approach to interface theory that integrates different forms of reading and analysis in order to allow readers to recognize the relations of the dynamic space between environments and cognitive events. Furthermore, while avoiding a descent into screen essentialism, Drucker insists that studying electronic reading practices must be focalized through studying graphical user interfaces, as GUIs constitute reading (and thus the reading subject, or “subject of interface” ).
*Guldi, Jo. 2013. “Reinventing the Academic Journal.” In Hacking the Academy: New Approaches to Scholarship and Teaching from Digital Humanities, edited by Daniel J. Cohen and Tom Scheinfeldt, 19–24. Ann Arbor: University of Michigan Press. doi:10.3998/ dh.12172434.0001.001.
Guldi calls for a rethinking of scholarly journal practices in light of the emergence and allowances of Web 2.0. She argues that journals can re-establish themselves as forthright facilitators of knowledge creation if they adopt notions of interoperability, curation, multimodal scholarship, open access, networked expertise, and transparency regarding review and timelines. For Guldi, the success of the academic journal depends on incorporating social bookmarking tools and wiki formats. Journals should assume a progressive attitude predicated on sharing and advancing knowledge instead of a limiting view based on exclusivity, profit, and intellectual authority.
*Hayles, N. Katherine. 2008. Electronic Literature: New Horizons for the Literary. Notre Dame, IN: University of Notre Dame Press.
Hayles surveys the field known as electronic literature. She suggests that while electronic literature acknowledges the expectations formed by the print medium, it also builds on and transforms them. In addition, electronic literature is informed by other traditions in contemporary digital culture, including computer games. In this way, electronic literature embodies a hybrid of various forms and traditions that may not usually fit together. Hayles outlines a wide variety of electronic literature examples, and comments that new approaches of analysis are required—in particular, the ability to “think digital” and to recognize the aspects of networked and programmable media that do not exist in print literature. In electronic literature, neither the body nor the machine should be given theoretical priority. Instead, Hayles argues for interconnections that “mediate between human and machine cognition” (x). She sees this intermediation as a more playful form of engaging with the complex mix of possibilities offered by contemporary electronic literature.
Huizinga, Johan. 1949. Homo Ludens: A Study of the Play-element in Culture. London: Routledge and Kegan Paul.
Huizinga offers a thorough study and analysis of forms of play. His definition and characteristics of play are widely cited among game scholars and other theorists, demonstrating the significance of his initiative to acknowledge the value of studying the meaning of play. Huizinga carefully outlines characteristics of play: play is a free activity; play steps outside of “real” life; play is different from ordinary life because it is restrained by locality and duration; play consists of rules and has order; and play includes no material interests or profit. While the definition of games and play remains a much-debated topic, Huizinga’s categories offer an important starting point. One key term in contemporary game studies that has emerged from Homo Ludens is the concept of the magic circle: gameplay is isolated from “real” life through locality and duration—play starts and ends, and it is limited in terms of time and space. All play occurs within the realm of these playgrounds.
Jagoda, Patrick. 2014. “Gaming the Humanities.” Differences 25 (1): 189-215.
Jagoda claims that digital games require new critical approaches, as they have a wide impact on various twenty-first-century fields of study and practice. He reflects on games and gamification as a problematic tendency in the digital humanities, primarily through his project “Game Changer Chicago Design Lab.” The Chicago Design Lab experiments with transmedia narrative, collaborative design, and engaging with digital humanities through new artistic forms. When discussing games and gamification, Jagoda acknowledges the cultural spread of games and the growth in participants and stakeholders, as well as the effect of gaming as a pedagogical technique. He elaborates on a case study of the reality game The Source (also included in this bibliography), which deals with varied social justice issues that include bullying, immigration, and health policy. Overall, Jagoda considers collaboration, social justice design research methods, and intersections between the humanities and the sciences, and concludes with the importance of gaming in the humanities as a method of research and collaboration between humanists, artists, designers, technologists, scientists, and educators.
Jones, Steven E. 2011. “Performing the Social Text: Or, What I Learned from Playing Spore.” Common Knowledge 17 (2): 283–91. doi:10.1215/0961754X-1187977.
Jones examines how texts and video games offer performative social system environments that allow for collaborative modelling toward knowledge development and acquisition. He sees video games as social objects that, similar to texts, attain their meaning only through engagement of the player or reader, where players take on a director/metaeditor role through content creation and content sharing. He describes the environment of the simulation game Spore “as a continually reedited universe of content-objects” (288). Jones goes on to compare game play in Spore to textual analysis, referring to Jerome McGann’s development of Ivanhoe as an example, and considers the ways in which both areas allow for modelling to visualize interpretation and rewriting by players. He calls for a cyberinfrastructure for the humanities that allows for interpretive consequences within a social and a structural space. In this space, players/ readers/textual analysts learn through complex, collaborative modelling, and knowledge is acquired through the process of manipulating representations. A textual editing environment based on this premise would remain purposefully unfi open, shared, and perpetually capable of manipulation.
“Second Life, Video Games, and the Social Text.” PMLA124 (1): 264–72.
Jones considers the similarities between the metaverse space in games such as Second Life and the social text and Web 2.0 generally. He explains that in these game spaces tagged objects exist in relation to users (who may also be metatagged through technologies such as RFID chips), thus forming structures in which interactions unite users and objects. Jones argues that these social spaces do not exist apart from the “real world” of meaning making and production. In games such as World of Warcraft, Second Life, Spore, and The Sims, and in certain alternate-reality games (ARGs), collaborative construction is already taking place to create objects and information. Jones concludes that such video game spaces provide humanists with models of networked, metatagged, multi-dimensional environments.
*2013. The Emergence of the Digital Humanities. London and New York: Routledge.
Jones studies the emergence of digital humanities in response to changes in culture. He uses William Gibson’s concept of the eversion of cyberspace (that is, the boundary crossing, flipping, and erasure between cyberspace and non-cyberspace) as a way to describe the cultural change that has led to the current incarnation of digital humanities. Jones frames the emergence of digital humanities as a blending of textual studies and game studies. He provides readings of popular games such as Fez and Spore, as well as a number of indie games, to analyze the relation between digital humanities and game studies. Jones concludes with an overview of practices (such as desktop fabrication) that are relevant to both gaming and digital humanities. For a snapshot of Jones’s stated views on scholarly communication, please see the annotation of the “Publication” chapter from The Emergence of the Digital Humanities elsewhere in this bibliography collection.
*Latour, Bruno. 2009. “A Cautious Prometheus? A Few Steps Towards a Philosophy of Design (with Special Attention to Peter Sloterdijk).” In Networks of Design: Proceedings of the 2008 Annual International Conference of the Design History Society, edited by Fiona Hackne, Jonathan Glynne, and Viv Minto, 2–10. Boca Raton, FL: Universal Publishers.
Latour meditates on the form and function of the term “design,” and proposes a more comprehensive vision for the practice. He suggests that design practitioners focus more fully on drawing together, modelling, or simulating complexity—more inclusive visions that incorporate contradiction and controversy. Latour argues that we are living in an age of design (or redesign) instead of a revolutionary modernist era of breaking with the past and making everything new. Increasingly, design encapsulates various other acts, from arrangement to definition, from projecting to coding. Consequently, the possibilities and instances for design grow exponentially. For Latour, the concept of an age of design predicates an advantageous condition defined by humility and modesty (because it is not foundational or construction-based); a necessary attentiveness to details and skillfulness; a focus on purposeful development (or on the meaning of what is being designed); thoughtful remediation; and an ethical dimension (exemplified through the good design versus bad design binary).
*Losh, Elizabeth. 2012. “Hacktivism and the Humanities: Programming Protest in the Era of the Digital University.” In Debates in the Digital Humanities, edited by Matthew K. Gold, 161–86. Minneapolis: University of Minnesota Press. http://dhdebates.gc.cuny.edu/debates/text/32.
Losh scans the instantiations of, and relations between, hacktivism and the humanities. She contends, along with scholar Alan Liu, that through an increased self-awareness the digital humanities can actually effect real political, social, public, and institutional change. Losh examines the hacking rhetoric and actions of scholar Cathy Davidson (via the HASTAC collaboratory); the Radical Software Group and its director Alexander Galloway; and the Critical Art Ensemble, with a focus on CAE member and professor Ricardo Dominguez. Losh concludes by acknowledging criticism of the digital humanities, and suggests a solution: digital humanists should engage in more public, political collaborations and conversations.
McGann, Jerome. 2001. Radiant Textuality: Literature after the World Wide Web. New York: Palgrave.
McGann’s compilation of essays from 1993 to 2000 shows the development of his work in the digital edition, literary studies and interpretation, and digital scholarly work. He comes to regard critical gaming structures as environments that allow for new approaches to these areas of study. The essays move through McGann’s understanding of the potential of digital technologies as “thinking machines” that can go beyond the material limitations of the book. He describes scholarly work, editions, and translations as performative deformation that manipulates text and supplies a perceptual presentation for the reader. McGann explores the opportunity to leverage the digital ecosystem and enable interplay between multiple fields by using markup and databases to make “N-dimensional space” accessible. The final chapter reveals how the digital game Ivanhoe offers such an environment. In Ivanhoe, a digital role-playing game, a literary work is read and interpreted in a framework that combines primary and secondary texts, scholarship, and the players’ interpretations and commentaries in the same area, thus encouraging new forms of critical reflection. McGann calls this a “quantum field,” where textual objects and reading subjects operate within the same space, and which allows for algorithmic and rhetorical performative activity within rather than outside of the object of attention.
*Ramsay, Stephen, and Geoffrey Rockwell. 2012. “Developing Things: Notes Toward an Epistemology of Building in the Digital Humanities.” In Debates in the Digital Humanities, edited by Matthew Gold, 75–84. Minneapolis: University of Minnesota Press. http://dhdebates.gc.cuny.edu/debates/part/3.
Ramsay and Rockwell take up the “your database/prototype is an argument” conversation (notably championed by Lev Manovich and Willard McCarty). They assert that taking building as seriously as scholarly work could productively dismantle or realign the focus of the humanities from its predominantly textual bent. Ramsay and Rockwell advocate for installing the user, reader, or subject at the level of building. Through this socially minded conceptual and physical shift, some of the abstractions and black boxing that render digital humanities tools theoretically insufficient could be avoided or amended.
Ryan, Marie-Laure. 1994. “Immersion vs. Interactivity: Virtual Reality and Literary Theory.” SubStance 28 (2): 110–37. doi:10.1353/ sub.1999.0015.
Ryan examines the theoretical implications of virtual reality (VR) in relation to literary theory. She notes the similarities between literary devices commonly used to create a sense of reader participation in a fictional world, and the immersion and interaction devices used in VR to effect what Ryan calls “telepresence.” She identifies immersion (the realistic representation) and interaction or interactivity (the ability to not only navigate but modify) as the two key features that create experiences of reality. Ryan considers VR a semiotic phenomenon, and states that the VR effect is the “denial of the role of signs,” thus allowing for an unmediated environment by working toward the appearance of a transparent medium. She concludes that textual environments are limited in their ability to develop experiences of reality in the way VR does, because their tools of interactivity remain signs instead of physical, unmediated interactivity passing through the body.
*Vetch, Paul. 2010. “From Edition to Experience: Feeling the Way towards User-Focused Interfaces.” In Electronic Publishing: Politics and Pragmatics, edited by Gabriel Egan, 171–84. New Technologies in Medieval and Renaissance Studies 2. Tempe, AZ: Iter Inc., in collaboration with the Arizona Center for Medieval and Renaissance Studies.
Vetch explores the nuances of a user-focused approach to scholarly digital projects, arguing that the prevalence of Web 2.0 practices and standards requires scholars to rethink the design of scholarly digital editions. For Vetch, editorial teams need to shift their focus to questions concerning the user. For instance, how will users customize their experience of the digital edition? What new forms of knowledge can develop from these interactions? Moreover, how can rethinking interface design of scholarly digital editions promote more user engagement and interest? Vetch concludes that a userfocused approach is necessary for the success of scholarly publication in a constantly shifting digital world.