First published in Fibreculture Issue 10: New Media, Networks and New Pedagogies
Colleges and universities in the United States currently face a daunting challenge: how can we transform longstanding definitions of literacy to account for not only the vast social shifts wrought by the centrality of networked, visual and aural media, but epistemological shifts as well? Calls for reconsidering literacy in light of digital tools are multiple and varied in approach and orientation, ranging from the declaration that every grade school student deserves access to a computer by then President Bill Clinton in 1996, to the articulation of multimodal literacy outlined by Gunther Kress in his seminal book Literacy in the New Media Age to the taxonomy of skills characteristic of a new generation of students who currently inhabit a digital and participatory culture listed by Henry Jenkins in his 2006 paper for the MacArthur Foundation, “Confronting the Challenges of Participatory Culture: Media Education for the 21st Century”; here Jenkins highlights the potential benefits of forms of participatory culture, including “opportunities for peer-to-peer learning, a changed attitude toward intellectual property, the diversification of cultural expression, the development of skills valued in the modern workplace, and a more empowered conception of citizenship.”
Scholarly work dedicated to expanding the definition of literacy and practices of reading and writing to include visual rhetoric and multimedia tools form a rich, longstanding tradition, generally emerging from college and university programs in composition and rhetoric. These programs advocate expanded composition practices, working toward an understanding of the elements of visual rhetoric and the use of multimedia reading and writing practices that unite text, images and design. As Mary Hocks explains in her essay “Understanding Visual Rhetoric,” the key here is in avoiding an easy bifurcation between the visual and the written, acknowledging instead that “all writing is hybrid - it is at once verbal, spatial, and visual.” The incorporation of visual elements, then, is not a radical shift; it is instead part of the always dialogic relationship among divergent aspects of communication.
Further, the resounding proclamations for redefining literacy brings with them attention to the term “literacy” itself. As Anne Wysocki and Johndan Johnson-Eilola argue in their essay “Blinded by the Letter: Why Are We using Literacy as a Metaphor for Everything Else?” literacy is a term that too easily slips off our tongues. Borrowing Stuart Hall's use of the term “articulation” (which in turn is borrowed from Antonio Gramsci), they argue for understanding literacy “not as a monolithic term but as a cloud of sometimes contradictory nexus points among different positions,” adding that “literacy here shifts away from receiving a self to the necessary act of continual remaking, of understanding the ‘unity' of an object (social, political, intellectual) and simultaneously seeing that that unity is contingent, supported by the efforts of the writer/reader and the cultures in which they live.” Many scholars have contributed significant work to this larger project of rethinking literacy alongside expanded acts of reading and writing, including Gail E. Hawisher, Nancy Kaplan, Stuart Moulthrop, Cynthia Selfe, and Scott deWitt, all of whom have made convincing arguments regarding the need to include visual rhetoric in any consideration of reading and writing practices in the 21st century, with attention to a redefinition that acknowledges this inclusion.
Many scholars have also studied the role of networks in educational settings, and while much of this work focuses on distance learning and forms of distributed teaching, with the preponderance of Web 2.0 tools, especially those designed specifically for students, critical analysis of Net-based opportunities are on the rise. Yochai Benkler's The Wealth of Networks: How Social Production Transforms Markets and Freedom outlines the shift from a mass-mediated public sphere to a networked public sphere, and suggests implications for education and students. Further, the work of Trebor Scholz interrogates the ways in which people actually use online tools and collaborative environments, and his Distributed Learning Project, which is a site for sharing resources for the teaching of and learning about new media art, offers a productive and real embodiment of distributed scholarship. And Cheryl Ball's 2007 examination of Michael Wesch's Web-based essay, “The Machine Is Us/ing Us” and the potential of Web-based forms of composition points to future forms of net-based writing practices that unite text, graphics, images and sound in a kind of desktop-oriented compositional mode.
Along with these redefinitions of literacy and calls for its expansion, we are also witnessing a groundswell of interest in understanding how to teach, in light not only of changes in the abilities and skills of students, but with respect to vastly different social, cultural and economic contexts that both characterize contemporary educational settings and will greet students when they leave the university. These shifts range from the decade-old critique of the increasingly corporate nature of the university and its role in sustaining a notion of the nation-state to the challenges of fully understanding what Brian Goldfarb dubs a “visual pedagogy.” Goldfarb writes in his introduction, “Subjects ‘come to know' in institutional settings that rely increasingly on media forms to produce knowledge,” continuing: “As the twentieth century progressed, media became an integral part of any discussion about the ‘how' questions in education. How do we teach? Certainly with media. How do media function? Certainly as modes of pedagogy. Throughout the intensified globalization of the second half of the twentieth century, media technology made a firm union with the science of pedagogy broadly applied, and this union has come to symbolize technological life in the industrialized nations of late capitalism.” Goldfarb goes on to analyze the ways in which we might expand our notion of pedagogy to include institutions beyond schools - museums and advocacy groups, for example - and in the process complicates the often simplistic thinking behind deployments of media in the classroom.
Yet another challenge in considering literacy and pedagogical practices in the 21st century comes in offering instruction to a generation of students with fundamentally different abilities and needs than generations prior. Marc Prensky responds to the claims that education in the U.S. has declined dramatically by asking critics to remember the fundamental cause of that decline. He writes, “Today's students are no longer the people our educational system was designed to teach.” He adds, “Today's students have not just changed incrementally from those of the past, nor simply changed their slang, clothes, body adornments, or styles, as has happened between generations previously. A really big discontinuity has taken place. One might even call it a ‘singularity' - an event which changes things so fundamentally that there is absolutely no going back. This so-called ‘singularity' is the arrival and rapid dissemination of digital technology in the last decades of the 20th century.”
Rethinking teaching practices to accommodate the skills and needs of these students often point back to the very tools students are already using. Here are just three recent instantiations of this:
- The 2007 Horizon Report, published jointly by the New Media Consortium and EDUCAUSE, lists an array of “technologies to watch,” by which the authors mean tools and applications that the anticipate will be broadly adopted by universities across the United States in the coming year; among the technologies listed this year are social networking tools, virtual worlds and massively multiplayer games.
- Some scholars are also promoting the use of games as models for learning. Douglas Thomas and John Seely Brown, for example, in a working paper titled “The Play of Imagination: Extending the Literary Mind,” describe the ability of multiplayer online games to “allow players to construct vivid and meaningful ‘conceptual blends' by taking different worlds (such as the physical and the virtual) and combining them to create new and better ways to understand both the game world they inhabit and the physical world.”
- The multi-user virtual environment Second Life is now home to the sites of more than 200 college and universities around the world, and educators are working hard to develop new pedagogical practices designed for the affordances of immersive, virtual environments.
The demand for an expanded definition of literacy to accommodate visual and aural media, then, is not particularly new, but it now carries with it calls for rethinking not only what we teach but how we teach. It also gains urgency as college students transform, becoming producers of media in many of their everyday social activities. The response among those who grapple with these issues as instructors has been to advocate for new definitions of literacy and an emphasis on visual literacy. These efforts are exemplary, and promote a much needed rethinking of literacy and models of pedagogy. However, what I would like to argue here, in what is more akin to a manifesto than a polished argument, is the need to push farther: What if we moved beyond visual rhetoric, as well as a game-based pedagogy and the adoption of a broad range of media tools on campus, toward a pedagogy grounded fundamentally in a media ecology? Framing this investigation in terms of a media ecology allows us to take account of the multiply determining relationships wrought not just by individual media, but by the interrelationships, dependencies and symbioses that take place within the dynamic system that is today's high-tech university. An ecological approach allows us to examine what happens when new media practices collide with computational models, providing a glimpse of possible transformations not only ways of being but ways of teaching and learning. How, then, may pedagogical practices be transformed computationally or algorithmically and to what ends?
What Is an Algorithm? What Is Computation?
To begin to answer this question, we need to consider the nature of algorithms and computation, acknowledging up front the fact that the desire for an algorithmic model is produced at the intersection of cultural, technological and social needs. Indeed, as the deployments of the term outlined below indicate, algorithm and computation here function as metaphors and bear the weight of the desire to articulate a still nascent practice within an emerging social sphere.
Algorithms are closely tied to generative art practices in that generative art sets up parameters and then allows for the unfolding that ensues. Philip Galanter defines generative art as any practice in which “the artist creates a process, such as a set of natural language rules, a computer program, a machine, or other procedural invention, which is then set into motion with some degree of autonomy contributing to or resulting in a completed work of art.” Marius Watz adds a distinction between generative processes that are deterministic and those that are more open: “A central aspect to the generative approach is the use of an externalized system, created by the artist but rarely completely under her control. Standard software tools are deterministic systems that always produce the same results, while generative systems are dynamic processes that must be harnessed and even farmed. The artist specifies the initial boundaries and strategies of creation, and then enters into a feedback loop of adjusting parameters in a search for optimal regions in parameter space. The moment of genuine surprise is often the moment of breakthrough.” Finally, pushing a bit further, Inke Arns suggests that we not look merely at the results of a generative process, but instead at the software itself. In “Code as Performative Speech Act,” Arns writes, “Software art does not regard software merely as a pragmatic, invisible tool generating certain visible results or surfaces, but on the contrary focuses on the program code itself - even if this code is not explicitly being laid open or put in the foreground. According to Florian Cramer, software art makes visible the aesthetic and political subtexts of seemingly neutral technical commands.” What we can take from the artworld's interest in generative and software-based practices is the desire to set something into motion, to relinquish some amount of control, and to reference directly the functionality and parameters of the tools - in this case software - that produce the conditions for the experience.
Algorithms have also recently inspired designers. In her graduate thesis titled, “Allegorithm,” graphic designer Juliette Cezzar, for example, defines an algorithm as a “pre-programmed procedure for an expected or unexpected result,” adding that it is also an attitude, technique, perception and procedure. For Cezzar, algorithms are constitutive of decidedly non-computational practices, such as baseball games, the methodology of scientists and the essay structure deployed by journalists. In all of these cases, an algorithm functions by allowing its users to follow a set of predetermined rules toward expected results. However, the more interesting direction of algorithms is toward unexpected results, as conditions are set in place that generate unanticipated outcomes.
McKenzie Wark addresses the idea of the algorithm in his book, GAM3R 7H3ORY, which looks at gaming as a series of allegories for daily life. In his discussion of allegories and algorithms, Wark defines an algorithm as “a finite set of instructions for accomplishing some task, which transforms an initial starting condition into a recognizable end condition.” He also notes that “what is distinctive about games is that they produce for the gamer an intuitive relation to the algorithm.” He continues:
If the novel, cinema or television can reveal through their particulars an allegory of the world that makes them possible, the game reveals something else entirely. For the reader, the novel produces allegory as something textual. The world of possibility is the world of the linguistic sign. For the viewer, the screen allegory is something luminous. The world of possibility is the world of mechanical reproducibility. For the gamer, the game produces allegory as something algorithmic. The world of possibility is the world internal to the algorithm. So: a passage from the topic to the topographic, mediated by the novel; a passage from the topographic to the topological, mediated by television; a passage, mediated by the game, from the topological to as yet unknown geographies, a point where the gamer seems to be stuck.”
Gaming, then, is a point on one vector of transformation, from the novel to the screen to the immersive world of the game, and from the textual sign to the world of mechanical reproducibility on to the world internal to the algorithm. Gaming takes us inside the algorithm, producing for the gamer “an intuitive relation” to that unfolding that is the algorithm. Rather than an allegory, the gamer experiences the putting into play of a set of elements determined in part by code.
Finally, in the preface to his book Gaming: Essays on Algorithmic Culture, Alexander Galloway defines an algorithm simply as “a machine for the motion of parts.” He goes on to describe video games as an essentially active medium, by which he means a medium “whose very materiality moves and restructures itself.” Galloway's book is an often eloquent examination of video games as a cultural form, but for my purposes, his text, alongside the others noted above, offers a useful vocabulary for a set of computer-based actions that contribute to a new model for the sort of pedagogical practice I am trying to articulate, one similarly grounded on algorithmic unfolding and machinic processes. Following this lead, we should consider relinquishing pedagogical models based on representation, narrative and discourse and move toward an information-based model, one in which cultural objects are technologies and the reader/viewer becomes a user or player.
However, we can only undertake this consideration with the following recognition: as Florian Cramer points out so well in “Words Made Flesh: Code, Culture, Imagination,” algorithms not only date back well before computers, but they also play a part within a cultural imaginary. He writes, “With its seeming opacity and the boundless, viral multiplication of its output in the execution, algorithmic code opens up a vast potential for cultural imagination, phantasms and phantasmagorias. The word made flesh, writing taking up a life of its own by self-execution, has been a utopia and dystopia in religion, metaphysics, art and technology alike.” Algorithms promise almost magical possibilities and the fulfillment of utopian transformation, and their deployment here, as metaphor and model, is with the recognition of their function within a larger cultural imaginary.
Comparing Pedagogical Models
Models of pedagogy are complex, and range from the broad articulation of pedagogy by writers such as Paolo Friere who, in his 1970 book Pedagogy of the Oppressed, argued for replacing a “banking” model wherein an instructor makes scholarly deposits into the student with a model grounded in praxis, dialogue and attention to the contextual politics of any given pedagogical situation, to the work of Henry Giroux who argues for an analysis of “the meaning of knowledge, classroom social relationships, and the political and cultural nature of schooling.” This essay cannot accommodate a full explication of the field of pedagogical theory; instead, I want to examine some of the much more basic components of teaching practices in colleges and universities, and ask how an almost whimsical rethinking might point us toward a computational pedagogy. Below, six small incursions into pedagogical practices and models toward the hope of creating a computational or algorithmic pedagogy:
Select and Combine: Using the Database Instead of Narrative
In many liberal arts classes, the 16-week semester (or 10-week quarter) is arranged linearly as a narrative that begins with a set of questions and concludes with a set of answers, however provisional. The class often includes an arc that resembles that of a narrative, as issues and conflicts reach a climax, then conclude with some sense of resolution. Obviously this is a sweeping description of classroom models, but it fits many courses that strive to address a theme via a series of questions. A narrative model, however, embodies an outdated epistemology and speaks to an older generation. Indeed, if, as Fredric Jameson claims, narrative is the fundamental cultural model for the 20th century, networks and participatory culture define our current moment, bringing with them a different set of themes, structures, practices and ideological constraints. In place of the narrative model, we might imagine the content of our courses as a database that is open to multiple points of entry and innumerable patterns of selection and combination that are realized most fruitfully through a kind of collaborative remixing by student and professor. In this way, the process of learning aligns with the activities of processing and play, such that the course itself becomes “a machine for the motion of parts.” Within this machine, the instructor serves as both designer and player, setting initial rules and expectations in place, but recognizing the need for improvisation as a given enacting of possibilities unfolds.
From Stasis to Process: Allowing for Unexpected Outcomes
The algorithmic model dares us to relinquish the vaunted “learning objectives” dutifully listed on our syllabi in favor of unexpected outcomes. Once set in motion, the course opens up to unforeseen arrangements and convergences of student interest, needs and abilities. As we forego the specificity of the learning objectives, we might gain ground in the larger project of teaching students how to learn, moving beyond the acquisition of information to the acquisition of the ability to learn. As learning objectives themselves become decentered, traditional approaches to evaluating student work must be rethought. One clear implication is the emphasis of process over product in student learning and a model of pedagogy that takes account of different learning styles and a range of possible goals and priorities that may vary from student to student.
Toward Distributed Authority
Another traditional pedagogical model places the instructor in the position of expert delivering knowledge to learners in an institutional structure that frequently remains transparent to students. Known generally as “direct instruction,” this model, too, is outdated in many classroom contexts, and does not align with the broader array of social practices of students, who know that the role of expert is contextual. Further, in technology-based classrooms, students often arrive with more expertise in certain areas than their instructors. An algorithmic pedagogy would examine the structure of the class itself, looking at the specific arrangement of institutional power alongside the expectations of participants. Obviously, instructors do bring fundamentally significant contributions to the classroom, as well as a degree of institutional power that, despite gestures toward divestiture, always remains in place. That said, however, how might one experiment with the deployment of expertise using a network as metaphor? Going further, how might we include networked experience in conjunction with class-based experience? How might we find ways to mobilize a user-oriented, participatory pedagogy in which students cycle fluidly through the roles of teacher, learner and synthesizer?
Soft Media Objects
In a 2006 presentation at Hyperwerk in Basel, Marius Watz noted that “things change when they become digital. A digital video is no longer a tape. It is suddenly a soft media object…” Soft media objects are those that are eminently mutable; situated in a vast online database of traded media, they become the fodder for tactical reuse as they are read, used and redeployed in sometimes eloquent ways akin in spirit to the critical analyses we favor as instructors. As we incorporate this media into the classroom, the modernist emphasis on medium specificity gives way to awareness of the function and context of any media object. The result is a transformation of “finished” art works, texts and media objects into raw materials that are ripe for reinterpretation and recontextualization. A student's coursework is thus inscribed in a process of knowledge production and transformation that continues beyond an individual class and may ideally be conceived as having relevance beyond the university.
Code Literacy: Understanding Dynamic Reading and Writing
Another traditional component of university teaching relies on reading and writing as central practices for gauging student learning. However, more and more what students need is the ability to read and write dynamically. The information around us moves and changes faster than ever, with much of that information existing not in stasis but in a constant state of flux, with ever changing relationships to other data. How can we understand writing and communication when we are dealing with dynamic information? In his graduate thesis, Ben Fry, who with Casey Reas developed the open source programming language known as Processing, asks, “What does the world economy look like? How can the continuously changing structure of the Internet be represented? It's nearly impossible to approach these questions because few techniques exist for visualizing dynamic information.” Processing allows designers who are non-programmers to play with code and begin to understand how to read and write dynamically, and this ability is one needed not only by designers but our culture at large. The ability to engage directly with code, even on a fundamental level, allows for new forms of reading and writing that transform conceptions and definitions of literacy.
Toward Virtual Education: A Process of Invention
A computational or algorithmic pedagogy points to a form of virtual education, with “virtual” in this case being understood in the sense articulated by Gilles Deleuze who defines the term across several years and texts. For Deleuze, the virtual and real are not opposed, nor does the virtual correspond with the possible; rather, the virtual is that which generates a thing's actuality. In replacing the possible-real binary with virtual-actual, Deleuze posits a form of becoming that is not algorithmic in the sense of producing expected and predetermined outcomes but instead in generating the unexpected, and a sense of the “impossible.” In a review of Keith Ansell-Pearson's Philosophy and the Adventure of the Virtual: Bergson and the Time of Life, Daniel W. Smith writes that for Deleuze, “the ‘rules' of virtuality are no longer resemblance and limitation, but difference and divergence. The virtual is itself entirely differentiated; and in actualizing itself, it does not proceed by limitation or exclusion but rather must create its own lines of actualization in positive acts that require ‘a process of invention' (72).” The virtual classroom, then, is a space for the actualization of emerging desires, and an algorithmic pedagogy would enact this process of invention.
The model of algorithmic pedagogy proposed here is admittedly provisional, but not entirely metaphoric. As with attempts to understand the functioning of complex systems, algorithmic pedagogy knits together ideas across divergent territories of disciplinary thought and practice. It is no accident that the vocabulary and epistemological framework that enables this argument derives from computers and digital culture. Indeed, the work of university faculty is rapidly coming to resemble that of systems engineers, architects and designers, as opposed to mere experts. And while this article has focused on the implications of algorithms (and algorithmic thinking) for teaching and learning, an equally profound impact may be seen on the evolution of other realms of scholarly practice, particularly research and academic publication. As the whole of academia slouches grudgingly into the digital age, it may well be time to reexamine an even broader range of conventions, standards and expectations across the academic spectrum.