McCarty 2010

From Whiki
Jump to navigation Jump to search
McCarty, Willard, ed. Text and Genre in Reconstruction: Effects of Digitalization on Ideas, Behaviours, Products and Institutions. Cambridge: OpenBook Publishers, 2010.

Cybertextuality by the Numbers, by Ian Lancashire (37-69)

cybertextuality "theorizes the authorial process" by bringing together cybernetics, self-testimony, cognitive psychology and computer-assisted text-analysis (37-8)

"Computer text-analysis (stylistic counts and the repeating patterns in textual concordances) detects what even the author cannot perceive bout himsef: it reveals these chunks in the phrasal vocabularies of authors." (38)

text-analysis software, etc

  • "Such tools extend the human mind in uttering. By availing ourselves of them, we bcome cyborgic and partake of the character of a cybernetic organism. Authoring of texts is a recursive process in which hand-shakes cycle between a sender who utters something and a receiver who perceives the sent message and feeds back information about it to the sender." (39)

death of the author and other postmodern ideas arise from a "legitimate perception of how unconscious the author is of cognitive creative process, of how mysterious it is" (41

  • "from a cybertextual perspective, however, the author is partly alive in the work" (41)
"Cybertextuality asks us to read texts in a new way, to discover within them the stigmata of authoring -- the marks that distinguish its subjection to cognitive limitations. These are partly observed in uttering-feedback cycles as they create evanescent or frozen texts.

need to re-encode visual text phonetically while reading to understand (44)

error correction in speech through hesitation and paralanguage/fillers (47)

"Language self-consciousness appears to be a stream but, when examined closely, consists of staccato-like pulses in which a succession of chunks, proposed by a cognitive conceptualizer (which we experience as the gist of what we intend to say), are monitored for correctness by a parallel process before being articulated. The Muse who brings texts piecemeal into being from darkness, and the Editor who announces corrections to those texts and knits them together from much the same obscurity, feedforward and feedback our utterances in cybertextual cycles." 49-50)

expert cognitive capacity can be expanded in specific domains, but there are capacity limits

  • "So far, little attention has been paid to text genres as evidence of cognitive capacity." (53)
  • "The extent of text that an author can write before reating himself might well signal his omega value." (56)

case studies in Shakespeare's Sir Thomas More and Woolf's The Waves

"Readers do not constrain an author's works; the author's cognitivity does." (68)
"When Christians search for the Logos in the texts of the Bible, and today when we use, in a revealing metonymy, an author's name for his collected works, we anthropmorphize an alien neurological entity that we also have within us but of which we are all, nonetheless, largely unconscious. Cybertextuality does not deny the loss of the creator with a photographed face and a pronounceable name but finds in all texts an anonymous entity. We need authorship attribution methods that can analyse more works than are orphaned in copyright limbo." (69)

The Human Presence in Digital Artifacts, by Alan Galey (93-117)

"This essay considers the tensions between the surface orderliness of scholarly resources and the stubborn irregularity of textual materials." (93)
"This essay argues it should be disquieting to see a deepening separation of material form from idealized content in our tools at the very moment when literary critics have established the materiality of texts to be indispensable to interpretation. As digital textual studies takes shape as a field, it finds itself caught between these divergent trends in computational practice and literary theory." (94)

textual scholarship "driven by an anxious desire to know what lies beneath the perceptual surface" (94)

Carpaccio, Vision of Saint Augustine, and Scientific American showing NYPL

  • "One could read the Carpaccio and Scientific American images' differences as emblematic of the digital humanities in its present state, which emphasizes abstract, large-scale approaches such as linguistic corpora and data mining, the social-science version of literary history practices by Franco Moretti (so-called distant reading), and text analysis techniques that derive patterns from multitudinous low-level observations rather than situated acts of subjective interpretation. These approaches represent a movement away from the humanities' traditionally idiographic tendency (to seek local knowledge about specific cases) and toward the natural and social sciences' nomothetic tendency (to seek abstract patterns and general laws)." (99)
"The concerns this essay advances have tended to remain tacit in the digital humanities,a field whose sustaining progress narratives and investments in fundable projects foster a sense of itself as an onward march into the future -- an avant garde that was the first to embrace computing as a tool for humanities scholarship. Yet the tool-building enterprise risks falling into a binary in which digital tools represent innovation, dynamism, and provocative instability, while the materials they operate upon -- very often literary texts -- represent availability, continuity, and unproblematic stability. This binary makes it easy to forget textual work always has an interpretive dimension that depends upon the complexity of humanities materials" (100)

digital humanists in same position New Bibliographers were (100-1)

"Although textual schoalrship often presents itself in a conservative light as a conduit of tradition and guardian of cultural heritage, its own future depends upon recognizing, pace Greg, that all recorded texts are also of value in proportion as they provoke thought and change in the present." (105)

textual scholars can read beneath the surface, countering screen essentialism (106-7)

"Digital textual scholars have found themselves charged with building a new humanities archive using someone else's tools." (109)
"is it desirable, let alone possible, to divide the content of a text from its material form for the purposes of machine-readability and large-scale computations?" (110)

disconnect between form and content in interface design, critiqued by Kirschenbaum and McGann, "is not merely loose thinking on the part of designers, nor a matter of critical inattention to discipline-specific theoretical discourses, but a basic conflict of values between the text-oriented humanities and other, data-oriented disciplines. Methodology relfects epistemology, and tools can invisibly import assumptions from other fields into the humanities." (112)

"humanities computing has bypassed such moments of theoria in its tendency to think of interface design and text encoding as separate activities, each happening at the opposite ends of the research plan. ... The deferall of interface thus represents not so much work left undone as a missed opportunity to articulate what's at stake in how the humanities understand texts." (114)
"what does the conjucntion and signify in a term like history and future of the book? Is it merely a hsty splice between disciplines, or an expansion of an established field into new territory?" (115)
"For a text encoder working under a computer science model to treat data as extricable from their presentation is consistent with best practice. For a literary scholar to treat texts as inextricable from their presentation is also consistent with best practice. This is the methodological crux facing digital textual scholars of the present and future." (116)