data and art: intention and chance #dighum1213

Finally home from a week in Europe that has been a crash course in DH. Actually it’s been a crash course in the issues around DH, the opportunity to see some really cool projects, to think about how the digital in humanities has the power to shift the paradigm, and also to hear how some within the field really don’t want that shift and would prefer that the digital remains a tool rather an epistemology.

On the second day of the Herrenhausen conference, “(Digital) Humanities Revisited–Challenges and Opportunities in the Digital Age,” Julia Flanders’ presented her thoughtful inquiry into the connection between art and data and pointed out the false dichotomy between conceptualizing the digital as delimited by the pixel, and analog art as constituting an infinite spectrum of creativity. The dichotomy fails, she argued, if we think of art as it has been aesthetically theorized, namely, as play within constraints. These constraints can be generic, formal, and linguistic. In the same way, what we think of as the infinite play of signifiers in the process of semiosis, in the making of meaning, is also delimited by sign, signifier, and receptor. Flanders identified the real problem with the pixel as not lying then within its boundedness but rather in the lack of connection between pixel and pixel, its positionality and lack of artistic intentionality. In a deftly strategic turn to the textual from the visual, Flanders sees encoded text as being able to retain far more of (the verbal icon’s) signification. Referring to Johanna Drucker’s work on digital aesthetics, Flanders led us to a notion of xml encoded TEI that is richer and more multidimensional than the Madonna rendered in a bitmap image.

In direct contrast with the previous panelist, Lev Manovich’s claim to have thrown out the metadata, to have approached the visualization of big data with no preconceived ideas, the digital editor approaches the analog, manuscript text with judgments about genre, textuality, authorial intention even, in the mark up decisions that are to made, about how to formalize and translate the text into a digital aesthetic. So, multiple editors can lead to multiple layers of interpretation, exploding any accusation of univocality in the marked up verbal icon.

So, how practical is this high digital theory? One of the most refreshing moments in this trip was when Melissa Terras, back at UCL, related the history of the Transcribe Bentham project and her radical suggestion that the thousands of pages of Bentham manuscript could be transcribed through crowdsourcing. Reactions voiced the doubt that even if accuracy in transcription could somehow be monitored, what would be produced would still not be an encoded text. However, trusting in the ability of the human to learn, the project gave transcribers the opportunity to learn how to mark up places and people and now, with the project having reached a benchmark number of pages, the team is excited to be able to hand the data over to the computer scientists to see what kinds of things might be learned from the pages.

Also, the mistakes of editors are enlightening. They give us the chance to recognize our blindnesses and insights. Working with the Fliegel Index of the Moravian papers of North American mission the categories of “knowledge” that Reverend Carl John Fliegel, a research assistant to the Moravian Archives in the 1950s and 1960s, collated, the biases of the time become evident. Despite the amazing value of his index, we must use it with caution; like the digital textual editors of today and tomorrow, Fliegel really could not throw away (his) metadata.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: