Hybrid Publishing: Scalar and watching reading write

Last week I attended a fascinating workshop organised by the Hybrid Publishing Lab on ‘Rewiring the Future of Publishing’. Tara McPherson and Robert Ochshorn both gave talks and my notes are underneath. Both Tara and Robert showed way more examples during their talk than I describe here to illustrate and support their argumentation. So any sketchiness in the narrative underneath is completely my doing.

Tara McPherson – Designing for difference. Notes on the Humanities and software design

McPherson is interested in critiques of, what she calls, a specifically American instantiation of the Digital Humanities. 2003 saw a rebranding effort of what was before that known as the computational humanities, which, McPherson argues, was a much more honest term of what that endeavour was about. She starts with referring to Alan Liu’s article in Debates in the Digital humanities, asking ‘where is the cultural criticism in the digital humanities’? Digital humanities does rarely extend its critique to the full register of politics, society, economics and culture. The role of cultural theory within the digital humanities and software studies should be extended to a theoretical explicit form of digital praxis. She takes serious recent claims by for instance Gary Hall, who states that the goal of critical theory and the quantitative and computational studies might be incommensurable and their productive combination will require far more time and care than they have been given so far. How can we design digital tools and applications that emerge from the concerns of cultural theory and in particular from a feminist concern with difference? This need to attend more time and care to the potential intersections of theory and the digital humanities has been the topic of recent lively and heated (online) debates, such as transform DH and DHpoco (Adeline Koh). The latter asks whether the digital humanities is a refuge from race, class, gender, sexuality and disability.

McPherson is interested in this exchange to formulate the beginnings of a claim towards a specific enactment or re-enactment of the humanities (Katie Keen). Ian Bogost has argued that, no, computational platforms are not transparent, but on the other hand, a blind focus on identity politics above all other concerns, has led to humanists neglecting exploring the technical quality of media in depth. This argumentation resonates with a lot of digital media studies and digital humanities research, as well as hardware and software studies. Sometimes we need to ‘bracket’ identity, Bogost argues, even if just for a moment. We cannot focus all the time on identity politics and social justice. McPherson argues however that we can talk about feminism and difference etc. without falling into the trap of identity politics. McPherson refers to Anne Balsamo’s use of the work of feminist philosopher Karen Barad in this respect, and her theory of intra-actions, where Barad understands the relationship between materiality and discursivity, objects and subjects, nature and culture to be fluid natured, open ended and contingent. Balsamo builds on this to develop a model of design practice. In such a model, design of technology, software code, and of self, proceeds for a messy entanglement with matter and each other, were entangled means we lack an independent self-contained existence.

McPherson from there goes back to this idea of the ‘bracketing of identity’. There is a tendency, she claims, to frame computational models as a series of levels, increasingly abstracted and detached from culture, which is a quite prevalent notion. McPherson refers to Mark Marino’s critical code studies. There is a tendency to frame computational methods as maths abstracted from culture, see for instance Bogost’s book series Platform Studies. Here we can find a chart with the five levels of analysis in new media studies, to the level of abstraction of the platform, which is seen as the foundational, primal level. Then there are examples of ‘boxes’ of culture and context. This model of layers, McPherson argues, puts a conceptual framework in place that distracts from the entanglements and interactions, favouring individual layers instead of focusing on the complex and messy ways that the layers themselves come into being. The models we work from are important McPherson argues, if, as Barad argues, matter matters, how we focus on matter also matters. Digital Humanities, and code and platform studies make it sometimes hard for intra-action to be discerned. This idea of conceptual bracketing, this singling out code/computation from culture is important in the discourse of knowledge production as it has been put forward by these disciplines.

As humans, McPherson argues, we see the world as analogue. The idea of the digital is then seen as linear and progressive. These teleological schemes make it hard for genealogical models of becoming of media and the emergence of media. The question is then, how did the digital appear as such a dominant paradigm in contemporary culture. And how did it privilege this logic of modularity and seriality? We can see this focus on the discreet amongst others in digital technologies, in UNIX and in languages like c and c++. How did these discourses from coding culture translate into the wider social world here? What is the specific relationship between context and code here in this historical context? McPherson argues that there is something very specific happening here, in these forms of the digital and digital culture that encourage these forms of partitioning, but it is also deeply entangled with culture. The digital is entangled in an analogue work, although it is increasingly determining this work. It is formed by discourses and peculiar histories and we need to understand these more. We need to track the messy intermingling and becoming of code and context. And feminist scholarship, McPherson argues, is an important asset here. Humanist scholars should learn to code or to acquire advanced technological literacy, but on the other hand feminist phenomenology, postcolonial theory and difference theories need to be incorporated in digital humanities discourses. If we cannot study all discourse and matter at once, McPherson suggests that we can have the agential cut (Barad) instead of the bracketing strategy (which brings modularity back). The cut as a methodological paradigm is fluid and mobile. McPherson argues, following Kember and Zylinska, that the cut as a causal procedure performs the division of the world and becomes an act of decision: where and how we focus matters, it helps us understand the relationship between object, subject, discourse and matter and between identity and difference etc.

So how do these models matter to the digital humanities? Alan Liu argues that we should use the tools, the paradigms and the concepts of digital technologies to help rethink the idea of instrumentality. We should design our tools differently, engaging power and difference, laying bare our theoretical allegiances and exploring the intra-actions of culture and matter. McPherson continues with showing some examples of people already doing this. She mentions for instance the Mukurtu CMS project, rethinking database structures and ontologies from an indigenous perspective. She also mentions examples of feminist game design. Can software be feminist? How can this sort of practice-based work evolve, McPherson asks? Other feminist scholars offer models, such as Balsamo, Brown, Noviski, Juhasz, Cardenas etc. McPherson then continues to talk about her own practice of developing feminist practices and systems of knowledge production and circulation, respecting the paradigms of reading that disciplines have developed over time. She mentions Vectors journal, established in 2005. This experiment revolved around questions like can scholarship be multimodal, performative and immersive? What is the relationship of image and text in experimental screen languages? Vectors wanted to do just that: explore experimental screen languages and interactivity and open access. It formed an experimental test bed for the Scalar software: how to develop threaded arguments. As McPherson explains, it is very hard to do this sort of work, as it is not sustainable. The projects were speculative (Drucker), committed to pushing back against the cultural authority of rationalism in the digital humanities and digital design. Humanities-scholars thinking is not easily amenable to the structures of the database however. The question is then, how can we hack the database to better serve humanities thinking and make it easier for scholar to iterate in the database structure?

http://hollywillis.com/wp-content/uploads/2009/11/ccpc625.jpg

McPherson narrates how she has been involved in setting up the Alliance for Networking Visual Culture, an organisation which grew out of the work they did on Vectors over the last five years, where they started thinking through modelling a different scholarly workflow, related to digital archives, from the digital objects scholars pin their work in, through to research platforms, to publication (including partnerships with archives, scholarly presses and academics). Here the question was: how do scholars work with digital archival materials and how can they publish them in different, lively way? How might scholars activate the database and archive differently? This project thus tried to activate the archive not as an objective neutral space, as just a repository for materials, but it was interested in theories of difference, the archive as a space for argumentation and difference and the cut as activated in the archive in a variety of ways, pushing towards new forms of publication (including ethical issues of open access and fair use). The idea of the Critical Commons is important here and how it ethically reframes cultural production. McPherson also mentions the Shoah project (managed by USC), which is an example of a humanities big data project. What kind of research questions can we draw from such an affectual embodied ensemble?

McPherson narrates how out of these kinds of collaborations they have been building Scalar, an authoring platform which aims to think about publishing in a variety of ways. It is a horizontal, non-hierarchal bottom up platform but also a human-integrated platform that is more top-down, where there are possibilities for scaffolding on this structure. An early prototype of this was Alex Juhasz ‘Learning from Youtube’. What Scalar does is that it takes content and structure it in dynamic ways, but it also targets the ethical engagements of scholars with platforms: what are the agential cuts of the platform in this respect? Scalar also develops contextual relationships to the archives the content draws upon, making clear where the material comes from, where it is embedded from. It also asks: what can we do with the medatadata this object relates to? What if your writing is not linear? Scalar can help with stuff like that. Features are partly based on the idea of structured data, although you can do more experimental Vector-like things with the platform too. The software that underpins Scalar, McPherson states, is born out of the frustration of scholars with traditional database tools. Scalar wants to integrate feminist methodologies into the Scalar system, at the level of software design, by reflecting and extending the research methodologies of scholars. It resists the modularity and compartmentalised logic of dominant computation design, by flattening out these structures. So in this respect Scalar takes seriously feminist methodologies from theories of the cut, alliance, intersectionality and articulation, in the support of individual projects and also in the technical design principles: authors enter into the flow of becoming through the creation of databases on the fly, engaging with the otherness of the machine; Scalar respects database agency but it does not allow everything to it. Scalar thus offers a way to explore the rich intra-actions that connect matter and discourse, to build technology in order to understand technology better and the way it interacts with culture. In this sense it allows us to rewire these circuits, to reactivate the humanist as a critical maker.

Robert Ochshorn – Watching Reading Write

Robert Ochshorn in his talk ‘Watching reading write’, starts with asking how thought transmits from the printed page to the imagination of the reader. Our ability to read comes from object recognition bits in our brains, he states, so we read text visually/hieroglyphically. Ochshorn questions things such as encodings and representations in the digital realm. How do these get established on levels and platforms? But abstractions like these, Ochshorn states, leak: from bugs to assumptions. Even if we wish to problematise the layers, he states, multi-layering is essential in both software and society at large. One of the biggest held assumptions is that text should be stored numerical as a string or a ray of numbers, from ASCII to Unicode. There is the same fundamental assumption about written language although it is apparently incommensurable with the way we actually process language in our brain (as images).

Screen Shot 2014-03-18 at 13.22.15As an experiment Ochshorn uploaded a paper as a continuous sequence of word forms. We can then ‘watch’ reading, in the sense of watching the machine move through words visually, based on the shape of the word, as an example of machine reading. Reading writes when the words move smoothly into each other. As Ochshorn states, we never have just an image but only the space in between images. PCA sees reading as bidirectional, where to read means an ability to write too. Analysis and synthesis are deeply entwined here, Ochshorn argues. In a collaboration with a sound artist, Ochshorn did a PCA analysis, which involved a new analysis of keys which were mapped to detuning. The reading of sound then becomes a synthetic writing and a recording into the instrument.

            Ochshorn is interested in stagnant structures and representations of media. He is especially interested in video. How can we represent and manipulate media differently? Ochshorn wants to challenge the idea that files are discrete and stabile objects, when we can actually manipulate them. Ochshorn also did an investigation into memory and dreams. Dreams about data. Ochshorn argues that memory is what we see, hear, touch, sensation, how do we make sense of that, when memory is finite in our brain? Ochshorn is interested in time, in ‘zooming’ in into time. How do we store the past and think about the past and about memory? Especially when memory is finite? What is the memory of digital databases, what is the dream of technology? When we see dreams as a manifestation of the unconscious, a function of memory, how can our digital memories enter into our dreams?

Screen Shot 2014-03-18 at 13.30.09

Ochshorn presents the video reader he made for the Video Vortex conference, using the open source software he designed called Interlace. You can tag the content and jump through the content via a textual layer, which provides new ways of moving through the material and re-engaging with it. The creation tool and the browsing tool become identical here. Ochshorn also designed a hybrid book, which represents reading as writing in a physical object, which he presented as a gift to the lab. He created a Hyperopia Thing, based on an earlier experiment he had realized at Marcell Mars‘s Public Library festival in 2012 where he had crossreferenced Wikipedia citations with an offline PDF library, such that following a citation link would open the referenced book directly inside of the encyclopedia. Ochshorn wanted to physicalize and embody the idea of reading as a form of writing. He carved out a book and used Raspberry PI which he uploaded with the contents of the English Wikipedia offline. He also put a receipt printer into the book. By surfing Wikipedia you get a trail, which is similar to Vannevar Bush’s idea of writing on the Memex. Within the custom interface to Wikipedia, a horizontal trail is created of the trajectory you take through Wikipedia. The trail is maintained here and everything is stored in a sidebar too. When the trail can be printed then via a receipt out of the book, the idea is then that paper is begetting paper… Ochshorn is interested in this idea of actively engaging with material, where by watching it you may create a trail that you can get back to and add another layer of meaning too.

Ochshorn then goes on to engage us in a conversation about metadata. When we save data to disk, he states, we need to store metadata that allows for its familiar representation. He shows an experimental video game, which represents what digital video looks like without metadata: it becomes a stream of raw digital bytes. When we perceive signal as an example of data without metadata, Ochshorn argues, then noise is even a lack of data. The objective here is to cancel out noise. Patterns emerge when the anti noise meets noise. And hen there is nothing. The only pattern in noise is itself. One can notice some formal similarity between the appearance of noise and the appearance of signal. Without metadata signal and noise are hard to tell apart.

Ochshorn continues to talk about the video he is trying to decrypt which was released by wikileaks under the name collateral murder. He developed this interface inspired by Assange saying that wikileaks had decrypted the video prior to its release. Ochshorn wanted to give others the possibility to decrypt the video, overcoming the formidable hurdles we face to bear witness. However, other sources say the soldier had access to an unencrypted version of this video he says. Even if Assange’s organisation did not decrypt it however to bring the video to our attention, maybe he still did decrypt it in a way. This video (including its title) was not only the release of raw data then, but also of metadata. Decryption was not to make it machine readable but to allow data to be human readable, to make human readable claims about the video.

Although Ochshorn talks about ‘against metadata’, perhaps it should be called beyond metadata or towards a new metadata, he argues. Metadata is too banal and too necessary, he states. Ochshorn pleads: stop designing software that assumes the prior existence of canonical metadata. He discerns 3 kinds of metadata:

–       Machine metadata: designed for the machine to interpret, the width and height of a videoscreen etc.

–     Meta databases: relational columns suited for rigid databases systems, which then come to stand in place of the original data: book cover, icons, timestamps etc.

–       Interpretative metadata: layer of data describing what we make of data

Ochshorn goes on to talk about compression, it is the basis and enabling force of networked media, and as with all metadata, it is deemed successful when it is invisible. Compression is related to what information is most important and a purge of the rest. What kinds of assumptions are underlying this, Ochshorn asks? He tries to disturb this. When we compress something it should become more understandable not less so. What would compression look like if it were targeted at humans and not at machines? Ochshorn shows a series of images. The ideal of metadata would be to incorporate the meaning of that data and compression as a proposal for that meaning. How can we use compression as a caricature, as an exaggeration of distinctive characteristics? The computer scientist Claude Shannon wrote that the meaning of a message is generally irrelevant. Entropy or information is relative to the average or expected message, he writes, information is surprise, the encoding of a normal baseline. Is this a bias to mediocrity, Ochshorn asks? But on the other hand Shannon writes that a machine might not be able to comprehend meaning but can be programmed to have some pass through a slip. In this respect, Ochshorn argues, meaning need not be encapsulated within a packet for it to be transported through the medium.

Screen Shot 2014-03-18 at 13.37.57

Ochshorn then goes on to explore temporal media: sound and video: how can we skim, seek and survey them? What is the potential of the timeline here, Ochshorn asks. The timeline can function as a graphic metadata to communicate the contents of video without having any idea of them. Can we use the timeline as an index to the video? How can we follow the trail of a dancer in the video? Ochshorn then asks whether metadata itself can be a film? He shows the example of the web-based documentary Montage Interdit. Is the film a set of tags? They are an entrance point to the material. Metadata as authorship is only the beginning in this respect. Ochshorn asks about reading and writing in this context: how can we connect these two? He explains the interface you use to tag is the interface you use to represent. The interface is the way you add tags and add meaning to it.

Ochshorn asks how we can bring video in the territory of text, how can we give it a spatial component? Ochshorn is interested here in the idea of maps and space as a territory for mapping. He refers to the Borgesian problem of the map and the territory. Here we see the collapse of creation and authorship: you can never see or read everything; we need distillations, to zoom out through time. This was before the advent of slippy maps however, Ochshorn states. We can control our view of the past with the fluid zoom control. The video then becomes a map through which we can zoom in and out through time. These kind of timelines are important here, Ochshorn argues, and they do not need to be images.

Ochshorn concludes that we need to design videos in such a way that they give instead of remove context, so that we can open codecs up to our own perception, whilst leaving a trail of what has been visually available before. He mentions the idea of stabilisers. But at the same time this is not stable, Ochshorn states, in a way it even renders the video impossible. Ochshorn is interested in this impossibility. Of indexes, codecs and representations. There is no such thing as a perfect interface that shows everything in any media form. He states that machine metadata can be designed to reveal rather than obscure, where the metadatabases can start to allow creation. He mentions for example annotating soundwaves with text to make it searchable.

Comments

Leave a Reply