What’s it like to edit Four Zoas? I think Sam Neill knows.
OK, so maybe we’re not quite at that level of John Carpenter horror-insanity. (Yet.) But the Four Zoas manuscript still presents an incredible challenge for any who dare to approach and attempt to define and contain — through editing — this unwieldy monster. It will drive you mad, eventually.
So at the outset of this new academic year, I thought it would be useful to revisit our progress on the project in a more reflective way, to identify where we’re at in the development of a digital edition and to elucidate the myriad complications and problems in working on such a project. If this comes across as a therapy session, well, reader, it basically is. Start the clock.
Some background: the Four Zoas manuscript has been “in the works” in the Blake Archive for well over a decade. New digital photography of the entire manuscript was secured from the British Library in 2005; we’ve since published those images in “preview mode” in the Archive. In that same time frame — and even before the new images were secured — several attempts were made at actually marking up the manuscript using our usual XML schema that’s in place for other works, like illuminated books. While those attempts were useful in learning more about the text, the complexity of the manuscript essentially pushed our editing and encoding protocols to the point of collapse. Madness. That history, along with a more recent round of prototyping a digital edition of Four Zoas, is documented in this article a few of us wrote for 19.
So why can’t it just be done? The way I see it, the challenges for editing Four Zoas break down into three categories: technical, institutional, and aspirational. I’ll explain.
Technical
As I mentioned above in the quick historical overview of the project, Four Zoas is too complex of a manuscript for us to 1) use our usual encoding protocols and still 2) maintain our editorial principles in providing adequately diplomatic transcriptions for a scholarly documentary edition. As a documentary editing project (of sorts), our transcription and markup principles are “diplomatic” in the sense that we attempt to model what the original inscription looks like. “Transcribe what you see” is our motto. That means we do things like maintain mistakes from the original, as well as include various editing marks that may have been made, like cross-outs or erasures. If something is illegible, we don’t simply skip over it or even relegated it to a note — we have editorial markup for illegible characters that produce a specific symbol in the transcription. We spend an inordinate amount of time debating whether a mark is a comma or period. You get the idea.
The problem with the Four Zoas manuscript is that its revision history as a document is incredibly complicated. Not only are there multiple stages of revision present in the surface-level inscriptions of the page, but the pages have also undergone several stages of revision underneath the surface. (Blake reused paper from old projects.) And so, in order to simply render something coherent by way of a transcription using our usual method, we would need to ignore the vast majority of the manuscript’s physical characteristics. Any attempt would inevitably amount to an unacceptable swing towards “readability” and away from “reliability” or fidelity to the original document.
Realizing this early on, we knew we had to develop some new protocols. After much deliberation and experimentation (years of it), we have settled on relatively modest changes to our XML markup procedure to accommodate what we feel lives up to our editorial goals for a digital edition. So it’s all worked out, in theory. Where rubber meets the pavement is where our next challenge begins.
Institutional
Our XML markup procedures are only one piece of producing a digital edition. One also needs infrastructure in place to translate that encoded markup into a usable display (usually in the form of a website). We have a great website. It was redesigned a few years ago, both in form and function, and I’d still rate the Blake Archive as among the best scholarly projects online today. (Yes, I’m terribly biased. Doesn’t mean I’m wrong.) But the website still runs off of our “standard” markup procedures; that’s the language it knows, and that’s the one it can read and transform into a website display. If we’re going to update the way we edit and encode a text, then we have to teach the website infrastructure the new language as well. That’s a difficult, and much different, task. And the changes needed for a new display are about as complex as FZ itself.
Like with building anything big and complicated — a skyscraper, a race car, whatever — building a big web-based project usually requires a lot of different skillsets and knowledge bases. Almost no single person has all of those things on their own. And even if someone has that knowledge and those skills, it’s often not feasible for that one person to do all the work on their own. In the case of building a new web display that could accommodate our reworked encoding protocols, we needed help from someone who could do real web development.
Like so many DH-ers in academic departments, we went running to our university’s DH lab/center/cluster/thing for guidance and support. You know: the people in the library that always seem to be doing everything for everyone all of the time. Being the magnanimous folks that they are, of course they were willing to help. But the DH center as a resource, at least on our campus, is at a constant redline of activity. Maxed out. It is often difficult to schedule meetings or work sessions. And ultimately, there is only one programmer/web developer who’s really able to help us. (You have one of these folks on your campus, too, the one always summoned to sherpa our sorry butts up DH mountain. Ours is “Josh,” soon to be knighted and inducted into sainthood, I hope. He deserves it.)
This is to say nothing of our own work structure(s). The Blake Archive, in both of its group iterations at Rochester and Chapel Hill, is powered by the academic jet fuel that is PhD labor. We’re graduate students. We have our own research and projects to contend with as we progress to completion, not to mention trying to figure out just what we can do when we’re done (spoiler: probably not academia). In other words, we’re busy. Additionally, our funding model — at Rochester, at least — is set up to allow about 2-4 hours of work from assistants per week. (2-4, officially.) You can see how the patchwork nature of schedules and funding could begin to put a damper on things.
And even simply within the Blake Archive itself, we have multiple (dozens?) of projects going at all times. We have a publication schedule set each year, and we work on and complete more “standard” projects accordingly. It’s difficult to maintain a robust R&D wing (which is how I think of the Four Zoas project within our workflow) while also keeping up with the day-to-day operations.
For those working in academia, this insane style of work is not news. It’s just the biz. And I’m not even complaining. Here, though, the sprawling collaboration needed to get this project done seems to complicate that already-complicated academic mode of productivity tenfold. I’m finding that the usual way things get done in the university to be, at times, completely at odds with making progress on this project.
Couldn’t we outsource some of it? The raw web-development stuff, specifically. It’s certainly been discussed. And even more certainly, a person or firm in the public sector would likely have more practical expertise — and more time, as we would buy it — in the specific types of web-based programming we need. (In our case, we’re essentially rebuilding our display in JavaScript.) The problem with that, though, is that our project, like many DH projects, sits at an awkward crossroads of new and legacy tech. XML (and historically, SGML) isn’t really used out in the world the same way that university digital editing projects use it. So even if we find a capable and available JavaScript-er, it’s unlikely they’d be able to do anything with our markup. We’d spend all our time necessarily explaining the markup and FZ to them (driving them insane in the process). Indeed, the people who have the most experience at these crossroads are the people in the DH centers. And around we go.
Furthermore, the Blake Archive uses XML for editing in a somewhat idiosyncratic way — similar to, but not compliant with, more widely adopted protocols like TEI. That has nothing to do with the Archive trying to be cool and rebellious. It’s just that old. TEI was coming up about the same time the Archive was getting going. The reason this is significant is that in cases like prototyping FZ, the Archive does lose out on a wider network of potential support and collaboration.
Aspirational
And in the end, we have only ourselves to blame. Right? Looking at Four Zoas, and also looking ahead at other challenging projects like Blake’s marginalia, or his Notebook, we saw an opportunity to expand the capabilities of the Archive as well as push the field of digital scholarly editing forward. One criticism of digital editing projects that I occasionally come across (and I specifically won’t link to examples, to protect the innocent) is that such projects often “simply” reproduce the representative strategies of print editing. The digital medium adds nothing new, allegedly. (I disagree, at least in the case of the Blake Archive, for a whole lot of reasons, of course.) With the FZ project, though, we have an entire history of potentially failed print editions — failed in the sense of balancing an accurate but accessible edition of the original manuscript. Digital technology might be just what’s needed, and our ideas for a digital edition were/are fundamentally new in that they add elements of interactivity and simulations of three-dimensional structure to our presentation of the text. With this project, we are aspiring really to do something new.
Additionally, over the past 18 months, we’ve even doubled down on our data collecting of the manuscript, pursuing multispectral imaging of FZ in a collaboration with the British Library and the Lazarus Project here at UR. This type of imaging is interesting and useful to us because it’s often used to peer into the buried layers of old, damaged, or heavily edited manuscripts. In other words, we could potentially read new things or learn new stuff about the FZ manuscript, especially in regards to its layers of revision. Our revised encoding protocols will allow for us to model such layers; multispectral imaging would give us new data to fill up those models.
But collecting that data is a yet another new enterprise that brings its own complications. Is it all worth it? I believe, firmly, yes. As that earlier critique implies, we don’t want to simply reproduce print editions. We want to create something new and useful in producing the first digital edition of the Four Zoas.
So we’re working through it all. But it takes time. And I’m sorry, our time is up for today. Thanks for listening.