Main »

Digital Humanities 2013

Page link?Notes: These conference notes are being written live. I will get all sorts of things wrong and there will be gaps when I'm bored or interested (or skipping.)

These conference notes are about the Digital Humanities 2013 conference at Lincoln Nebraska.

Monday, July 15th

On Monday I had a full day Association for Computing in the Humanities meeting. Some of the issues that we are all facing around the administration of the digital humanities include:

  • How to make our publications as open as possible.
  • How to deal with the regionalization of societies. Does ACH want to be regional?
  • What sorts of activities should ADHO offer to its members? We have journals and other activities - what is missing?
  • How to welcome people new to dh and our associations?
  • How to be more inclusive?
  • And the ongoing issue of multilinguality.

In general (as I have written before) my sense is that as the digital humanities has gone from being a specialized and unimportant field to becoming central to some perceptions of the humanities which has led to various forms of stress. When we were outsiders it didn't really matter that we were mostly white, English speaking and male. Now it matters and lots of people are rightly asking questions about inclusiveness and the field. We need to be honest about our failings and unflinchingly welcome criticism and examine our field. A field is only as healthy as its willingness to be honest with itself.

Tuesday, July 16th

On Tuesday Stéfan Sinclair and I ran a workshop on Teaching Text Analysis with Voyant. Our script is here.

A Storify by a participant is here

I'll be adding my notes on the opening keynote later (I didn't have my laptop so I have to transcribe.)

Wednesday, July 16th

The Design of New Knowledge Environments

I was part of a panel that reported on research from the INKE project. I presented a short history of the interface to text analysis leading up to Voyant and how you can skin Voyant in different ways. Luciano Frizzera presented really cool ideas about multipoint-touch environments for reading variorum editions (see video Jennifer Windsor presented her thoughts on typefaces for legibility in reading environments. Stan Ruecker presented about our workflow visual prototypes.


We had our Association for Computing in the Humanities Annual General Meeting during lunch. Bethany Nowviskie (our President) introduced all the activities of the year. Jeremy Boggs talked about the new web site. Susan Brown talked about the neat microgrants programme which gives small amounts of funding to new digital humanists. Stéfan talked about the mentoring programme. We had the job slam. Susan Brown ran a Pedagogy Lightning Talks:

  • Simon Appleford with Jennifer Guiliano talked about which helps with training.
  • Barbara Bordalejo talked about a neat project that shows creativity in a course.
  • Jeremy Brown talked about connecting DH to the real world. BYU helps students by asking them to produce a tangible product that they can use to explain what they do.
  • Brian Croxall talked about a project where students designed an iPad app for "Bellocq's Ophelia". He told the students they would all get As and that worked well!
  • Johanna Drucker talked about students creating tutorials for her DH101@UCLA. Students who took the class were paid to build the tutorials for future students.
  • Matt Gold talked about the Commons in a Box that can be used to connect across institutions.
  • Mark Leblanc talked about MOOCs for credit. He lets students take courses at and then examines them and then asks them do a project and then gives them credit. He expects to be challenged by colleagues.
  • Jennifer Guiliano and Trevor Munioz talked about HILT which used to be DHWI which builds on the DHSI model.
  • Dan O'Donnell talked about the unessay. He uses the digital humanities. See He asks them to just screw around and then helps them fix stuff up.
  • Mia asked about how one deals with the anxiety that people feel about failing at computing.
  • Jentry Sayers talked about the Teaching and Learning Multimodal Communcation project and how students got a collection published.
  • Bob Scott talked about - the idea is that librarians need to practice being a digital humanist in order to support dh.
  • Erik Simpson talked about Ashplant - creating a sandbox for undergraduate digital projects. The idea is to build a framework by and for students around Ulysses.
  • Glen Worthey recited O DH Pioneers and the need to teach pioneers that are not white men.

Bethany closed the ACH AGM by inviting folk to join.


In the afternoon I went to the poster session where I was involved with two posters. One was led by John Simpson and was on a Framework for Testing Text Analysis and Mining Tools. The second was led by Stéfan Sinclair and it was on Voyant Notebooks: Literate Programming and Programming Literacy. The Voyant Notebooks video is here:


Kenny Chow: A Digital Approach to the Design of Gesture-Driven Interactive Narratives

Chow introduced the GeNIE system and gestures. At MIT in the Tangible Media Group they are looking at gesture interfaces in games and mobile devices.

GeNIE stands for Gestural Narrative Interaction Engine. They are trying to use ideas from the humanities and creating an open source platform. They don't just want to solve traditional media problems with new tech. They want to imagine new uses of gestures. They support both Input Gestures and Storyworld Gestures. Drawing on C. S. Peirce they use the idea of an indexical relationship. He had a great list of idexical relationships between gestures and meaning. Pantominmic, Iconic, Metonymic, Manipulative …

Studies of human gesture, studies of human-gestural input,

They draw on Labov for narrative structure. Gestures convey emotional tone that affects narrative. They use shot conventions from cinema for camera angle. He then showed a couple of videos of using the system on a iPad. One video was to show micro-aggressions to help people become aware of them.

There was an interesting question about whether this was about narrative or about ritual. Chow answered that GeNIE is interested in more than just narrative, but also the formation of identity.

Susan Wiesner: Computer Identification of Movement in 2D and 3D Data

They were looking at how to accurately identify movement so that they could recognize movement in 2D (film.) They looked at how dancers describe movement and found that it is very codified. There are genre and style issues between different traditions.

I wonder if they looked at Hachimura's work on Noh movement and using Laban notation.

They used Laban but at a higher level and they are using XML to capture over 200 movements. They did motion capture on dancers doing the movements and edited these to get model moves. Then then trained a system with the model moves. The system then tries to recognize moves and classify them.

I'm not sure I caught the complexity of what they were doing. The system seems to be able to recognize a model move in a 2D video despite different angles and distance from dancer. They seemed to get a reasonable level of recognition.

This is based on codified movements, but now the challenge is to look at uncodified movement and see if the conceptual categorization they developed worked. They are look at concepts of struggle and dances that are supposed to be about conflict.

Jentery Sayers, Devon Ellio and Jeremy Boggs: Made to Make: Expanding Digital Humanities through Desktop Fabrication

They discussed the following:

  • What is desktop fabrication?
  • How is desktop fabrication relevant to digital humanities?
  • What are some example projects?

You can see the slides here .

Desktop fabrication personalizes the manufacturing process. DF is more about prototyping than creating replicas. DF is an umbrella term for more than 3D printing. Neil Gershenfeld has a good book called Fab. Fabrication will pbring programability of material world to personal. At U off Victoria they are creating open kits inspired by Fluxus They are "building" on talk about building in the digital humanities by Steve Ramsay and myself. Jentery gave two examples, the "Telescribe Kit" from 1914 - a failed technology. They are collaborating with museum to digitize artefacts in the history of science and technology. They also look at patents and science fiction. Another kit is the "Flash Jewellery Kit" from 1884. They are replicating jewelry women would wear to display technological innovation.

Some of the implications

  • Applied media history or science and technology studies. See work of Ratto (2013) and Turkel.
  • Look at material of old technology. Understand history as made artefacts. See Gitelman and Kirschenbaum.
  • Better understand history of inscription and transduction. See chun, Sterne, and Vismann.
  • Also tacit knowledge and sense history. How do we understand the experience of hand.
  • And understand embodiment. (I got behind here, but he made a connection to understanding women and technology that I missed.)

Jeremy then talked about the work at the Scholars Lab. People have now heard of making because of the stories around the printing of guns. Jeremy talked about "experimental humanities" - trying things and breaking them. The library isn't just a repository, it is a laboratory. Jeremy mentioned the "hermeneutics of screwing around." The library is a good place for this open ended exploration. They are also interested in playgrounds of loose parts.

Devon Elliot is at the Lab for Humanistic Fabrication that is used by public history students who learn about interactive exhibit design. They have a lot public facing work. Students have to learn about tools and resources at-hand, which is different than reading about it or ordering it.

How can these tools be used in research? Devon's PhD is on the history of stage magic. He can't easily build the stagecraft - the complicated sets. With desktop fabrication he can engage with the sources, that often include diagrams that are incomplete. By making them they can find the parts left out or the gotchas. It also allows them to identify the tacit knowledge and implicit assumptions in sources. Devon can create scale illusions and see what works and what doesn't.

Jentery then talked about next steps and needs. We need modular spaces for DIY. Jentery has talked about pop-up maker spaces. We should think of infrastructure as parts - they burn out and disappear (and get used.) We need coursework. We need best practices for circulating online (think about the gun schematics.) We need more attention to eco-friendly manufacturing. We need to connect with comparative media studies. We should be able to use computational methods for look at the history of media technologies.

This is my vote for the best talk of the conference.

Lindsay Thomas: 4Humanities: Designing Digital Advocacy

Lindsay Thomas presented a paper I was part on the work of The slides are at:

She talked about our survey of arguments for an against the humanities.

Arguments against include:

  • The humanities are useless because there are no skills taught and you don't get jobs
  • The humanities have wandered away from what they should teach (the canon). The humanities have gotten lost.

Arguments for support include:

  • Real world examples of success and statistics that answer the "useless" arguments
  • Arguments about the need for humanities for civil society and for the preservation of culture

Finally, there are arguments that are neither for or against that include:

  • Discussions that bemoan the passing of the humanities
  • Discussions about

She talked about our AllOurIdeas poll, Humanities, Plain and Simple, #WhatEverOneSays and other initiatives:

  • Humanities, Plain and Simple solicits arguments for the humanities in plain language
  • #WhatEvery1Says gathers what people say about the humanities to analyze the discourse
  • Humanities Infographics has produced an infographic promoting the humanities

We are trying to help campaigns advocate for the humanities. She ended with things people can do.

One of the key issues is that we don't have a tradition of muscular advocacy in the humanities. We tend to expect it to be done by someone else because it is hard to be critical and an advocate at the same time. Compared to the advocacy done by the science and medical community we are naive and untrained. We don't lobby local politicians, we don't challenge critics, we don't even think it is something we should do (and we think those who do it are dirtying their hands.) We are part of our problem - of course it really isn't a problem for those of us already tenured (until universities start laying off tenured profs.) Advocacy is needed for our students to have choices and for future students to have access to the humanities at all.

4Humanities is trying to figure out how to deploy the digital to advocate.

At the talk we unveiled an info graphic The Humanities Matter! that was developed by Melissa Terras and @4Hum. See:

James Smithies: ‘State of the Art’: Negotiating a National Standards-approved Digital Humanities Curriculum

James talked about digital humanities in New Zealand and the challenges of the Canterbury earthquakes and new initiatives. They have stared an archive like the CNHM 9/11 type archive. The initiative is at

Like many regions they feel in New Zealand they feel that they are behind. Another challenge is the number of levels of approval that have to take place to develop curriculum. He described features of the programme.

He is trying to strike a balance between "hack and yack". One course is more about methods and one is about critical approaches. He talked bout layers of process and the challenges of starting new things. We are going to be influenced by internal pressures and external forces like accreditation processes.

Katina Rogers: From Anecdote to Data: Humanities Scholars Beyond the Tenure Track

Katina talks about scholar practitioners at the Scholars Lab at U of Virginia. The Lab is an exception to

We are a moment where we need transparency around alternative academic scholars (alt-ac) and the alternatives to tenure track.

The Scholarly Communication Institute launched a study on alt-ac careers and training. Most people entering grad school expected to become professors. The expectations are not aligned with the realities of the job market. Further students report getting little to no career advice. Few get prepared or advised about alternative careers. And yet many universities have great resources on campus. Why not list PhDs on campus in alt-ac jobs so they could mentor students. Further, departments should track the outcomes of their programmes. The time is right to measure prestige in other ways.

Katina discussed what alt-ac is - does it have to be alternatives in the university or could people do alt-ac outside the university. She talked about why people take alt-ac jobs: location, to acquire new skills, salary, contribution to society, benefits, and family considerations. There is a need to find a job, especially when you have a lot of debt.

She then talked about competencies employers want:

  • Collaboration and interpersonal
  • Communication skills (oral, writing)
  • Research skills (grad students tend to undervalue this compared to employers)
  • Analytical skills
  • Project management (employees overvalued, but they also needed training in this)

These skills are also useful to people who pursue professorial careers. Why not teach these skills to all graduate students. They have a public database that we can add to at

The Praxis Network is a new initiative that is equipping students for forward-looking scholarship and meaningful humanities careers. See

She ended with things we can do.

Paul Fyfe: Counting Words with Henry James: Towards a Quantitative Hermeneutics

Paul talked about James "In the Cage" and counting words. He mentioned a conference Surface reading / Machine Reading.

We are moving beyond the hermeneutics of suspicion. I wonder if we are seeing a suspicion of suspicion.

He is inspired by Ramsay's screwmeneutics. How can we play around? He is inspired also by Drucker on the performative. What might be our aesthetic reaction to data? Lets think about the performative dimensions of counting and counting for words.

I came to the talk late, but he made interesting connections to numeracy. How does counting become a performance? Can one just enjoy the surface characteristics.

I need to talk to Paul and make sure I understand what seems like a very intriguing idea. Could we perform older ways of counting like what Mendenhall wrote about in 1887 in "The Characteristic Curves of Composition"? What if we did the stylistic analysis by hand? What if we did multivariant analysis by hand the way Bertin describes with blocks and rods?

Ed Finn: The Science Fiction of Science has his slides and stuff. Project Hieroglyph is a project at the Centre for Science and the Imagination as ASU. The project is about changing the way people think about the future.

This project was started by Neil Stephenson who challenged ASU to think big. Where are the big ideas? ASU challenged Stephenson back to help. The project explores radical science fiction ideas. Stephenson's idea is the Tall Tower. Cory Doctorow had an idea for Lunar Sterography where 3D printers are sent to the moon. The Drone Commons is Lee Konstantinou's idea for drones distributing the internet.

They use Commons in a Box to create a sense of community. They found they got a lot of spam.

Willard McCarty: Busa Award Lecture

Willard started by reminding us of Lisa Lena Opas-Hänninen who passed away this year.

Willard's talk was a combination of retrospection and new research. His PhD was on John Milton and the plan was to become an English professor. He spent 11 years in academic limbo and fell into humanities computing. He became captivated by Ovid's Metamorphoses. He turned to computers to help him. He created his own markup using COCOA. He finally abandoned the project when he realized that markup was wrong for the job.

Willard then shifted to talk about Humanist. He threw himself into Humanist without thinking it would help him, and years later it helped him get a job at Kings. He did it only out of love. He didn't walk a career path, but followed the smell of food on the wind.

Willard wonders, "why me?" He hasn't written code, he isn't on big projects, he doesn't get grants, and he teaches face-to-face. He has argued for failure. He rediscovered the truth of the hacker that modelling is the way to make progress. Modelling however, was too pat an answer.

He therefore turned to history drawing on Ian Hacking and Michel Foucault.

He wanted to call attention to the otherness of the computer. He wanted to recognize and draw attention to the uncanny otherness. It is not to be overcome.

We are surrounded by subtle and mature disciplines of inquiry. When will be able to write with the power of philosophy. We may be smart and popular, but this is not enough. We need more than all the stuff. We need resonance with the intellectual traditions of the arts and humanities. Willard also believes we need the techno-sciences too. He encourages us to pay attention to media arts and how artists have worked with science and technology. Where are we?

The raw material is abundantly to hand - what do we do with it? Where would be with digital scholarship in 20 years?

The point is the struggle with the limitations and otherness. We need to be scathed - changed by computing. He talked about the singularity as if he thinks we will get conversational computers. He reminds us how we co-evolve with tools - we are changed by what we invent. The pace of development is fast and traffic between invention and self-conception is dense. What can we do in the digital humanities?

Then he returned to history. History helps us make choices. He choose the "incunabular" period from the 1940s to the invention to the web. In that period there were tensions between the desire for the computer to save us work and the realization that it wouldn't save time, but create more problems.

There is mounting evidence that literature is probabilistic. Because of the fear of the mathematical our colleagues stay away. This fear tells us about our relationship to computers. People fear the possible success dehumanizing the humanities. Fear of being put out of work.

He talked about how people encountered computers. First the computer as oracle - superhuman intelligence. Second the computer as agent of job loss. The computer was defined by the cold war. We feared the computer causing nuclear war by mistake. The cold war gives us good reason to be fearful of computers. Humanists turned from computing to critical theory because of fear of the cold war. Willard, however feels that students wanted theory and the mechanizers had nothing for them.

The web, coming when the cold war ended, offered a way out, but may have buried it. But the old problems are still there. We need to ask where the criticism is in the digital humanities?

Willard talked about a theoretical poverty that gets announced again and again. The amassing of data doesn't help us with the fundamental problems. Willard feels that we are not facing some basic question, but I don't get the question.

Willard talked about the expulsion from Eden. We live in a hostile universe that doesn't care about us. There is a mounting attack of ourselves as scientists on ourselves as humanists. What is left of us? Willard is fascinated by the ephemeral left when formalization fails. Is there a history of this fascination? I think there is a tradition in philosophy that often leads to calls for silence in the face of the failure to ever capture the truth.

We are involved and attacked. The digital machine has shifted the locus of engagement. Willard suggested that we should remind in the uncanny valley that our plus sign denotes (the + between computers + humanities.) Our structure is the crossroads.

What can we do with computers? Willard seems to think that AI and neurocomputation will surprise us.

It is the struggle that counts, not the product that counts.

Thursday, July 19th

In the morning I went

Fred Moody: Introducing Anvil Academic: Developing Publishing Models for the Digital Humanities

Moody presented a new imprint Anvil Academic that can provide publishing support

He talked about the tension between digital scholarship and analog metrics. Young scholars doing innovative works are being judged by old standards. Some of the things they are trying to publish include:

  • Data-driven projects
  • Multi-modal titles
  • Networked authorship projects
  • Educational innovations

What is missing in the digital space is the support of a publisher:

  • Peer review
  • Editorial services
  • Distribution
  • Impact metrics
  • Imprimatur
  • Cataloging and Preservation

He talkeda bout the criteria for selection: its scholarly contribution, rationale for being digital, contribution to art, and balance.

They are now in a proof of concept phase - trying to prove the worth of the idea. They aren't doing hosting and they are doing preservation through

Anvil Academic has a number of supporters, but will they provide long term funding?

Christopher Long: eBook as Ecosystem of Digital Scholarship

Long took us back to Socratic ways of doing politics (as told to us by Plato.) Plato presents an "erotic" ideal of Socrates performing. Long is trying to likewise perform the dialogue, but digitally. He wanted to test new ideas in public, but without the rigidity of written word. Socrates decided not to write, so Long also decided not to write - instead he started a podcast. See

He then shifted to reflecting on Platonic writing as likewise a public and erotic engagement. Long quoted the "Phaedrus" to the effect that we need to bring writing to life. He talked about the dialogical in writing where, as Latour says, "The reader writes the text."

Could Long create a public sphere that performs the politics of collaborative reading that lives up to the ideas he wants to put forward about public sphere.

He showed various forms of annotation tools that let some form of dialogue take place.

I asked Long about whether he had any problems with the Socratic model of dialogue? He had a great answer about idealizing Socrates and funding and work.

Julianne Nyhan: Joint and multi-authored publication patterns in the Digital Humanities

Nyhan talked about how this comes out of her work on an oral history of the digital humanities. She wanted to see how publication patterns might show collaboration and evolution. They extracted bibliographic data from Computers and the Humanities and Literary and Linguistic Computing. They then looked at frequencies and interelationships. In Chum single authorship stays stable and joint authorship goes up. In LLC the single authorship drops and triple authorship goes up.

She then talked about gender analysis. They found similar proportions of men and women collaborated in LLC, in Chum there is a marked bias towards females being collaborative.

It would be interesting to look at conference proceedings.

Hamed Alhoori: Identifying the Real-time impact of the Digital Humanities using Social Media Measures

Alhoori is looking at how one can use social media to get an early indication of research work worth following. Some people use citations, but concerns have been raised. Other approaches look at comments, bookmarks, downloads and so on. Bollen et al. concluded that scholarly impact is multi-dimensional.

Can reference management software do more than just handling references. Could social reference management systems predict a ranking and then be woven into software like Zotero. They used Mendeley.

He talked about Altmetrics which are based on number of tweets, Facebook likes and so on. This metric can be different than citation metrics.

He talked about how the humanities are different.

Martin Holmes and Janelle Jenstad: Encoding historical dates correctly: is it practical, and is it worth it?

Martin started by talking about the Glorious Revolution and the problem of dates when some countries are on the Julian calendar and others on other calendars. Before the general adoption of the Gregorian calendar we have dating chaos. When they asked colleagues they didn't find any solutions and yet we need interoperability in eventology.

Their project is the Map of Early Modern London that has 4,105 dates encoded so far. The ISO dating standard is meaningless as it presupposes the Gregorian calendar. They want dates to be computable, translatable and so on.

Their solution is to use the capability of the TEI and use the att.datable.custom class of attributes. They can markup the dating method they use and what calendar is used in the text (which could be different.)

Janelle nicely recognized the research assistants at the end.

Takahashi, Mito; Tezuka, Kana; Yano, Tamaki: Identifying the author of the Noh play by considering a rhythmic structure - Validating the application of multivariate analysis

A team presented on Noh. There are currently 250 Noh plays in the repetoire. The main actor is called the "Shite" player. There is a rhythm to Noh plays that is different across authors so they are trying to use it for identifying authors.

Uesaka, Ayaka and Murakami, Masakatsu: Authorship problem of Japanese early modern literatures in Seventeenth Century

Saikaku Ihara was most popular writer of 17th century Japan. The problem is that we don't know exactly which novels are his. Recently computer-processing of Japanese texts has become possible. One of the problems is tokenization of words. Second problem is changing meaning of Chinese ideographs.

After Saikaku died his student edited his manuscripts and published them. They can compare disputed novels to known examples of Saikaku's work.

Deborah Anderson: Representing Texts Electronically in Lesser-used Languages: Current Issues and Challenges in Character Encoding

Anderson talked about the challenges of introducing new scripts into standards like Unicode. The process is much longer. It is more effective to work with a veteran, but that can lead to suspicion. I can imagine that this would be a real issue in first nations communities. Anderson talked about Nepalese scripts.

Another problem is the name of the script like Tibetan or Old Turkic. There can be differences in how groups want to name their script. "Old Hungarian" is not acceptable to some people, despite being the normal English name (which itself can be a problem.)

There are membership fees if one wants to join a Unicode committee to understand the process of approval. Even when a script is approved, there is other stuff needed like fonts, locale data and so on.

Anderson concluded with some recommendations about including people into processes.

ADHO and centerNet General Meeting

At lunch we had a general meeting of ADHO and centerNet. Ryan Cordell talked about DH Commons that is becoming an official publication of centerNet. DHCommons will become a place for peer review of projects. Alex Gil talked about GO::DH and its initiatives to support global digital humanities. One such initiative is Around the World in 80 days - a project that will highlight a different project in the world every day for 80 days.

At the end we had the SIG slam.

Ian Johnson: From database to mobile app: scholar-led development of the Heurist platform

Ian described his very flexible database environment Heurist. Heurist has timelines, maps and above all is fairly easy to use. The philosophy of making the interface easy goes against the grain with the developers who want to use Heurist to make web databases for others (and therefore don't want it to be too easy to use.)

Heurist Mobile works on top of Heurist to provide tours, stops, connections based on your database, if it has that information. That strikes me a neat way to make augmented reality tours from scholarly data.

Ian discussed some of the way projects (or some projects) that can lose track of their scholars. He sees three types of development:

  • Engineering-driven development
  • Scholar-Analyst + Tech Team
  • Scholar Programmers

Quinn Dombrowski: What ever happened to Project Bamboo?

Quinn started by talking about the history of Bamboo and the desire by Chris Mackey at Mellon to fund less technology and more scholarly agendas. They wanted to see less reinventing of wheels. They wanted to advance arts and humanities research through the development of shared technology services. The leadership came out of IT not the humanities which is one reason they ran a community design process. Even from the beginning they got a certain amount of criticism.

The design process brought together 650 participants from 114 organizations. Notably there was representation from small liberal arts institutions. Participants wanted to advance arts and humanities research (not so much shared technology services.) They were soon drowning in data and they thought they could categorize everything and come up with the critical needs. Alas, that didn't work - the attendees at workshop 2 felt that central IT had "decontextualized scholarly practice."

One thing that came out was the Bamboo Commons - a collection of different collections of information from recipes to services. This didn't quite fit what they had intended to make (a bunch of services in the cloud.)

The grand plan was to then ask for millions of dollars to build the wonderful beautiful thing. Then the global financial crisis which changed things. In addition the programme under Chris Mackey (and Ira Fuchs) disappeared. Quinn called for a critical study of funder studies!

Anyway they re-scoped and tried to rebrand. Many were then left out when the project shifted to trying build things. The big picture was to build a services platform, collections, tool directories and all sorts of stuff. Lots of universities now involved.

In December of 2012 Project Bamboo went to the morgue. They have gathered stuff and tried to bury it gracefully.

Some things are still living on including Perseus which uses some things built in Bamboo. The cypher initiative is looking at some demonstration work. DiRT is continuing.

Lessons learned:

  • If you're Bamboo, who's your panda?
  • A demo of solving a real need is worth a thousand words
  • Beware of over-engineering
  • Euclid says that parallel tracks of work won't ever meet
  • Odds are that nothing you communicate will come off worse than communicating nothing

She quoted Brett Bobley's theory of Bamboo - that is was a way of bringing people together.

There is a reliquary of sorts at

Julian Flanders and Fotis Jannidis: A concept of data modeling for the humanities

Data modeling is one of the main tasks of digital humanists and it is important to our self-conception. Alas, there is almost no research on data modeling.

Do we need a theory of data modeling in the humanities? Is it one thing?

In computer science data modeling is based on the modeling of databases. It is a collection of conceptual tools for understanding data and relationships. A data model can also be seen as developing an abstract description. One definition is conceptual one is logical.

This got too dense for me to take notes ... sorry.

In CS data modeling is seen as capturing something in the real world. In the humanities we are not necessarily capturing objective reality. We are often modeling interpretations. This seems to me to be similar to discussions about interpretation and whether interpretations can be true.

There is a history of data modeling in the humanities. There is a tradition from which we draw. The intellectual work of digital humanists in data modeling

Humanists often concentrate on the individuality while computer scientists look for generalities. Digital humanists feel the pull in both directions!

I asked a question to try to see the connection with hermeneutics. It seems to me that data modeling is a form of interpretation or is it something else. Wendell suggested that we are interpreting, but for a machine (which then is for a human.)

Steve Ramsay asked about UML and seemed to suggest that, as in agile programming, we can communicate by making something and saying, "does this work?" Fotis responded that people are using partly conceptual models all the time. We seem to need the semi-formalized models even if they don't work.

Isabel Galina: Is There Anybody Out There? Building a Global DH Community

Isabel Galina is from UNAM in Mexico and helped Red HD - a Network of Humanities Digital in Mexico. She told the story of the journey. There is a consensus that there has been a boom in the digital humanities. DH has established itself, but defining DH is hard. Behind the problem of defining DH is the question of who is "we"? The community has worked hard at building and traditionally viewed itself as welcoming and open. Collaboration and open have been seen as virtues of the digital humanities. And yet, the community has become aware that it is not as open as it thought it was. This openness has become a topic of discussion. The energy of DH if applied to inclusiveness would make it an example of change.

There is an argument that DH has concentrated on building and not on cultural critique. We are undertheorized and underimplemented. In RedHD they are looking at how resources are deployed and discussed. She referenced Fiormonte that many out there are hidden in the official history of DH.

UNAM has 300,000 students and 45,000 academic staff. She reported on some of the research carried out at UNAM including digitization projects. She only discovered the digital humanities in the UK. I think this is a common issue - people are doing what we call digital humanities, but call it other things. She is at the Instituto de Investigaciones Bibliográficas.

  • Raise awareness of DH
  • Identify key scholars and projects
  • Investigate key local issues

How could they do this? They have run scoping activities to find out the issues from institutional recognition. Some of the comments they got:

  • "There are other people like me"
  • Vague support from administration
  • Lack of coherent policies or structures to have a real impact

There are lots of small enthusiastic projects, but nothing to help them grow. She hasn't however, had problems getting funding. Neither is there really a lack of technological resources. Finding and training human resources is an issue - there is little learning support.

Another issue is sustainability from server hosting to preservation. Many were paying for server hosting from their own money because that was easier.

They also noticed an absence of a library community in the digital humanities.

She talked about some of the actions:

  • They are working on guidelines
  • Regular meetings
  • They organized a special issue of a journal
  • They are starting courses

They need to start larger and more complex projects. She mentioned ADHO projects like GO::DH which is not an aid and outreach programme. It would be interesting to see how alternative discussion lists are different than Humanist. A lot of work is about finding people - I'm struck by the early efforts in Canada to build up lists of researchers. She talked about a Day of DH in Spanish and Portuguese. (Yay!!!) Many of the blogs were projects, not individuals.

So what or now what?

She talked about life without reliable technology and adequate infrastructure. And yet lack of technology can help them think about technology. They are not tempted to play at the expense of thinking. The minimal computing project is part of this.

Language is at the forefront of the challenges out there. It is time consuming to participate in an English conversation if it isn't your first language. A few rules that make life easier for non English

  • Avoid acronyms
  • Avoid colloquialisms
  • Provide extra context
  • Use slides with lots of text for people to read

She also argued that we must be realistic and believes there is good will in digital humanities. She described initiatives that might help. Now that we want to be more inclusive it becomes important to think about how to do this. Peripheral countries can lead on inclusiveness, but the center needs to move out. This can't be left to the periphery.



edit SideBar

Page last modified on July 19, 2013, at 03:36 PM - Powered by PmWiki