Main »

CSDH 2019

These are my notes on the CSDH-SCHN and CGSA conferences that are part of the Congress of the Humanities and Social Sciences.

As before I have to warn readers that these notes are not an accurate record of the conferences, or for that matter, of my understanding of the conferences. I have tendency to get distracted, to skip over important things, and to not take notes when the power runs out.

If you want more complete notes written by many hands see the Google Drive area:

Day 1, Sunday, June 2nd

Deb Verhoeven: Making Sense of the Unfathomable: Digital Humanities in Desperate Times.

Dr. Verhoeven started by talking about how we can speak data to power.

She talked about the data and her Kinomatics project. Big data is a collection that is so large that exceeds us in some way. It has an existential component. It forces you to think about something you don't know - what exceeds your knowledge. Big data implicates us and has epistemic implications. We are forced to lean into machines to deal with it.

She then talked about her Gender Offender conference. She talked about the Turner painting A Disaster at Sea and how it shows us without a perspective. When we walk away from the painting are we walking

She then talked about Robert FitzRoy who started gathering data about weather to assist captains with forecasts. His idea was to prevent shipwrecks. She also talked about Samuel Plimsoll who gathered data about ships to prevent the coffin ships. His data led to the "plimsoll line."

Dr. Verhoeven then shifted to ask if we can use data to forecast for patriarchy and to change things. Why is patriarchy so successful. She talked about her confronting data about women's participation in the Australian film industry. All the attempts to change things put the onus on women, those without any power. So she looked at what she could learn from the data to change things and found that it had to do with men not hiring and working with women.

She showed network graphs of who collaborates with who and the gender offenders who work only with men. She mentioned how in criminal network research such studies show who are the people to "take out" to change things.

She then shifted to how we understand power in terms of positioning. She talked about how some strategies (like taking men out, or just adding women) would not work. She showed data about screening and how there are multiple forms of gatekeeping.

She then moved to a new theory of power. She argued that gender isn't enough. It is complex and co-constituted. She found a strong element of class - private boys schools. We need a theory of power that is intersectional. She argued that we have one and it is called Kyriarchy - "a tenacious social system that keeps all intersecting states in relations of domination rather than co-existence." This comes from Elisabeth Schussler Fiorenza, Wisdom Ways.

She then introduced WIDGET, her new tool for gender inclusion. She showed a video of it. It is coming soon and lets you create simulations of how to change things.

She closed by talking about how one can both understand the hopelessness of things while trying to change things.

Susan Brown: To Reify or not to Reify: A Linked Open Data Question

Dr. Brown talked about linked open data and ontologies. She talked about how with RDF triples one can then start drawing inferences or querying the triples. The problem is that it is confining. RDFs float out there without any context when in fact we want to say things about the assertions.

Reification is going beyond the binary relation between two things in a triple. She gave an example of LEL's cause of death which is uncertain. Brown showed how reification can work, but also how it makes things much more complicated. You get 4 times as many triples. The specificity of subject, object and predicate gets buried which then makes machine processing harder.

Alternatives to reification include the Web Annotation Data Model which allows one to talk about annotations of texts. This has the same problem.

They are abandoning the subject oriented triple for a description model. This model avoids reifying but means translating their data.

Geoffrey Rockwell: Exploring through Markup: Recovering Cocoa

I presented a paper about a project recovering the COCOA markup language through creating a tool that replicates the way one might work with it.

The projector turned off and the screen rolled up right when I was about the present the MicroOCP tool in Voyant. You can see the slides here.

ACCUTE Panel: “Where do Interdisciplinary Researchers Fit?”

Chair: Lai-Tze Fan Aimée Morrison, University of Waterloo Lourdes Arciniega, St. Mary's University Leif Schenstead-Harris, Independent Scholar Alison Hedley, McGill University Maya Hey, Concordia University Jason Lajoie, University of Waterloo

Some of the remarks included:

  • An interD researcher fits anywhere they can establish a generative conversation
  • Choose institutions before they choose you. Be suspicious of well resourced academics who boost it.
  • Think about working on policy for the government. Research can be a public good.
  • Is interdisciplinary research a luxury or a technocratic "solution" superimposed across the disciplines being cut?
  • The academy is not the only (or best) home for interdisciplinary work.
  • It is about the question and trying to answer it rather than stick to a discipline.
  • Junior people often run into trouble with gatekeepers. Gatekeeping happens at the ground level, not higher up. It is ironic that the worst disciplinary gatekeeping happens low down.
  • How to convince students that an interdisciplinary programme is a better lens than a discipline? Things like "liberal studies" doesn't explain much.
  • Being able to translate research into separate specific disciplinary languages is a skill one has to learn.
  • In interdisciplinary work it can be impossible to do a complete literature review. The research can appear diluted.
  • Often one is asked to demonstrate departmental allegiance in different forms.
  • Interdisciplinarity often doesn't fit in the academy as well as outside.
  • Why do we keep on returning to the challenges of interdisciplinarity? It seems like we keep on cycling around this.
  • The general public isn't interested in our endless discussions about interdisciplinarity.
  • What do we lose when we do interdisciplinarity?

There were some great suggestions about how to survive:

  • Get a committee that includes people who can help you figure out how to fit in each of the intersecting disciplines
  • Browse online syllabi and read through them - that will give you a sense of what are the key texts in a sub field
  • Take a course in the different fields to connect with faculty from the fields
  • There have been debates about interdisciplinarity for a while. There is a history of such debates. Learn it.

Day 2, Monday, June 3rd

Margaret Linley: Explorations in Machine Reading and Decolonial Methods

Dr. Linley started by talking about how colonial the digital humanities can be. She mentioned Jockers' talking about big data and Alan Liu's (and others) response. The problem is whether we can apply machine techniques without replicating their Western assumptions.

She talked about a project trying to use text analysis to study travel writing, a genre that was used to access unknown places. Travel writing was fundamental to imperialism. Such writing organizes and systematizes knowledge about places in colonial ways. This project uses natural language techniques to identify and connect figures that occupy the landscape but are treated as marginal by the dominant narrative. Such approaches are crucial to decolonizing our digital reading.

The project builds on SFU's archive of travel writing. She reflected on reading from Burnaby mountain. She reflected on habits of spatial organization and representation that show up in writings. Parts of British Columbia can get structured as if they were the lake district in the UK that poets wrote about.

There are dangers using methods that extract named entities. These entities are not abstractions but can be real people, real places. What knowledge is assumed in the analytical processes? What knowledge is used to extract relationships?

They chunk sentences into triples without having to parse sentences completely. The chunks are useful.

They found that the entity extraction didn't work that well as it was trained on a different genre. They are interested in non-named entities like the shepherds or loggers. She is less interested in the named people, but wants to connect the marginal people to the named people. They have to now train for unnamed entities like "sons of the forest." They want to get triplets like "Mr. Gray - listened to - the shepherd".

Kaitlyn Grant: Affective Collections: An Exploration of Methodologies in Digital, Community-based Research

Grant is asking, How does a feminist ethics of care help us understand how to better support community archives? Community archives are created by and controlled (to some extent) by the communities of origin. Care practices reinforce relationships that better peoples' lives.

Archives shape communities and the stories told about communities. Community archives change the relationship between the archive and the community. Participation, shared stewardship, activism are put ahead of traditional archival values.

"Digital storytelling emerged in the 1980s as a form of oral history used by arts practitioners committed to the democratization of culture..." Podcasts, blogs, and fan fiction are examples. Such storytelling change formats rapidly which can make it harder to gather community archives. One needs participatory methods.

The methods in the field include oral history, discourse analysis, ethnographic case studies and digital methods for gathering online materials. The methods changed depending on community and their needs. Case studies are important therefore.

She then asked how care practices would factor in all this. Care practices move away from generalizations about behaviour to looking at cases and how care would help in each situations. The approach looks at the webs of relationships (including those of the archivist.) The researcher develops radical empathy. There is an emphasis on face-to-face relationships, but how do we scale up to deal with larger distributed networks. How can TAPoR involve the users and care for them.

She talked about how care practices could intersect with the digital humanities mentioning Bethany Nowviskie "On Capacity and Care."

She closed by talking about "affective collections" and how there are emotions around documents. Community members often have strong feelings about the records. An Affective Collection takes care of this.

Emese Ilyefalvi: Looking through the Hungarian verbal charm corpus

Ilyefalvi started by talking about three types of folklore databases. The most popular is the genre database that have the same problems as genre books. They are pragmatic as genre, despite its problems, has uses. In her case they had to ask "what do they consider a verbal charm?" There is a concern that it is hard to decide what is a charm or what is a prayer. Generally a verbal charm is uttered and intended to change the world, like trying to change the weather.

She gave examples that sometimes include ritual actions. The textualization of charms can influence the structure of collections. Are they organized by collector, function, rites and gestures? How does one include the physical actions/rites that may accompany the verbal charm.

She then talked about how a database should support many research perspectives. In her project they started with two printed collections. They found incantations in written collections (especially before the availability of recording equipment.) She showed a visualization of the distribution of charms showing different forms of collection.

Ilyefalvi then talked about the metadata gathered and how you can search by any field. In the text they annotated the different speech acts. They have published 1700 charms of 6000 gathered. She then showed how she used Voyant to study the text of the complete dataset. She found that the word "saint" and the word "eyes" had opposite diachronic patterns. Eyes don't show up before 1851. Saint disappears over time. Eyes connect with colours and often have to do with the "evil eye". Charms are designed to help to avert the harm of the evil eye.

She then talked about how "barley" or "sty" become outliers. The same word means both. The two are connected because the way you cure the sty is a barley type gesture. The word is most frequent in the 1960s. What happened in the 1960s The phenomenon could be due to the uneven collection practices. During the 1960s a particular survey (atlas of Hungarian folklore) included a separate question about stys as a sample health question.

She concluded by asking about the interests of the collectors and how they influence the content of the collection. Knowledge is always situated.

Holly H. Pickering: Designing for Sustainability: Maintaining TAPoR and

Pickering started by talking about how we love to start projects in the digital humanities, but we don't think about sustainability or gracefully closing down projects. The paper focused on the projects of TAPoR and , two projects I'm involve in. The context of the paper was the merger of DiRT directory into TAPoR. We had to renovate TAPoR absorb DiRT. After the merger we found all sorts of problems:

  • There were empty tool entries
  • The metadata was inconsistent
  • The entries were uneven in the sense of have lots of some tools and few of others
  • Spanbots like to add tools

We used SQL queries to catch tools that haven't been updated or tools that have no description.

Then she talked about how we also need to add new tools, finding that which we don't know. Where do we find new tools especially since the lists out there have the same problems of not being maintained. We use spreadsheets to keep track of lists and places to check. We have Google alerts.

Pickering then shifted to talking about - a site with brief recipes for doing things. It is a companion to TAPoR. It is like a cookbook to help people figure out how to achieve something. She then talked about how we are both gathering and creating recipes and tutorials. These recipes need to be checked as links can break or tools disappear. She then talked about our Github repository of tools.

Finally she talked about project management. Nothing gets maintained without some level of conscious management. She talked about a model of management that uses research assistants over the year in a cycle of activities. She also talked about the death of projects and planning for such deaths.

The maintenance of born digital research isn't really a task that is taken on at the university or recognized as scholarly work. Books are maintained by libraries - who will take care of digital projects?

At the end we had a spirited discussion about depositing projects and I mentioned the paper I and others wrote on Burying Dead Projects that dealt with the ending process.

Lena Krause: Visualizing and Editorializing a historical database

Krause started with the difference between a search and a research tool. Search is finding a needle in a haystack and research is studying haystacks. Searching is easy, research is harder. Different approaches produce different data which then produces different research.

She is creating a research database CONBAVIL that is about civil building projects examined by Paris city council from 1795 to 1840. You can understand what the city was trying to build by examining the database. The current form of the database is, however, limited in the types of questions you can ask of it. Databases can become obsolete.

Krause wants to recycle databases rather than build a new one. She is trying to prototype out how the research imagined could be fostered using the database. She wants to reshape the database and add visualization to make the haystack visible. Visualization can be an affordance that can let one spatialize, temporalize the data. She imagines an atlas and showed a demo. She started with a map of France showing where the building plans came from. She showed variant maps with underlying population data. She had a "Destroy Paris" button.

She then showed a timeline that was interactive. But, again the data wasn't interactive which is needed so you can figure out what are the actual records.

Lastly she showed a sunburst that was interactive and really neat. If you clicked on a category (pie slice) of something like schools then you see the types of schools in a ring.

Kaarina Mikalson: Listen In! Podcasting as a Scholarly Tool for Public Engagement

Mikalson argued that podcasting is a valuable form of public outreach. Their project is Canada and the Spanish Civil War. She gave some background on the Canadian participation in the Civil War. The project is a web site with a digital repository, database of those Canadians who participated, teaching modules, and links to the podcasts. The war had a major impact on Canadian artists/novelists.

The scholarly podcast is called Listen In. They interview people who bring expertise to the issue. They are introduced by the use of radio during the war. The Civil War was the first radio war. It was one of the first where there were live broadcasts from the front. Both sides had lots of propaganda broadcasts. They used volunteers to address fellow countrymen back in their home countries.

Listen In is also the title of a pamphlet that transcribed broadcasts. There were a lot of people listening in in Canada. Radio was the way to engage a mass audience in real time.

Podcasting is, by contrast, relatively cheap. Our scholarly print output is hard for people to access. Mikalson talked about how Hannah MacGregor has been showing the potential with Secret Feminist Agenda. MacGregor has developed a way to have their podcast peer reviewed.

The Spanish Civil War project engaged a lot of community researchers so podcasting is a good way to bring in these researchers. The podcast as a genre is more dialogical and enables them to tell stories about their research. They try to be rigorous, but can be more conversational and make links to contemporary issues that are tangential. They make the research relevant.

Mikalson talked about the skills needed and the problem of failure. It may be cheap to podcast, but hard to do it well. Editing audio is timeconsuming and involves many decisions. It takes time to learn to do it well.

In DH we talk about the importance of failure. Their failures include that they don't get a lot of listeners. It is hard to interview guests at a distance. It is time-consuming. Consistent editorial rationale is hard to keep.

Panel on The SpokenWeb Project: Documenting Humanities-Oriented Spoken Collections

Jason Camlot started us off by talking about the SSHRC-funded partnership SpokenWeb project. He talked about the reel to reel tapes documenting the poetry series of the Sir George Williams Poetry Series (1965-1974). He started just by creating a guide to the collection and then realized it can go online. It is now at the SpokeWeb site. He then surveyed what else is out there and found lots. So now he (we) have scaled up the project. The goal of the project is to digitize and make accessible spoken poetry collections along with related materials.

Some of the aims include:

  • inventory and index existing collections
  • making collections usable for teaching and other uses
  • study literary collections in new ways through sound - audiotextual criticism
  • explore computational methods for analyzing literary sound
  • create online presentations

The collections mostly go from 1960 to 2000. The whole idea of what is literary content and how to "read" the text change with audiotexts. When you listen you are often listening to a social event with more than the poet.

What is digital about the project? As we are digitizing the materials we have a lot of digital working groups looking at metadata, rights management, digitization itself, signal analysis, and so on. We are adding contextual information through, for example, oral history. We have a podcasting task for, pedagogy task force and community collections task force.

He ended by theorizing about circulation. These tapes circulated as "mix tapes" back in the day.

Michael O'Driscoll spoke next about SpokenWest: The UAlberta Collection. He highlighted some of the key challenges tied to the way audio is time-dependent. Some of what we have has to be fully inventoried and thought about. The management of the audio objects opens questions as we shift to the digital. There is an echo effect in remediation of reel to reel. Archiving audio changes things.

The U of Alberta collection is built on the longest writer in residence programme in Canada. It goes back to early 1960s. In many cases things were recorded and not touched. Dr. Devereaux and O'Driscoll rescued something like 3000 tapes that were going to be thrown out. The first step was to digitize the high priority items.

When digitizing you have to read the audio object - including the hand written notes on the tapes, the ephemera that drops out of the box. Each piece is documented and photographed. Then you have to do close listening not just for the literary content, but also for all the other things that happen on the audiotext. There are clues as to the space, the audience, the situation. He then played Phyllis Web reading in 1972. You faintly hear her saying that she has to catch her breath. O'Driscoll talked about extra-poetic things we can learn from the recordings. The extra-poetic makes these indexical performances that point to the context/history of production. The extra-poetic speech is an event born genre.

One of the things we are doing is oral history interviews with recordists and others involved in the events and their archiving. Archival history is important as there is a whole para-textual dimension to these recordings. We can go to the print archives to sleuth more about

We are thinking about audiographic coding. Audiotextual taxonomies. The actual fact of the recording and technologies can add meaning.

He closed by saying that it is really hard to study durational (time-dependent) materials. It is really takes a lot of time to listen to such materials and they are often really boring. We can help people used to the random access of print to deal with durational materials.

Sean Luyk talked about the infrastructure that he helps run in the library. We are using ERA A+V which is a service built on Avalon. The U of Alberta library folk have developed a great service. Audiovisual materials are under represented in digital humanities projects because they lack the detailed data or rights. Systems like Avalon can make life easier. Having some metadata is better than none. An iterative approach of layering metadata is better than starting too complex.

The SpokenWeb metadata task force has develop a shared model to support the hub and spoke model of the project where we have to support different collections. SpokenWeb has developed Swallow that lets metadata be created by RAs easily. Now the challenge is going beyond descriptive metadata to structural metadata about what is inside the tapes to allow people to navigate tapes. Avalon has a new graphical tool that lets one create structural metadata about segments.

He ended by talking about the importance of user-centered design for the sites.

Holly Pickering then talked about Metadata for the Spoken Word. We know how to catalogue print materials but not how to deal with audio like what we have. She talked about the information that Avalon wants and the mismatch with the metadata that we need for the collection. The challenge we have at the U of A is how to work with the larger metadata scheme of SpokenWeb without having to do too much work. Holly has done a set of crosswalks that led to recommendations. She talked about the compromises and recommendations. She briefly showed the crosswalk spreadsheets and then the recommendations. There is a happy medium.

The recommendations have to now be implemented and be tested. We also have to map metadata onto user needs.

Ali Azarpanah then talked about designing an interface for the U of Alberta web site. He talked about why we need a separate site from the center.

Then he talked about using a personal/scenario/wireframe method to prototyping. He talked us through each step starting with creating believable personas. Personas are imagined users. The next step is to imagine concrete scenarios of usage. These helps designers anticipate what a user expects or wants to do. Developing wireframes is the third step. They show content hierarchy and functionality. The designer uses the priority scenarios to imagine what the major pages and we negotiate the pages.

He closed by talking about implementation. We will have a separated model that has the content in the library ERA A+V server and the interface in a web site we develop.

Day 3, Tuesday June 4th

Emilio Calderon: Public Interest and Current Affairs in Colombia; Topic Modeling Razón Pública

Calderon talked about his thesis project. He is looking at all the articles published in RP, a public interest academic magazine similar to The Conversation. He is trying to track what topics are considered in the public interest by the authors and by subscribers.

He is following Nancy Fraser's rethinking of the public sphere. She says there is not a self-evident or universal criteria for establishing what is of public interest. She argues that there are multiple public spheres that overlap and compete.

He showed a graph of the number of publications per month. There are around 40-50 a month. This seems enough to sustain the not-for-profit magazine. He then showed the distribution of men and women publishing - there are significantly more men publishing. Most authors only publish 1 article. Most of the authors are university professors and they are from universities in Bogota.

Then he talked about LDA topic modeling to figure out what might be the topics. He talked about ways of figuring out which topics were significant. He found having 30 topics gave the best result. He used different metrics to identify topics that weren't useful. Ultimately his aim was to use the topics in surveys of the subscribers. He still had some topics that seemed very similar like two peace related topics.

He talked about some of the top topics which included culture, laws and justice, global conflicts, and conflict and narcotics.

Next step is to use cosine similarity between lists of top words and he has to manually assign them to articles. All this will lead to a survey of the subscribers.

Zeinab Farokhi: The Case of Telegram Messenger for Social Change: An Iranian Case Study

Farokhi is looking at the way diasporic muslim women are facing islamophobia. She is looking at the Telegram social media and how it might be used for activism. Scholars have different views about the effectiveness of cyberactivism. Some see it as useless and some see it as being useful along with other forms of action. It also

In Iran a social media like Telegram can provide a place for discussion of topics that can't . She talked about a case study where someone imprisoned was freed after a viral online thread.

The case started with an entrepreneur ran into financial trouble to tax changes and 1200 workers lost their jobs. He was blocked and then jailed and died in jail. Then his wife was sent to jail for co-signing promissory notes. Petitions began to circulate to boycott the bank in question. There was an overwhelming public response and collective action. Some public figures even were critical. As a result of the account closing campaign the bank withdrew their complaint. For public figures it was seen as an issue of ethics. For most Iranis it was an issue of corruption and certain people control the banks. People pointed out how officials who had stolen were not being jailed.

Telegram is a German-based company founded by a Russian entrepreneur. It is the most popular in Iran. Users can create supergroup of up to 100,000. They can exchange images and texts. It has reached almost half the Iranis. Iran has banned it, but Iranis have bypassed the ban.

The discussion unfolded on Telegram and had a tangible effect. Telegram served as a platform for an online justice and corruption. It has fostered a more democratic conversation.

Traditional forms of activism are generally agreed to be more effective, but that doesn't mean online activism can't also be effective. Instant messaging has the potential to transform users. In a society like Iran where there is institutional distrust social media can be used to mobilize.

I asked about whether this case study is an anomaly and what about Telegram might make it suitable for cyberactivism. Telegram is fast and private. It is easier to use that makes it accessible to a wider audience. When it was banned, Telegram responded with a version that bypassed censorship. See

As for social media and activism she suggested that it may be the case that certain social media may be effective for a while until the state learns to manipulate it.

Lynne Siemens: University-Industry Partnerships in the Humanities: The Partners’ Perspective

Dr. Siemens introduced herself as someone who studies humanists. Humanists don't have a lot of experience with partnerships so she is looking at what works with partnerships. Her case study is the INKE project that is working with partners like libraries or academy adjacent organizations. It is a many to many partnership. It isn't a university to business partnership.

There are lots of challenges to partnerships. From partner perspective humanists are disorganized. But they also like work with humanists as a way to learn. Organizations can link to each other through partnerships with universities. They can raise their profile. They can expand their personal and professional networks.

Coordinating with academics is a real challenge. They have to learn to think like us. There is a need to navigate the differences between research and service. There is the challenge of providing time to attend meetings. This is a cost to organizations. There is also a lack of understanding that has to overcome.

The measures of success are soft. There are no patents and so on coming out from the partnership. What partners see as a success also changes. Some things they look for are:

  • Policy impact from participating in research grant
  • Chance to contribute to research and be recognized as a researcher
  • The creation of new knowledge, perspectives, and tools that they could point too
  • Some sense of spinoffs

There was a general sense of outcomes including:

  • Strengthen communication
  • Creating new services
  • Potential for advocacy
  • New ways of thinking

INKE has been meeting for a while, they are well organized and hopefully will get funded. They need to make sure that there is good communication if the soft outcomes are to be achieved for both parties.

Luis Meneses: Establishing Criteria for Cases of Abandonment in Online Digital Humanities Projects

Dr. Meneses talked about how fragile are the software things we depend on. "Out civilization runs on software."

How do we know if things are abandoned? He showed to web sites that he created around 2015 that both run on Python 2. One is the Cervantes project that has been around since 1995, but

How does degradation happen?

  • Links stop working
  • Redirects
  • Error pages
  • Someone hacks your site and replaces the content

He started hacking the DH book of abstracts starting with 2008. He then used regular expressions to extract 8823 URLs. He then checked the links for redirects or errors. He then showed some interesting graphs. He showed a graph of URL decay by year. He classified the links and found a lot of even recent web sites had poor links. By the time the abstracts are published about half of them are bad.

He then showed results on when URLs were last modified. There is a lot of digital debris being left behind. I'm not sure if this is a problem - a well designed site may not need to be modified regularly.

The average lifespan of our sites is 5 years. We need (better) preservation strategies.

They are now gathering web archives regularly so as to track changes of sites to study change better.

Julia Genevieve Polyck-O'Neill: Data Visualization and Visual Arts Specialist Perspectives: Towards New Interpretive Frameworks

Polyck-O'Neill started with Johanna Drucker who suggested that visual disciplines like art history should beware of quantitative methods that come from science traditions. Digital humanities methods may not work well in art history. She talked about the one to one relationship between source to code.

She then talked about taxonomies of visualizations like that of Ben Schniederman. She also examined definitions of visualization like one from the Fondazione Bruno Kessler that suggests that visualization is more effective than verbal presentation for big data.

Digital humanities is text heavy and weak on visual literacy. Technology is widening the gap between those who can read and engage with new forms of communication like visualization. There is a need for visualization literacy. Often the focus is on the mechanics of generating visualizations not the semiotics or rhetoric. There is a generalized impression that issues of interpretation are not relevant to visualizations as if they were transparently clear and objective.

DH has expanded and become more dynamic. This includes the fields of art history and visual culture. DH and art history is being bridged in some GLAM contexts (galleries, museums, and museums).

The Object: Photo The Thomas Walther Collection (2014-15) project might be an example of what can be done. The project brought digital humanities methods into a major museum. The MoMA didn't have DH people inside - they had to bring them in. There is therefore not the discussion between the museum staff and the team (Lev Manovich lab) that was brought in.

There are new pathways that bring visual studies people back into conversation:

  • Neri Oxman and the Mediated Matter Group at the MIT Media Lab - Vespers project that creates death masks. The project asks about the emotions around the visualizations printed onto death masks.
  • Theaster Gates, But To Be A Poor Race (2017) questions who or what is being represented and by who. It draws early visualizations by W.E.B. DuBois. Gates paintings of visualizations often omit information. Visualized data can take on an aesthetic life apart from the quantitative.
  • Coder le monde treated database results as art. How can the gallery space shift the interpretation of traditional methods.

These projects call upon the expertise of visual studies specialists to be interpreted.

Victor Temprano: Projects that matter

The closing keynote was by Victor Temperano who created Native-Land, a web site that attempts to map the different lands of indigenous people. The interactive map has an argument built into it.

To Victor this issue mattered to him personally which is why he built it and maintains it. Don't think something doesn't matter just because it isn't a major social issue. His talk aimed to give a sense of the journey of creating this site which has taken off.

Growing up he didn't learn much about indigenous history. In university he was blown away to learn about the residential schools. He wanted to do something about what he was learning about. He started building web sites. He built one called Pipe Watch (that is now all gone.) That led to thinking about whose land the pipe lines are going across. He started looking around for information.

He taught himself all sorts of technologies without knowing about formal names for the tools. It is clear he was very inventive in figuring out how to do this. It got its viral start when a A Tribe Called Red linked to it. He then did the US and then other countries. He prefers to have something than to be totally right. It isn't an academic site and he needs help to refine it. People email fixes all the time.

He wanted to make it more confrontational with "Whose stolen land are you on?", but decided it would be better to keep it less confrontational. It is a rhetorical tool.

He gets data from different sources. Ideally he gets the geospatial data from a nation. Then he tried to get it from a study about the nation. Then there are some conglomerate map that has many nations. There is a question as to whether zones with boundaries is the best way to represent lands. Perhaps it should be more like different fogs.

He now has a research assistant and he developed a custom authoring interface for the assistants. It uses React inside a WordPress plug in. He uses Mapbox for the underlying maps and they have been great supporting him. He has now has apps and, alas, everything needs to be rebuilt regularly.

Be careful about doing digital for digital sake. Don't feel you have to keep on adding things. For example, he feels people should just go to the territory rather than expect lots of multimedia on the Native-Lands site.

He acknowledged that he has lots of privileges that make this project possible. He has a support network. He has no student loans. He doesn't have any funding. This is project that matters to him. He could put in the time and he knew he could put in the time. He sees it as an entrepreneurial project.

He is now trying to develop a long term sustainability plan. He set up a company and has an indigenous board and staff. He is a settler and wanted to make sure there was input from indigenous people. He isn't sure how he will survive the building of his own organization. He plans to step back and get back to the tech. He wants to have others take on the public roles.

Then he shifted to theoretical challenges. There are all sorts of problems like:

  • What is an indigenous nation? Who decides? Does it have to be a certain size?
  • What time period does one map?
  • What about different categories like Métis.
  • What about different countries like South America?
  • What counts as a source for this map?
  • Who is going to use it and for what purpose?

Ultimately the map is to support discussions about colonization. That is the final criteria. Sometimes he worries that mistakes could do harm. He doesn't want it to become tokenistic.

He talked about the issue of time. It tries to represent where people now think of their nations as being.

He talked about how videogames are another way that indigenous peoples are talking about land.

Day 4, Wednesday, June 5

I now switched over to the Canadian Game Studies Association

Kai Yin Lo: Deleuze, Cinema, and Videogames: A Study of Mikami’s The Evil Within

Lo applied ideas from Deleuze about cinema to talk about The Evil Within that was directed by Shinji Mikami. The game had cinematic aspects like Letterbox format, film gran, rain drops.

He then talked about Deleuze's adaptation of Bergson's notion of the body as a site of movement. There are three kinds of images from perception image, the affection image, and the action image. Lo then illustrated types of shots corresponding to perception, affect, and action.

Then he talked about how Deleuze draws on Peirce's ideas of signs that also is tripartite.

In the videogame both the director and the player control of the camera at different times. The director can redirect the gaze of the player.

Lo then talked about the large form of action-image or "heroic" form. You have a series of quests similar to videogames where the quests are levels. EAch quest has a series of situations in which the player has to take action. He then talked about small form of action-image or "survival" form. Here the action precedes the situation. The action triggers the situation that then calls for another action.

He is also interested in the use of collectibles as a form of memory formation.

Lastly he talked at how we can get cinema to videogames. Cinema is a classic genre that has been well theorized. We can learn from cinema theory. For example, we can look at differences rather than essence. We can look at the plasticity of games.

Loïc Mineau-Murray: Agentivité et style de combat des personnages féminins dans les jeux de rôle japonais

Mineau-Murray talked about women character's agency and combat styles in JPRGs. He began by talking about feminist game studies. There are two types of approach:

  • There is a discussion of the importance or lack of women characters
  • There are studies of the visual representation

There is a lack of theorietical tools. We have Janet Murray talks about how games give us agency - ability to take meaningful action. Klevjer feels that the actions of the player are realistic when they tie to the character not the procedures of the game. it would follow that woem need to be given realistic characters where the character can make a differenct.

He asserted that In game studies player agency is studied. In film/literary studies, character agency is studies. This raises issues

Isabelle Boisclair adds an economic aspect aspect to agency. An agent acts for themself. An object acts for an agent. This lets us critique the token strong female characters when they act as an object. He gave the example of Trinity in the Matrix.

How can a character's agency be studied if that character is controlled by a player? I assume we want to think about the affordances. We have to look at what a player can do when playing a character. in Gore Galore: Literary Theory and Computer Games I propose a way of using Bakhtin's ides of the chronotope to study the pace of action/agency.

He then started talking about characters in the JPRG Tales of Phantasia. He talked about how narratology isn't enough to look at agency, we need to think of a game as the creation of a history. He compared female characters like Velvet and Mint. Mint can't help much in combat. You can play her but she is not designed to be interesting. Cless is the character around which the game is built. Mint is there so Cless can shine. Velvet is closer in agency but a sexualized.

He concluded by suggesting that studying the relationship between player and character agency is a way to study gender while avoiding narratology.

Kynan Ly: Work Culture in Early Japanese Game Development

Ly started by talking about how the videogame industry weaves Japanese and Western influences. Our paper wants to recover the way Japanese game developers talked about their work culture. Part of the problem is the issue of historiography - what sources do we have and how reliable are they. Ly talked about some of the sources:

  • The Untold Histories archive of interviews that we are archiving at the U of Alberta
  • Japanese magazines like hobby game magazines
  • Talks by developers reflecting back on the early

Ly talked about a number of issues that came up in our study:

  • The issue of West and Japan. While the domestic market in Japan may not have welcomed Western games the developers were definitely influenced by Western games they played and games like Tetris were very popular
  • Work and stress and poor pay was another theme - Ly talked about the Hamachi room where developers were locked in at crunch time
  • What was life like for women designers

We ended by talking about the need for historiography.

We got a number of good questions:

  • Was the hobby community influenced by the Western community? Were Japanese magazines drawing on similar Western hobby magazines?
  • Is the theme of work and stress really that similar? Were work conditions really that similar? My sense is that Japanese game developers didn't get rewarded in the same ways as Western developers were. There was a "salaryman" culture where you worked for the good of the company and your reward was the health of the company.

Jason Hawreliak: Games that Stink: Towards a Theory of Olfaction in Digital Game

Hawreliak started by asking why talk about olfaction in games? We know that scent is powerful = that it is evocative of memories. Games are also about memory. Alberta "Skip" Rizzo at USC uses simulations and smells to help treat trauma. Scents can remind us of environments.

Two books he works on includes What the Nose Knows (Gilbert 2008). In 1916 there were experiments with piping rose oil through a theatre during highlights of a Rose Bowl Game. Smell-O-Vision was developed by Hans Laube in 1939. It was used for a movie called Scent of Mystery (1960). Smell was presented as the next level of immersion.

In Wired there is a 1999 article about Digiscent the eventually failed. Scent is really complex and smell printers rarely work well.

Mochizuki (2004) created a game called Fragra which was meant to be an immersive game. More recently there is a Feelreal VR accessory that attaches to the Oculus Rift. Another game, Tainted by Ranasignhe (2019), uses four scents that are used to "uncover the plot and guide players in the horror-survivor grame." There is a Arduino driven unit controlled by Unity that can waft scents.

This reminded me of Melanie McBride's work that was presented at Replaying Japan 2014. See also public talks.

Elizaveta Tarnarutckaia and Astrid Ensslin: The Myth of the “Clarté Française” in Players’ Perception of Speech Accents in BioWare’s Dragon Age

This project looked at Reddit comments on speech accents in games like Dragon Age. She recommended her recent book Approaches To Videogame Discourse.

Tarnarutchaia then started by talking about language ideologies and debates about ideologies. She talked about folk linguistics that are when people people share their intuitions about language use.

They looked at Dragon Age (Bioware). They gathered 191 individual Reddit comments about accents. The most comments were about French accents. French was a trigger. She then talked about the myth of the "clarté Française". It is an unsubstantiated notion of the linguistically inherent clarity of French.

She then gave some examples of Redditors talking about a French voice actor and whether her accent is French or Canadian. The commentors don't take into account that French is widely spoken around the globe.

She concluded by talking about the Redditors's linguistic essentialism which goes against the diversification trends in game development. Redditors echo pre-existing puristic debates. Linguistic elitism still survives.

Marika Brown: The Mechanical Wilds: New Relations with an Unnatural Nature in Horizon Zero Down

Brown started by talking about the game Horizon Zero Down and the mechanical animals that were created to tend nature. The usual contrast of nature and machines doesn't make sense in this game. In the game there is an interdependency that challenges us to reconceptualize the division. All life on the planet has been made possible by the machines (the AI Gaia).

She talked about how indigenous people are used as models for peoples in the game. Imagined indians in King's words are used to flesh out the game with primitive tribes. She talked about how modernity is contrasted with indigeneity. The modern is tied to technology and the indigenous is primitive and out of time. The idea is that the primitive is a phase one has to go through to get to modernity. Much of the praise for the game is in terms that have been use to stereotype indigenous peoples.

The idea is that a game like this works out futures in a safe way. But, games like this inherit structures and can reinforce them.

Nicholas Hobin: Skin Deep: Getting to the Meat of Video Game Animals

Hobin started by asking about the representation of animals in videogames. He talked about Red Dead Redemption and how it shows hunting. There is a weight and physicality to the animals in this game. In the game you take control of a cowboy and you go on adventures in a fictional Western frontier. Part of the game's immersiveness comes from the animal life. The designers have tried to make the hunting, skinning, and selling of animals.

Why look at animals? Hobin talked about how animals are some of the first subjects of art. Thus it makes sense to include animals in simulations.

Hobin then compared Red Dead Redemption and how realistic the skinning is to how other games stay away from the realities of meat. Meat makes animal bodies knowable. Animals are known through their flesh. In the game one can kill animals as a way of learning through them (or follow them.)

In real life we conceal all the processes of butchering animals so we can eat meat without knowing too much. In the game the processes are exposed to some extent.

Joshua Call: Calories, Calories: Complicating Food Economies in the Survival Game Genre

Call started by wanting to make fun of how ridiculous the diets in games are. Now he is trying to think about how the in-game food systems reinforce cultural values.

If you trace the history of survival knowledge it often goes back to indigenous cultures. We can trace a history of survival representations and there are many of them. There is also a history of survival games. A common trope in survival media is the survival of wilderness or primitive wisdom.

Then he talked about how survival games index survival. What and how is health represented. What do the heads-up displays show you as important. Food levels, health, calories. There are now FitBit type watches that track things. Green Hell is the most ambitious now - you have to track diet.

He gave an example, The Forest. The scenario is so extreme (plane crash and you have to survive cannibals) that it authorizes us to give up ethics and do things we wouldn't do in real life. The game has you burning calories quickly and having to find things to eat. The amounts that you have to eat are not reliable. These games borrow tropes from bush craft texts but isn't at all realistic. The Forest game maps colonial fantasies about eating.

Day 4: Thursday, June 6

Julija Jeremic: Video Game Education in British Columbia

Jeremic talked about the situation of game education in BC. BC is the second largest site after Quebec.

The types of curricula are typically skills related and academic. The IGDA is developing a curriculum.

For her method she went through the official BC government site that lists all academic programs and chose all the programs that mentioned games. 13 of the 26 public institutions were game related, one at the Masters level. Another approach she took was to look for advertisement. There are further programs in the private training institutions. There are a lot of animation programs (18 or so). There are a number (14) of programs that specifically mention games in their title including a couple that are about games and writing.

She talked about the programs that list working in the game industry as a career option even though they don't have specific game courses. Communications, comics, audio, and graphic design programs would be examples.

There are number of animation programs in BC. It was the most numerous of the programs.

Another approach she took was to look for advertisement.

She looked at the topics in the courses. There are, obviously, a number of game programming programmes. Visual design is another. Game production was a third major topic. There are a number of courses related to social issues. The business of gaming doesn't get a lot of play.

  • 56 programs in 33 insitution
  • 34 diploma and BA 17
  • 14 are explicit programs

Her next step is to look closely at two programs, one degree and one at a private institution.

Anna Borynec and Cate Peter: The Work of Play: A survey of graduates from higher education programs in video games

Borynec and Peter talked about a survey of over 400 people about programs across the US and Canada (and elsewhere) sponsored by HEVGA. The survey follows a 2015 survey. Their

Cate talked about methodology. They showed a map that showed a lot of programmes on the East coast.

About 70% are male. This is up a bit since 2015. 20% identify as LGBTQ+. This is significant as it is much higher than in the population in general.

Personal stories and experiences are also important. People talking about being gay in the industry point out how there is a bro culture in many studios.

They looked at courses people were taking. There a number of experiential learning courses, but only around 50% have internships. Most people get jobs in the games industry though a lot get jobs in education. They are making more than the typical household. There was high job satisfaction. That said, job satisfaction drops over time.

Women graduates have gone up from 14% in 2008 to 33% in 2018. Women seem more likely to leave the industry over time. They talked about the

Alison Harvey: Making the Grade: Women in Games Higher Education

Dr. Harvey has been doing work on participation for a while. She talked about how there is a perception that women don't have the "real" skills therefore they don't get hired as much. Harvey wanted to figure out what real skills might be. It is important to understand the tacit skills expected along with higher ed and what happens there.

She talked about how there is an explosion of demand for programmes. This makes it highly competitive.

She focused on 5 UK institutions that offered suites of programmes. She did in-depth interviews. Most of her interviewees were men.

She found many female-identified mentioned how you constantly told that you really have to fight in the industry. There is a heroic masculinity ethic circulating. This means that women have to learn to cope. They have to ignore things and just "get on" with things. They can cope by just being more of a guy than others. The ethic seems to encourage individualism. This research can be compared to research into physics students that found that they resist the sexism.

Another tactic was to distance oneself from other and become self-sufficiency. This is strange given how much groupwork you have to do.

Left On Read Again: Mixed-Reality UI Design in Bury me, my Love and Emily Is Away Too

Lyne Dwyer (Concordia University)

Dwyer talked about Emily is Away Too (2017) and interface. Ideally interfaces should coexist and operate with the game world. The function of the UI and the fiction of the world should correspond. The interface is diegetic - it is part of the play world.

In Emily the platform tries to fit into your computing system as if it were part of everyday life.

Benjamin Unterman: Games Imitating Life

Unterman talked about Another Lost Phone: Laura's Story which you play on a phone. The game mimics a smartphone interface. The interface design is consistent with the design aesthetic of minimilist design of contemporary phones. The game ladders a set of puzzles where you have to find passwords in the variety of information you have access to. The interface prioritizes narrative coherence.

There are another similar simulated interfaces like A Normal Lost Phone, Sara is Missing. There are games that look like productivity tools. Cost Cutter looks like a graphing environment. Leadership, Crash Planning and Breakdown are examples. These are designed to have an interface that is at odds with the content. Crash Planning lets you pretend you have a productivity tool going on your desktop as you play. It is transgressive. It is a reaction to the gamification of productivity.

There are a whole class of games that you can play within Excel - programmed

Pro Office Calculator is the last game he talked about. It is an antagonistic simulated interface. It starts as a calculator and then disintegrates into puzzles. It actively stands in your way.

  • Realistic interface design uses design unity and established conventions. The designer takes advantage of what we already know about interface. It is/allows for augmented reality.
  • Complicit design uses other interfaces that hide other things. The game signals to player to bring them into the game.
  • Antagonistic design uses unreliable narrators or interfaces.

He closed by looking forward at other interface patterns.



edit SideBar

Page last modified on June 06, 2019, at 12:05 PM - Powered by PmWiki