Short Guide To Evaluation Of Digital Work

Main.ShortGuideToEvaluationOfDigitalWork History

Hide minor edits - Show changes to output

July 22, 2009, at 05:23 PM by 142.244.120.137 -
Added lines 1-4:
Back to [[MLA Digital Work]] Home

----

Changed line 122 from:
Back to [[MLA Digital Work]]
to:
Back to [[MLA Digital Work]] Home
April 10, 2009, at 11:30 AM by 142.244.120.147 -
Added line 117:
----
April 10, 2009, at 11:27 AM by 142.244.120.147 -
Changed line 48 from:
!!! Best Practices in Digital Work
to:
!!! Best Practices in Digital Work (Check List)
Changed lines 62-63 from:
->Generally speaking projects should choose open and well documented standards (as opposed to proprietary standards like the WordPerfect file format) if they want their materials to be useful to scholars in the future. Electronic scholarly resources that use proprietary formats doom their work to be inaccessible to scholars once that format is superceded. Exceptions to this are project exploring interactivity which often calls for an authoring environment like Flash that can facilitate innovative interfaces. Such projects will typically keep the materials in open standard formats and use Flash to provide the interactive interface.
to:
->Generally speaking projects should choose open and well documented standards (as opposed to proprietary standards like the [=WordPerfect=] file format) if they want their materials to be useful to scholars in the future. Electronic scholarly resources that use proprietary formats doom their work to be inaccessible to scholars once that format is superceded. Exceptions to this are project exploring interactivity which often calls for an authoring environment like Flash that can facilitate innovative interfaces. Such projects will typically keep the materials in open standard formats and use Flash to provide the interactive interface.
April 10, 2009, at 11:25 AM by 142.244.120.147 -
Changed lines 55-56 from:
->Once choices are made about the content then a digital scholar has to make choices about how the materials are digitized and to what digital format. There are guidelines, best practices and standards for the digitization of materials to ensure their long term access, like the [[http://www.tei-c.org/index.xml | Text Encoding Initiative guidelines or the [[http://www.getty.edu/research/conducting_research/standards/ | Getty Data Standards and Guidelines]]. These are rarely easy to apply to particular evidence so evaluators should look for a discussion of what guidelines were adapted, how they were adapted, and why they were chosen. Absence of such a discussion can be a sign that the candidate does not know of the practices in the field and therefore has not made scholarly choices.
to:
->Once choices are made about the content then a digital scholar has to make choices about how the materials are digitized and to what digital format. There are guidelines, best practices and standards for the digitization of materials to ensure their long term access, like the [[http://www.tei-c.org/index.xml | Text Encoding Initiative guidelines]] or the [[http://www.getty.edu/research/conducting_research/standards/ | Getty Data Standards and Guidelines]]. These are rarely easy to apply to particular evidence so evaluators should look for a discussion of what guidelines were adapted, how they were adapted, and why they were chosen. Absence of such a discussion can be a sign that the candidate does not know of the practices in the field and therefore has not made scholarly choices.
April 10, 2009, at 11:24 AM by 142.244.120.147 -
Changed lines 8-9 from:
This is an annotated expansion on the [[http://www.philosophi.ca/theoreti/wp-content/uploads/2007/06/EvalMediaCheckList.pdf | Evaluating Digital Word (PDF)]] which was prepared as a one page checklist for a presentation to the ADE/ADFL in 2007 (see [[http://www.philosophi.ca/theoreti/?p=1696 | blog entry]].
to:
This is an annotated expansion on the [[http://www.philosophi.ca/theoreti/wp-content/uploads/2007/06/EvalMediaCheckList.pdf | Evaluating Digital Word (PDF)]] which was prepared as a one page checklist for a presentation to the ADE/ADFL in 2007 (see [[http://www.philosophi.ca/theoreti/?p=1696 | blog entry]] about the event.)
December 02, 2008, at 02:21 PM by 75.156.164.38 -
Changed lines 93-94 from:
->A basic set of questions to ask about pedagogical scholarship is whether the learning innovation has actually been used and whether it has been used in real teaching and learning circumstances. As mentioned above, for pedagogical digital work evaluators should also ask if the use has been assessed and what the results were.
to:
->A basic set of questions to ask about pedagogical scholarship is whether the learning innovation has actually been used and whether it has been used in real teaching and learning circumstances. As mentioned above, for pedagogical digital work evaluators should also ask if the use has been assessed and what the results were. For more see also [[DemonstratingTheScholarshipOfPedagogy | Demonstrating the Scholarship of Pedagogy]].
Added lines 113-116:
----

See also [[DemonstratingTheScholarshipOfPedagogy | Demonstrating the Scholarship of Pedagogy]]

December 02, 2008, at 02:02 PM by 75.156.164.38 -
December 02, 2008, at 02:01 PM by 75.156.164.38 -
Changed lines 111-113 from:
* For expertise in pedagogical innovation you can ''ask your local teaching and learning unit'' for advice or names of people who have developed similar learning interventions.
to:
* For expertise in pedagogical innovation you can ''ask your local teaching and learning unit'' for advice or names of people who have developed similar learning interventions.

Back to [[MLA Digital Work]]
December 02, 2008, at 01:45 PM by 75.156.164.38 -
Added lines 8-9:
This is an annotated expansion on the [[http://www.philosophi.ca/theoreti/wp-content/uploads/2007/06/EvalMediaCheckList.pdf | Evaluating Digital Word (PDF)]] which was prepared as a one page checklist for a presentation to the ADE/ADFL in 2007 (see [[http://www.philosophi.ca/theoreti/?p=1696 | blog entry]].
December 02, 2008, at 01:40 PM by 75.156.164.38 -
Changed line 84 from:
* Demonstration (Has it been shown to others?)
to:
* ''Demonstration (Has it been shown to others?)''
December 02, 2008, at 01:39 PM by 75.156.164.38 -
Changed line 14 from:
* Is it accessible to the community of study?
to:
* ''Is it accessible to the community of study?''
Changed line 17 from:
* Did the creator get competitive funding? Have they tried to apply?
to:
* ''Did the creator get competitive funding? Have they tried to apply?''
Changed line 20 from:
* Have there been any expert consultations? Has this been shown to others for expert opinion?
to:
* ''Have there been any expert consultations? Has this been shown to others for expert opinion?''
Changed line 23 from:
* Has the work been reviewed? Can it be submitted for peer review?
to:
* ''Has the work been reviewed? Can it be submitted for peer review?''
Changed line 26 from:
* Has the work been presented at conferences?
to:
* ''Has the work been presented at conferences?''
Changed line 31 from:
* Have papers or reports about the project been published?
to:
* ''Have papers or reports about the project been published?''
Changed line 34 from:
* Do others link to it? Does it link out well?
to:
* ''Do others link to it? Does it link out well?''
Changed line 37 from:
* If it is an instructional project, has it been assessed appropriately?
to:
* ''If it is an instructional project, has it been assessed appropriately?''
Changed line 42 from:
* Is there a deposit plan? Will it be accessible over the longer term? Will the library take it?
to:
* ''Is there a deposit plan? Will it be accessible over the longer term? Will the library take it?''
Changed line 49 from:
* Appropriate content (What was digitized?)
to:
* ''Appropriate content (What was digitized?)''
Changed line 52 from:
* Digitization to archival standards (Are images saved to museum or archival standards?)
to:
* ''Digitization to archival standards (Are images saved to museum or archival standards?)''
Changed line 57 from:
* Encoding (Does it use appropriate markup like XML or follow TEI guidelines?)
to:
* ''Encoding (Does it use appropriate markup like XML or follow TEI guidelines?)''
Changed line 62 from:
* Enrichment (Has the data been annotated, linked, and structured appropriately?)
to:
* ''Enrichment (Has the data been annotated, linked, and structured appropriately?)''
Changed line 69 from:
* Technical Design (Is the delivery system robust, appropriate, and documented?)
to:
* ''Technical Design (Is the delivery system robust, appropriate, and documented?)''
Changed line 74 from:
* Interface Design and Usability (Is it designed to take advantage of the medium? Has the interface been assessed? Has it been tested? Is it accessible to its intended audience?)
to:
* ''Interface Design and Usability (Is it designed to take advantage of the medium? Has the interface been assessed? Has it been tested? Is it accessible to its intended audience?)''
Changed line 81 from:
* Online Publishing (Is it published from a reliable provider? Is it published under a digital imprint?)
to:
* ''Online Publishing (Is it published from a reliable provider? Is it published under a digital imprint?)''
Changed line 87 from:
* Linking (Does it connect well with other projects?)
to:
* ''Linking (Does it connect well with other projects?)''
Changed line 90 from:
* Learning (Is it used in a course? Does it support pedagogical objectives? Has it been assessed?)
to:
* ''Learning (Is it used in a course? Does it support pedagogical objectives? Has it been assessed?)''
Changed lines 97-109 from:
* Ask the candidate. A candidate should know about the work of others in their field and should be able to point you to experts who can understand the significance of their work. If they can't then they aren't engaged in scholarship.

* Find a Computing and <your field> centre, association or conference and scan their web site. If you want names of people able to review a case there are centres for just about every intersection of computing and the humanities (like the [[http://chnm.gmu.edu/ | Center for History and New Media]]); there are national and international organizations (like the [[http://www.sdh-semi.org/ | Society for Digital Humanities / Soci้t้ pour l'้tude des m้dias interactifs]] in Canada and the international [[http://www.ach.org/ | Association for Computers and the Humanities]]); and there are conferences (like the upcoming [[http://mith.umd.edu/news/mith-to-host-digital-humanities-2009-conference | Digital Humanities 2009]] joint conference.)

* Check the [[http://www.digitalhumanities.org/ | Association of Digital Humanties Organizations]] and contact association officers. On their home page they list past joint Digital Humanities conferences.

* Join the [[http://www.tei-c.org/index.xml | Text Encoding Initiative Consortium]] or their discussion list [[http://listserv.brown.edu/archives/cgi-bin/wa?SUBED1=tei-l&A=1/ | TEI-L]] and ask for help with technical review.

* Ask [[http://www.mla.org/comm_id | MLA Committee on Information Technology]].

* Search [[http://www.digitalhumanities.org/humanist/ | HUMANIST archives]]. Humanist is a moderated discussion list that has been going since 1987, searching the list should provide ideas for experts.

* For expertise in pedagogical innovation you can ask your local teaching and learning unit for advice or names of people who have developed similar learning interventions.
to:
* ''Ask the candidate.'' A candidate should know about the work of others in their field and should be able to point you to experts who can understand the significance of their work. If they can't then they aren't engaged in scholarship.

* ''Find a Computing and <your field> centre, association or conference and scan their web site.'' If you want names of people able to review a case there are centres for just about every intersection of computing and the humanities (like the [[http://chnm.gmu.edu/ | Center for History and New Media]]); there are national and international organizations (like the [[http://www.sdh-semi.org/ | Society for Digital Humanities / Soci้t้ pour l'้tude des m้dias interactifs]] in Canada and the international [[http://www.ach.org/ | Association for Computers and the Humanities]]); and there are conferences (like the upcoming [[http://mith.umd.edu/news/mith-to-host-digital-humanities-2009-conference | Digital Humanities 2009]] joint conference.)

* ''Check the [[http://www.digitalhumanities.org/ | Association of Digital Humanties Organizations]] and contact association officers.'' On their home page they list past joint Digital Humanities conferences.

* ''Join the [[http://www.tei-c.org/index.xml | Text Encoding Initiative Consortium]] or their discussion list [[http://listserv.brown.edu/archives/cgi-bin/wa?SUBED1=tei-l&A=1/ | TEI-L]] and ask for help with technical review.''

* ''Ask [[http://www.mla.org/comm_id | MLA Committee on Information Technology]].''

* ''Search [[http://www.digitalhumanities.org/humanist/ | HUMANIST archives]].'' Humanist is a moderated discussion list that has been going since 1987, searching the list should provide ideas for experts.

* For expertise in pedagogical innovation you can ''ask your local teaching and learning unit'' for advice or names of people who have developed similar learning interventions.
December 02, 2008, at 01:35 PM by 75.156.164.38 -
Changed lines 8-9 from:
[[#Questions | Questions]] | [[#Check | Check List]] | [[#Experts | Finding an Expert]]
to:
[[#Questions | Questions]] | [[#Check | Check List]] | [[#Expert | Finding an Expert]]
December 02, 2008, at 01:35 PM by 75.156.164.38 -
Changed lines 2-3 from:
This short guide gathers a collection of questions evaluators can ask about a project, a check list of what to look for in a project, and some ideas about how to find experts in one place.
to:
This short guide gathers a collection of questions evaluators can ask about a project, a check list of what to look for in a project, and some ideas about how to find experts in one place. This assumes that evaluators who are assessing digital work for promotion and tenure are:

* Are new to the review of digital scholarly work and therefore could use a framework of questions to start with,
* Prepared to review the materials submitted by a candidate in the form it was meant to be accessed but need ideas of what to look for, and
* Will also ask for expert reviews from others and therefore need suggestions on where to look for relevant expertise.

[[#Questions | Questions]] | [[#Check | Check List]] | [[#Experts | Finding an Expert]]

[[#Questions]]
Added line 45:
[[#Check]]
Changed lines 47-48 from:
Here is a short list of what to look for in digital work:
to:
Here is a short list of what to check for in digital work:
Added line 93:
[[#Expert]]
Changed lines 107-109 from:
* Search [[http://www.digitalhumanities.org/humanist/ | HUMANIST archives]]. Humanist is a moderated discussion list that has been going since 1987, searching the list should provide ideas for experts.
to:
* Search [[http://www.digitalhumanities.org/humanist/ | HUMANIST archives]]. Humanist is a moderated discussion list that has been going since 1987, searching the list should provide ideas for experts.

* For expertise in pedagogical innovation you can ask your local teaching and learning unit for advice or names of people who have developed similar learning interventions
.
December 02, 2008, at 01:28 PM by 75.156.164.38 -
Changed lines 1-4 from:
!! Check List

Questions to ask about a digital work that is being evaluated:
to:
!! Questions, a check list and how to find an expert
This short guide gathers a collection of questions evaluators can ask about a project, a check list of what to look for in a project, and some ideas about how to find experts in one place.

!!! Questions
Some questions
to ask about a digital work that is being evaluated:
Changed lines 38-40 from:
!! Best Practices in Digital Work
Here is a short list of what to look for in digital work.
to:
!!! Best Practices in Digital Work
Here is a short list of what to look for in digital work:
Changed lines 85-87 from:
!! How to Find an Expert
Places to start to find an expert who can help with the evaluation.
to:
!!! How to Find an Expert
Places to start to find an expert who can help with the evaluation:
December 02, 2008, at 01:24 PM by 75.156.164.38 -
Changed lines 18-19 from:
->The best way to tell if a candidate has been submitting their work for regular review is their record of peer reviewed conference presentations and invited presentations. Candidates should be encouraged to present their work locally (at departmental or university symposia), nationaly (at national society meetings) and internationally (at conferences outside the country organized by international bodies.) This is how experts typically share innovative work in a timely fashion and most conferences will review and accept papers about work in progress where there are interesting research results. Local symposia (what university doesn't have some sort of local series) are also a good way for evaluators to see how the candidate presents her work to her peers.
to:
->The best way to tell if a candidate has been submitting their work for regular review is their record of peer reviewed conference presentations and invited presentations. Candidates should be encouraged to present their work locally (at departmental or university symposia), nationally (at national society meetings) and internationally (at conferences outside the country organized by international bodies.) This is how experts typically share innovative work in a timely fashion and most conferences will review and accept papers about work in progress where there are interesting research results. Local symposia (what university doesn't have some sort of local series) are also a good way for evaluators to see how the candidate presents her work to her peers.
Changed lines 34-35 from:
->Digital work
to:
->Digital scholarly projects should deposit their data for archiving when then are finished. While few projects do this because a) well managed repositories are just emerging, and b) many projects, even moribund ones, dream of the next phase; we can expect projects to plan for deposit when they think they are finished. The reason for following guidelines for scholarly encoding or digitization is so that the editorial and multimedia work can be reused by other projects, but without the work being documented and deposited we risk losing a generation of such work. Further, digital scholars should be encouraged to deposit their work so they can move on to new projects as one of the dangers of digital work is the danger of being buried in the maintenance of previous projects.
Changed lines 40-41 from:
to:
->A scholarly work that represents humanities evidence in a digital form is the result of a series of decisions, the first of which is the choice of what to represent. For example, a digital representation of a manuscript is first a choice of what manuscript to digitize and then what contextual materials to digitize. These decisions are similar to those any editor or translator makes when choosing what to represent in a new edition or translation. A content expert should be able to ask about the choices made and discuss these with a candidate.
Changed lines 43-46 from:
to:
->Once choices are made about the content then a digital scholar has to make choices about how the materials are digitized and to what digital format. There are guidelines, best practices and standards for the digitization of materials to ensure their long term access, like the [[http://www.tei-c.org/index.xml | Text Encoding Initiative guidelines or the [[http://www.getty.edu/research/conducting_research/standards/ | Getty Data Standards and Guidelines]]. These are rarely easy to apply to particular evidence so evaluators should look for a discussion of what guidelines were adapted, how they were adapted, and why they were chosen. Absence of such a discussion can be a sign that the candidate does not know of the practices in the field and therefore has not made scholarly choices.

->In many cases the materials may be digitized to an archival standard, but be made available online at a lower resolution to facilitate access. Again, there candidate can be expected to explain such implementation decisions.

Changed lines 48-51 from:
to:
->As mentioned in the previous point there are guidelines for encoding scholarly electronic texts from drama to prose. The TEI is a consortium that maintains and updates extensive encoding guidelines that are really documentation of the collective wisdom of expert panels in computing and the target genre. For this reason candidates encoding electronic texts should know about these guidelines and have reasons for not following them if they choose others. The point is that evaluators should check that candidates know the literature about the scholarly decisions they are making, especially the decisions about how to encode their digital representations. These decisions are a form of editorial interpretation that we can expect to be informed though we should not enforce blind adherence to standards. What matters is that the candidate can provide a scholarly explanation for their decisions that is informed by the traditions of digital scholarship it participates in.

->Generally speaking projects should choose open and well documented standards (as opposed to proprietary standards like the WordPerfect file format) if they want their materials to be useful to scholars in the future. Electronic scholarly resources that use proprietary formats doom their work to be inaccessible to scholars once that format is superceded. Exceptions to this are project exploring interactivity which often calls for an authoring environment like Flash that can facilitate innovative interfaces. Such projects will typically keep the materials in open standard formats and use Flash to provide the interactive interface.

Changed lines 53-58 from:
to:
->One of the promises of digital work is that it can provide rich supplements of commentary, multimedia enhancement, and annotations to provide readers with appropriate historical, literary, and philosophical context. An electronic edition can have high resolution manuscript pages or video of associated performances. A digital work can have multiple interfaces for different audiences from students to researchers. Evaluators should ask about how the potential of the medium has been exploited. Has the work taken advantage of the multimedia possibilities? If an evaluator can imagine a useful enrichment they should ask the candidate whether they considered adding such materials.

->Enrichment can take many forms and can raise interesting copyright problems. Often video of dramatic performances are not available because of copyright considerations. Museums and archives can ask for prohibitive license fees for reproduction rights which is why evaluators shouldn't expect it to be easy to enrich a project with resources, but again, a scholarly project can be expected to have made informed decisions as to what resources they can include. Where projects have negotiated rights evaluators should recognize the decisions and the work of such negotiations.

->In some cases enrichment can take the form of significant new scholarship organized as interpretative commentary or essay trajectories through the material. Some projects like [[http://www.nines.org/ | NINES]] actually provide tools for digital exhibit curation so that others can create and share new annotated itineraries through the materials mounted by others. Such interpretative curation is itself scholarly work that can be evaluated as a form of exhibit or essay. The point is that annotation and interpretation takes place in the sphere of digital scholarship in ways that are different from the print world where interpretation often takes the form of an article or further book. Evaluators should ask about the depth of annotation and the logic of such apparatus.

Changed lines 60-64 from:

* Interface Design (Is it designed to take advantage of the medium?)

* Usability (Has it been assessed? Has it been tested? Is it accessible
?)
to:
->In addition to evaluating the decisions made about the representation, encoding and enrichment of evidence, evaluators can ask about the technical design of digital projects. There are better and worse ways to implement a project so that it can be maintained over time by different programmers. A scholarly resource should be designed and documented in a way that allows it to be maintained easily over the life of the project. While a professional programmer with experience with digital humanities projects can advise evaluators about technical design there are some simple questions any evaluator can ask like, "How can new materials be added?", "Is there documentation for the technical set up that would let another programmer fix a bug?", and "Were open source tools used that are common for such projects?"

->It should be noted that pedagogical works are often technically developed differently than scholarly resources, but evaluators can still ask about how they were developed and whether they were developed so as to be easily adapted and maintained.

* Interface Design and Usability (Is it designed to take advantage of the medium? Has the interface been assessed? Has it been tested? Is it accessible to its intended audience?
)
->The first generations of digital scholarly works were typically developed by teams of content experts and programmers (often students.) These project rarely considered interface design until the evidence was assembled, digitized, encoded and mounted for access. Interface was considered window dressing for serious projects that might be considered successful even if the only users where the content experts themselves. Now best practices in web development suggest that needs analysis, user modeling, interface design and usability testing should be woven into large scale development projects. Evaluators should therefore ask about anticipated users and how the developers imagined their work being used. Did the development team conduct design experiments? Do they know who their users are and how do they know how their work will be used? Were usability experts brought in to consult or did the team think about interface design systematically? The advantage to a candidate of engaging in design early on is that it can result in publishable results that document the thinking behind a project even where it may be years before all the materials are gathered.

->It should be noted that interface design is difficult to do when developing innovative works for which there isn't an existing self-identified and expert audience. Scholarly projects are often digitizing evidence for unanticipated research uses and should, for that reason, try to keep the data in formats that can be reused whatever the initial interface. There is a tension in scholarly digital work between a) building things to survive and be used (even if only with expertise) by future researchers and b) developing works that can be immediately accessible to scholars without computing skills. It is rare that a project has the funding to both digitize to scholarly standards and develop engaging interfaces that novices find easy. Evaluators should look therefore for plans for long term testing and iterative improvement that is facilitated by a flexible information architecture that can be adapted over time. A project presented by someone coming up for tenure might have either a well documented and encoded digital collection of texts or a well documented interface design process, but probably not both. Evaluators should encourage digital work that has a trajectory that includes both scholarly digital content and interface design, but not expect such a trajectory to be complete if the scope is ambitious. Evaluation is, after all, often a matter of assessing scholarly promise so evaluators should ask about the promise of ambitious projects and look for signs that there is real opportunities for further development.

->Finally, it should be said that interface design is itself a form of digital rhetorical work that should be encouraged. Design can be done following and innovating on practices of asking questions and imagining potential. Design, while its own discipline, is something we all have to do when developing digital works. Unlike books where the graphic and typographic design is left to some poorly paid freelancer paid for by the publisher after the book is written, in digital work it is all design from the first choices of what to digitize. This is especially the case with work that experiments with form where the candidate is experimenting with novel designs for information. In the humanities the digital work has forced us to engage with other disciplines from software engineering, informatics to interface design as we ask questions about what can be represented. It is a sign of good practice when humanists work collaboratively with others with design expertise, not a sign that they didn't do anything. Evaluators should look expect candidates presenting digital work to have reflected on the engineering and design, even if they didn't do it, and evaluators should welcome the chance to have a colleague unfold the challenges of the medium.

Changed lines 72-73 from:
to:
->The nature of the organization mounting a web resource is one sign of the background of a digital project. Some organizations like the [[http://www.stoa.org/ | Stoa Consortium]] will "mirror" an innovative project which typically involves some sort of review and the dedication of resources. Evaluators can ask about the nature of the organization that hosts a project as the act of hosting or mirroring (providing a second "mirror" site on another server) is often a recognition of the worth of the project. While universities doe not typically review the materials they host for faculty, a reliable university host server is one indication of the likelihood that the server at least will be maintained over time, an important concern in digital work as commercial hosts come and go.
Changed lines 75-76 from:
to:
->A simple sign that a project was designed to advance scholarly knowledge is that it has been demonstrated to peers, whether through local, national, or international venues. A candidate who doesn't demonstrate their work and get feedback is one who is not sharing knowledge and therefore not advancing our collective knowledge. Obviously some works are harder to demonstrate than others, particularly interactive installations that need significant hardware and logistical support. That said, just as university artists are evaluated on the public performances or shows of their work, so can a digital media artist be asked to document their computer installations or performances. Evaluators can ask about the significance of the venue of an installation just as they would ask about an art exhibit.
Changed lines 78-79 from:
to:
->As mentioned above, certain projects can be expected to be connected online to other projects. Learning materials can be connected to larger learning course systems; hypermedia works can link (reference) other works; and tools should have documentation and tutorials. Evaluators can ask how a work participates in the larger discourse of the field whether by linking or being subsumed. Do other projects in the same area know about and reference this project? Does it show up on lists of such works? For example there are lists of tools on the net - does a particular tool show up on well maintained list?
Changed lines 81-82 from:
to:
->A basic set of questions to ask about pedagogical scholarship is whether the learning innovation has actually been used and whether it has been used in real teaching and learning circumstances. As mentioned above, for pedagogical digital work evaluators should also ask if the use has been assessed and what the results were.
December 02, 2008, at 11:38 AM by 75.156.164.38 -
Changed lines 64-74 from:
* Ask the candidate

* Find a Computing and <your field> association or conference and scan
their web site.

* Check the Association of Digital Humanties Organizations and contact association officers

* Join
the Text Encoding Initiative Consortium and ask for technical review

* Ask MLA CIT committee

* Search HUMANIST archives
to:
* Ask the candidate. A candidate should know about the work of others in their field and should be able to point you to experts who can understand the significance of their work. If they can't then they aren't engaged in scholarship.

* Find a Computing and <your field> centre, association or conference and scan their web site. If you want names of people able to review a case there are centres for just about every intersection of computing and the humanities (like the [[http://chnm.gmu.edu/ | Center for History and New Media]]); there are national and international organizations (like the [[http://www.sdh-semi.org/ | Society for Digital Humanities / Soci้t้ pour l'้tude des m้dias interactifs]] in Canada and the international [[http://www.ach.org/ | Association for Computers and the Humanities]]); and there are conferences (like the upcoming [[http://mith.umd.edu/news/mith-to-host-digital-humanities-2009-conference | Digital Humanities 2009]] joint conference.)

* Check the [[http://www.digitalhumanities.org/ | Association of Digital Humanties Organizations]] and contact association officers. On their home page they list past joint Digital Humanities conferences.

* Join the [[http://www.tei-c.org/index.xml | Text Encoding Initiative Consortium]] or their discussion list [[http://listserv.brown.edu/archives/cgi-bin/wa?SUBED1=tei-l&A=1/ | TEI-L]] and ask for help with technical review.

* Ask [[http://www.mla.org/comm_id | MLA Committee on Information Technology]].

* Search [[http://www.digitalhumanities.org/humanist/ | HUMANIST archives]]. Humanist is a moderated discussion list that has been going since 1987, searching the list should provide ideas for experts.
December 02, 2008, at 11:10 AM by 75.156.164.38 -
Changed lines 3-4 from:
Questions to ask about a case:
to:
Questions to ask about a digital work that is being evaluated:

* Is it accessible to the community of study?
->The most basic question to ask of digital work is whether it is accessible to its audience be it students (in the case of pedagogical innovation or users in the case of a research resource.) A work that is hidden and not made available is one that is typically not ready in some fashion. It is normal for digital work to be put up in "beta" or untested form just as it is normal for digital work to be dynamically updated (as in versions of a software tool.) Evaluators should ask for the history of online publication of a work and ask if it has been made available to the intended audience so that there might be informal commentary available.

Changed lines 22-23 from:
* Have papers about the project been published?
to:
* Have papers or reports about the project been published?
->There are peer reviewed journals that will accept papers that report about the new knowledge gained from digital projects whether pedagogical scholarship or new media work. Further, there are venues for making project reports available online for interested parties to read about the academic context of a project. These reports show a willingness to present to the community for comment the results and context of a project. They also provide evaluators something to read to understand the significance of a project.

Changed lines 26-28 from:

* Is it accessible to the community of study?
to:
->The web is about connections and that is what Google ranks when they present a ranked list of search results. An online project that is hidden is one that users are not trying. One indication of how a digital work participates in the conversation of the humanities is how it links to other projects and how in turn, it is described and linked to by others. With the advent of blogging it should be possible to find bloggers who have commented on a project and linked to it. While blog entries are not typically careful reviews they are a sign of interest in the professional community.
Changed lines 29-32 from:
to:
->A scholarly pedagogical project is one that claims to have advanced our knowledge of how to teach or learn. Such claims can be tested and there is a wealth of evaluation techniques including dialogical ones that are recognizable as being in the traditions of humanities interpretation. Further, most universities have teaching and learning units that can be asked to help advise (or even run) assessments for pedagogical innovations from student surveys to focus groups. While these assessments are typically formative (designed to help improve rather than critically review) the simple existence of a assessment plan is a sign that the candidate is serious about asking whether their digital pedagogical innovation really adds to our knowledge. Where assessments haven't taken place evaluators can, in consultation with the candidate, develop an assessment plan that will return useful evidence for the stakeholders. Evaluators should not look for enthusiastic and positive results - even negative results (as in this doesn't help students learn X) are an advance in knowledge. A well designed assessment plan that results in new knowledge that is accessible and really helps others is scholarship, whether or not the pedagogical innovation is demonstrated to have the intended effect.

->That said, there are forms of pedagogical innovation, especially the development of tools that are used by instructors to create learning objects, that cannot be assessed in terms of learning objectives but in terms of their usability by the instructor community to meet their learning objectives. In these cases the assessment plan would resemble more usability design and testing. Have the developers worked closely with the target audience to develop something they can use easily in their teaching?

Changed lines 34-35 from:
to:
->Digital work
December 02, 2008, at 10:43 AM by 75.156.164.38 -
Changed lines 6-7 from:
to:
->Digital work is hard to review once it is done and published online as our peer review mechanisms are typically connected to publication decisions. For this reason competitive funding decisions like the allocation of a grant should be considered as an alternative form of review. While what is reviewed is not the finished work so much as the project and track record of the principal investigators, a history of getting grants is a good indication that the candidate is submitting her research potential for review where there is real competition. Candidates preparing for tenure should be encouraged to apply for funding where appropriate.
Changed lines 9-10 from:
to:
->Given the absence of peer review mechanisms for many types of digital work candidates should be encouraged to plan for expert consultations, especially when applying for funding. It is common in electronic text projects to bring in consultants to review encoding schemes and technical infrastructure - such expert consultations should be budgeted into projects in order to make sure projects get outside help, but they can also serve as formal, though formative, opinions on the excellence of the work. Evaluators should ask candidates to set up consultations that can help contextualize the work and improve it.
Changed lines 12-13 from:
to:
->Certain types of online work can be submitted to reputable peer reviewed online sources. Online journals exist with review mechanisms comparable to print journals and there are new forms of peer reviewed venues like [[http://www.vectorsjournal.org/ | Vectors]] that accept submissions of new media work. There are concerns about the longevity of these venues so candidates should also be encouraged to deposit their work in digital repositories run by libraries.
Changed lines 15-18 from:
to:
->The best way to tell if a candidate has been submitting their work for regular review is their record of peer reviewed conference presentations and invited presentations. Candidates should be encouraged to present their work locally (at departmental or university symposia), nationaly (at national society meetings) and internationally (at conferences outside the country organized by international bodies.) This is how experts typically share innovative work in a timely fashion and most conferences will review and accept papers about work in progress where there are interesting research results. Local symposia (what university doesn't have some sort of local series) are also a good way for evaluators to see how the candidate presents her work to her peers.

->It should, however, be recognized that many candidates don't have the funding to travel to international conferences and we should all, in this time of restraint, be judicious in our air travel. For that reason candidates should seek out local or regional opportunities to present their work wherever possible.

November 09, 2008, at 09:53 PM by 75.156.164.38 -
Changed lines 23-24 from:
!! Best Practices in the Evaluation of Digital Work
to:
!! Best Practices in Digital Work
Here is a short list of what to look for in digital work.

Changed lines 28-29 from:
* Digitization to archival standards (Are images saved to museum standards?)
to:
* Digitization to archival standards (Are images saved to museum or archival standards?)
November 09, 2008, at 09:52 PM by 75.156.164.38 -
Deleted lines 22-35:
!! How to Find an Expert

* Find a Computing and <your field> association

* Start at ADHO and contact association officers

* Join the Text Encoding Initiative Consortium and ask for technical review

* Ask granting council

* Ask MLA CIT committee

* Search HUMANIST archives

Changed lines 45-60 from:
* Learning (Is it used in a course? Does it support pedagogical objectives? Has it been assessed?)
to:
* Learning (Is it used in a course? Does it support pedagogical objectives? Has it been assessed?)

!! How to Find an Expert
Places to start to find an expert who can help with the evaluation.

* Ask the candidate

* Find a Computing and <your field> association or conference and scan their web site.

* Check the Association of Digital Humanties Organizations and contact association officers

* Join the Text Encoding Initiative Consortium and ask for technical review

* Ask MLA CIT committee

* Search HUMANIST archives
November 09, 2008, at 09:49 PM by 75.156.164.38 -
Changed lines 2-59 from:
• Did the creator get competitive
funding? Have they tried to apply
?
• Have there been any expert
consultations
? Has this been shown
to other for expert opinion?
• Has the work been reviewed? Can
it be submitted for peer review
?
• Has the work been presented at
conferences
?
• Have papers about the project been
published
?
• Do others link to it? Does it link
out well
?
• Is it accessible to the community of
study?
• If it is an instructional project, has it
been assessed appropriately
?
• Is there a deposit plan? Will it be
accessible over the longer term
?
Will the library take it?
Find an Expert
• Find a Computing and <your field>
association
• Start at ADHO and contact
association officers
• Join TEI Consortium and ask for
technical review
• Ask granting council
• Ask MLA ICT committee
• Search HUMANIST archives
Best Practices
• Appropriate content (What was
digitized?)
• Digitization to archival standards (Are
images saved to museum standards?)
• Encoding (Does it use appropriate
markup like XML or follow TEI
guidelines?)
• Enrichment (Has
the data been
annotated, linked, and structured
appropriately?)
• Technical Design
(Is the delivery
system robust, appropriate, and
documented?)
• Interface Design
(Is it designed to
take advantage of the medium?)
• Usability (Has it been assessed? Has it
been tested? Is it accessible
?)
• Online Publishing (Is it published from
a reliable provider? Is it published
under a digital imprint
?)
• Demonstration (Has it been shown to
others?)
• Linking (Does it connect well with
other projects
?)
• Learning (Is it used in a course? Does
it support pedagogical objectives
? Has
it been assessed?)
to:

Questions to ask about a case:

* Did the creator get competitive funding
? Have they tried to apply?

* Have there
been any expert consultations? Has this been shown to others for expert opinion?

* Has the work been reviewed
? Can it be submitted for peer review?

* Has the work been presented at conferences
?

* Have papers about the project been published
?

* Do others link
to it? Does it link out well?

* Is it accessible to the community of study
?

* If it is an instructional project, has
it been assessed appropriately?

* Is there a deposit plan
? Will it be accessible over the longer term? Will the library take it?

!! How to Find an Expert

* Find a Computing and <your field> association

* Start at ADHO and contact association officers

* Join the Text Encoding Initiative Consortium and ask for technical review

* Ask granting council

* Ask MLA CIT committee

* Search HUMANIST archives

!! Best Practices in
the Evaluation of Digital Work

* Appropriate content
(What was digitized?)

* Digitization to archival standards
(Are images saved to museum standards?)

* Encoding
(Does it use appropriate markup like XML or follow TEI guidelines?)

* Enrichment
(Has the data been annotated, linked, and structured appropriately?)

* Technical Design (Is the delivery system robust, appropriate, and documented
?)

* Interface Design (Is
it designed to take advantage of the medium?)

* Usability (
Has it been assessed? Has it been tested? Is it accessible?)

* Online Publishing (Is it published from a reliable provider? Is it published under a digital imprint?)

* Demonstration (Has it been shown to others?)

* Linking (Does it connect well with other projects?)

* Learning (Is it used in a course? Does it support pedagogical objectives? Has
it been assessed?)
November 09, 2008, at 09:44 PM by 75.156.164.38 -
Added lines 1-59:
!! Check List
• Did the creator get competitive
funding? Have they tried to apply?
• Have there been any expert
consultations? Has this been shown
to other for expert opinion?
• Has the work been reviewed? Can
it be submitted for peer review?
• Has the work been presented at
conferences?
• Have papers about the project been
published?
• Do others link to it? Does it link
out well?
• Is it accessible to the community of
study?
• If it is an instructional project, has it
been assessed appropriately?
• Is there a deposit plan? Will it be
accessible over the longer term?
Will the library take it?
Find an Expert
• Find a Computing and <your field>
association
• Start at ADHO and contact
association officers
• Join TEI Consortium and ask for
technical review
• Ask granting council
• Ask MLA ICT committee
• Search HUMANIST archives
Best Practices
• Appropriate content (What was
digitized?)
• Digitization to archival standards (Are
images saved to museum standards?)
• Encoding (Does it use appropriate
markup like XML or follow TEI
guidelines?)
• Enrichment (Has the data been
annotated, linked, and structured
appropriately?)
• Technical Design (Is the delivery
system robust, appropriate, and
documented?)
• Interface Design (Is it designed to
take advantage of the medium?)
• Usability (Has it been assessed? Has it
been tested? Is it accessible?)
• Online Publishing (Is it published from
a reliable provider? Is it published
under a digital imprint?)
• Demonstration (Has it been shown to
others?)
• Linking (Does it connect well with
other projects?)
• Learning (Is it used in a course? Does
it support pedagogical objectives? Has
it been assessed?)
Page last modified on July 22, 2009, at 05:23 PM - Powered by PmWiki

^