Current Quality Assurance Practices in Web Archiving [Poster] Metadata
Metadata describes a digital item, providing (if known) such information as creator, publisher, contents, size, relationship to other resources, and more. Metadata may also contain "preservation" components that help us to maintain the integrity of digital files over time.
- Main Title Current Quality Assurance Practices in Web Archiving [Poster]
Author: Reyes Ayala, BrendaCreator Type: PersonalCreator Info: University of North Texas
Organizer of meeting: Texas Digital LibraryContributor Type: Organization
- Creation: 2013-05-07
- Content Description: Poster presented at the 2013 Texas Conference on Digital Libraries. This poster discusses research on the current quality assurance practices in the web archiving community.
- Physical Description: 1 poster : col. ; 36 x 24 in.
- Keyword: web archiving
- Keyword: quality assurance
- Item is a Primary Source
- Conference: Texas Conference on Digital Libraries (TCDL), 2013, Austin, Texas, United States
Name: UNT Scholarly WorksCode: UNTSW
Name: UNT College of InformationCode: UNTCOI
- Archival Resource Key: ark:/67531/metadc159526
- Academic Department: Library and Information Science
- Academic Department: Digital Projects Unit
- Display Note: Abstract: Web archiving is the process of storing and maintaining Internet resources (such as websites) to preserve them as a historical, informational, legal, or evidential record. The process involves three states: selecting relevant resources for preservation, gathering and storing them, and providing for their access. In recent years, it has become an increasingly common practice in libraries around the world, as national libraries, such as the Library of Congress and the National Library of Australia, seek to preserve their national digital heritage. Many universities have also begun archiving the web, usually to create subject-specific collections of web sites that supplement their existing print and digital collections. Within the web archiving community, a step that often goes unmentioned is the Quality Assurance process (QA), which measures the quality of an archived site by comparing it to a standard that must be met. Currently, each institution conducts its QA process independently, using a myriad of different standards and software tools. The result is a considerable knowledge gap: practitioners do not know if and how their peers are conducting a QA process and generally do not share this information. Consequently, there are no agreed-upon quality standards or processes. The study presented here attempts to address this information gap in the web archiving community. To this end, we investigated how several institutions conduct their quality control processes. It is worth noting that quality control procedures are often not publicly available and not thoroughly documented, if at all. Much of the information present here has been obtained from reports, electronic communications, listserv discussions, and interviews with staff involved in the QA process. The results we obtained led us to design a survey instrument to gather information in a more thorough and structured manner. The results from this survey are included here.