Article Summary: “Pictures into Words” by Brian Stewart
“Pictures into Words” by Brian Stewart describes a study comparing user indexing, i.e. crowdsourced tagging, with professional indexing. The article first summarizes Panofsky’s and Shatford’s methods for classifying images. Panofsky outlined three levels of subject matter/meaning:
- Pre-iconographical description (objects or events)
- Iconographical analysis (themes)
- Iconographical interpretation (meaning)
Shatford emphasized two dimensions of images, ‘of’ and ‘about’ (sometimes called an image’s ofness and aboutness.) Shatford also defined four facets of an image: who, what, where, and when.
Stewart argues that indexers have traditionally taken a positivist approach by labeling mostly specific, objective subjects. Today, however, users are in greater need of abstract subjects (interpretive indexing). The article gives the example of abstract subjects like “happy” or “peace.”
Part I of the study examined 11 professional indexers. Stewart broke down the subject terms they chose into specific, generic, and abstract categories. The professionals’ chosen terms for the set of photographs was 52% specific, 46.2% generic, and 1.8% abstract. They were reluctant to use abstract terms for fear of being too general. Professional indexers refrained from selecting abstract terms even when those terms would be useful, and Stewart argues that they made false assumptions about what the user needs.
Part II of the study examined social tagging, i.e. crowdsourcing on Flickr. 66 taggers tagged 33 photos, 11 of which were untitled and 22 of which were titled and given a photographer name. Interestingly, whether a photo was titled or not had little influence on taggers’ subject selections. The user group showed a strong preference for generic terms (60.3%) and about equal selection of specific and abstract terms (19.62% and 20%, respectively). Taggers were then given a “training intervention” which taught them the Panofsky/Shatford matrix and encouraged increased attention to abstract terms. After training, the users selected more than double the number of terms as before and the number of abstract terms multiplied by 5.1.
Stewart believes that users know better than professionals what they are searching for and what tags are most appropriate. Users are more likely to select abstract terms, especially after special training. Crowdsourcing the work of indexing actively involves users in the process of discovery. Practically speaking, professionals may become dependent on crowdsourcing the work of indexing as the volume of digital materials continues to mushroom. The greatest challenge is, as Stewart addresses in this study, keeping users motivated and active. Stewart found that tagging activity from users stagnated after ten days. For crowdsourcing projects to be sustainable, users need to feel invested in the project and like they are part of a community. It is the professional’s task to mentor and foster this community.
Stewart, Brian. “Pictures into Words.” The Indexer vol. 33, no. 1 (March 2015): 8-25.
Good work! This is a great article explaining the range of approaches to data entry, including crowdsourcing. It's interesting that he notes how "addicted" professionals can become to the crowds helping to organize our digital abundance!
ReplyDeleteDr. MacCall