HARD Clarification Form Feedback
During the second and third weeks of August, 2003, the HARD annotators at the LDC had the opportunity to perform a very new and refreshing task: completing clarification forms. These forms were submitted by participating sites to garner additional information from the original query issuer/topic creator. After completing this task, the annotators had some general feedback about the content and structure of the forms. These reactions are being shared so that sites can see the human impact of their work, and perhaps modify or revisit their approaches to the Clarification Forms next year.
What was good:
What was not as good:
- Extra space: annotators preferred to have a text box in which to include additional relevant terms.
- Variety of choice: it was helpful to have both potentially relevant lists of terms and short text segments to choose from.
- Headlines: it was better to a headline/title and then a list of terms, rather than just a list of terms.
- "Tell us more": annotators enjoyed having a space in which to specify more of what they were looking for.
- Longer text/word selections: it was best to have the full word and/or phrase in order to inform a more secure selection.
- Color and design: it was more pleasant to fill out several dozen forms if the design and color were varied a little bit.
- Ranking: it was helpful to rank sources according to preference [if two (or three) sources are equally desired, can they receive the same rank?]
- Timeframe: some annotators enjoyed choosing a timeframe for the question.
- Negative option: In marking single terms or short phrases, Neg. (as in,the annotator is not at all interested in a certain result) is a good option to have, in addition to Yes/No.
- Clarity: Clear (and brief) instructions were best. For example, "select the terms that are relevant to your query." No need to add extra detail (like "don't select the terms that are not relevant").
- Run variation: annotators enjoyed reading differently styled forms from the same site.
- Scrolling boxes. They were time-consuming and cumbersome, often containing too much information to judge in each box. Perhaps in the future, if sites choose to use scrolling boxes, they could be a bit lighter in content.
- Partial words as terms.
- Large groups of terms that are judged together: more often than not, they are not all on topic.
Additional recommendations for the future:
All in all, this was an entertaining and enjoyable task. It showed the annotators a little bit of personality from each of the sites; most of the staff found it interesting to see how creatively the sites interpreted the clarification form guidelines. We hope this will remain a part of the HARD track in the future.
- Check the text of the forms for spelling and syntactical errors. Some of the instructions were vague and difficult to understand because of the wording or other mistakes. Brevity and clarity are probably best for an exercise like this.
- Check the hidden input field in each individual form to ensure that it matches the intended site, run and topic numbers. We experienced a number of confusing moments because the topic clarification form was explicitly named one thing, while the result file (i.e., hidden input value) was actually named something else.
- When testing the forms before submitting them to the LDC, *especially during or immediately after the LDC's completion of the forms*, be sure to do one of these things:
It was quite confusing to differentiate "test" results from actual results. Again, some sites did chose to rename their test forms, which was extremely kind.
- Send an email warning us of the fact that your site is testing X number of forms
- Change the hidden input value of the form to something that is obviously not a HARD topic number (i.e., XXX1 or TEST1). Some sites did do this and it was very helpful.
- Change the cgi script to something other than the one the LDC provided (probably not the most effective option)
Back to Main HARD Project page