Kamis, 25 Februari 2010

A Survey of Survey Tools

A Survey of Survey Tools

This page was last updated: 19 November 2008.
Dates of evaluation of survey tools are in the “Roll-Call of Tools” section.
A “Changelog” at the end of the document lists changes for the updates to this page.
The content and comments on this page are the professional opinion solely of the Web Accessibility Center and do not represent the opinion of The Ohio State University or any organizations or units affiliated with the WAC.

This document presents an overview of a number of popular tools for conducting surveys online. We have no hard statistics on which of these tools are currently being used at OSU. We have either taken surveys delivered by OSU entities via one of these tools or have been contacted by web administrators who have used or currently deploy the tool within their department or unit. At the end of this document we mention a few other tools we came across in our research.

In the comparisons and analysis below, we attempt to assess the degree to which the survey tools—for both the administrator and survey-taker—are accessible to keyboard alone and to a screen reader. Also important in this capacity, particularly for people with cognitive disabilities, are a logical page organization (visually and at the level of the HTML), easy recognizability of both the progress through the survey for the survey-taker and the various feature sets available for the administrator, and visual usability factors such as use of open/white space, proximity of related elements, and visual contrast between elements that might help in survey-takers making selections on multiple-choice and matrix question types.

For accessibility testing:
We ran a screen reader (JAWS) against all of the interfaces, both survey-taker and administrator. How difficult was it to navigate the interfaces and accurately fill in information?
We attempted to perform all input with keyboard alone, specifically using the tab key to navigate form elements. Can a user tab and arrow through the interfaces? Is it clear where the cursor is while doing so, that is, do form or other elements clearly visually indicate focus? Is it necessary to use the mouse for essential functionality?
We set the computer into a high-contrast rendering mode. Are all essential page elements still visible and is contrast uniformly enhanced?
We enlarged the screen fonts. Does enlarging fonts distort or “break” the interfaces?
We made subjective appraisals of the clarity, organization, and contrast of the visual presentation. Does does styling and layout aid visual recognition of the functionality or purpose and significance of the various elements of the interfaces?

It is relatively easy to make static form elements accessible. And, by and large, survey-taker interfaces in all of the tools lack dynamic interface elements. That is, there is little AJAX or use of JavaScript to manipulate display, though some use JavaScript for tooltips and to trigger selection of elements in question types. We found more extensive use of dynamic techniques on the administrator interfaces, which tended to make them less accessible. Here is some of the markup we looked for:
Use of headings to introduce questions or sections of content
Use of labels to associate descriptive text with form elements
Use of fieldsets and legends to group and identify related form elements
Proper markup of tables with headers in matrix question types

There are some basic HTML techniques survey authors ought to be aware of, regardless of the accessibility of the tool they are using. So after discussing the survey tools, we offer a few tips for creating accessible surveys.
Roll-Call of Tools

Without further ado, the patients currently on the operating table:
SurveyGizmo (commercial, hosted, re-reviewed 25 July 2008)
SurveyMonkey (commercial, hosted, re-reviewed 25 July 2008)
Zoomerang (commercial, hosted, re-reviewed 25 July 2008)
Checkbox (commercial, server installed or hosted, re-reviewed 24 September 2008, version 4.4)
LimeSurvey (open-source, server installed, re-reviewed 24 September 2008, version 1.71+)
Snap Survey Professional Edition (commercial, survey configuration via desktop application (Windows only), manual publishing to web for data collection, reviewed 25 July 2008)

For our tests, we composed a survey in each tool. The first seven questions in each survey are either identical or as close to identical as we could muster, given the idiosyncrasies of the tools and question types. We then imagined 10 survey-takers and had “them” fill in each of the five surveys overlapping questions identically. We did this to get a sense of how the statistics collected by each tool stack up head to head.
Question Types

We are listing only those features available at all package levels and try to constrain our choices to question types that are common across more than one survey tool. All of the remotely managed (non-locally installed) survey tools provide more question types and, in some cases, greater flexibility in collection of data (uploads of data files, for instance) at the higher cost package levels.

Tidak ada komentar:

Posting Komentar