Initial Expert Review

Six experts agreed to serve as reviewers for initial analysis of accessibility of TE items. These reviewers included three experienced teachers of students with vision impairments (TVIs) working at public schools and the state school for the blind, one blind reviewer also certified as a TVI, one parent of a child with vision and motor impairments also certified as a TVI, and one occupational therapist working as a university trainer.

Expert reviewers evaluated prototype TE items drawn from several sources, including sample items prepared by CETE content area item developers. While the review focused on students with visual and motor disabilities, these experts had a diverse range of experience in working with students with a variety of sensory, linguistic, and cognitive challenges, so the review was designed to capture the breadth of their knowledge and experience. Experts were asked to consider six different categories of processing when considering each test item: perceptual, linguistic, cognitive, motoric, executive, and affective. These processes are defined in Universal Design for Computer-Based Testing (UD-CBT) Guidelines (Dolan, Burling, Rose, et al., 2010).

The experts conceptualized item barriers to accessibility in terms of presentation and response requirements and suggested alternative formats they deemed to be accessible. Experts concluded that the drag-and-drop interface used in many TE items is inaccessible to students with vision or motor impairments regardless of the item type in which it appeared. To alleviate the motor demands of dragging and dropping, experts suggested using alternative formats with radio buttons, keyboard commands or switches, or a scribe to enter responses. These would significantly increase the accessibility of these items for students with motor disabilities.

Reviewers pointed out that radio buttons are fully accessible on-screen, with audio, and with switches. Radio buttons are familiar to students from multiple-choice items and can be transcribed directly for print or tactile test forms. Reviewers also thought that drop-down menus and matching formats would offer greater accessibility by virtue of enabling keyboard responses or switches rather than requiring a mouse or touchscreen. These formats maintain the integrity of the test questions but allow the user to interact with the test with greater independence.

For items that require short answer text entry, reviewers suggested that students who are blind could use the onscreen text box as long as they have sufficient keyboarding knowledge. Other constructed response items, such as plotting points on a graph, were inaccessible online. Constructed response items, even though they may be machine-scored in an online format, would require human scoring if presented in paper-and-pencil or braille tests. Nonetheless, expert reviewers supported the use of constructed response items as they are accessible to students in offline formats and are consistent with instructional demands.

Finally, reviewers recommended that standard accommodations, such as screen magnification, large print and braille hard copy tests, auditory presentation, and the availability of switch systems, would continue to be useful and necessary for students with vision and motor disabilities. In some cases, these accommodations might increase cognitive demands or item difficulty. For example, memory load would be greater when an item is presented auditorially instead of visually, and part of an item may disappear from the screen when screen magnification is used.

«  back to Project Activities