Methodology: Accessibility data generated with Bobby

The methodology used in this survey is essentially identical to that used in my previous studies (1999 and 2000). The following remarks are not meant to be exhaustive; they are meant to highlight and clarify some important aspects:

1. Scope of investigation:

Three sets of web pages were included in this survey: general campus pages,  library pages and academic department/unit pages. The campus and library set each consists of the respective home pages and the pages directly linked to them. If a campus web sites contained a page with links that, for the most part, connected to the various academic department homepages, all directly linked pages were included in the academic department set.  If such a page could not be found, a page with links to academic departments was created and used for the purpose of this evaluation.

Text-only or non-frame versions were used in this survey only when they were accessible from the top of the home page.

As in the previous year, the focus was on the thirteen UW campuses that offer four-year programs are included in this survey. However, for the first time, some data on the UW two-year campuses and its central institution ("UW-Colleges") were also collected. 

2. Evaluation tool

I used the latest download-version of Bobby (3.2), which is an accessibility validator created by the Center for Applied Special Technology. Bobby was developed to assist people in checking the accessibility of their web pages. For each page checked, Bobby provides information pertaining to the type, number, and location of accessibility errors--both minor and major ones. Bobby also issues a summary report for each set of web pages. Web pages that contain any major ("priority 1") error do not receive Bobby's approval.

In this study, the term "Bobby-approved" is used in a rather lax manner.  It refers to those pages which passed the automated Bobby check.  No systematic manual checks were performed. Similarly, the error data presented on this site refer exclusively to those errors that were detected during the automated Bobby check.

3. Limitations associated with Bobby as evaluation tool

As Bobby's creators freely admit, their product is not a perfect tool. While Bobby checks for compliance with the W3C-WAI’s W3C/WAI Web Content Accessibility Guidelines and Techniques for html documents, it automatically checks for compliance with only a subset of these. For the features not included in its automatic test, Bobby prompts the user to perform a “manual” check. Bobby is also unable to check for the accessibility of script (such as Javascript) or script-generated content. Some features can only be partially checked with Bobby. When encountering images, for example, Bobby will not report an error as long as some alternative text is provided—no matter how meaningless or non-descriptive this text may be. Thus, for various reasons, reliance on Bobby’s automatic checking facility alone is prone to produce some falsely positive (error-free) findings.

In addition to falsely positive results, Bobby, on occasion, also produces falsely negative results (reported errors where none exist), as I found out during my earlier studies. For example, pages that, at the very beginning, provide a "text-only version" link may not get Bobby's approval. Bobby simply checks the graphics versions for violation of accessible design principles. If it discovers a violation, Bobby considers this page to be inaccessible—regardless of how perfectly accessible the text-only version may be.

Another problematic feature of Bobby is its inability to distinguish between degrees of impact between different manifestations of the same error. For example, a bullet icon without an ALT tag (containing alternative text) registers as equal in status (i.e., as being a “Priority 1” error) to that of an image (also without an ALT tag) that is packed with crucial information. Similarly, Bobby may classify different types of accessibility errors as equal in severity even if the barriers they constitute differ to a significant degree. For example, the lack of alternative text associated with a purely decorative image registers as an error equal in need of correction to the lack of frame labels in multi-frame pages.

Despite its shortcomings, Bobby is a good evaluation tool in studies like this, where the accessibility of thousands of individual web pages are evaluated and a rough measure of accessibility suffices. In fact, the majority of web accessibility studies known to this author rely exclusively on Bobby’s automatically generated data. 

One final point, recent upgrades in Bobby (in response to changes in the evolving W3C standards) should call for initial caution in comparing previous years' data with this year's data.  However, since there has been no significant change in the W3C standards with regard to the most frequently occurring errors (i.e., images without alternative text and inaccessible hot-spots in image maps), a comparison of old and new Bobby-generated data is feasible and meaningful. 

Back to the top of this page

Back to the page entitled: Accessibility data generated with Bobby

Back to the Contents/Overview page

Researched and created by Axel Schmetzke, Library, University of Wisconsin-Stevens Point.
Last updated 04/11/01 .
Comments are welcome!

Valid HTML 4.0! Bobby 3.0 Approved Valid CSS!