The methodology used in this survey is essentially identical to that used in my previous studies (1999, 2000 and 2001). The following remarks are not meant to be exhaustive; they are meant to highlight and clarify some important aspects:
1. Scope of investigation:
Three sets of web pages were included in this survey: general campus pages, library pages and academic department/unit pages. The general campus and library set each consists of the respective home pages and pertinent pages directly linked to them. If a campus web site contained a page with links that, for the most part, connected to the various academic department homepages, that page was used as the starting point for evaluating academic department/unit pages; all pages directly linked to it were included in the academic department/unit set. If such a page could not be found, a page with links to academic departments was created and used for the purpose of this evaluation.
Text-only or non-frame versions were used in this survey only when they were accessible from the top of the home page.
As in the previous year, the focus was on the thirteen UW campuses that offer four-year programs are included in this survey. However, some data on the UW two-year campuses and its central institution ("UW-Colleges") were also collected.
2. Evaluation tool
I used the download-version of Bobby 3.1.1, which is an accessibility validator created by the Center for Applied Special Technology. Bobby was developed to assist people in checking the accessibility of their web pages. For each page checked, Bobby provides information pertaining to the type, number, and location of accessibility errors--both minor and major ones. Bobby also issues a summary report for each set of web pages. Web pages that contain any major ("priority 1") error do not receive Bobby's approval.
When preparing for this study, much thought was given as to which version of Bobby to use. I had used Bobby 3.1.1 for some of my previous web accessibility studies. Since then, three other versions—Bobby 3.2, Bobby 3.3 and Bobby WW—had been developed. Since the difference between Bobby 3.3 and Bobby WW consisted mainly of the addition of US Government Section 508 compliance checking, which is not utilized in this study, the author faced a choice between Bobby 3.1.1, Bobby 3.2 and Bobby 3.3/Bobby WW. The final choice fell on Bobby 3.1.1— for the following reasons: First, a preliminary test run, in which 12 web sites (a total of 288 pages) were evaluated with each of the three Bobby versions, revealed the closest similarity between the Bobby 3.1.1 and the Bobby WW evaluation results. While the correlations for the numbers of errors detected per site between Bobby 3.1.1 and Bobby WW, Bobby 3.1.1 and Bobby 3.2, and Bobby 3.2 and Bobby WW, were all close to one (the Pearson product-moment coefficient was .9999 for all three pairs), the total numbers of errors detected (1105 by Bobby 3.1.1, 1098 by Bobby 3.2 and 1106 by Bobby WW) showed more similarity between the first pair. Second, for unknown reasons, Bobby WW resulted in frequent crashes of my computer (running Windows ME). Third, unlike Bobby 3.2 and Bobby WW, Bobby 3.1.1, provides a total count of the instances of Priority-1 detected errors, broken down by error types, in the summary report. The summary report in the two later Bobby versions does not provide this information. Collecting it with these versions is excruciatingly time-consuming; it would involve looking at the individual page reports and then adding the figures provided therein.
In this study, the term "Bobby-approved" is used in a rather lax manner. It refers to those pages which passed the automated Bobby check. No systematic manual checks were performed. Similarly, the error data presented on this site refer exclusively to those errors that were detected during the automated Bobby check.
3. Limitations associated with Bobby as evaluation tool
In addition to falsely positive results, Bobby, on occasion, also produces falsely negative results (reported errors where none exist), as I found out during my earlier studies. For example, pages that, at the very beginning, provide a "text-only version" link may not get Bobby's approval. Bobby simply checks the graphics versions for violation of accessible design principles. If it discovers a violation, Bobby considers this page to be inaccessible—regardless of how perfectly accessible the text-only version may be.
Another problematic feature of Bobby is its inability to distinguish between degrees of impact between different manifestations of the same error. For example, a bullet icon without an ALT tag (containing alternative text) registers as equal in status (i.e., as being a “Priority 1” error) to that of an image (also without an ALT tag) that is packed with crucial information. Similarly, Bobby may classify different types of accessibility errors as equal in severity even if the barriers they constitute differ to a significant degree. For example, the lack of alternative text associated with a purely decorative image registers as an error equal in need of correction to the lack of frame labels in multi-frame pages.
Despite its shortcomings, Bobby is a good evaluation tool in studies like this, where the accessibility of thousands of individual web pages are evaluated and a rough measure of accessibility suffices. In fact, the majority of web accessibility studies known to this author rely exclusively on Bobby’s automatically generated data.
Back to the top of this page
Back to the page entitled: Accessibility data generated with Bobby
Back to the Contents/Overview page
Researched and created by Axel
Schmetzke, Library, University of Wisconsin-Stevens Point.
Last updated 04/30/02 .
Comments are welcome! email@example.com