ITTATC Web Accessibility

Diagnostic and Repair Tools

 

Section 508 Coordinators Meeting

Gettysburg, PA

November 2003

 

·        Except as noted below, slides have a white background with black lettering and a black rectangular bar across the bottom that fades to grey and then white toward the left and right edges. The lower right hand corner states Section 508 Coordinators Meeting:  Gettysburg, PA November 2003 in white over the gradient bar.  Slides have the ITTATC logo in the lower left hand corner. This consists of a multi-hued brown globe with black continents.  “ITTATC.org” is written in black-outlined, white letters which are a little greater in height than half the diameter of the globe. There are a number of small thin rings around and emanating from the globe in a non-uniform configuration.  The length of the text is such that extends beyond the globe and rings on each side.  The text is centered on the gradient bar previously described, while the globe and rings extend above and below it.

·        Title slides also have a white background.  The gradient bar is a little bit larger than on the general slides, and is located about a quarter of the way from the top of the slide.  The globe logo is situated similarly with regard to the gradient bar, however the suffix “.org” is not present.  There is no additional text on the bar.

·        Text and titles throughout the presentation use the Verdana typeface in black on the white background.

 

Slide 1

Title:

Web Accessibility Diagnostic and Repair Tools

Text:

Section 508 Coordinators Meeting

Gettysburg, PA

November 2003

Image:

Title Slide

 

Slide 2

Title:

None

Text:

Information Technology Technical Assistance and Training Center

www.ITTATC.org

Toll free:  866-9ITTATC (948-8282) (V/TTY)

ITTATC promotes the development of accessible electronic and information technology products and services related to Section 508 of the Rehabilitation Act and Section 255 of the Telecommunications Act by providing Information, Training and Technical Assistance to Industry, Trainers, State officials and Consumers.

Image:

Title Slide Format.  NIDRR logo at bottom left of page with caption “National Institute on Disability and Rehabilitation Research #H133A000405”

 

Slide 3

Title:

Web Accessibilty Tools

Text:

AccVerify DS2  with AccRepair

·        HiSoftware

A-Prompt 1.0.6

·        University of Toronto’s Adaptive Technology Resource Centre and the TRACE Center at University of Wisconsin Madison

Bobby ™ 5.0/WebXACT

·        WatchFire®

InFocus 4.2.2 and AskAlice

·        SSB Technologies

Image:

None

 

Slide 4

Title:

Web Accessibility Tools (cont.)

Text:

LIFT Online

·        UsableNet

PageScreamer 5.0

·        Crunchy Technologies

UsableNet 508 Plugin 1.2.1 and WCAG Plugin 1.0.1 for DreamWeaver Version 1.2.1

·        UseableNet

Image:

None

 

Slide 5

Title:

Method: Pages Tested

Text:

Accessibility Forum “Pathologic pages” (http://www.accessibilityforum.org/docs/feb_mtg_02/af-denver-cdrom_feb02/Worst_Case_Section_508_Rules/index.htm)  (Case sensitive link)

Image:

Screen capture of above referenced page.

 

Slide 6

Title:

Method:  Standards

Text:

Section 508 1194.22 Guidelines. (http://www.access-board.gov/508.htm)

W3C WCAG 1.0  (http://www.w3.org/TR/WCAG10/)

Images:

None

 

Slide 7

Title:

Testing is still in progress

Text:

On-going testing

Results reported here are continuously being updated

·        New versions

Today is an overview

Image:

None

 

Slide 8

Title:

At a Glance Comparison Chart

Text:

None

Image:

Table showing feature comparison of tested applications.  This table is contained in the attached file Comparison of Web Accessibility Tools At A Glance.xls.

 

Slide 9

Title:

AccVerifyDS2: “Combined” report

Text:

None

Image:

Screenshot of the summary report produced by ACCVerifyDS2.  Window is divided into three panels.  The left panel shows a hierarchy of files.  The middle panel shows a list of graphical icons related to testing success, and the right panel lists a summary of specific types of errors encountered.

 

Slide 10

Title:

AccVerifyDS2: Reporting

Text:

None

Image:

Screenshot of the detail report produced by ACCVerifyDS2.  Window is divided into three panels.  The left panel shows a hierarchy of files.  The middle panel shows a list of graphical icons related to testing success, and the right panel lists instances and details of individual errors encountered.

 

Slide 11

Title:

AccRepairDS2: Table Element

Text:

None

Image:

Screenshot of the Table Utility window in AccRepairDS2, showing a list of check boxes for formatting repairs which can be made automatically.

 

Slide 12

Title:

A-Prompt 1.0.6: Reporting

Text:

None

Image:

Screenshot of the A-Prompt 1.0.6 Rule window.  Window includes list of rules and contents required for conformance, a readout of the number of problems encountered, and a notice of the desired level of conformance.

 

Slide 13:

Title:

A-Prompt Repair

Text:

None

Image:

Screenshot of the A-Prompt Rule window recommending changes to the alternate text of a specific web image being tested.

 

Slide 14

Title:

Bobby: Accessibility Tab

Text:

None

Image:

Screenshot of the Bobby Accessibility Window showing a summary of errors by priority level in a table at the top of the page, with a chart of specific errors at the bottom of the page.

 

Slide 15

Title:

Ask Alice

Text:

None

Image:

Screenshot of Ask Alice report listing errors and their WCAG priorities with references to specific sections of the applicable standard being violated (in this case specific subsections of Section 508)

 

Slide 16

Title:

InFocus Server: Report

Text:

None

Image:

Screenshot of the Executive Summary Report.  At the top of the report are links to specific data views (Executive Summary, Violations Summary, Pages Alphabetical, Pages by Total Violations)  At the bottom of the page is a listing of web pages ranked by number of violations.

 

Slide 17

Title:

InFocus Server: Report

Text:

None

Image:

Screenshot of InFocus Server report listing errors and their WCAG priorities with references to specific sections of the applicable standard being violated (in this case specific subsections of Section 508)

 

Slide 18

Title:

Lift Online: Report

Text:

None

Image:

Screenshot of the Lift Online Report showing a list of encountered errors with icons denoting severity, and a text column denoting guideline(s) violated.

 

Slide 19

Title:

Lift for Dreamweaver:  Repair

Text:

None

Image:

Screenshot showing two windows.  The upper right image is on top and shows a listing of images with their type, page address, and alt tag.  The lower left window is a table wizard which allows the user to define header cells, etc.

 

Slide 20

Title:

PageScreamer 5.0

Text:

None

Image:

None

Note:

Page intentionally left blank

 

Slide 21

Title:

UsableNet 508 Plugin for Dreamweaver

Text:

None

Image:

Screenshot of webpage opened in Dreamweaver with properties tools opened.  Beneath the properties bar is an additional tool which shows errors related back to the properties displayed.

 

Slide 22

Title:

Server vs. Desktop

Text:

Advantages

·        Provides consistency of standards application on large sites with many developers

·        Most provide automated testing at user defined intervals

·        Good for pages with dynamic content (checks content after rendered in browser)

·        Promotes collaboration/Others can see the reports

·        Often has the “project” definition option

Disadvantage

·        repair features limited

Image:

None

 

Slide 23

Title:

Desktop vs. Server

Text:

Cost: usually less expensive

Very good for less experienced developers

·        more detail provided on what the standards are and how to fix the error

Often has some autofix features

·        “Libraries” of images, tables, etc that need titles, ALT tags, LONGDESC, etc

Image:

None

 

Slide 24

Title:

Looking for the “silver bullet”?

Text:

All tools have features that are useful

·        personal preference (screen design)

·        potentially helpful in applying uniformity to large sites/groups of developers

There will always be the need for “manual checks” and judgment

The accessibility evaluation of the tools themselves is not complete (ex: jsp)

Image:

None

 

Slide 25

Title:

Questions?

Text:

Mimi Kessler

404-894-0953

mimi.kessler@ITTATC.org

 

Deborah Buck

518-439-1263

Deborah.buck@ITTATC.org

Image:

None