Accessibility Checkers for the Web: How Reliable are they, actually?


Automatic, semi-automatic, and manual checkers are essential to assess the degree of accessibility of web pages and web sites. In this work, a number of checkers is evaluated with regard to completeness, reliability, and user friendliness. The evaluation's objective part is based on a variety of testsuites with more than 20,000 testcases in total, addressing topics such as HTML 5 (including audio and video), SVG (stand-alone and inline), MathML, Xlink, Xpath, and XML in general, as well as WAI ARIA, various CSS3-related technical recommendations, JavaScript, and HTTP headers. That part also comprises color contrasts and a series of textual and semantical checks. The subjective evaluation was conducted by expert testers using the tools in the context of real-life web sites and tasks. The results show that more modern checkers are needed in terms of more complete and more reliable tools, which allow frequent, fast, and exhaustive monitoring. Given the current quality of these tools, human inspection and user testing are unavoidable to yield a high degree of accessibility of web pages and applications.