EASI Web Developer Perspective

This video shows how to evaluate an existing web page in terms of accessibility using the EASI (Expert Accessibility Support and Integration) tool. More specific it shows how to create an evaluation project, where the user can specify the web site or web page to be evaluated and the tests that should be performed.

These are the steps that needs to be undertaken and are shown in the video: (Note: The links included in the following text will open the video on the youtube web page.)

  • To create a new evaluation project the user first needs to login and select the role that she wants to represent in the new project (developer, expert or commissioner). The roles that she can select depend on her user account. In this video the user selects "Developer" as role.
  • After the user has logged in and selected the role she needs to enter the project name and specify the URL of the web page she wants to evaluate.
  • A new project will be created in the local working directory of the user including a configuration file where further settings regarding the project can be made.
  • In the project configuration the user can select the evaluation tests that should be performed. These tests include:
    • Tests grouped by the type of content they apply to, e.g. all tests related to forms, to images, to tables, to links and so on.
    • Tests related to search engine optimization.
    • Tests implementing WCAG 2.0 (Web Content Accessibility Guidelines 2.0).
  • After the user has selected one or more tests she can start the evaluation from within the configuration file. This will also open a new browser window within the EASI tool that displays the evaluated web page.
  • The results of the evaluation are displayed in two different ways:
    1. At the bottom of the EASI developer perspective all the results are displayed in one table
    2. On the right side of the EASI developer perspective (in the "Accessibility Report" view) the results are grouped by the content they address, e.g. form related rules, image related rules and so on. This view has three tabs:
      1. in the "Summary" tab for each type of content the the following information is presented:
        1. Number of problems found by automated testing.
        2. Number of problems found by developer testing.
        3. Number of outstanding developer tests.
      2. The second tab "Automated Testing" displays all the evaluation results for those tests, that can be tested automatically. For example it can be tested automatically if each form element has an associated label. Each test presented here either returned a "pass" or a "fail". And for each test the user can use the arrow icons provided to highlight the element associated with the test result in the browser window.
      3. The third tab "Developer Testing" displays all those tests, where a manual check is involved that needs to be undertaken by the developer. In the example with the labeling of form elements mentioned before, only a manual test can check if the label for a form element is also meaningful.
        The tests are presented as questions to the user. For each tests there are three different possible answers: "Yes", "No" and "Could Not Test". By default all questions have the value "Could Not Test".
        For each test the developer can again use arrow keys to navigate through the elements that are related to the tests and highlight them in the browser.