Manually Test EASI Web Project
This video shows the manual test of a local EASI web evaluation project.
These are the steps that needs to be undertaken and are shown in the video: (Note: how to create such a project and perform an automatic evaluation is shown in other videos. If you haven't please watch those videos first. Note 2: The links included in the following text will open the video on the youtube web page.)
- Note: the local HTML source code that is tested inside the video contains a login form with labels and text fields for the user name and password. However there are tests that cannot be tested automatically, so a manual test is required.
- The user needs to select the "Developer Testing" tab of the "Accessibility Report" view. In this tab all the tests are listed, which need to be done during the manual evaluation. The tests are grouped by the type of content to which they apply, e.g. form related tests, image related tests and so on. Each test is presented as a question to the user (e.g. "Do you provide a checkbox in addition to a submit button for users to confirm their responses before submitting them?"). For each tests there are three different possible answers: "Yes", "No" and "Could Not Test". By default all questions have the value "Could Not Test".
- In the video the user answers one question with "No" and a second one with "Yes" and leaves the other questions unanswered.
- After the user has completed the manual evaluation she can save the results by clicking on the button provided above the questionnaire. All the results of the evaluation are then saved in a remote database using services provided by the "Web Compliance API".
- To get a quick overview of the saved results the user can open the "EASI Web Compliance API Editor". Using this editor the user can get an overview of all the past evaluations for projects where she is a project member. For each evaluation a list of the evaluation results is provided, including the outcome of the test (passed, failed, cannot tell, not applicable), the type of test (either automatic or manual) and further information about the test. In the video the user selects the evaluation that she just performed and saved and discovers that the two tests that she answered are correctly saved in the database.
Note: this video is part of a more complex use case which includes three videos: