When working with Accessibility Guidelines, algorithms only go so far
A cautionary tale of relying too heavily on algorithms to test your design's accessibility
Author: Kris Jeary
Published: 2nd May 2015
If you have any comments on our articles, send us your thoughts on Twitter.
Over the last couple of months we have been working with a local council to design their new website, we can't talk about which council or show any work yet as we are still in pre-release. It has been a great project to work on and very challenging at times, designing an interface that will work for such a wide range of people.
The only way to deliver on the project goals was to put accessibility at the heart of every decision we made with our client. We are fortunate to have a wide range of programs that let us test the site against accessible criteria. We tested, we iterated and after a lot of hard work had a set of files that met AAA accessibility guidelines, the highest mark awarded.
We rejected creating an accessibility toolbar on the site, we wanted to rely on the tools people actually use, in my opinion an accessibility toolbar is a massive waste of time and can actually interfere with the tools being used by people. With that decision made we had to ensure our solution worked in harmony with these tools. We tested our design with screen readers to ensure that content was dictated in a logical way, we tested the design for contrast, we thought we had been really thorough and in truth we had been, as much as is possible without a first hand understanding on how the internet is for those with a disability.
What happened next carried with it a very important lesson.
Once we had finished slapping ourselves on the back we had an opportunity to test our work with a group of blind and visually impaired volunteers, a group brought together by the Kent Association for the Blind (KAB).
We arrived at KAB offices full of excitement and got chatting with our volunteers, discussing dealing with the web and the tools that they use to do so. The variety of tools and approaches was quite remarkable. Some were using screen readers, some magnification, others relied more on colour inversion. One of our volunteers used a screen to braille device, a quite remarkable piece of equipment which read the screen text and outputted to a portable machine.
What resulted from the testing was very encouraging, we received a fair amount of praise for the accessibility of our work but what became apparent was that testing a website by using a computer algorithm will only get you so far.
One key component of the new website was a large header area embedded within is a search box that asks a simple of the visitor; "What do you want to do?". The search box is a solution which will provide smart results based on the search, the aim is to encourage visitors to use this first and foremost, as opposed to display a list of categories or frequent tasks (these are also included but not as prominent). We used a fixed header which shrinks when the user scrolls. This was working wonderfully, appropriately scaled depending on the device and screen real estate available. That is it worked until one of our partially sighted volunteers zoomed in on a touch device to make the text visible, when that happened the fixed header took up the entire screen and overlapped the content beneath.
We also ensured in our early testing that the contrast ratios of all components was correct and again met AAA accessibility. The main content area was dark text on a light background and the footer area was set the other way around, white text on a dark background. This decision was tested when a volunteer showed us the first thing they do is to invert the colours when viewing dark text on a light background, of course in our demonstration they had to then switch the inversion off when they reached the footer area.
Without this user testing we would still have delivered a AAA designed website, but one that still had important flaws.
The lesson from this tale is not to rely solely on what a computer algorithm tells you, it would be an easy thing to congratulate yourself on meeting a set of criteria guidelines, but they are guidelines only, nothing beats real user testing and in depth conversations with people that will be interacting with your work.
There are many blind charities across the country, many of which (I'm sure) would be delighted to arrange an opportunity for you to test your work with them and to help raise the accessibility bar for everyone. Be thankful for this help and it is always good practice to offer to pay for the volunteers' time or make a donation to the charity.
Happy coding and thanks for reading.
Most popular articles
- A guide to website content
- An introduction to the new General Data Protection Regulation (GDPR)
- Working with Squiders
- Designing for the elderly
- Reverse engineering the Squiders website
- Working within accessibility guidelines
Or you can view the complete list