There have been reports in the UK press of plans to reduce the speed limit of traffic on rural roads from 60 to 50 miles per hour (96 to 80km/h). The main argument, of course, is to improve road safety, but there is also an argument that speed limits on their own do not necessarily lead to safer drivers:
- a speed limit may imply that driving at, or just below, that limit automatically means ‘safe’.
- a safe driving speed depends on context – weather, time, road condition, surrounding environment, visibility, to name but a few factors.
The reality is that:
- if I break the speed limit, then I risk being caught, and either fined or charged with dangerous driving. So I try(!) not to break the speed limit.
- if I stick to the speed limit on an urban or semi-urban dual carriageway where there are two lanes and good visibility, but a 40mph speed limit, I can guarantee that at some point I’ll be overtaken by vehicles who are breaking the law yet who are overwhelmingly unlikely to be punished for it. (There are several roads round Dundee like this; the other day I was passed by a speeding bus, on the back of which was an advert for a road safety initiative illustrated by a large picture of a speed camera.)
So there’s a penalty for speeding, but it doesn’t stop people breaking the law, especially when they see little reason for keeping to it. I might get a bit annoyed, but so long as those other drivers are driving safely, is it really a big deal?
Which brings me to…the reawakened argument over web page validation and accessibility.
A post by Jeff Attwood questioning the merits of HTML code validity has caused a lot of debate amongst web standardistas, and on the GAWDS discussion list. Jeff’s debate revolves around the lack of code validity of home pages of high-traffic sites like Google. He concludes by encouraging developers and designers to aspire to validation but suggesting that few will notice the difference if it is achieved.
The dominant reaction on GAWDS is one that often occurs in this debate – accusations of the organisations concerned of a lack of concern for accessibility. That includes the employer of accessibility pioneers like T.V. Raman, a company with a home page that a very, very large number of people visit and use (including disabled people) without undue problem. Wouldn’t we have heard about it if it were not so?
Large tech companies have a history of embarrassing themselves by undermining their own accessibility R&D expertise with corporate blunders (see Google Chrome or IBM’s Sydney Olympics site). But as Matthew Pennell points out on the GAWDS list, there are most likely compelling commercial reasons why Google limits the amount of code on its home page (and has no DTD), so as to reduce download times as far as possible. I don’t have experience of working for a large tech company, so I’m in no position to question their motives for failing to achieve validation across the board. But, based on the validation errors found, Matthew challenges anyone to prove that these errors on their own lead to exclusion for disabled people. I’d also like to see that evidence.
We had this debate in the drafting of WCAG 2 and the role of validity within the guidelines. It also reminds me of the reaction to the DRC ‘s survey of web accessibility conducted in the UK in 2004. The survey carried out automated testing, expert testing and user testing with disabled people. It found that some sites which did not validate to WCAG 1.0 nevertheless could be used for the intended purpose by the disabled evaluators. Instead of a debate as to the validity of the guidelines, some people questioned the validity of the research. There seemed to be a preference to use guideline conformance over the experiences of disabled people as evidence of accessibility.
For me, the sensible position to take is that aspiring to have as few validation errors as possible is an excellent design principle. But let’s not kid ourselves that validity means universal accessibility for disabled people and non-validity means inaccessibility, or illegality (as one comment in Jeff Attwood’s blog appeared to assert). As accessibility advocates, we do our cause no favours by haranguing people for behaviour we object to based on points of principles rather than solid evidence. Mike Davies (@isofarro), who has criticised this approach for years, has recently made a similarly scathing attack on CSS evangelism, and in doing so identifies examples of where informed use of invalid code can enhance accessibility.
In both cases, the standards-based approaches are the way to go – as far as possible. But in the messy world of the Web, this may – for valid reasons – not be all the way.
Just like that 50mph speed limit on an open, straight stretch of rural road is no more making it safe than the 30 mph limit on the city road outside a school at 3pm when it’s raining, your wiper blade has stopped working properly, you’re tired, and stressed by the white van in your rear view mirror.