Who cares about validation?

14 October 2004

20 comments

A Blue Perspective: Who cares about validation?

Mike's rather altruistic article about his efforts on the re-design of ABC News sparked Ethan's commentary about validation, which set Keith off about Web Standards zealotry and 73 comments ensued with little resolution. Who knows what other carnage was wrought on weblogs I didn't have the time to read. Essentially it all boils down to this:

Can you have Web Standards without validation?

Quite obviously, the term Web Standards means different things to different people. Recently, we have seen the popularisation of this term which previously only members of the W3C ever thought about, let alone uttered. And as with all things that become popular, the masses have a way of morphing it into something they can use best – subtracting bits, adding bits and changing bits, until the only part that's recognisable from the original is the name.

Because of the open – some would say easy – nature of web development, you get a more varied range of people plying their trade in HTML than you do in other areas of programming: marketing staff, graphic designers, coders, lecturers, tech support staff, babysitters, parents, children and web professionals. Most of these people don't care how their code looks – or even how their pages look – so it's no surprise that when some of them become interested in "Web Standards", it's not because they like encoding all their ampersands.

The enduring benefits of "Web Standards" are smaller file sizes, user customisability, easier maintainability and better accessibility. (If you want a comprehensive list, there's several places you can look.) All four of those points are more related to the separation of content and style than to the validation of code. You should be able to understand, then, the tendency for the popular definition of "Web Standards" to mean semantic code and CSS, with valid code hanging somewhere off the end.

Compliance with semantic code and CSS has visible benefits, visible in time and visible in money. At the moment, any validation errors that don't affect the visible benefits are invisible to 99% of the population; the other 1% being the self-priding Standards adherents who decry Mike Davidson's use of the phrase "Web Standards" in any relation to "ABC News".

I personally like my code valid – it gives me a nice warm fuzzy feeling when I get up in the morning – but I also don't freak out when I accidentally post a URL on my site which has an unencoded ampersand in it. Why? Because it has no effect whatsoever. I also don't freak out when I find that someone else's site doesn't validate because they accidentally posted a URL on their site which has an unencoded ampersand in it. Why? Because I don't know why they have that ampersand there. It might be because in order to change that ampersand they have to reconfigure their entire CMS, which would entail a change to their quality system, requiring the re-training of over 200 staff, which would affect their inventory output for the month of October, causing them to lose their biggest client and sending the company into financial collapse. And what for? An ampersand that has no effect whatsoever.

In every other area of "Web Standards" we recognise the inherent imperfection of the system: CSS hacks for IE 5, <table cellspacing="0">, <div class="clear">, PNGs without transparency. These are parts of our technological environment which we cannot easily change, so we adapt to them, we loosen the rules. They mightn't be within a strict definition of Standards, but their detriment is minuscule compared to their benefits.

You might be clinging on to that 1998 definition of what "Web Standards" are, hell, I like to spell colour with a "u", but if you want good development to become popular, you also have to be willing to be flexible. Web Standards isn't just a set of rules anymore, it's a movement. It's about people wanting to change the way the Web works, for the better. But change is gradual – you can't have it all in one bite.

Categories

, ,

Comments

  1. 1/20

    Rob Mientjes commented on 14 October 2004 @ 05:44

    I'm not likely to hop onto the 'validation is required'-bandwagon, but it's all about personal choices, likes and dislikes. That's what I wrote in my article[1], and that's what I think. Web Standards are to be achieved step by step, and that's where the problem lies - many people think the switch is easy. It is not. Far from it. Please don't nag about an unencoded ampersand.

  2. 2/20

    Julian commented on 14 October 2004 @ 07:34

    "but I also don't freak out when I accidentally post a URL on my site which has an unencoded ampersand in it. Why? Because it has no effect whatsoever."

    You're using XHTML strict and you serve your pages as text/html. That's why it has no effect. Try "application/xhtml+xml" and it has one.

  3. 3/20

    Carey Evans commented on 14 October 2004 @ 07:50

    Ironically, this post is a bit messed up in your RSS feed because a couple of ampersands need to be escaped.

    For <div class="clear"> above, the < symbol is included in the RSS feed as <, but it needs to be escaped twice as &lt;. Bloglines has its own div.clear CSS rule, with a font-size of 1px, so the end of your post wan't particularly easy to read.

  4. 4/20

    Carey Evans commented on 14 October 2004 @ 07:53

    I see that ampersands need to be escaped in your comments, too. I meant to say that < is included in your RSS as &lt;, but needs to be &amp;lt;.

  5. 5/20

    Luke Moulton commented on 14 October 2004 @ 08:25

    I too am tiering of the Web Standards priests pontificating perfect validation for every single page. The web has become a medium where the average Joe or Josephine can share information with the rest of the world with very little barrier to entry. It's not like in the past when you had to have a Ham Radio operators licence and $4k worth of equipment (and a minority audience). This being the case, I think it's up to the web professionals to do the best they can (within the time/budget restraints they have) before letting websites and web tools loose on the rest of the population.

  6. 6/20

    Nathan commented on 14 October 2004 @ 12:00

    "I personally like my code valid it gives me a nice warm fuzzy feeling when I get up in the morning"

    I agree with that, we all know validation is not mandatory (yet) so it really comes down to the individual level. If it makes you happy, go for it, if not ignore it!

  7. 7/20

    Nick Finck commented on 14 October 2004 @ 14:00

    Keith's post and the comments that followed may have come to no solid conclusion but there was some pretty good points made in the comments to my post about being <a href="http://www.digital-web.com/news/2004/10/stuck_up_on_validation/">Stuck up on Validation</a>. I don't think there is a true solution to this problem. People will be antagonists if that is the life they wish to lead and there is nothing to stop us... but I say hears to the optimist in all of us. I say we move forward, one little step at a time. We should show recognition for a good honest step in the right direction instead of crying foul when something triggers a few errors on the W3C Validator service.

  8. 8/20

    Ethan commented on 14 October 2004 @ 14:18

    Since writing my (rather ham-fisted) original post, I've been pointing folks at a comment I made on Keith's weblog that (I hope) clarifies my position a bit:

    http://www.7nights.com/asterisk/archive/2004/10/standards-equals-validation#comment18

    Yes, I absolutely believe that validation is a necessity -- technically speaking, it's a prerequisite to standards compliance. But of course, the realities of today's web frequently makes it impossible. The separation of presentation from content another component of standards compliance, and to which Mike's work at ABC News and ESPN is a great testament.

    What my post (see "ham-fisted," above) was trying to get at wasn't the validation errors themselves, but rather to decry the idea that the "real world" and validation have to be diametrically opposed. Granted, as I said: more often than not, the former will make the latter impossible. But to say that one simply shouldn't "care" about working toward "100% validation" because of the issues we all face? That struck me as a bit discouraging, and unfair to the folks writing and teaching these standards: people in the "real world" that are actively working to unify the theory and practice of web standards.

    Oh, and yes: I'm rambling.

  9. 9/20

    Mike D. commented on 14 October 2004 @ 15:31

    Thanks Cameron. And I'll add to Nick's post that I think the discussion at Digital Web sums it up best... particularly Michael Almond's post:

    http://www.digital-web.com/news/2004/10/stuck_up_on_validation/#comment477

    Here's the thing. I care about what I can control. What I can control is seeing to it that coding standards are improved with every one of our redesigns. I can control what markup language we use. I can control what sort of scripting we employ. I can control whether we use CSS-P or tables. I can control whether we use smart, non-proprietary code.

    What I *can't* control is certain sorts of validation errors. I can request they be addressed, but in certain cases, this may takes months or, yes, even years. I requested the removal of ampersands from our ad system during the ESPN redesign and it's just now being achieved.

    So why should "strict validation" be something I care about? If it can't be achieved because of environmental factors, why bother caring about it? Of *course* everyone working on our sites tries to keep their code clean. That's not the problem. The problem is that in a flexible publishing environment with a ever-changing group of contributors, things happen. I submit to you that it is our *duty* to be fault-tolerant in our environment. If something as seemingly insignificant as a poorly coded interactive ad can take our site down, that's an unacceptable vulnerability.

    I further submit that the W3C Validator would be a whole lot more useful if it sorted your pages not by how many errors they produce but by how many *types* of errors they produce. 1000 ampersands equals 1 error. 3 missing alt tags equals 1 error. 1 missing trailing slash equals 1 error.

    Wouldn't that be a lot more useful to people? I mean, if I know I have 1000 ampersands and I can't do anything about it, why make me wade through 2000 lines of errors just to get to the more reachable stuff?

    Why 2000 by the way? Because the W3C validator assigns two or more errors to each missing ampersand. Thanks. That's great. I really needed that.

    I think both a more human-friendly standards community and a more human-friendly validator would do more to eliminate faults than what is being done right now.

    I am interested in eliminating faults? Yes. Am I going to beat people, or myself, up because some faults can't be eliminated? No way.

  10. 10/20

    The Man in Blue commented on 14 October 2004 @ 15:42

    Just wondering how much of your time over the past two days has been taken up by reading and responding to all this Mike? :o]

  11. 11/20

    Mike D. commented on 14 October 2004 @ 15:47

    Too much, Cameron... too much. Time better spent on squashing nagging display bugs to be honest, but I can do both. I have coffee.

  12. 12/20

    Unearthed Ruminator commented on 14 October 2004 @ 22:32

    I too go for the warm fuzzy of passing the various validation tests (both the W3C and Ask Cynthia). I used to have the logos and links and what not to the validators, but I stopped doing that on my sites. It really has nothing to do with the content, which is what I hope people are after (it does say something about the person behind the content, but that's another story).

  13. 13/20

    Mark Tranchant commented on 14 October 2004 @ 23:52

    Why not get it right? It's not that hard, and then you can use the validator to make sure you haven't missed anything.

    If you deliberately, or through negligence, write invalid code, you never know what effects it may have on browsers unknown to you. Debugging problems is also harder, as you can't eliminate the unknown effect of your little non-validating foibles. Unless you can churn through your browser's source code, you don't *know* what's happening.

  14. 14/20

    Mike D. commented on 15 October 2004 @ 02:00

    Mark,

    I don't disagree with you, but here's the thing: I can count on one hand how many times in the last year it was a validation error which was causing layout problems on a particular layout I was working on. I comment my divs. I know when they are closed properly or not. 9 out of 10 times I run across a problem, it's due to a quirk in how different browsers interpret different CSS rules so a focus on CSS (and hacks to CSS) is a thousand times more likely to find my problem quickly than the validator is.

    Also, one definitely shouldn't assume that just because a site is valid, they can reliably predict that it won't fail in certain browsers. Safari, Firefox, and especially IEvil all interpret things just a bit differently and actual testing is the only way to know for sure whether your sites will break. I think a more accurate statement would be that if you keep your site as simple as possible, it has a smaller chance of failing in unknown browsers. That is a piece of advice I'm definitely guilty of ignoring sometimes.

  15. 15/20

    Steven Champeon commented on 15 October 2004 @ 06:28

    It is pretty clear from the unending furor surrounding the issue of validation (and from the various sides taken over and over again by the same voices) that there needs to be some clarification of the terms "Web standards", "validation", and so forth.

    "Web standards" is a loose term of art referring to the various Recommendations issued by the W3C; the Web Standards project likes to refer to "baseline standards compliance" as a principle towards which browser vendors and authors alike should strive, but in different ways. Browsers should support, in as similar a fashion as possible, the HTML/XHTML standards, CSS (in its various incarnations), ECMAScript/JavaScript/JScript (in its various incarnations), the DOM (same), such that authors can build one site to that baseline and have it work in as many environments and for as wide an audience as possible.

    Whether browser vendors should introduce extensions, and whether authors should use them, is of little consequence. It is a personal, and business, decision, whether the author chooses to avoid validation, cross-platform compatibility, and other mechanisms that tend towards ensuring a large audience and an inclusive design that allows one to benefit from the many strengths of today's powerful technologies.

    I believe that it is best for the future of the Web for there to be a strong argument for any browser vendor (in the form of a large body of standards compliant documents, the authors' expectations regarding the proper display and rendering and behavior) to continue to support the baseline we speak of.

    The furor nearly always erupts when a site is launched and claims some measure of "standards compliance", without conforming to many of what some consider the basic requirements of same: valid markup, valid stylesheets, and a reasonable attempt to follow the spirit of the law with respect to the strict separation of content/structure on the one hand (markup and its contents) and presentation (styles) and behavior (scripting and the DOM) on the other.

    It is clear to most professional Web designers and developers who do try to live up to the principle of "more valid sites, more browsers that respect standards", that when a site crows about their compliance, without actually observing that principle, they are undercutting the argument for future generations of browsers to bother supporting the baseline.

    It is also very (painfully, one might say) clear to many of the same professionals that the principle is observed more often in the breach, as many factors influence the success of a site; control over those factors varies widely. When Mike mocks the import of encoding ampersands, he's both correct (it currently has little impact) and at fault (if nobody lives up to the standards, then why should future browsers?) but it's his decision to make, and his battle to pitch.

    Standards advocates of all stripes recognize that in order to take advantage of the power of CSS and the DOM, the underlying markup must be clean and valid, and to take greatest advantage it must also be in some sense semantically meaningful, so that the stylesheets can latch onto that implied meaning and style the site consistently.

    What remains clouded during this whole debate is that modern browsers, having been largely born out of the insanity of the tag soup era, have extensive and fairly reliable error-correcting mechanisms. Try changing the HEAD in your document to BODY and vice versa, and look at the DOM tree in Safari's debug menu, or in Mozilla's Dom Inspector, and you'll see they've been auto-corrected before rendering.

    But this doesn't imply that the document is valid, rather it implies that we've become very reliant on browsers to fix our problems and failures for us.

    I remember a time when we'd done this before, back when Netscape Navigator accepted anchor HREFs and IMG SRC attributes missing their trailing quote character. Netscape tightened up their parser, and hundreds of thousands of pages "broke". The reality is that the pages were already broken, their authors responsible, and large chunks of the Web became unusable as a result.

    I think of validation as a way to avoid such nightmares. I should hope there are many of you who do, too.

  16. 16/20

    Mike D. commented on 15 October 2004 @ 08:20

    When Steve and I can agree on something, it *must* be right. I agree. And he's right.

  17. 17/20

    Anne commented on 16 October 2004 @ 17:21

    Page must be well-formed. http://annevankesteren.nl/archives/2004/09/business-failure

  18. 18/20

    Anne commented on 16 October 2004 @ 17:21

    Argh, s/Page/Pages/

  19. 19/20

    Richard@Home commented on 19 October 2004 @ 00:39

    I care about validation!

    Why? I'll tell you -> http://richardathome.no-ip.com/index.php?article_id=366

    (in a nutshell - if a page validates it shows the developer is concerned about the little things - they're a proffessional.)

  20. 20/20

    Keith commented on 19 October 2004 @ 13:23

    There are two things that bother me about this issue:

    1 -- The idea that validation is easy. I see this all the time and I hate to say it people, but it really depends. There are a whole bunch of reasons why a page may be hard to keep valid. In my case it's comments. <a href="/asterisk/archive/2004/06/sustaining-validation">I spent an inordinate amount of time trying to rectify that</a> and it came down to a choice between usability and validation. I chose usability.

    As it stands right now designing and building a Web site is about compromise and balance. There are times when Validation gets the short end of the stick.

    If you want to lay blame, lay it on the browser and tool makers.

    2 -- The idea that someone is somehow more professional if their site validates. Again, it depends. I know many professionals who don't have a valid site.

    Look, the issue here, to me anyway, isn't one of validation. I agree with the idea that validation is important and I understand all the arguments. I agree with just about every point Steven and the WaSP make in principle.

    No, the issue is communication. For some reason when these issues come up there are folks who take an all or nothing attitude toward it and that, by it's very nature pits people who should be working together against each other.

    All I ask is that people spend less time nit picking and more time praising and helping move people forward.

    Bring up the validation issue, but don't make it out to be easy, or disparage someone's professionalism.

    Positivity breeds positivity and in the end will bring more people on board. The exclusive, negative nitpicking just drives people out and turns them off.

  21. Leave your own comment

    Comments have been turned off on this entry to foil the demons from the lower pits of Spamzalot.

    If you've got some vitriol that just has to be spat, then contact me.

Follow me on Twitter

To hear smaller but more regular stuff from me, follow @themaninblue.

Monthly Archives

Popular Entries

My Book: Simply JavaScript

Simply JavaScript

Simply JavaScript is an enjoyable and easy-to-follow guide for beginners as they begin their journey into JavaScript. Separated into 9 logical chapters, it will take you all the way from the basics of the JavaScript language through to DOM manipulation and Ajax.

Step-by-step examples, rich illustrations and humourous commentary will teach you the right way to code JavaScript in both an unobtrusive and an accessible manner.

RSS feed