GIANT ROBOTS SMASHING INTO OTHER GIANT ROBOTS

Written by thoughtbot

Once Bitten Twice Shy

As somewhat of an obsessive perfectionist in certain technical regards, I found one of the conclusions of Mike Davidson’s Lessons From The Roundabout SEO Test to be interesting.

After doing a borderline legitimate (and certainly interesting!) analysis of why he’s the number 5 “Mike” on Google, Davidson concludes:

The findings do support my initial suspicions about web standards as they relate to SEO though: that they matter about as much as a cheap umbrella in a hailstorm. That is to say: kind of. Developers should write clean, semantic code as a matter of professionalism rather than search engine optimization.

That last part is what stuck with me, because it’s something I’ve had in the back of my head for a while. Having the leanest, most semantic markup isn’t going to save the world. Having tight, clean CSS isn’t going to save the dolphins. Validating isn’t going to save the giant panda. And so on.

There are legitimate business and usability reasons to do each of these things, but there is also a deeper “being correct” sense which, I think, is actually valuable on projects with multiple contributors. When I get handed a collection of HTML and CSS which are views in an application and it’s my job to hook it up to a database, I feel good about working on it if the HTML and CSS is put together logically in a way I would have done it myself.

So, on top of the benefits it provides to customers, it’s a professional courtesy. You look at someone’s markup and stylesheets and you can say “ok, this is the sort of person who thinks like me and who I want to work with”.

In fact, next time I hire a designer I might just ask for some markup and skip the resume.