Is it Now Acceptable to Require JavaScript?

In this age of HTML5, CSS3, and anti-Flash it seems as though we may be slipping away from our roots. Or are we?

Back when standards were standards if you were building anything that didn’t have a fall back plan in place for a lack of JavaScript, you were doing it wrong. Yes it took more time and it took better planning but that’s the point. When you’re a professional you’re supposed to be doing it right, right?

The rise of (my) Web standards

Back when JavaScript was reborn, when its use began transforming into what we know and love today, the rules were still being written. I remember thinking about how I should approach learning this skill I knew would soon be essential. My first stop when teaching myself something is Google. Of course back in the day Google was still polluted with DHTML tutorial sites and wretched implementations. After reading countless tutorials comprised mostly of “copy and paste this snippet here and that snippet there” I knew it was time to hit the books.

It took about four seconds of poking around to realize that there was more than meets the eye when it came to JavaScript for me. I needed to learn about the DOM before I tried to manipulate it. If there’s one thing that drives me batty it’s the notion of ‘learning the framework instead of the language’ and this is no exception. JavaScript though, has a completely new layer to work with.

I won’t detail the issues I have behind learning a framework or platform as opposed to the language here, but it can be applied to every programming language, every markup language, CSS, and JavaScript as well. To sum it up: learning the framework gives the language a bad name. We’ll leave it at that and the rest for another article.

I grabbed myself a copy of DOM Scripting by Jeremy Keith and to this day I’m glad I did. DOM Scripting was instantaneously followed by Bulletproof Ajax, also published by Mr. Keith. If I had to choose two books as a suggestion to someone looking to learn proper JavaScript I think these two are it. There are of course subsequent, more advanced books that I’d also suggest, but these two works will help you to realize which blog posts are junk and which are gold.

The theory behind writing JavaScript, as I took it, can be taken from DOM Scripting:

Separate behavior from structure using unobtrusive JavaScript. Add dynamic effects with progressive enhancement. Ensure backwards-compatibility through graceful degradation.

I lived by those rules. I still do. But am I stuck in the past while everyone is moving forward?

The new Web

Has building a proper Web stack become old school? What I’ve been curious about lately surrounds the trend that requiring JavaScript for your Web app is something we hardly blink at. MobileMe has just recently relaunched with a new look:

Screenshot of the MobileMe login screen with JavaScript enabled

Looks really great, but if you hit the page with a JavaScript-incapable browser it’s a bit different:

Screenshot of the MobileMe login screen with JavaScript disabled

Back in the day we’d all have scoffed at such a thing, comparing it to a “This site looks best in IE6” badge straight out of the 1990s. What’s changed since then? Why is this now an acceptable practice? I understand that Apple is a progressive company, bringing the web in its best form to the largest population possible, and making it look great, but that’s my point. Is this new school of thought going to transform itself into common practice?

The entire point behind breaking away from closed platforms such as Flash (yes, it’s a closed platform no matter how much marketing gets put behind it) is to build a universally accessible Web. Devices are advancing, yes. Browsers are advancing, yes. But does that give us the liberty to put the fruits of our labor on a back burner now that we’ve reached some sort of plateau in the evolution of the browser?

I’m not dogging Apple

Apple isn’t the only one producing Web applications that show similar messages if you’re using a JavaScript incapable browser. In fact, other more popular platforms are doing it even worse:

Screenshot of Google Docs with JavaScript disabled

There was a time, if I remember correctly, where Google Docs gave a notification similar to MobileMe as opposed to loading a non working pseudo interface that does nothing but confuse a visitor. Gmail still retains its fully functional JavaScript-less implementation, albeit behind a quick message notifying you what you’re in for:

Screenshot of Gmail's notification of JavaScript being disabled

Depending on your choice, you can end up at a really well put together version of Gmail:

Screenshot of Gmail's HTML only version

To me, Gmail remains to be one of the best implemented modern Web applications because of this very attribute.

Where are we headed, really?

I’ve tried to wrap my head around these poorly implemented Web applications to find out the real inspiration behind them. Are companies rushing JavaScript dependent platforms out the door simply to get things live before the competition? Are metrics showing that supporting JavaScript simply doesn’t make fiscal sense? Are we to a point where leading Web companies care more about dollars and cents than users?

What about the Web applications themselves? We’re working with the most rich implementations of JavaScript we’ve ever seen. Many times, it doesn’t even make sense to offer a degraded version of an application simply because the desired feature is built within JavaScript from the ground up; there’s nothing else to show. This is not a bad thing. I’m concerned in particular for those applications that could in fact have a decent degraded version. Does it change the opinion we have behind the modern Web though? I’m speaking from a front end developer point of view here, a conversation between us professionals, not as users.

Do we need to move beyond this self-imposed requirement of providing a gracefully degraded version of our application? If so, would it not be a (short) matter of time before that school of thought trickles all the way down to the Coda Slider we plan to implement? In essence, what’s the difference? We’d all be thrilled if we could just make that AJAX request and call it a day without having to first build an alternate version, but is it the right thing to do as professionals?

The groundwork has been laid by some of the best thinkers in our industry, and neglecting to build a proper stack, to me, pushes all of that hard work aside in favor of (too) rapid deployment. If your metrics show a 99% JavaScript enabled audience, are you willing to forsake that 1%?

I’m not only concerned about that 1%, I’m (perhaps) more concerned about how it affects the overall implementation. Working from your degraded version is going to result in a much more stable environment upon which to build your behavioral layer. Skipping that valuable step can and probably will result in a less structurally sound document.

I have a tendency to remain loyal to influential circumstances that have shaped me as a professional, but I’m curious how (if?) these events are affecting other designers and developers, if at all. Do you continue to be curious about degraded versions of modern Web applications? I could be way off base in even thinking about things to this level, and if that’s it by all means feel free to call me out on it, but there’s something under my skin about what’s going on. Thoughts?