Blocking JavaScript seems like a good idea

My fancy new website is broken due to a script blocking extension. I found this out when I showed a friend the Radial Blitz website. He complained that something was lacking. He turned on screen sharing to show me and sure enough my large screenshot slider was nowhere to be seen.

Turns out a script blocking extension was causing the problem. Obviously I didn’t expect the slide show to function, but I expected at least a static image. The theme designer likely never even considered that somebody would have JavaScript disabled. Should they, or is JavaScript a critical web technology?

I don’t think it’s useful to argue for JavaScript, it’s features are well known and I find it invaluable. Instead I’d like to consider why somebody would disable it. To my own chagrin, my list seems to support their position.

Garbage and bloat

A reason to dislike JavaScript is the unnecessary bloat it causes. It takes longer to download a page, often causes the page to render slower, and often provides no user advantage, or even any developer gain.

I’ll take Twitter here as an example. I have the standard follow button on my blog (well, had, since I’ve replaced it now). This is a button which you can simply click to follow me on twitter. With the button in place I see a lot of data being loaded from ~100K in JS and ~150K in HTML. Half of that HTML looks like it’s a defect, as it appears to be the same resource loaded twice.

That’s 250K of data to transfer just to have a follow button!

I know that Facebook has a similar overload problem with their button. On I had to disable the Facebook widget because it was taking too long to load. These two simple social buttons were the primary reason the site was loading slowly. No wonder people want to disable this crap.


A lot of the JavaScript loaded for a page isn’t visible to the user at all. It merely tracks the user on third party sites, like Google Analytics or Chart Beat. This information is valuable to the site owners, but let’s be honest, it is mostly of no value to our users.

On my own projects I’ve noticed that these external requests slow down loading. I’ve seen the sites responding slowly, or not responding at all, including Google.

What bothers me the most is that it should be totally unnecessary. This is something that could be done entirely server-side. There’s no reason why my server couldn’t talk to Google Analytics and Chart Beat — aside from it not being an option in those services.

As a user, having slower page performance just so an advertiser can create a profile of me doesn’t sound appealing. Disabling JavaScript at least partially solves that problem (though an ad-blocker is also effective here).

Nefarious injection

Some sites are just downright tacky with their ad systems. Sometimes you have a hovering popup, other times a wacky animation. Clever use of JavaScript can often defeat the user’s ad-blocker and display some poorly targeted ad to the user.

I recently read that some ISPs are also injecting ads into sites. It’s essentially a man-in-the-middle attack which hijacks a webpage and adds some JavaScript to the end. Site owners aren’t even aware of this issue. It is a point against JavaScript, but of course these corrupt ISPs could equally well attach HTML advertisements if they wanted.

The proper solution here is of course for everybody to start using HTTPS.


Browser vendors have done a terrible job of providing website security. Exploits using JavaScript are very common. Improved security is one of the key promoted features for the script blockers I’ve looked at.

The current approach to site security is to continually patch on ad hoc rules, restrictions, exceptions, and other nonsense. Clearly this will never work, and it’s just making it harder to write websites. CORs is a good example of this nonsense; I mentioned this in a previous article.

Until vendors provide a properly designed sandbox model the security problems are not going away. It’s not like it’d be very hard to build this model, it just happens to break the web. Well, it really only breaks the web for advertising and tracking networks, so our user wouldn’t likely be too upset.

Or, you could just disable JavaScript. It’s overkill, but it’s the safest option.

Disable it?

When I started writing this article I was firmly convinced that disabling JavaScript was not a good idea. It is a critical web technology and a lot of standard features just can’t work without it. I was going to be fair though, and decided to think of the reasons why somebody would want to disable JavaScript. Unfortunately, reading my list now, disabling JavaScript doesn’t seem like such a silly concept anymore.

Disabling JavaScript removes the vast majority of crud polluting pages and hurting privacy. I can totally sympathise with users that want to disable it.

As a website designer it’s a big problem for me though. HTML and CSS are simply not adequate for doing a lot of basic presentation. Used correctly, simple interactive features can provide a huge benefit to user experience. From simple hiding and revealing of information, to providing interactive forms that guide the user, or presenting voting and rating buttons for articles. I don’t imagine I’d much enjoy a web without these features.

I do consider the problems I listed to be significant. But I don’t consider disabling JavaScript to be the correct solution.

11 replies »

  1. Your sites should always be useful to present the information they present with javascript turned off. The moment you require javascript to be on to view the basic site content in a basic view, you’ve broken the rule.

    Now, javascript might add fancy features, or so forth, but for the basic underlying function of the site, that should always happen even with javascript turned off.

  2. >This information is valuable to the site owners, but let’s be honest, it is mostly of no value to our users.

    That’s some short thinking.

    That information is valuable to me, as a developer, which in turn allows me to make the website better for you, the user.

    • I said mostly to acommodate the few designers who are actually using tracking tools to improve their site for users. Most “improvements” I’ve encountered however have been to improve click-trhough, sign-up, turnaround, or other business value aspects that aren’t truly of value to the user.

      I do agree that tracking user experience and improving the site is a good goal, but I just haven’t seen this done in practice. That is, not by using automated tracking programs. I’ve seen it done a lot with explicit user testing.

  3. Given existing precedent of separating content from presentation, that’s the wrong tree to bark at.

    As the amount of hoops to jump through for the site to “degrade” nicely is extreme, costs rise.

    • Part of the issue is that I feel that HTML + CSS has failed to provide a complete solution. There are too many limitations in CSS to even come up with proper layouts, thus people have turned to JS. A lot of JavaScript is not intending to add behaviour, nor do people want to use it to add content, it’s just that the other features are lacking and they are required to.

  4. Sorry but something about this article really bugged me after I found it on Reddit. There are some assumptions that are made here that I personally find don’t hold true and potentially have the same effect that older versions that IE have on holding the web back.

    Going point by point


    You might be pulling down some extra files in order to render your page but you need to take into account two factors, the first being caching the second being the alternative of server side rendering.

    If you are regularly using CDN source for JS files then over time browsers will cache those. If you go to one site using the same version of jquery as another pulled from the same CDN source then you can load that resource from cache on the second site. Also we have seen a boom recently of Single Page Apps or SPA’s that load up a single html page and then change the states through JS depending on user actions. Effectively in this case your users load your JS files once and then any time they change a page or use a function they don’t have to constantly reload or re-render the whole page.

    If we were to ditch all of the functionality that JS gives us in the front end then we obviously have to deliver that through the backend, that means more processing on server. If you consider this to be the case then the extra time you use to download a JS asset is likely taken up by your server processing and assembling a html page to send back to you. On top of that also take into account that the device you are viewing the site on is likely considerably more powerful than the server being used to assemble the page so purely from a resource point of view it’s considerably more efficient to send a bunch of JS files and some json than it is to assemble a full page in php etc and then send it to the browser.


    While we may consider tracking to be of considerably more use to the site owner than the visitor in reality it’s probably the other way round. Google Analytics data in the right hands has the power to shape and improve a users visitors to your site. For site builders that actually analyse their GA data it’s a driving force in shaping what content and functionality is developed and how it’s delivered. If you think about it the alternative is us hitting users constantly with user feedback surveys or just plain guess work in order to work out how our users are interacting with our websites and apps and whether they are delivering desired outcomes.


    Yes JS has been an issue in the past but with the current crop of browsers that has been significantly less of an issue and using anti-virus, malware detection, HTTPS and tools like WOT significantly reduce the security issues that JS has posed in the past.

    Disable it?

    Want to be able to use the majority of sites and web apps in the future?

    • Bloat isn’t just about bandwidth, it’s about memory usage and speed on the resulting page. Caching can only address the bandwidth part. Using common CDNs for resources is also an avenue for undesired tracking of users. It gives the CDN a complete record of where a user has been online.

      While I agree tracking can be used for good, I have not seen this much in practice. The standard use is to improve not general UX, but to improve conversion. But, as I said, there is no reason why tracking can’t be done via the server-side, and without passing data off to a third party.

      I don’t believe the issue with security is any better. You’re talking exactly about the ad hoc approaches I’ve mentioned in the article. Until a comprehensive solution is created the security will always be weak.

  5. Well, I’m with you; HTML and CSS alone don’t provide the tools I want to present the web pages I want to present. (And I rather like JavaScript. Until I fell in love with Python, I used to recommend JS as a good first language.)

    I think your objections are valid, but don’t really apply to JavaScript. They’re certainly solved by turning JS off, but it’s not JavaScript’s fault that buttons have either crap code or so many bells and whistles they become noticeable, let alone a problem. I agree, that stuff sucks, but it’s the designer’s fault, not JavaScript’s.

    And tracking can be accomplished by cookies and even by images. (The image URL can contain tracking info.) And in any event, the problem is spammers and advertisers, not JavaScript’s.

    Without JS, there’s no AJAX, so no feedback in search boxes or other nice features like that. And JS is the only way to expose and hide parts of a page without reloading it. Without JS, there’s no DHTML!!

    • I agree it isn’t the fault of JavaScript, but getting rid of JavaScript is actually a very effective remedy at the moment. In the same sense not driving a car rids you of all those crappy drivers on the road. It is an effective remedy, it just has a huge downside to it.

      I believe that a proper browser sandbox could fix the key problems of security and privacy. So far though the browser vendors have not shown any real interest in making the web safer and more secure. It’s still just a bunch of patchwork.

  6. Being happy user of NoScript, I am pleased to see that some site developers put some thought on the topic. And it is funny how much butthurt you generated with this post.

  7. REALLY!? It feels like the times when a microwave oven was invented…

    JS is definitely the way to go for any modern, responsive site. It’s like choosing not to install any apps on your desktop or smartphone because there are binaries out there that are trojans.

    JS is an essential tool for making web2 work and visitors actually love it when done right. Just remember, not all sites are text-centric where disabling JS would still render them useful, easy or fun to use.

    I don’t think anybody disagrees that there are sites that abuse visitor trust and privacy and ideally there would be a method of flagging them. Perhaps a rating system/plugin directly integrated into browsers would be of great help here. That’s what we should concentrate on, not block standard-based improvements aiming at bringging liveliness into the web, now that adobe flash is (finally) going away.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s