Menu

A log of Alex Maughan’s thoughts.
Mostly focused on design, with scratchings below the superficial.

26 October 2014

Weak vs Strong Progressive Enhancement

For some of us, progressive enhancement (PE) is a big thing. Once you understand PE you become quite evangelical about it – rightly or wrongly so. The “well, it depends” is of course always admissible. I don’t expect the likes of proto.io or Google Docs Spreadsheets to function on a perfect, browser-capable layer cake. I’m not a fundamentalist idiot.

However, that said, I find myself having to argue for PE where it is perfectly feasible to implement. I’m not going to rehash all the reasons for PE now. Just one of those reasons is the focus of my attention here, as it has the potential to undermine one’s argument for PE if not done properly.

The most common counter argument I get faced with is, “Who consciously chooses to disable JavaScript in their browser?” At this point, the person delivering their rhetorical question believes they have won the argument. A rising smugness takes hold of their face, while I internally start to lose my shit. Firstly, this counter argument neglects to realise that PE is not only applied to JavaScript (JS). It is a holistic, layer cake approach to HTML, CSS, and JS. Secondly, the question is not whether a user actively chooses to disable JS or not, it is whether or not that JS has actually run. There are three common causes for the latter to fail:

  1. The browser is still busy downloading it. On slow connections this can be quite a while. A user may start trying to interact with your site before everything has loaded. It is amazing how often I’ve seen this first-hand while shoulder surfing someone on whom I’m doing a quick guerrilla user test.
  2. While downloading all the client-side code and assets required for a page, the user’s connection may drop for a short period causing the browser to stop loading stuff and simply settle on what it has dowloaded thus far. This can be hugely deceptive to the user, especially if all your HTML and CSS has loaded, but not all your JS. The page will look in perfect working order, but if not progressively enhanced won’t actually work. Most users won’t know they need to refresh the page. They’ll assume your site is broken and your bounce rate gets another boost.
  3. Somewhere at the top, or even in the middle, or even towards the end, of your spaghetti pile of JS, an error in a browser unknown and unforeseen to developers occurs, permanently halting the single-threaded parsing of all the noodles below it.

I live in South Africa, where even if you have the money to pay for the best data deal on your phone and have a fast landline installed at home or the office, the connection can suddenly become nightmarishly slow and regularly drops. It’s amusing that the same people who complain about the standard of ISPs in this country, fail to see how this bad service and infrastructure should affect how they build things accessed by this untrustworthy service. Where I come from, points 1 and 2 above should be sufficient at winning the argument.

Point 3 often instigates a rebuttal around having proper testing mechanisms and processes in place before deploying anything to production. Yeah, I agree, we should. But shoulda coulda woulda. We live in a world of move fast and break things. In fact, on too many occasions I’ve been a frustrated witness to move crazily slow and still break things, like some disaffected Patagonian tortoise in a china shop.

When your development team grows, so does regression. When deadlines are tight, and people are shouting at you from the top, and you’ve worked everyday over weekends and public holidays for the last few months to get something out, errors occur and QAs can’t catch them all, no matter how slick their automation.

So, I’m going to make the hopeful assumption the above argument holds some water for you. I’m guessing this same argument has been put forward by many others already. I’ve (potentially) re-hashed it here to highlight a misleading fail-point of what many assume is PE, but which doesn’t actually guard against the above, thus weakening the PE argument.

About a year ago, I kept finding myself falling into the trap of what I would like to dub Weak Progressive Enhancement (as a little nomenclature hat tip to Searle’s Weak versus Strong AI).

The difference between the two is very simple, and relates strongly to the original counter argument I presented above (i.e. Who chooses to disable JS?). Weak PE makes decisions based purely on whether JS is enabled in the browser. Strong PE makes decisions based on JS been successfully run.

Modernizr is a great tool, but like every tool it can be used incorrectly. Embarrassingly, I have used it terribly in the past myself, so I don’t say this in a condescending way.

JS-detection, via something like Modernizr, is frequently used to detect if JS is enabled, but not if all JS has run. This is a trap I personally kept falling into quite a while ago, and much of this mistake is still wreaking havoc out there in the wild. I similarly see this happening on many other sites built by other developers who believe they’ve nailed the PE problem.

The problem is that decisions get made based on the green light Modernizr (or some other JS detection technique) gives. For example, we start to hide things that can only be accessed again by JS. We create interaction styles that only make sense if JS can play its part in those interactions.

If the last thing you do, after all other scripts have been loaded, is run Modernizr or some other test for JS-readiness, then you’ll be ok. Strong PE for the win. However, there are good reasons to include Modernizr before everything else and many people do. You may want to take advantage of the Modernizr’s detection in your JS thereafter – performing certain enhancements or even loading certain non-essential assets conditionally (with something like yepnope.js). However, if you then start to make decisions in your CSS based on this green light for JS, you’ve fallen into the trap of Weak PE. Scripts after that initial detection may succumb to any of the 3 scenarios stated above, and your CSS will be working on false information.

Everyone’s setup and layer cake is somewhat different, so offering a sweeping solution to this would be wrong. However, it is often simply a case of giving the green light for JS after all of it has run, so something like this should be the last bit of JS the browser sees:

document.documentElement.className += " js-ready";

Also, you can without too much effort test if your site or app is weakly or strongly progressively enhanced. Add this bit of JS randomly into any part of your JS pile, preferably somewhere near the top, but it’s good to see what happens lower down too:

throw 'Testing for strong progressive enhancement.';

After adding this, can you still use your site? Do all the interactions fulfil their promised purpose?

It’s hard to believe that this hasn’t been brought up many times before by others, but I nonetheless felt the need to share my own thoughts on the matter, because it’s important to me that people at least acknowledge the benefits of PE. By realising there is a misleading kind of Weak PE, we can also make sure not to undermine our argument for it. Weak PE renders points 1 through 3 completely moot, and potentially gives PE opponents the upper hand.

14 September 2014

The diminishing returns of being stubborn

Copernican geocentric system

In 1922, Alexander Friedmann published an article in the journal Zeitschrift für Physik, where he proposed three different ways in which the universe could be understood in response to a nascent concept, the cosmological constant. Put simply, the cosmological constant was a bit of scientific and mathematical hackery contrived by Einstein in order to maintain an unsubstantiated belief in a static and eternal universe.

At this time, Einstein was riding a huge wave of scientific rock stardom, having just published his general law of relativity which had now officially replaced Newtonian physics as being the most accurate way of understanding gravity and how this gravity governed the movement of objects in space (planets, moons, stars etc).

Despite replacing Newtonian gravity, Einstein’s gravity and his new flexible spacetime concept still predicted a universe heading for ultimate disaster. His formula compelled the ultimate realisation that the attractions between each planet and star would cause all of them to eventually converge into a cosmic pile-up of epic proportions. While Newton chose to conclude that this almighty crash was being held back by the almighty himself (i.e. God), Einstein sought a less religious explanation by introducing the cosmological constant, an arbitrary number indicating the exact force of anti-gravity required to keep the universe static; to hold it steady for eternity.

Friedmann proposed that the universe could instead be understood in three different ways:

  1. A high density universe in which the above mentioned pile-up would eventually happen.
  2. A low density universe whereby objects in space were slowly but surely moving further away from one another creating an ever-expanding universe.
  3. A perfectly dense universe that was at a perfect symmetry between 1 and 2, in that an initial separating push had occurred with just the right amount of force to create a stabilising balance against the pull of gravity.

Friedmann’s article was a mathematical argument. Einstein took no time in shooting it down, even going as far to publicly declare that Friedmann’s calculations were flawed. Friedmann’s calculations were in fact correct, but being denounced by the scientific rock star of the age meant his work was given no more consideration. Friedmann died only 3 years later after contracting an illness. He died away from his wife and child, unknown by most in the scientific community. Those who did know him saw him in a tarnished light, thanks to Einstein’s condemnation.

To make matters worse, Einstein would go on to prematurely dismiss yet another great mind. Georges Lemâitre was a Belgian priest and cosmologist who had impressed the likes of Eddington with his mathematical prowess after a short stint at Cambridge. Lemâitre studied theoretical physics in Belgium as well as spending some time at Harvard and MIT. Not knowing anything about Friedmann’s work, he had begun developing a theory of the universe that unwittingly reiterated Friedmann’s work of a non-stationary universe. Lemâitre proposed that it all originated from an exploding primeval atom.

In 1927, at a conference in Brussels, Lemâitre approached Einstein with his theory. After pointing out that he had already been proposed something similar by Friedmann a few years before, Einstein once again dismissed the idea saying, “Your calculations are correct, but your physics is abominable.” 1 Einstein had learned not to spuriously accuse proponents of a non-stationary universe of being mathematically incorrect, but persisted with his own beliefs by dismissing it all the same.

Thanks for the amateur history lesson, Alex, but what the hell is the point of all of this? Well, as you could of guessed from the title of this post, this little snippet of scientific history helped solidify some thinking I’ve been doing around stubbornness. You see, Einstein was blessed with a brilliantly stubborn attitude that helped him realise radical shifts in science, but he was equally cursed by this same stubbornness later in his life. By refusing to even consider that his own subjectively unsubstantiated belief in a stationary universe could be wrong, he acted as a horrible impediment to two very promising scientists. He was in many ways a modern equivalent of the Catholic church during the days of Copernicus and Galileo. This may sound hyperbolic, but I wager Friedmann and Lemâitre would consider it spot on.

Now, I don’t for one second want to give off the idea that the likes of Friedmann, Lemâitre, Einstein, Copernicus, and Galileo have any real equivalency to what I do on a daily basis. One of the reasons I enjoy reading about scientific history is that it consistently provides a jolting perspective on how laughably inconsequential I am and the work I do – a humbling reminder that I really shouldn’t be so emotionally invested in the efficacy of a website interface. Although, this being said, I think I can say that stubbornness has had a similar(ish) part to play in my life, as it has with others around me.

With most things it’s a general stubbornness that gets you started and keeps you going. You have to be stubborn against your own doubts, against your own inability to learn fast enough. Additionally, all knowledge domains come pre-packed with contemporary trends, wrapped and sold on by fashionistas who instruct you on how that domain should be thought about, spoken about, and executed in practice. Your stubbornness, therefore, can also be an isolating influence, being that thing that makes you stick your head down and mumble “bah humbug!” to outside influences.

There’s a focused appeal to hunkering down in your own mind. It becomes a stubborn leitmotif day after day, year after year; periodically building in complexity, which crescendos before crashing down into moments of calm; a temporary y-axis flatline. It is in this calm that I have made my best and most sudden jumps in improvement – a beautiful irony of metaphors, as if the flatline provided the trampoline I needed. This improvement is all, in my opinion, thanks to stubbornness.

But there’s that unfortunate downside of stubbornness which can impede your own progress after a certain level of competency has been reached. The more confident I’ve become in a certain area of knowledge, the more subjectivity starts to masquerade itself as objectivity. It is easy to fall into the trap of knowing, instead of admitting that you continue to remain more in a state of not knowing than knowing. This false state of knowing is socially and economically enticing. Pay grades and social standing are structured around it.

Einstein kind of recognised this problem when stating, “To punish me for my contempt for authority, Fate made me an authority myself.” In all his spiteful arrogance and demanding authority, Newton also admitted to the vastness of not knowing in the face of only a tiny bit of knowing,

I do not know what I may appear to the world, but to myself I seem to have been only a little boy playing on the seashore, and diverting myself now and then in finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay undiscovered before me.

Being stubborn can help you uncover little gems of knowledge (or really big ones if you have the amazing talent and intellect of all of those scientists mentioned above). It can drive personal discoveries with an exacting joy at getting better at something. But sometimes instead of gems, that same stubbornness can result in you polishing your own little knowledge turd, trying to rub off the fecal imperfections inherent in that knowledge instead of simply admitting that you’re ultimately shining a piece of shit.

I think I need to be more wary of the diminishing returns of stubbornness. Letting go of my precious little turds may be easier said than done, but I guess I have to try.

Notes:

  1. This and other quotes in the post are taken from The Big Bang by Simon Singh.
17 August 2014

Art and technology

In the documentary Tim’s Vermeer Tim Jenison, a complete newbie to painting, paints Johannes Vermeer’s The Music Lesson based on a real-life reproduction of the painting’s composition, which he painstakingly constructs himself in a warehouse. He does this using an optical setup of lenses and mirrors built on the foundations of what many believe artists like Vermeer used back in the day.

Jenison believes he’s re-discovered the means by which Vermeer was able to create his unbelievable, photographic-like paintings. His method, although requiring creative ingenuity to think up and build, as well as huge amounts of patience and perseverance, transforms the act of painting itself into a mechanical and objective process – it transforms the human doing the painting into a non-subjective machine.

Whether or not his painting is proof that Vermeer’s long-acclaimed genius is actually mechanically replicable is not something I feel a need to explore. Rather, my interest lies more in our culturally emotive reactions to this debate. The very strong feelings so many of us have against the idea of reducing art to a technologically driven and mechanical process is significant (to me, at least).

Tim’s documentary, as well as various books that advance the theory of artists using optical devices, are criticised for missing the point of Art; that by focusing on technological trickery, one naively misunderstands what it is that makes Vermeer one of history’s most celebrated artists.

So what does it lack? The film implies anyone can make a beautiful work of art with the right application of science. There is no need for mystical ideas like genius. But the mysterious genius of Vermeer is exactly what’s missing from Tim’s Vermeer. It is arrogant to deny the enigmatic nature of Vermeer’s art.

I think the Art-loving outcry emanates from how we choose to define and, in turn, place cultural importance on Art, with a capital A.

Art is a brilliantly illusive concept. There’s much confusion and flexibility around its definition, yet most of us have very emotive and philosophical affiliations to it. It seems to transcend ideas around aesthetics, and scoffs condescendingly at the proposal of it merely being form without function. Art is a prevailing means of commentary and expression. Most of all, it is regularly seen as the end result of genius. Because of this, only a special few are celebrated from one generation to the next.

I really don’t think I’m imagining the very real rift that consequently exists in the minds of many, whereby Art is seen as a special human-only artefact, while technology is the clever, but otherwise ugly, Frankenstein child of our increasingly mechanised drudgery. The two are considered by many as immiscible, or rather that technology detracts from the true essence of Art; that it somehow robs Art of its expression, diminishing its human commentary.

In The Story of Art, E.H. Gombrich writes,

There really is no such thing as Art. There are only artists. Once these were men [sic] who took coloured earth and roughened out the forms of bison on the walls of a cave; today some buy their paints, and design posters for hoardings; they did and do many other things. There is no harm in calling all these activities art as long as we keep in mind that such a word may mean very different things in different times and places, and as long as we realise that Art with a capital A has no existence.

There are three important things to take from this very popular introduction to Art (I’m persisting with the capital A, for reasons soon to be explained):

  1. The only thing that defines something as Art is whether a human artist produced it.
  2. There are many different definitions and these are mutable over time and geography.
  3. Art with a capital A does not exist, apparently.

Points 1 and 2 reaffirm my preceding assumptions.

Point 3 is where things become wonderfully contradictory. You see, Gombrich has chosen to denounce Art with a capital A because it brings with it with too many intellectual pitfalls, which he’d rather avoid. By saying Art doesn’t exist, he avoids having to define it. This is fine, I guess, but as I’ll now try to explain, it is, I’m afraid, a contradictory cop out.

It is impossible for Art not to exist. If there was only art with a lowercase a, then any person who calls herself an artist should be celebrated as one. I’m personally okay with this, but it seems the Art world, including old Gombrich, aren’t (my own emphasis added):

Praise is so much duller than criticism, and the inclusion of some amusing monstrosities might have offered some light relief. But the reader would have been justified in asking why something I found objectionable should find a place in a book devoted to art and not to non-art, particularly if this meant leaving out a true masterpiece.

Okay, so if I’m following all of this correctly, it seems there are and there are not things worthy of being called ‘art’. I’m sorry, but this is when art starts to eat all of its greens and grows into a big and healthy Art – it most certainly comes into existence when you say there is ‘art’ and ‘non-art’.

Gombrich’s hugely circulated book on the subject operates on the epistemological foundation that his subject matter can only be addressed in light of the humans who produce it and that, as humans, we define it differently over time and space. This democratic foundation quickly crumbles under contradictory conservatism, however, because he clearly reaffirms well-known Art-type criticisms by deeming only some productions as being worthy of this categorisation.

Logically speaking, one is left confused by a book framed as an introduction to a select group of celebrated people who produce something that supposedly does not exist, but has simultaneously existed in various forms since troglodytes started vandalising their caves.

Considering Gumbrich’s work is such a huge bestseller (the copy I own at home is a whopping 15th edition), is it safe to assume his framing of, and thinking around, Art is shared by a good many of us? I venture to think so. Most of us seem to share this judgemental and contradictory understanding of Art, even if we try to democratise the way we talk about it at times. I think the reason is that it fundamentally comes down to us using Art as a way to elevate ourselves; a means by which we point at our human uniqueness in relation to other animals and, in more recent times, as a cultural device to argue for our value over machinery and the ever-booming ingenuity of technology.

What is a bit concerning is that by creating this dichotomy between Art and technology, we seem to do so with the assumption that technology is anti-human. The harsh criticisms of Tim Jenison’s painting of a Vermeer fails to recognise that here is a man of glorious creativity and talent, and is by all accounts unbelievably dedicated to any undertaking he pursues. What he achieves is just as worthy of celebration in my mind as Vermeer’s unbelievable talent.

There are Vermeers everywhere. Some use paint, some use hammers, some use keyboards – but all of them use technology in some way or form. Something as simple as a paintbrush is technology. There’s nothing more human than the technology we invent and use to do great things.

As long as we don’t lose sight of the inherent humanity in our technology (which we unfortunately do from time to time), I think we’ll be okay. Art can continue to be mysterious, uplifting, and judgemental, but so can technology and it will continue to be involved in the production of Art, whether or not Art snobs like it or choose to admit it.