We live in a wonderful world. For geeks this is especially true, with the explosion of technology making things possible that were not even dreamed of a generation ago. These, truly, are exciting times.
Virtual reality! Instant worldwide communication! Millions of books available at the touch of a button! Stream all the music you could ever want, wherever you are, using the supercomputer in your pocket!
Yet, if we look closely, everything is not perfect. We may have the world's information at our fingertips, but we all know that network coverage is spotty and the capabilities of devices inconsistent. Even we, in the highly-developed Western world, regularly wear our "buffer faces".
With a billion more people due to come online over the next few years, mostly in the developing world, the strain on our networks will become more serious. No wonder the biggest technology companies are investing huge sums to beef up worldwide Internet infrastructure.
And these billion people won't be using the latest and greatest hardware, instead depending on low-cost, low-power phones. Think that's a long way off? Think again; back in 2013 a UN study showed that more people worldwide have access to a mobile phone than a toilet.
We don't have to travel to South America or Africa to see the effects of unreliable networks and poor hardware. Travel out of any large town or city in the UK and you'll see your Internet speed drop like a stone (browse around opensignal.com for visualisations of this). Pick up any cheap demonstration Android phone and try to use the most modern "web apps" and you'll soon experience levels of frustration previously unimagined. We geeks living and working in city centres using our high-end phones and fast networks don't know how lucky we are.
Random access errors
So, as the designers and developers of online systems, what should our response be? Clearly we have to recognise that users don't all experience our systems in the same way. Unlike code which runs on a server which we control, web pages can be requested and executed in an environment which we have absolutely no control over. Online, things go wrong. A lot.
And this is 1 in 93 requests, not sessions or users, so a person may be able to use your site OK one minute, then it fails the next. How many bad experiences will it take for that person to lose trust in your site, and possibly move to your competitor? On the web you're only one click away from your rivals.
How to make unhappy users
Let's try putting that in a user story:
As a user on a device which connects over the Internet
I want to randomly only receive part of your "web app"
So that I can better understand the inherent fragility of the Web
Because the Web is fragile. There are no guarantees of delivery, so the more cruft we load into web pages makes it more likely something will go wrong.
This approach is called progressive enhancement. At its heart, it's about asking "if" a lot.
Progressive enhancement to the rescue!
The fundamentals of progressive enhancement, as explained by Jeremy Keith in his Enhance! talk, are:
- Identify the core functionality of the page
- Deliver that functionality using the simplest technology possible (HTML)
We also enable access for an increasingly large number of people using bandwidth-saving browsers. For instance, Opera Mini has a quarter of a billion users worldwide. And that broader access also applies for people using assistive technologies, or people whose browsers are temporarily degraded (by, for instance, a network problem). This works in the same way providing a ramp into a building makes it possible for wheelchair users to access the building, but also makes it easier for people who have trouble with steps.
Make It So
If you look around the web for examples of progressive enhanced sites you'll be forgiven for thinking there aren't many to find. Not because they don't exist, but because progressive enhancement at its best is transparent - often you won't even notice it unless major problems happen. Still, there are some noteworthy examples we can learn from.
To websites and beyond...
But while providing unencumbered access to content is a no-brainer for a text-heavy sites, even advanced "web apps" can be progressively enhanced. For example the Google+ social network was rebuilt with progressive enhancement at its core:
This is how the web is meant to work
This approach is not new, it's built into the very fabric of the web itself, but it is gaining in popularity as companies see the advantages of providing a resilient interface for their content and products. We can't control the environments in which our web pages run, so it makes sense to ensure we defend as much as possible against the inevitable problems that users will encounter in the real world. For more on this approach I can highly recommend the online Resilient Web Design book - which is itself a progressive web app which can be read offline.
The goals of progressive enhancement are resilience, broad access and speed. Measurable and achievable, these are the sort of things modern web system developers are beginning to think about. Not just to make websites faster and more robust for our current users, but to build a platform on which we can reach the millions of people around the world who could benefit from access to the web.