It’s been a few years since Google first released their study (PDF) about how page speed impacts user behavior. For me, this was the first pivotal wake up call that turned the light bulb on. I had not acknowledged, even in my own head, the bias I felt towards slower loading web pages but after reading that study, I knew it was the truth. Slow loading pages really try my patience. There’s no beating instant gratification but I still want things to stimulate me visually. Ask a friend: would you prefer a fast loading page, or a slow loading one? All else being equal, not a single person in their right mind will say they prefer the slow loading page. Most people who read this will nod their heads and say yes, we are working on the challenge. The problem has been identified, the team is in place and we have the relevant KPI’s in a routine dashboard being sent around to all the executives or relevant product, engineering or analytics folks.

What if all of your speed benchmarks are wrong?

It’s hard for any of us to acknowledge that the very yardstick we cling to is flawed. In developing the fastest loading question page in the category, we ignored many of the user experience elements I wanted so badly so that we could satisfy another metric: revenue per visitor. That’s the wrong way to measure things, as it’s a single session, one time only statistic. Far better to measure life time value, which is what Google was attempting to proxy in their study. For those who remember the original Google infrastructure, they served search engine results from RAM, not disk, which saved them milliseconds as the hardware didn’t need to do a disk seek to find the data needed, before sending it out to the requester. This innovation cut the response time of their retrieval process, which helped to ensure the impression among searchers that Google was something different, something new. However, speed is a behavioral aspect of web design, it’s not a static, never changing number. According to some data, three seconds is the current normal.

When reading cutting edge articles like those from 37signals blog on HTML 5, the new Basecamp and how they have managed to cut the response time and latency of their application down to unheard of numbers a few short years ago, it’s obvious where we are headed. HTTP as a protocol was always stateless. Things have been done, from using cookies to JavaScript to ajax to scripting languages, all to improve the latency, throughput and connectivity of the protocol. The inherent redundancy of HTTP even 1.1 results in extra overhead which necessitates a slower loading experience because of the normal internet architecture. Let’s imagine just for a moment that the old rules no longer apply. The hood has been literally been popped, we can all look underneath and figure out how to make things just work. Quickly, easily and without headaches. What would happen then?

Standards such as the artificial three second rule would get tossed out the window. Instead of a multi-second response time being considered world class, a sub-second response time would become the normal expectation and a millisecond response time would become the new world class. The one complaint I have yet to hear is, “Can you make that work slower?” HTML 5, CSS3 and modern web technologies have enabled an incredible user experience to be delivered to any device, on any platform, in milisecond response times all with more flexibility to designers. Allowing the creation of truly inspiring, engaging user interfaces. Parallel downloads, persistant server connections, caching, newer distributed databases such as hyperdex and more will drive the future. Sure, you might say, “We’re fast now,” but that’s based on the average experience of your audience today that has to contend with bloated, slow loading pages. What happens when single second response times are the norm? Sub second response times will be the only ones which seem truly fast, and that will be the new standard by which you are judged.

With HTML 5 still a draft spec with no finality in sight, if you want to both future proof and enable backwards compatibility with your application infrastructure, you embrace the new technology and develop for the next decade of amazing. The slow will lose, the fast will win. This time, the hare won’t fall asleep and will win the race in a tuxedo and bow tie bringing me a molten chocolate cake!

Tagged with:  
Share →