Velocity wrap-up: Mobile, third parties, and browsers

The great thing about this year’s Velocity conference was that it almost doubled in size over last year. The down side was that there was no way to get to every session I wanted to check out. So for the past month, I’ve been using my spare time to pore over the videos and slide decks for the sessions I wish I’d had time to attend.

Similar to last year’s Velocity wrap-up, here’s a collection of my favourite slides from this year. These days, I’m particularly interested in third-party content, mobile, and browser performance, and these slides reflect that. Interestingly, if last year’s collection represented a snapshot of our industry, I’d say that this year’s is more like a series of microscope slides. Almost every one offers a granular perspective that reflects how our industry has entered a phase where the nuances are as important as the big picture.

The Impact of Ads on Performance and Improving Perceived Performance
Julia Lee, Senior Director of Engineering, Yahoo! Mail

With almost two billion page views a day, the cumulative effects of latency can hit Yahoo mail hard. They found that 73% of their overall latency was due to ads. No surprise when you look at how convoluted an ad’s server call can be (as I’ve written about here), as seen in this slide (click to see a larger version):

Velocity 2011 - Third-party ad call
What does this convolution add up to, performance-wise? Julia shared that in the old days, before redirects, the average ad experienced about 464ms of latency. Over time, that number grew to 2.7 seconds.

Performance Measurement and Case Studies at MSN
Paul Roy, Alex Polak, Gregory Bershansky
MSN Performance & Reliability Team

In that vein, MSN shared some case studies that showed how speeding up load time affected page clicks. I live for these kinds of case studies, so I was happy to learn that:

  • In an experiment with implementing synchronous jQuery load, they experienced a +0.5% increase in search clicks and page clicks.
  • In another experiment with improving JavaScript execution time, they experienced a +1.2% increase in search clicks and a +0.5% increase in page clicks.

But what I thought was particularly interesting was the case study around delaying ad loading. They experimented with delaying the loading of a major ad by 1 second, which improved the time to onload by 500ms. As a result, they saw an increase page clicks and views, but a 15% dropoff in ad clicks.

Obviously this kind of ad performance hit isn’t viable for a site that relies on CTR for revenue, but what grabbed my attention was MSN’s takeaway from this exercise:

Velocity 2011 - Microsoft case study: Delay ad loading

What I like about this is that, rather than being scared off by the initial 15% hit to CTR, Microsoft persists in looking for the sweet spot that yields faster load time without hurting advertisers. This improvement may end up just being a hundred or so milliseconds, but the message here is that it’s a goal worth chasing.

Mobile Web & HTML5 Performance Optimization
Maximiliano Firtman

If you’re in mobile web development, this slide deck is a must-see. Starting at slide 51, mobile performance expert Maximiliano Firtman gives an impressively thorough breakdown of optimization tips, from handling images to deferring content. There’s no one slide that sums it up better than this one:

Velocity 2011 - Why mobile websites are slow

Web Site Acceleration with Page Speed Technologies
Bryan McQuade and Joshua Marantz, Google

This session was where I heard a nice little case study in which DoubleClick removed one client-side redirect and cut 1.5 seconds off ad load time. As a result, they increased click-throughs by 12% on mobile. (More details here.)

But what I thought was really cool was this slide, which included a nifty video that showed the delay caused by synthesized screen clicks on mobile devices:

Velocity 2011 - Mobile button click delayThe video demonstrates how the click event on touch-screen smart phones will not fire until a roughly 300-500 ms delay. (You can see the video yourself here.)

This is a good reminder of the little ways that perceived performance can be affected, which are beyond our control. And it’s a great reminder of why synthetic mobile browser tests can’t give you a complete performance picture.

Modernizing Internet Explorer
Jason Weber, Performance Lead, Internet Explorer

I missed this session, but our CTO Kent Alstad went, and he was so impressed that afterward he insisted I check out this video (which is, unfortunately, only available by purchase). IE gets a lot of flak, but Jason does a great job of explaining why IE6 and IE7 made sense at the time. He goes on to talk about the fact that IE9 isn’t just a tarted-up version of IE8, but is in fact a complete teardown and rebuild. The bulk of the session is his explanation of exactly how his team did it. The most impressive part of his session is this pair of slides.

The first shows two websites that the IE team monitors internally every day, The New York Times and CNN. The different slices of the pie illustrate the CPU usage of various subsystems when the sites are viewed on Internet Explorer 8:

Velocity 2011 - Internet Explorer 8 CPU usage

And these are the same sites viewed on IE9:

Velocity 2011 - Internet Explorer 9 CPU usage
The grey areas shows all the CPU time that disappears, when you compare IE9 to IE8. According to Jason, because of the lighter CPU load, these sites load twice as fast on IE9, and his team monitors other sites that they say load up to ten times faster in IE9.

Creating the Dev/Test/PM/Ops Supertribe: From Visible Ops To DevOps
Gene Kim, Visible IT Flow

If you ever need to explain to naysayers why performance matters, this slide pretty much sums it up:

High performing websites

Related posts:

Share:

Print
Digg
del.icio.us
Facebook
Google Bookmarks
LinkedIn
Reddit
RSS
StumbleUpon
Suggest to Techmeme via Twitter
Technorati
Twitter