Mobile Performance Testing – Questionable Value?

The hot topic is ‘Mobile’.  Its finally hitting the mainstream and everyone is interested.  People are browsing and buying on mobile – and if the application is performing poorly for the end user then they simply ditch and go elsewhere.  This acutely brings attention to speed and generates a lot of interest in the performance testing of mobile.

So I ask myself the question: Is it me – or is there is something I’m just not ‘getting’ with mobile performance testing? If you have a website and want to performance test the mobile version then the sensible thing to do would be to simply test at an API/Protocol level, wouldn’t it? Record the traffic for the mobile version and then just test.  The desire to record or emulate from a device just seems completely over the top and actually unworkable – there are far too many devices running too many different versions of the OS’s.  Plus you can’t factor in what the actual coverage an telecoms operator will provide to the end user, it depends on factors such as location, usage, reception strength and bandwidth capacity.   Lets be clear here; I’m talking about mobile versions of a website – NOT Apps.

So its with interest I take a look at HP True Client white paper. I start reading, and after lots of marketing padding I come across the first piece of useful information (page 5):

“what about other network conditions such as network latencies, packet loss, and connections getting dropped? You need to be able to test for a mix of common network conditions since you can’t predict where your customers will be coming from. You need to be able to account for the various hops in the sky in order to determine how these may impact your systems.” 

This sounds interesting. HP acknowledge the complexity I outlined earlier. How are they going to tackle this?

performance testing, though important, is nothing more than following the current testing strategies and practices that have been used for testing load under Internet usage. This couldn’t be further from the truth.

That told me! But I’ll politly disagree until I come across something of more substance.

Many of the skill sets developed for regular web testing are applicable to performance testing for mobile applications. But many new and innovative approaches are also required to accurately record the mobile traffic replay and to accurately represent the user experience. As we noted earlier, mobile applications bring their own set of testing requirements.

Really, I thought that most experiences were built on web services/straight http/open standards. Most vendors are now finding that the headaches and costs associated with releasing a App version means they are going back to HTML.  I’m still not quite sure what I’m required to do that is ‘new and innovative’.  I would class the SPDY protocol as new and innovative, but thats an entirely different subject.

“When users come to the system through mobile devices, they create a different behavior on the system. Since connections stay open for longer, more concurrent connections are consumed product review kollagen intensiv dermaperfect. This may mean that in the worst-case scenario, 10 percent of mobile users may exhaust all the connections available, and regular Internet users may not be able to access the system. You need to be able to test for situations like this.” 

Finally, some meat around the bones.  I would think that if the application is well architected then this would not be an issue – isn’t this what connection pooling is all about? However the reality is that some systems are not well built.

“The fact that users frequently drop off from their mobile connections, especially when they are traveling between locations and crossing various cell areas, can also impact the performance of the system.” 

Its easy enough to simulate connections being dropped using a normal load test scenario, I’m still feeling a little puzzled.

HP then talk about Mobile TueClient and HP Mobile Applications, but this is very high level.  I’m none the wiser and a little disappointed that I haven’t learnt anything new.

I’m guessing that the new protocol adaptor/tool allows capture and playback of traffic from emulators and devices.  But I seriously can’t work out the benefit of doing this as opposed to recording and setting a cookie/parameters to tell the service under test that we are a mobile device and then playing back and recording this in the traditional and normal way.   Android and iOS are the main mobile operating systems out there, and browsing is done through a mobile browser. Which from the application point of view – is sending traffic in the normal way.  I also question the value of  recording and playback from mobile devices/emulators – the benefit seems marginal for the effort.  Lets not forget the browser is wrapped in many variants of mobile OS’s, firmware and implementations that are sandwiched in a telecom providers stack. All of which can affect the way the traffic is handled.  Performance Testing is a risk based exercise – and I think testing from mobile emulators will in most cases not justify the effort.

The biggest problem I see for website providers when sending to mobile clients is actually working out the clients receive rate – it varies when they are on the move. It would be neat if somehow the client could tell the host what its receive rate is and allow content to be altered accordingly.  I’m sure there are a host of privacy issues associated with this though. Following good practices and performance by design should largely mitigate most of the risks anyway.

So I still stand by my initial thoughts.  If you want the mobile segment of a site performance tested then record using the normal methods through the standard API’s and services. Its cheaper, effective and faster – also, if that isn’t working nothing else will.  If you test from the cloud or inside the firewall then continue to do so, I can’t see any significant and compelling reasons to change the techniques used for performance testing of mobile just yet.  Performance testing of Mobile Apps – now thats a different beast, and I’ll talk about that in my next article.

One thought on “Mobile Performance Testing – Questionable Value?

  1. Ignoring HP’s marketing material…

    In my experience most developers don’t consider the network/client-side aspects of web performance – they just think people need more bandwidth.

    While simulating the mobile experience is possible to some degree, we still need to measure it for real.

    RUM tools and things like 5o9’s browser help in this area, server side RUM that can watch at the TCP level us another good thing.

    As far as performance testing goes we could adopt a SOASTA style approach of test in live i.e. monitor the performance of the mobile experience and incrementally improve.


Leave a Reply

Your email address will not be published. Required fields are marked *