This involved installing Linksys WRT54G routers between the routers and modems of over 2,000 volunteer panel members. Each Linksys router was modified with a client which would run a pre-scheduled routine of diagnostic tests on a number of performance indicators at times when the panelists' networks were judged to be idle, and report this data back to home base for analysis.
The results, as seen in the OFCOM report, show a wide range of variation in maximum and average speeds beyond what could be attributed solely to loop length (pages 30 - 32). Possible explanations include everything from poor customer network configuration and wiring problems, to contention, throttling, and insuffucient backhaul provisioning.
Some of the causes could probably be inferred from the data collected by the Linksys devices, but the mandate from OFCOM was to focus on speed. I find this ironic, in that I have heard members of the OFCOM consumer panel advocating incorporating other KPIs, such as latency, in ISP performance claims and marketing.
Sadly, there was no discussion of individual ISP performance in the session, despite the fact that SamKnows obviously has a high level of insight in this regard. My reading was that OFCOM is keen to avoid this sort of disclosure, because it might somehow distort the market.
I'm intrigued by how the outputs from SamKnows' data could be married with data from other sources, such as the Measurement Lab, Akamai, Level3, and the Internet Storm Center, to build a better-rounded real-time picture of what is actually driving the quality of the customer experience. I'm sure a number of telcos and broadband service providers wouldn't want to subject themselves to that sort of scrutiny for obvious reasons.
On the other hand, if exposed to the end user community, it could also be a powerful marketing and customer care tool. Transparency of performance claims, backed up by hard evidence from a number of sources, would be a great selling point, and if customers have some visibility on what's behind their service problems, presumably they will be less inclined to bombard call centers, especially if they can see that the problem likely lies either in their own CPE/network, or on the other side of the local access network.
Anyway, the thing that really excited me was that the SamKnows team clearly wants to expand their study methodology beyond the UK. I can think of a number of readers of this humble bloglet in various markets around Europe and elsewhere who would make great local partners, so don't be shy.
'Sadly, there was no discussion of individual ISP performance in the session, despite the fact that SamKnows obviously has a high level of insight in this regard. My reading was that OFCOM is keen to avoid this sort of disclosure, because it might somehow distort the market.'
Good point, James...
If I can get statistics on airline arrival delays and hotel star-ratings, it seems that it would not distort the market to publish a league table of the best and worst performing ISPs, at least as measured by the tests. In theory, price is a mediating factor for consumers, but so is quality.
It would be interesting for someone to file an FOI Act request for the underlying data and sort the results by ISP name.
I'm not certain if Ofcom could withhold disclosure under the FOI Act because it published similar performance data for the DQ services market, if I recall correctly...
Samknows did publish their first findings (pre-Ofcom) which included ISP specific data.
One of the issues is that the funding restricts the number of devices, and hence the number of ISPs with a statistically significant sample is small.
Post a Comment