Current industry benchmarking is not fit for purpose.
Last updated
Last updated
Present performance tracking and benchmarking approaches for service providers are inadequate in today's globally interconnected internet landscape. These methods not only fail to accurately reflect the user experience, but they also foster a culture where service providers may feel incentivized to skew results. Additionally, they promote an overly narrow focus on the final leg of broadband service delivery, specifically pertaining to download and upload speeds.
This type of speed measurement is typically served by a crowd sourced web solutions such as Ookla's speedtest.net and offers only a snapshot test result when an end user initiates a test and which by default runs to the closest test server to the end user. This does not take in to account the variation in performance over time which can be affected by temporary events, faults or time of day such as peak usage typically on an evening during 7pm-11pm.
Internet service providers tactically position their speed test servers near the customer's edge, thereby reducing the number of network elements and the corresponding latency. This effectively curbs the impact of possible data loss. In essence, this tactic allows service providers to mask potential issues like core or border gateway network congestion from the data. Thus, speed test performance data often only mirrors performance up to the halfway point of a service provider's network. Furthermore, the functionality of Speedtest is skewed to overstate the maximum theoretical throughput that the line can support. It accomplishes this by utilizing multiple TCP threads, allowing the line to accelerate to speed, and discarding the bottom 30% of results. This approach further contributes to a result that appears better than the reality experienced by users when they engage with an internet application.
Broadband speed test data currently forms the basis of many internet reports and rankings. These, in turn, are frequently cited by technical resources, media platforms, governments, and telecoms providers as a reliable measure of performance trends. However, considering that the internet is a global network of interconnected networks, solely relying on data points gathered from end-users to a mid-way point in a user's internet service provider can't truly reflect the end user's internet experience. Often, in cases of intermittent poor service, speed test data continues to display flawless results. This is because issues related to the service provider's core network and gateways—which are frequently the root of the problem and within the provider's control—are ignored.
As many internet service providers around the world are investing in next generation network solutions including FTTH (fibre-to-the-home) and 5G networks , the average speed test result of the typically user has increased exponentially with new regulator focus pushing for gigabit services to the home this is a trend that will continue, however this has often just shifted the bottle neck form the home to the telecoms exchanges and the same level of investment is needed on the interconnect between service providers and to data centers where popular services are served from.
Qualoo network is designed to test internet performance in the same way an end user would experience it, while also ensuring data is collected to support new global digitization initiatives that call for true global connectivity, and an inter-connected population where no individual is left behind. This changes the focus from a very narrow view on last mile service speeds to a global connected network where each stakeholder can not only improve their own network, but also optimize their decision making of which 3rd party networks to partner and share traffic with.