Increases in average broadband speed have not brought a commensurate decrease in average web page load times, according to a new white paper from the American Enterprise Institute. Instead, average web page load times actually increased between the fourth quarter of 2015 and the same period of 2016, the report said.

“The emphasis on one facet of internet performance, such as last mile broadband networks, tends to minimize other factors that may be more important to the user, such as the performance impact of tracking networks, browsers, webpage design and web server performance,” wrote report author Richard Bennett in an executive summary of the report titled “You Get What You Measure: Internet Performance Measurement as a Policy Tool.”

Average Web Page Load Times
Average web page load times in 2015 were 2.6 seconds for fixed networks and 3.6 seconds for mobile networks, according to the report. For 2016 the average fixed and mobile network load times were 2.7 and 3.8 seconds, respectively. Yet both average fixed and average mobile download speeds increased during the same period.

Source: AEI Report

As the report notes, ad-supported websites face a delicate balancing act with regard to load times. “The revenue model for the web generally depends on ad sales, but ads are principal drivers of poor performance,” the report noted. “The more ads a page carries, the higher its revenue potential when all other factors – such as traffic and click-throughs – are equal, but slow page loads encourage bounces.” Bounces are sessions in which the user immediately leaves the site without going past the first page.

The report includes some eye-opening information from the HTTP Archive, which states that a typical webpage now accesses 19 domains and forms 34 TCP connections. The front pages of major news sites are particularly complex. The report cites data from the Ghostery plugin which showed that the New York Times front page includes 51 connections to trackers, analytics and social media. Other news sites have similar statistics: Washington Post (41), Wall Street Journal (41) and Los Angeles Times (81).

The sheer number of sources that a webpage may draw upon is not the only concern. The more servers are involved, the greater the likelihood that one of them may be overwhelmed, to the detriment of the entire webpage.

The author sees website operators having a strong interest in knowing “which ads and which trackers make the greatest contribution to revenue while imposing the least burden on page load time.” End users also would like greater control over web page load times. “In the ideal scenario, users would be able to detect the proximate causes for slow page loads and take action to resolve them,” the report states.

Some tools already have been developed to provide at least some information that could benefit end users and web page operators, the author notes. These tools include:

  • Passive web performance measurement tools from the World Wide Web Consortium
  • The Passive Indicator (PAIN) system developed at the Politecnico di Torino
  • Round trip time (RTT) measurements
  • Sandvine’s QoE (quality of experience) System
  • The SamKnows Whitebox

The author sees pros and cons with each of these systems, suggesting that elements of each could be combined for a more complete solution.

According to the author, an ideal measurement and remediation system “would involve broadband providers taking web QoE measurements and sharing them with web developers and operators.”

Many of the web’s performance issues “stem from the continued use of old technology for new tasks,” according to the author, who argues that good real-world data about web page load times could help in developing replacement protocols for TCP and HTTP that would be better tailored to today’s tasks.

The author also argues that making webpage QoE more visible has the potential to benefit consumers by intensifying competition between wired and wireless networks.

“The internet is built on the model of multi-stakeholder collaboration,” the report concluded. “Applying this model to real-world performance measurement can make the internet experience better for everyone.”