The Guardian breaks news at 1000ms
- The old version of guardian.com was not responsive and took 10 seconds to load on average.
- Used these guidelines across the org to drive more awareness & accountability during the migration to a new responsive version:
- Load times are measured as perceived load times (time until paint).
- Their target budget was 1000ms, but a lot of that is spent on networking that the app doesn’t have control over. The real target was more like 400ms.
- For a guardian.com page, nothing is as important as the content, so using an entirely server-side rendered system causes a lot of unnecessary blocking waits, which might impact both performance and reliability/resiliency unnecessarily:
- The new version only loads the content in the critical path; everything else is async.
- Browsers stream HTML, but block when they hit a
script tag, and wait until that resource is downloaded.
- CSS can’t be streamed, so the browser waits until the entire thing has downloaded and then builds the CSS object model.
- Their solution was to inline all critical CSS in
<head>, and have everything else be loaded async at the bottom of the initial payload.
- Ditto for fonts.
localStorage is used to cache these resources, so on subsequent page loads, they can be loaded along with the critical resources at the top of the page.
- Can’t use HTTP caching for this because it doesn’t provide enough flexibility to implement this two-pronged “if cached load the top of the page, if not, load at the bottom of the page” approach.
- TL;DR: A lot of real eng work was needed to get the content loaded and rendered within 1000ms.
- Beacon API / RFC
- HTTP/2 + server push
- TCP congestion window
- Service workers
- High performance browser networking - Ilya Gregorik
- Render path / render blocking
- Fastly + onCookie