Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

☔ Performance Score Is Different #10657

Open
patrickhulce opened this issue Apr 28, 2020 · 0 comments
Open

☔ Performance Score Is Different #10657

patrickhulce opened this issue Apr 28, 2020 · 0 comments

Comments

@patrickhulce
Copy link
Collaborator

patrickhulce commented Apr 28, 2020

tl;dr - make sure you're actually testing with the same settings and follow the advice in the variance docs

You're probably here because you filed an issue wondering why metrics or performance results were different between two different runs. We are deeply appreciative of your effort to improve Lighthouse by letting us know!

First, check that you're actually testing with the same settings, especially if you're using a non-default profile such as desktop. Different channels have different ways of configuring throttling. Look carefully at the available CLI flags and consider using the lr-desktop-config.js if you're trying to match the DevTools or PageSpeed Insights desktop profile. Assuming all settings are identical, the remaining differences are likely due to variability.

Performance variability of webpages is a very challenging topic, especially if Lighthouse is one of you or your client's first experiences with performance measurements. We've documented all of the most common sources of performance variability in our variance docs as well as steps you can take to limit its impact.

Lighthouse has a few internal mechanisms that attempt to limit the impact of variability such as simulated throttling, CPU / Memory Power estimation in the runtime section of the report, and companion projects like lighthouse-ci that can automatically run Lighthouse many times. Ultimately though, there is only so much that can be done from within Lighthouse itself and the eventual results rely on the stability of the environment in which Lighthouse is run (as well as the particular URL you're testing!).

Comparing Lighthouse Results From Different Environments

Inevitably, you or your clients will try to compare Lighthouse results from different environments (PSI vs. local, your office machine vs. your home machine, etc). These results may be systematically different in a way that never aligns due to the same underlying variance factors described in our documentation. We highly recommend trying to benchmark in a consistent environment, but we understand this is going to happen. If you must compare across multiple, pick the environment you'll measure yourself against as the standard, and then calibrate other environments to match as closely as possible using throttling settings. We are working on ways to make this process a little easier (e.g. #9085).

@GoogleChrome GoogleChrome locked as resolved and limited conversation to collaborators Apr 28, 2020
@patrickhulce patrickhulce changed the title ☔Performance Variability ☔ Performance Variability Jun 24, 2020
@patrickhulce patrickhulce changed the title ☔ Performance Variability ☔ Performance Score Is Different Jun 24, 2020
@patrickhulce patrickhulce pinned this issue Oct 22, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants