Site icon Craig Andrews

Lighthouse Performance Testing

Lighthouse is a great way to establish a build-measure-learn feedback loop resulting in continuous value creation by testing ideas in the areas of SEO, performance, accessibility, and more. In this article, I’ll cover what Lighthouse is and how to add it a project with examples covering pure Javascript (node) projects and Gradle projects (with any front end).

What is Lighthouse Testing

Lighthouse is an open-source, automated tool for improving the quality of web pages. You can run it against any web page, public or requiring authentication. It has audits for performance, accessibility, progressive web apps, SEO and more.

You can run Lighthouse in Chrome DevTools, from the command line, or as a Node module. You give Lighthouse a URL to audit, it runs a series of audits against the page, and then it generates a report on how well the page did. From there, use the failing audits as indicators on how to improve the page. Each audit has a reference doc explaining why the audit is important, as well as how to fix it.

You can also use Lighthouse CI to prevent regressions on your sites.

Lighthouse documentation

Add Lighthouse Testing to a Project

1. Add Lighthouse Configuration

Add lighthouserc.yml (or one the alternative names) to the root of your frontend project.

The startServerCommand in lighthouserc.yml deserves particular attention. It must start the application; depending upon how your application works, particular environment variables, arguments, or other settings may need to be applied.

ci:
  collect:
    # Puppeteer is used to login
    puppeteerScript: puppeteer-script.js
    puppeteerLaunchOptions:
      args:
        - '--no-sandbox'
        - '--headless'
        - '--ignore-certificate-errors'
    numberOfRuns: 3
    url:
#      Add your urls here
      - https://localhost:8443
    startServerCommand: "./gradlew bootRun" # other approaches "java -jar build/libs/your-project-jar.jar"
    # other examples:
    # startServerCommand: "java -jar build/libs/your-project-jar.jar"
    # startServerCommand: "npm start"
    settings:
      onlyCategories:
        - accessibility
        - best-practices
        - performance
  assert:
    preset: lighthouse:recommended
    assertions:
      offscreen-images: 'off'
      uses-webp-images: 'off'
      color-contrast: 'off'
      first-contentful-paint:
        - error
        - maxNumericValue: 2000
          aggregationMethod: optimistic
      interactive:
        - error
        - maxNumericValue: 5000
          aggregationMethod: optimistic

2. Add Puppeteer Login Script

Add a Puppeteer script named puppeteer-script.js that logs into the application.

Here’s an example:

/**
 * @param {puppeteer.Browser} browser
 * @param {{url: string, options: LHCI.CollectCommand.Options}} context
 */
 module.exports = async (browser, context) => {
  const page = await browser.newPage();
  await page.goto(context.url);
  await page.type('input[name=username]', 'user');
  await page.type('input[name=password]', 'password');
  await page.keyboard.press('Enter');
  await page.waitForNavigation();
  await page.close();
};

3. Install Dependencies

Using yarn:

yarn install --dev @lhci/cli puppeteer

Using npm:

npm add --save-dev @lhci/cli puppeteer

4. Run lhci

Using yarn:

yarn lhci autorun

Using npm:

npm lhci autorun

When lhci is run, it outputs a summary to the console and writes html and json formatted reports to the .lighthouseci directory created alongside lighthouserc.yml.

Automatically Run Lighthouse

Now that Lighthouse can be run manually, it’s time to automate it to it’s run more consistently, often, and effortlessly. Lighthouse can be automatically run in a few different ways.

Only one approach should be used; there’s no value in running Lighthouse multiple times for a given build. For example, one wouldn’t want to run Lighthouse both as JUnit test and a GitLab job.

Ideally results should be uploaded a Lighthouse Server (which is straightforward to run on AWS, Azure, Heroku, on-premises, etc) so they can be easily viewed and changes tracked over time. If a Lighthouse Server isn’t available, temporary-public-storage can be used instead to get something up and running quickly. Or, results could simply be written to a given filesystem path. Another option is to not persist the results, using Lighthouse to only fail the build if error thresholds are exceed. See the Upload section of Lighthouse’s configuration documentation for details on the reporting options.

GitLab CI Job

To run Lighthouse as part of a GitLab CI pipeline, add these job definitions to .gitlab-ci.yml:

set CI_COMMIT_AUTHOR:
  # Request for GitLab to set this variable itself: https://gitlab.com/gitlab-org/gitlab/-/issues/284079
  stage: .pre
  image:
    name: alpine/git
    entrypoint: [""]
  script:
    - 'echo "CI_COMMIT_AUTHOR="$(git log --format="%aN <%aE>" -n 1 "${CI_COMMIT_SHA}")"" > CI_COMMIT_AUTHOR.env'
  artifacts:
    reports:
      dotenv: CI_COMMIT_AUTHOR.env
  rules:
    - if: '$CI_COMMIT_AUTHOR == null'

# Lighthouse CI Testing
lighthouse-ci:
  stage: test
  image: cypress/browsers:latest
  before_script:
    # Lighthouse should handle these variables itself, removing the need to manually set them here: https://github.com/GoogleChrome/lighthouse-ci/pull/568/
    - export LHCI_BUILD_CONTEXT__GIT_REMOTE="${CI_REPOSITORY_URL}"
    - export LHCI_BUILD_CONTEXT__COMMIT_TIME="${CI_COMMIT_TIMESTAMP}"
    - export LHCI_BUILD_CONTEXT__COMMIT_MESSAGE="${CI_COMMIT_MESSAGE}"
    - export LHCI_BUILD_CONTEXT__AUTHOR="${CI_COMMIT_AUTHOR}"
    - '[ "${CI_COMMIT_BEFORE_SHA}" != "0000000000000000000000000000000000000000" ] && export LHCI_BUILD_CONTEXT__ANCESTOR_HASH="${CI_COMMIT_BEFORE_SHA}" || true'

    # Set any additional environment variables you need to run the application, for example:
    # - export MY_ENVIRONMENT_VAR=value

    # example for how to configure LHCI to upload results to a Lighthouse server. If you don't want to upload results to a Lighthouse server, don't set these variables.
    # - export LHCI_UPLOAD__TOKEN="" # don't store the token in a file in source control. Instead, save in a GitLab CI variable (and enable masking) using GitLab's UI: https://docs.gitlab.com/ee/ci/variables/#create-a-custom-variable-in-the-ui
    - export LHCI_UPLOAD__TARGET="lhci"
    - export LHCI_UPLOAD__SERVER_BASE_URL="https://lhci-server.example.com"
  script:
    - cd react-app
    - yarn install
    - yarn lhci autorun

See Lighthouse’s GitLab CI documentation for more information.

JUnit Test using Testcontainers

This approach consists of a JUnit test which uses Testcontainers to run lhci.

The advantages of this approach include:

To publish results to a Lighthouse server, set these environment variables when running the test:

import static org.assertj.core.api.Assertions.assertThat;

import java.io.File;
import java.nio.charset.StandardCharsets;
import java.nio.file.Path;
import java.time.Duration;
import java.util.Map.Entry;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;

import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.context.SpringBootTest.WebEnvironment;
import org.springframework.boot.web.server.LocalServerPort;
import org.springframework.util.Assert;
import org.springframework.util.StreamUtils;
import org.testcontainers.Testcontainers;
import org.testcontainers.containers.GenericContainer;
import org.testcontainers.containers.output.Slf4jLogConsumer;
import org.testcontainers.containers.startupcheck.OneShotStartupCheckStrategy;
import org.testcontainers.utility.MountableFile;

import lombok.extern.slf4j.Slf4j;

@SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT)
@Slf4j
/* default */ class LighthouseTest {
	@LocalServerPort
	private int port;

	private static String lhciVersion;

	private static final String LHCI_ENVIRONMENT_NAME_PREFIX="LHCI_";
	private static final String CI_ENVIRONMENT_NAME_PREFIX="CI_";

	@BeforeAll
	private static void beforeAll() throws Exception {
		lhciVersion = getLhciVersion();
		log.info("Tests will run using lhci version {}", lhciVersion);
	}

	@Test
	/* default */ void testLighthouse() throws Exception {
		Testcontainers.exposeHostPorts(port); // allow the container to access the running web application
		try (GenericContainer<?> container = new GenericContainer<>("cypress/browsers:latest")) {
			container
				.withLogConsumer(new Slf4jLogConsumer(log))
				// pass through environment variables relevant to LHCI and GitLab CI
				// lhci needs the GitLab-provided CI_* variables to determine commit information so it can report it to the lhci server
				.withEnv(System.getenv().entrySet().stream()
						.filter(e -> e.getKey().startsWith(LHCI_ENVIRONMENT_NAME_PREFIX) || e.getKey().startsWith(CI_ENVIRONMENT_NAME_PREFIX))
						.collect(Collectors.toMap(Entry::getKey, Entry::getValue)))
				.withCopyFileToContainer(MountableFile.forHostPath(Path.of("frontend/lighthouserc.yml")), "/src/lighthouserc.yml")
				.withCopyFileToContainer(MountableFile.forHostPath(Path.of("frontend/puppeteer-script.js")), "/src/puppeteer-script.js")
				.withWorkingDirectory("/src")
				.withCreateContainerCmdModifier(c -> c.withEntrypoint(""))
				.withCommand("/bin/sh", "-c", String.format("npm install -g @lhci/cli@%s puppeteer && lhci autorun --collect.startServerCommand=\"\" --collect.url=\"https://%s:%d\"", lhciVersion, GenericContainer.INTERNAL_HOST_HOSTNAME, port))
				.withStartupCheckStrategy(
						new OneShotStartupCheckStrategy().withTimeout(Duration.ofHours(1))
						).start();
			assertThat(container.getLogs()).isNotBlank();
		}
	}

	/** Get the installed version of lhci.
	 *
	 * This approach ensures that as the version of lhci specified in package management changes,
	 * this tests will always use the same version.
	 * @return installed version of lhci.
	 * @throws Exception if something goes wrong
	 */
	private static String getLhciVersion() throws Exception {
		final Process process = Runtime.getRuntime().exec("./yarn lhci --version", null, new File("frontend"));
		Assert.state(process.waitFor() == 0,"lhci version command did not complete successfully");
		final String output = StreamUtils.copyToString(process.getInputStream(), StandardCharsets.UTF_8);
		final Matcher matcher = Pattern.compile("^(?<version>\\d++(?:\\.\\d++)++)$", Pattern.MULTILINE).matcher(output);
		Assert.state(matcher.find(), "Could not determine lhci version from command output. Output: " + output);
		return matcher.group("version");
	}
}

Lighthouse Performance Testing by Craig Andrews is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Exit mobile version