Ideas for speeding up test execution

On GitHub Actions, running the whole unit test suite takes almost 30 minutes. Locally, it obviously depends on the machine the developer is using, but it is also quite slow (justified, it’s a lot of code).

In my experience, long test execution times are demotivating for developers. They don’t run them at all for a long time, or spend most of their development-time waiting for tests. I would like to collect some ideas on speeding up the test execution.

The first thing that stood out to me is that it is not possible to run the tests in parallel. Maven (or Maven Daemon) is able to do this while considering the dependent modules in a multi-module project, but doing so causes the tests to fail. This might be worth investigating.

One approach would be to split individual modules into their own repositories. For example, the spring or quarkus extensions are definitely independent. If desired, one can still create a virtual mono-repo, for example using tools like mani or something similar.

2 Likes

The biggest part are the tests in a single module: operaton-engine.

Running multiple modules in parallel does not solve the issue, since most have to wait until engine is built.

So it has to be considered how to speed up the engine tests. Surefire has possibilities for parallel test execution as well. This can be examined if this works stable.

Also profiling the tests would be an option. Each one is performing quite fast, but maybe there is potential by e.g. faster XML parsing.

1 Like

Running multiple modules in parallel was not my intention but only running necessary ones. Doing this an other release train is needed.

1 Like

How about setting up a nightly build?

We could disable some modules for normal builds, e.g. building the distributions. Also we could perform QA tasks like Sonar analysis, or build and publish reports (e.g. deprecations, available plugin-updates) there.

I think it’s good to run the full test suite on every merge right now. With our rapid pace of commits at the moment, it’s crucial to catch any issues immediately, so we can address any hints of potential problems as they arise. Once we catch up on the backport commits, we can consider adjusting this approach.

1 Like

I would agree, there. Currently we have a lot of PRs coming in and a lot of moving parts.

When we decide to get rid of the javax / jakarta shading, it should be possible again to run tests for single maven modules indipendently. Maybe we can adapt the testing, Monorepo style, to only build changed modules and their dependents? I could not find a default approach for this in Maven, though.

I’ve build a GH workflow for this recently, I can adapt it for this use case. In this case it’s even more complex, reads out some information from Manifest files in the packages and build docker images from it. We can basically just test the packages that have changes and do a full testrun nightly.

name: packages
on:
  push:
    branches:
      - main
    paths:
      - 'packages/**'
  workflow_dispatch:

jobs:
  detect-changes:
    runs-on: self-hosted

    # Expose step outputs as job outputs
    outputs:
      packages: ${{ steps.changed-packages.outputs.packages }}
      packages_json: ${{ steps.changed-packages.outputs.packages_json }}
      docker_builds: ${{ steps.collect-docker-builds.outputs.docker_builds }}

    steps:
      - name: Checkout repository
        uses: actions/checkout@v3
        with:
          fetch-depth: 0

      - name: Get changed files
        id: changed-files
        uses: tj-actions/changed-files@v45

      - name: List all changed files
        id: list-changed
        run: |
          echo "Changed files:"
          echo "${{ steps.changed-files.outputs.all_changed_files }}"

      - name: Get changed package names
        id: changed-packages
        run: |
          CHANGED_FILES="${{ steps.changed-files.outputs.all_changed_files }}"
          CHANGED_PACKAGES=""
          for FILE in ${CHANGED_FILES}; do
            if [[ $FILE == packages/* ]]; then
              PACKAGE=$(echo $FILE | cut -d'/' -f2)
              CHANGED_PACKAGES="${CHANGED_PACKAGES} $PACKAGE"
            fi
          done
          UNIQUE_PACKAGES=$(echo "${CHANGED_PACKAGES}" | tr ' ' '\n' | sort | uniq | tr '\n' ' ' | sed 's/^ *//;s/ *$//')
          echo "Changed packages:"
          echo "[${UNIQUE_PACKAGES}]"
          echo "packages=${UNIQUE_PACKAGES}" >> $GITHUB_OUTPUT
          
          # create a json parseable version that we can use as input for matrices
          JSON_ARRAY=$(echo "${UNIQUE_PACKAGES}" | awk '{for(i=1;i<=NF;i++) printf "\"%s\"%s", $i, (i<NF ? "," : "")}')
          echo "packages_json=[${JSON_ARRAY}]" >> $GITHUB_OUTPUT

      - name: Extract Docker build information from manifest files
        id: collect-docker-builds
        run: |
          CHANGED_PACKAGES="${{ steps.changed-packages.outputs.packages }}"
          DOCKER_BUILDS=""
          
          for PACKAGE in ${CHANGED_PACKAGES}; do
            PACKAGE_DIRECTORY="./packages/${PACKAGE}"
            PACKAGE_MANIFEST_FILE="${PACKAGE_DIRECTORY}/bo.plus.json"
          
            if [[ -f "${PACKAGE_MANIFEST_FILE}" ]]; then
              PACKAGE_VERSION=$(jq -r '.version // "latest"' "${PACKAGE_MANIFEST_FILE}")
          
              DOCKER_BUILDS+=$(jq -c --arg package "${PACKAGE}" --arg packageDirectory "${PACKAGE_DIRECTORY}" --arg packageVersion "${PACKAGE_VERSION}" '
                .build.docker[]? | .tag = (.tag // $packageVersion) | .buildContext = (.buildContext // "./") | .file = (.file // "./Dockerfile") | .registry = (.registry // "container.backoffice.plus") |
                {package: $package, file: .file, buildContext: .buildContext, registry: .registry, image: .image, tag: .tag}' "${PACKAGE_MANIFEST_FILE}")
          
              DOCKER_BUILDS+=" "
            fi
          done
      
          JSON_BUILDS_ARRAY=$(echo "${DOCKER_BUILDS}" | jq -s '.')
          echo "Docker Builds JSON Array:"
          echo "${JSON_BUILDS_ARRAY}"
          
          # remove newlines with jq -c (compact format)
          COMPACT_JSON=$(echo "${JSON_BUILDS_ARRAY}" | jq -c '.')
          printf 'docker_builds=%s\n' "${COMPACT_JSON}" >> $GITHUB_OUTPUT

  build-packages:
    needs: detect-changes
    if: ${{ needs.detect-changes.outputs.docker_builds != '[]' }}
    strategy:
      matrix:
        build: ${{ fromJSON(needs.detect-changes.outputs.docker_builds) }}
    uses: ./.github/workflows/docker-build.yml
    with:
      DOCKER_BUILD_REGISTRY: ${{ matrix.build.registry }}
      DOCKER_BUILD_IMAGE: ${{ matrix.build.image }}
      DOCKER_BUILD_TAG: ${{ matrix.build.tag }}
      DOCKER_BUILD_CONTEXT: ./packages/${{ matrix.build.package }}/${{ matrix.build.buildContext }}
      DOCKER_BUILD_FILE: ./packages/${{ matrix.build.package }}/${{ matrix.build.file }}
    secrets: inherit