Test Report Sprawl: Managing Results Across Multiple Frameworks

Modern applications don’t use a single test framework. You’re running Playwright for E2E, Jest for unit tests, maybe Vitest for the newer modules. Each framework generates reports in different formats, stored in different places. Finding “what failed” becomes an archaeology expedition.

The Multi-Framework Reality

A typical project might have:

LayerFrameworkReport Format
E2EPlaywrightHTML + trace files
IntegrationCypressJSON + screenshots
Unit (React)JestJSON or HTML
Unit (Vite)VitestJSON or HTML
APIpytestJUnit XML or HTML

Each framework has its own:

  • Report format and structure
  • CI artifact location
  • Viewing tool or command
  • Retention period

The Problems This Creates

1. Context Switching

To understand “did tests pass?” you need to:

  1. Open GitHub Actions
  2. Find the workflow run
  3. Download Playwright artifacts
  4. Extract and open locally
  5. Go back to GitHub
  6. Download Jest artifacts
  7. Extract and open those too
  8. Repeat for any other frameworks

This process takes 5-10 minutes and breaks your flow every time.

2. No Unified View

Each framework’s report only knows about its own tests. There’s no single place to see:

  • Overall pass rate across all test types
  • Which layer is most problematic
  • Trends over time for the full suite

3. Inconsistent Retention

CI artifacts expire at different rates. Your Playwright traces from last month are gone, but Jest JSON files from the same run might still exist. Historical analysis becomes impossible.

4. Team Communication Overhead

When someone asks “what’s the test status?”, you can’t point them to one place. You end up copying and pasting results from multiple sources, or scheduling a screen share to walk through different reports.

The Solution: Unified Test Report Hosting

Instead of fighting multiple formats, upload everything to one platform that normalizes the data:

  1. Single upload step - All reports go to the same place
  2. Automatic format detection - The platform parses each format correctly
  3. Unified dashboard - See all results in one view
  4. Consistent retention - Same history for all frameworks
  5. One link to share - Point teammates to a single URL

How Gaffer Handles Multiple Frameworks

Gaffer automatically detects and parses reports from all major test frameworks.

Supported Formats

FrameworkFormatWhat’s Extracted
PlaywrightHTMLFull report with traces, screenshots, videos
JestJSONTest cases, durations, failure messages
JestHTML (jest-html-reporter)Visual report with embedded results
VitestJSONTest cases, durations, failure messages
VitestHTMLFull report with test details
pytestHTML (pytest-html)Test cases, logs, captured output
AnyJUnit XMLUniversal format from most frameworks
AnyCTRF JSONCommon Test Report Format (15+ frameworks)

Automatic Detection

You don’t need to specify the format. Upload your reports and Gaffer figures out what they are:

# GitHub Actions example - upload everything
- name: Upload all test reports
  uses: gaffer-sh/gaffer-uploader@v2
  with:
    api-key: ${{ secrets.GAFFER_API_KEY }}
    report-path: |
      ./playwright-report
      ./coverage/jest-report.json
      ./test-results/vitest.json

Normalized Analytics

Regardless of source format, Gaffer extracts:

  • Total tests, passed, failed, skipped
  • Individual test names and durations
  • Failure messages and stack traces
  • Pass rate trends over time

This lets you compare apples to apples across frameworks.

Per-Framework Filtering

The dashboard shows all results together, but you can filter by framework to drill into specific layers:

  • “Show me only E2E failures”
  • “What’s the trend for unit tests?”
  • “Which framework has the most flaky tests?”

Example: Multi-Framework CI Setup

Here’s a complete GitHub Actions workflow uploading results from multiple frameworks:

name: Tests
on: [push, pull_request]

jobs:
  unit-tests:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
      - run: npm ci

      - name: Run Jest tests
        run: npm run test:unit -- --json --outputFile=jest-results.json

      - name: Run Vitest tests
        run: npm run test:vitest -- --reporter=json --outputFile=vitest-results.json

      - name: Upload to Gaffer
        if: always()
        uses: gaffer-sh/gaffer-uploader@v2
        with:
          api-key: ${{ secrets.GAFFER_API_KEY }}
          report-path: |
            ./jest-results.json
            ./vitest-results.json

  e2e-tests:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
      - run: npm ci
      - run: npx playwright install --with-deps

      - name: Run Playwright tests
        run: npx playwright test

      - name: Upload to Gaffer
        if: always()
        uses: gaffer-sh/gaffer-uploader@v2
        with:
          api-key: ${{ secrets.GAFFER_API_KEY }}
          report-path: ./playwright-report

Both jobs upload to the same Gaffer project, giving you a unified view of all test results for each commit.

Using CTRF for Universal Coverage

If your framework isn’t directly supported, CTRF (Common Test Report Format) provides a universal JSON standard with reporters for 15+ frameworks:

  • Mocha, Jasmine, Cucumber
  • Go test, PHPUnit, RSpec
  • .NET, Java (JUnit, TestNG)
  • And more

Install a CTRF reporter, generate the JSON, upload to Gaffer. Done.

# Example: Mocha with CTRF
npm install mocha-ctrf-json-reporter
mocha --reporter mocha-ctrf-json-reporter

Benefits of Consolidation

Once all your reports are in one place:

BeforeAfter
5+ minutes to check all results10 seconds, one dashboard
Different retention per frameworkConsistent history
Can’t compare across frameworksUnified analytics
Multiple links to shareOne URL for everything
Framework-specific toolingOne platform to learn

Get Started

Stop juggling multiple report formats. Gaffer normalizes everything into one dashboard with unified analytics across all your test frameworks.