Modern applications don’t use a single test framework. You’re running Playwright for E2E, Jest for unit tests, maybe Vitest for the newer modules. Each framework generates reports in different formats, stored in different places. Finding “what failed” becomes an archaeology expedition.
The Multi-Framework Reality
A typical project might have:
| Layer | Framework | Report Format |
|---|---|---|
| E2E | Playwright | HTML + trace files |
| Integration | Cypress | JSON + screenshots |
| Unit (React) | Jest | JSON or HTML |
| Unit (Vite) | Vitest | JSON or HTML |
| API | pytest | JUnit XML or HTML |
Each framework has its own:
- Report format and structure
- CI artifact location
- Viewing tool or command
- Retention period
The Problems This Creates
1. Context Switching
To understand “did tests pass?” you need to:
- Open GitHub Actions
- Find the workflow run
- Download Playwright artifacts
- Extract and open locally
- Go back to GitHub
- Download Jest artifacts
- Extract and open those too
- Repeat for any other frameworks
This process takes 5-10 minutes and breaks your flow every time.
2. No Unified View
Each framework’s report only knows about its own tests. There’s no single place to see:
- Overall pass rate across all test types
- Which layer is most problematic
- Trends over time for the full suite
3. Inconsistent Retention
CI artifacts expire at different rates. Your Playwright traces from last month are gone, but Jest JSON files from the same run might still exist. Historical analysis becomes impossible.
4. Team Communication Overhead
When someone asks “what’s the test status?”, you can’t point them to one place. You end up copying and pasting results from multiple sources, or scheduling a screen share to walk through different reports.
The Solution: Unified Test Report Hosting
Instead of fighting multiple formats, upload everything to one platform that normalizes the data:
- Single upload step - All reports go to the same place
- Automatic format detection - The platform parses each format correctly
- Unified dashboard - See all results in one view
- Consistent retention - Same history for all frameworks
- One link to share - Point teammates to a single URL
How Gaffer Handles Multiple Frameworks
Gaffer automatically detects and parses reports from all major test frameworks.
Supported Formats
| Framework | Format | What’s Extracted |
|---|---|---|
| Playwright | HTML | Full report with traces, screenshots, videos |
| Jest | JSON | Test cases, durations, failure messages |
| Jest | HTML (jest-html-reporter) | Visual report with embedded results |
| Vitest | JSON | Test cases, durations, failure messages |
| Vitest | HTML | Full report with test details |
| pytest | HTML (pytest-html) | Test cases, logs, captured output |
| Any | JUnit XML | Universal format from most frameworks |
| Any | CTRF JSON | Common Test Report Format (15+ frameworks) |
Automatic Detection
You don’t need to specify the format. Upload your reports and Gaffer figures out what they are:
# GitHub Actions example - upload everything
- name: Upload all test reports
uses: gaffer-sh/gaffer-uploader@v2
with:
api-key: ${{ secrets.GAFFER_API_KEY }}
report-path: |
./playwright-report
./coverage/jest-report.json
./test-results/vitest.jsonNormalized Analytics
Regardless of source format, Gaffer extracts:
- Total tests, passed, failed, skipped
- Individual test names and durations
- Failure messages and stack traces
- Pass rate trends over time
This lets you compare apples to apples across frameworks.
Per-Framework Filtering
The dashboard shows all results together, but you can filter by framework to drill into specific layers:
- “Show me only E2E failures”
- “What’s the trend for unit tests?”
- “Which framework has the most flaky tests?”
Example: Multi-Framework CI Setup
Here’s a complete GitHub Actions workflow uploading results from multiple frameworks:
name: Tests
on: [push, pull_request]
jobs:
unit-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
- run: npm ci
- name: Run Jest tests
run: npm run test:unit -- --json --outputFile=jest-results.json
- name: Run Vitest tests
run: npm run test:vitest -- --reporter=json --outputFile=vitest-results.json
- name: Upload to Gaffer
if: always()
uses: gaffer-sh/gaffer-uploader@v2
with:
api-key: ${{ secrets.GAFFER_API_KEY }}
report-path: |
./jest-results.json
./vitest-results.json
e2e-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
- run: npm ci
- run: npx playwright install --with-deps
- name: Run Playwright tests
run: npx playwright test
- name: Upload to Gaffer
if: always()
uses: gaffer-sh/gaffer-uploader@v2
with:
api-key: ${{ secrets.GAFFER_API_KEY }}
report-path: ./playwright-reportBoth jobs upload to the same Gaffer project, giving you a unified view of all test results for each commit.
Using CTRF for Universal Coverage
If your framework isn’t directly supported, CTRF (Common Test Report Format) provides a universal JSON standard with reporters for 15+ frameworks:
- Mocha, Jasmine, Cucumber
- Go test, PHPUnit, RSpec
- .NET, Java (JUnit, TestNG)
- And more
Install a CTRF reporter, generate the JSON, upload to Gaffer. Done.
# Example: Mocha with CTRF
npm install mocha-ctrf-json-reporter
mocha --reporter mocha-ctrf-json-reporterBenefits of Consolidation
Once all your reports are in one place:
| Before | After |
|---|---|
| 5+ minutes to check all results | 10 seconds, one dashboard |
| Different retention per framework | Consistent history |
| Can’t compare across frameworks | Unified analytics |
| Multiple links to share | One URL for everything |
| Framework-specific tooling | One platform to learn |
Get Started
Stop juggling multiple report formats. Gaffer normalizes everything into one dashboard with unified analytics across all your test frameworks.