JUnit XML is everywhere. Originally created for the JUnit Java testing framework, the format became the de facto standard for test result reporting across languages and frameworks. Most CI systems parse it natively. Most test frameworks can generate it.
The Format
A JUnit XML file contains test suites, which contain test cases. Each test case has a name, execution time, and optionally failure/error details.
<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="Test Results" tests="3" failures="1" errors="0" time="2.45">
<testsuite name="LoginTests" tests="2" failures="1" time="1.23">
<testcase name="should login with valid credentials" classname="LoginTests" time="0.52"/>
<testcase name="should reject invalid password" classname="LoginTests" time="0.71">
<failure message="Expected 401, got 200" type="AssertionError">
AssertionError: Expected status 401 but received 200
at LoginTests.test (tests/login.spec.ts:24:10)
</failure>
</testcase>
</testsuite>
<testsuite name="SignupTests" tests="1" failures="0" time="1.22">
<testcase name="should create new account" classname="SignupTests" time="1.22"/>
</testsuite>
</testsuites>Key Elements
<testsuites> - Root element containing all test suites
name- Overall test run nametests- Total test countfailures- Tests that failed assertionserrors- Tests that threw unexpected exceptionstime- Total execution time in seconds
<testsuite> - A group of related tests
name- Suite name (often the test file or class)tests,failures,errors,time- Same as above, scoped to this suite
<testcase> - An individual test
name- Test nameclassname- Test class/file (used for grouping in CI UIs)time- Execution time
<failure> - A failed assertion (test ran but didn’t pass)
message- Short failure descriptiontype- Error type (AssertionError, etc.)- Body contains the full stack trace
<error> - An unexpected exception (test crashed)
- Same attributes as
<failure> - Used for exceptions vs assertion failures
<skipped/> - A test that was skipped
message- Optional reason for skipping
Generating JUnit XML
Playwright
// playwright.config.ts
import { defineConfig } from '@playwright/test';
export default defineConfig({
reporter: [
['list'],
['junit', { outputFile: 'results.xml' }]
],
});Jest
npm install jest-junit --save-dev// jest.config.js
module.exports = {
reporters: [
'default',
['jest-junit', { outputDirectory: '.', outputName: 'results.xml' }]
],
};Or via command line:
jest --reporters=default --reporters=jest-junitVitest
npm install @vitest/junit-xml-reporter --save-dev// vitest.config.ts
import { defineConfig } from 'vitest/config';
export default defineConfig({
test: {
reporters: ['default', 'junit'],
outputFile: {
junit: './results.xml'
}
},
});pytest
Built-in support:
pytest --junitxml=results.xmlMocha
npm install mocha-junit-reporter --save-devmocha --reporter mocha-junit-reporter --reporter-options mochaFile=results.xmlGo (go test)
go install github.com/jstemmer/go-junit-report@latest
go test -v ./... 2>&1 | go-junit-report > results.xmlRSpec (Ruby)
# spec/spec_helper.rb
RSpec.configure do |config|
config.add_formatter('RSpec::Core::Formatters::JUnitFormatter', 'results.xml')
endCI Integration
Most CI systems parse JUnit XML automatically for test result visualization.
GitHub Actions
- name: Run tests
run: npm test
- name: Upload test results
uses: actions/upload-artifact@v4
if: always()
with:
name: test-results
path: results.xmlGitHub Actions doesn’t display JUnit results natively in the UI, but third-party actions like dorny/test-reporter can:
- name: Test Report
uses: dorny/test-reporter@v1
if: always()
with:
name: Test Results
path: results.xml
reporter: java-junitGitLab CI
test:
script:
- npm test
artifacts:
reports:
junit: results.xmlGitLab displays JUnit results directly in merge request UIs.
Jenkins
Jenkins parses JUnit XML with the JUnit plugin:
post {
always {
junit 'results.xml'
}
}Azure DevOps
- task: PublishTestResults@2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/results.xml'Limitations
JUnit XML is old and shows its age:
No Retries
The original format doesn’t distinguish between a test that passed on retry vs one that passed on first attempt. Some generators add custom elements like <rerunFailure>, but it’s not standardized.
No Rich Metadata
No standard place for:
- Screenshots
- Video links
- Browser/OS info
- Custom tags
Generators often add <properties> or <system-out> elements, but parsing them is framework-specific.
Flat Structure
Test names and classnames are strings. There’s no hierarchy for nested describe blocks or parameterized tests beyond what you encode in the name.
Time Precision
The time attribute is a decimal in seconds. Some implementations use milliseconds, some use seconds with varying precision. Parsing can be tricky.
Alternatives
CTRF (Common Test Report Format) - A modern JSON schema that addresses JUnit XML’s limitations. Supports retries, rich metadata, and consistent structure. See our CTRF guide.
Allure XML - Allure’s custom format with richer data model, but requires Allure tooling to parse.
TAP (Test Anything Protocol) - Text-based format popular in Perl/Node.js. Simpler than XML but less metadata.
Using JUnit XML with Gaffer
Gaffer parses JUnit XML and extracts test results for analytics:
- name: Run tests
run: npm test
- name: Upload to Gaffer
if: always()
uses: gaffer-sh/gaffer-uploader@v1
with:
gaffer_api_key: ${{ secrets.GAFFER_UPLOAD_TOKEN }}
report_path: ./results.xmlThe parser extracts:
- Test names and statuses
- Execution times
- Failure messages and stack traces
- Suite groupings
For richer analytics (flaky detection, duration trends), consider also generating CTRF output.
Related
- CTRF Guide - Modern alternative to JUnit XML
- Test Artifact Management - Store and share test results