Currents.dev is a mature Cypress-era test analytics product, now oriented around enterprise workflows (SSO/SCIM, Slack Connect, 1-year retention). Its Team plan is $49/month for 10 seats and 10,000 test results per month, with $5 per additional 1,000 results on top. Teams start shopping for a Currents.dev alternative when they cross 10 users, hit the test-result cap, or simply don’t want a per-usage meter on their CI output. Gaffer prices on storage, not seats or test volume: $49/month flat, unlimited users, 50 GB, 90-day retention.
Why Teams Look for Currents.dev Alternatives
Seat Count Ceiling on the Team Plan
Currents’ Team plan includes 10 seats. Past that, the path is Enterprise with custom pricing. For a 15-person engineering team that wants everyone to see test results, the Team plan doesn’t fit and the next step is a sales call.
Test-Result Metering
The 10,000 test results per month included on Team sounds generous until you do the arithmetic. A team running 300 CI runs per day at 50 tests per run produces roughly 450,000 test results per month. At $5 per additional 1,000 results, that’s around $2,200/month in metered overage on top of the $49 base. Usage scales with your test suite size and CI frequency, both of which tend to grow.
No Free Tier
Currents does not publish a free tier. Evaluation means a trial window, not a persistent free workspace you can leave running on a side project.
Gaffer: Same Outcomes, Different Economics
Flat Pricing, Unlimited Users
Gaffer has three tiers. Free is $0 with 500 MB and 7-day retention. Pro is $15/month with 10 GB and 30-day retention. Team is $49/month with 50 GB and 90-day retention. Every tier includes unlimited users. Overage on Pro and Team is $0.50/GB/month. There is no test-result cap; storage is the only billed resource.
Same CI Upload Pattern
One step in your CI pipeline:
- name: Upload to Gaffer if: always() uses: gaffer-sh/gaffer-uploader@v2 with: gaffer_api_key: ${{ secrets.GAFFER_UPLOAD_TOKEN }} report_path: ./test-resultsOr with curl:
curl -X POST https://app.gaffer.sh/api/upload \ -H "X-API-Key: $GAFFER_UPLOAD_TOKEN" \ -F "files=@test-results/report.xml"JUnit XML and CTRF are both first-class. Playwright, Jest, Vitest, pytest, and Cypress all work.
MCP Code-Mode for Agentic CI
Both products expose an MCP server. The shapes differ. Currents’ MCP provides roughly 25 CRUD tools for its Dashboard, GitHub PR, CLI, and Slack surfaces. Gaffer’s MCP uses code-mode: three primitives (search_tools, execute_code, and a small set of named functions like get_project_health, get_flaky_tests, get_slowest_tests, get_failure_clusters, get_test_history, get_coverage_summary, find_uncovered_failure_areas, compare_test_metrics, search_failures, list_test_runs). The agent composes JavaScript that calls those functions, rather than issuing one MCP call per CRUD endpoint. A typical agent session is one execute_code call that does the work of 5 to 10 CRUD requests. See GitHub Agentic Workflows with Gaffer MCP for the reasoning behind this shape.
Built-in Analytics Without Extra Configuration
Flaky detection, slowest-test surfacing, and failure clustering run on every upload. As a calibration point, Gaffer’s own 30-day metrics (pulled via the MCP server) currently sit at a 95 health score, 99.96% pass rate over 200 runs, 4 flaky tests, and a p95 duration of 62 seconds on the slowest test. Those are the same numbers the dashboard and MCP tools would return for any project with comparable activity.
Pricing Comparison
| Tier | Currents.dev | Gaffer |
|---|---|---|
| Free | Not offered | $0, 500 MB, 7-day retention, unlimited users |
| Entry paid | Not offered | $15/month, 10 GB, 30-day retention, unlimited users |
| Team | $49/month, 10 seats, 10K test results/month, +$5 per additional 1K results, up to 1-year retention | $49/month, unlimited users, 50 GB, 90-day retention |
| Overage | $5 per 1,000 test results over cap | $0.50/GB/month (Pro and Team) |
| Enterprise | Custom (SSO/SCIM, Slack Connect, data redaction) | Not offered |
Feature Comparison
| Feature | Currents.dev | Gaffer |
|---|---|---|
| Test report hosting | Yes | Yes |
| Historical trends | Yes | Yes |
| Flaky test detection | Yes | Yes |
| Slowest-test and failure clustering | Yes | Yes |
| MCP server | Yes (CRUD tools) | Yes (code-mode) |
| GitHub commit status | Yes | Yes |
| Slack notifications | Yes (incl. Slack Connect on Enterprise) | Yes |
| Webhook notifications | Yes | Yes |
| SSO / SCIM | Enterprise | Not available |
| Jira / MS Teams / GitLab / BitBucket integrations | Yes | Limited |
| Self-hosting | Enterprise | Not available |
| Retention (paid) | Up to 1 year | Up to 90 days |
| User pricing | Per seat (10 included on Team) | Unlimited users, all tiers |
| Free tier | No | Yes |
When to Use Currents.dev
Currents is the better fit if you need:
- SSO or SCIM today, not on a roadmap
- Slack Connect, Jira, MS Teams, GitLab, or BitBucket as first-class integrations
- A team larger than 10 users where per-seat pricing has already been negotiated
- Retention longer than 90 days (up to 1 year on Team)
- Self-hosting via an Enterprise contract
When to Use Gaffer
Gaffer is the better fit if:
- Your team is approaching or past 10 users and per-seat pricing is the binding constraint
- You want a free tier for evaluation or for side projects
- Storage-based pricing matches your workload better than test-result metering
- You want the MCP code-mode primitive for agentic CI workflows. See Test Intelligence for AI Tools
- You prefer flat billing without a per-usage meter on CI output
Migration from Currents.dev
No big-bang cutover needed.
- Keep uploading to Currents. Nothing breaks.
- Add a Gaffer upload step to CI using JUnit XML or CTRF output.
- Point the team at Gaffer for sharing and historical analytics. Retire Currents when you’re ready.
Other Comparisons
Evaluating multiple tools? See how Gaffer compares to Allure and ReportPortal.
Try It
Gaffer’s free tier includes 500 MB storage and 7-day retention. Enough to run your CI against it for a week and decide.