EDITORIALS

How to get started with agile testing in Testpad

Man viewed from behind in a running stance on a street, poised to start, representing getting started with agile testing

How to get started with agile testing in Testpad

A practical guide to agile testing in Testpad – set up sprint test plans, run exploratory sessions, and keep regression under control.

Pheobe

By Pheobe

April 27, 2026

Linkedin Logo Twitter Logo Facebook Logo
t

est tools are often too rigid for agile, but Testpad gives you just enough structure to track what matters without restricting how you test or slowing down the sprint.

Agile testing is continuous – testing runs alongside development, sprint by sprint. If your team is already sold on the approach, this guide covers exactly how to get started with agile testing in Testpad: the practical setup, the sprint-by-sprint rhythm, and how to avoid the friction points that trip most teams up.

Here's how to get started with agile testing in Testpad – it comes down to five things:

  1. Create one project per product or team
  2. Add a folder for each sprint and duplicate it at the start of every cycle
  3. Build a separate regression script for the core functionality you retest every sprint
  4. Use exploratory testing sessions to cover what scripted tests miss
  5. Share the report link with your team – no logins needed

Each of these is covered in detail below – along with the common mistakes that trip agile teams up and how to avoid them.

How do you set up a sprint test plan in Testpad?

The cleanest approach is to create one project per product or team, then use a Folder per sprint to collect all the scripts you need for that sprint's testing.

Each sprint, duplicate the previous Folder. The tests copy across; the results don't. That gives you a clean starting point every time, without rewriting anything – and keeps a full history of what was tested and when in the folders behind it.

Name folders clearly – "Sprint 12" or "Sprint 12 – checkout flow" – so it's obvious what each one covers at a glance.

Inside the script, keep test prompts short and specific. How detailed you go is a matter of preference – some teams prefer broader areas to roam. But if you're new to this, specific prompts are a good starting point, such as:

Apply a valid discount code at checkout
Checkout with an expired card
Order confirmation email is triggered
Cart items persist after session timeout

These aren't rigid step-by-step instructions, but prompts – specific enough to keep testing focused, but open enough for the tester to explore and react to what they find. That's intentional, and is what makes Testpad work well for agile teams: you get structure without losing the ability to think on your feet.

How do you handle regression testing between sprints?

Regression is where agile testing gets messy for a lot of teams. The question is always: how much do we retest, and how often? In Testpad, the answer is a dedicated regression script (or a set of them, depending on the size of your product) – a separate, persistent checklist of the core functionality you re-verify every sprint. Simply put, a practical list of the things that have broken before, and the things that can't break.

Keep the regression script lean. Review it regularly (monthly or quarterly works for most teams) and cut tests where the risk has genuinely shrunk: functionality that hasn't changed, hasn't caused problems, and isn't likely to. What you want is a tight, fast-to-run set of checks that give the team real confidence before a release.

Each sprint, duplicate the regression script – or the folder containing your regression scripts – the same way you would for your sprint scripts. That resets the columns, gives you a clean run to work with, and means you can prune or adjust tests between sprints without touching the historical record of what happened last time.

How does exploratory testing fit into an agile sprint?

Scripted tests cover what you planned for, whereas exploratory testing covers what you didn't.

Exploratory testing doesn't have to wait until the end of a sprint. It can run throughout alongside development, on the latest additions to the product, or revisiting what was built last sprint. The question is less "when" and more "how."

If you want to bring exploratory testing into your sprints, Testpad works well with a charter-style approach: a loose script of areas to investigate, with time allocated per topic. Something like:

  • Checkout flow breaks with unexpected inputs
  • Checkout flow works with correct inputs

Results get logged as you go – pass, fail, or a note – and any bugs can be linked directly in Testpad. At the end of the session, you have a record of what was tested, even if the approach was freeform.

This is Testpad's sweet spot for agile teams: more structured than a blank page but less rigid than a formal test case.

Who should be doing the testing?

In agile, quality isn't just a testing problem. It comes from good design, clear specs, solid product decisions, good UX, and careful implementation. Testing is one part of that picture rather than a safety net at the end. In practice, the more perspectives involved, the better. Developers test differently to product managers, and clients use the product in ways neither would anticipate. Getting all of them involved just gets more useful eyes on the thing.

Testpad makes this easy. Guest testers can access a test run without needing an account or any training. That's particularly useful for user acceptance testing (UAT) at the end of a sprint, where a client or stakeholder needs to verify a feature before it ships.

A typical testing split might look like this:

  • Developers write automated unit tests for new work where feasible, extend automated system tests to cover new features, and manually verify anything that can't be automated
  • Testers or PMs run the sprint's scripts – a mix of manual regression, test prompts, and exploratory sessions
  • Clients or stakeholders, where involved in the sprint, run the UAT checklist or their own sanity checks before sign-off

Nobody needs to understand test case management to participate. They just need a link.

How do you track testing progress during a sprint?

Progress tracking in agile testing should answer one question quickly: are we ready to ship?

In Testpad, the report page gives you an up-to-date picture of where things stand – reload it at any point to see how many tests have passed, failed, or are still pending.

At the end of a sprint, Testpad's report gives the team a clear record of what was tested and what wasn't, which bugs were found, and whether any failed tests were resolved before release. You can share it as a link without the need for a Testpad account to view it. That's usually all a team needs for a sprint retrospective or a stakeholder update.

What about integrating with the tools your team already uses?

Yes. Testpad has lightweight integration with Jira, GitHub, and Pivotal Tracker. When a test fails, you record the issue number against the result and it becomes a clickable link – so anyone reviewing the run can jump straight to the issue without switching tabs and hunting for it.

It's not a deep two-way sync. Testpad doesn't automatically create issues on test fails or pull status updates back from your tracker. But for most agile teams, that's exactly the right level of connection – your test results stay clean, and your issue tracker handles the rest.

A few things that trip up agile teams early on

The most common mistakes are easy to avoid:

  • Testing too late in the sprint. If testing only starts once development is "done," you've lost the agile benefit. Start testing features as soon as they're stable enough to poke at.
  • Writing test cases for everything. Detailed test cases take time to write and maintain. Short prompts are faster to create and good enough for most agile contexts.
  • Letting the regression suite grow unchecked. A regression script that takes three hours to run will get skipped. Keep it focused on what matters most.
  • Skipping exploratory testing. Scripted checks catch what you planned for. Exploratory testing finds the gaps. You need both.

Ready to try it?

Testpad is built for exactly this kind of testing – fast-moving, practical, and light on process. You can set up your first sprint test plan in minutes, invite your team without any onboarding, and start tracking results straight away.

Not a Testpad user yet, but have got this far down and want to give it a go? Test us free for 30 days – no credit card required.

Green square with white check

If you liked this article, consider sharing

Linkedin Logo Twitter Logo Facebook Logo

Subscribe to receive pragmatic strategies and starter templates straight to your inbox

no spams. unsubscribe anytime.