SAP Testing in S/4HANA Projects: Key to Success

Discover the importance of planning, scoping, and setting agreed objectives for SAP testing in S/4HANA projects. Learn how effective SAP project management can drive success.

Sundar Padmanabhan

1/29/20255 min read

white concrete building
white concrete building

SAP Testing in S/4HANA Projects: Get the Planning Right or Pay for It Later

By Sundar Padmanabhan | Experience Exchange | Jan 2025

I've sat in enough S/4HANA post-mortems to know how the story usually goes. The programme looked solid on paper. Good SI, engaged sponsor, decent budget. But somewhere between SIT and go-live, things started coming apart. Defect volumes nobody planned for. Business users turning up to UAT and seeing the system for what felt like the first time. Regression cycles that were scheduled for two weeks quietly becoming five. And then the blame game — was it the config? The data? The interface team?

Almost every time, when you actually dig into what happened, the answer isn't hiding in a transport log or a test script. It's sitting in a planning workshop that was rushed, a scope document nobody really agreed on, and a set of test objectives that were vague enough to mean different things to different people. The technical stuff is usually fine. It's the human and process decisions made in the first few months of the programme that determine what testing feels like at the end.

That's what I want to talk about here — not testing tools or automation frameworks, but the three things that matter most before you run a single test: planning, scoping, and agreeing what you're actually trying to prove.

Planning isn't a document. It's a series of hard conversations.

The test plan is usually the first thing a programme asks for and the last thing anyone looks at once execution begins. I've seen test plans that were beautifully formatted, ran to forty pages, and were completely disconnected from the reality of how the programme was actually running.

Real test planning for an S/4HANA programme means working through questions that are genuinely uncomfortable to ask early on. How many regression test cycles are we budgeting for, and who decides if that's enough? Who owns defect triage — the SI or the client? What happens when a P1 defect is raised three days before the exit criteria review — who makes the call on whether we proceed? What's the escalation path when the business can't get their UAT testers released from BAU?

These aren't questions about testing. They're questions about governance, authority, and accountability. But if you don't answer them during planning, you'll answer them under pressure during execution — and the answers you get then are rarely the ones you'd have chosen.

S/4HANA adds specific complexity here that an ECC upgrade doesn't. The Universal Journal changes how finance processes behave in ways that surprise even experienced SAP finance teams. Fiori apps perform and feel completely different to SAP GUI, which means user acceptance isn't just functional — it's behavioural. Interfaces with legacy systems often need to be retested from scratch because the underlying data model has changed even when the interface spec hasn't. All of this needs to be factored into your plan before the schedule is locked, not discovered halfway through SIT.

Scope is a risk decision, not a coverage decision.

Every stakeholder in an S/4HANA programme approaches test scope from a different angle. The business wants everything tested. The SI wants a lean scope that fits the timeline. The project manager wants a scope that fits the budget. The test manager — if they're any good — wants a scope that is honest about where the risks actually live.

The mistake most programmes make is treating scope as a coverage question: how much of the system can we test in the time we have? The better question is: which processes, if they fail at go-live, would genuinely hurt the business — and have we given ourselves enough runway to test those properly?

That changes the conversation entirely. It means you might agree to do lighter-touch testing on a low-volume process that's been in SAP for fifteen years and hasn't changed, while spending serious time on the new Fiori-based procurement workflow that nobody has used in production before. It means you're making explicit, documented, risk-informed decisions about where effort goes — rather than trying to boil the ocean and ending up with shallow coverage everywhere.

The other thing that kills scope is the assumption that everyone's agreed when they haven't. I've been in workshops where the SI lead, the client test manager, and the business process owners have all nodded at the same scope document and walked away with three different understandings of what it meant. Get it in writing. Get signatures. And then revisit it when — not if — the programme timeline shifts, because the first thing that gets squeezed when a programme falls behind is testing scope, and you want a documented baseline to defend against that.

Test objectives that nobody fights for are useless.

Here's a question I ask at the start of every S/4HANA engagement: what are we actually trying to prove through testing? Not what phases are we running, not what tools are we using — what are we trying to prove?

The answers I get are usually some version of "that the system works." Which is true, but it's not an objective — it's a hope. An objective is specific enough that you can look at it at the end of a test phase and say, with evidence, whether you've met it or not.

The most useful place to anchor test objectives is the programme's own business case. If the business case says the S/4HANA migration will reduce period-end close from five days to two, then one of your test objectives should be to validate that the S/4HANA Finance close process, under realistic conditions, actually supports that. If the business case promises real-time inventory visibility replacing overnight batch runs, then your integration and performance testing needs to specifically target that capability. Objectives connected to business outcomes are ones that sponsors and steering committees understand and care about. When testing falls behind schedule — and it will — those are the objectives you can use to hold the line on exit criteria, because the business case itself is at stake.

Vague objectives, on the other hand, get quietly dropped when the schedule gets tight. And when that happens, you go to go-live carrying risk that nobody has formally acknowledged.

The boring truth about S/4HANA testing

None of what I've described above is glamorous. It doesn't involve AI-powered test automation or cloud-native testing platforms. It's workshops, documents, difficult conversations, and decisions made by people who are often already stretched thin on a complex programme.

But in my experience, the S/4HANA programmes that go well — that go live on or close to schedule, that don't haemorrhage budget in hypercare, that don't end up as a cautionary tale in an audit report — share one characteristic above all others. They invested seriously in the front end of the test lifecycle. They brought in a test manager early, before the schedule was baselined. They ran proper scoping workshops. They wrote objectives that were tied to the business case and agreed at steering level. And they treated their exit criteria as a contract, not a suggestion.

Testing near the end of a programme is too late to fix the decisions made at the beginning. The quality of your plan in month two will shape almost everything about what you experience in month twelve.

Get that part right, and the rest becomes manageable. Get it wrong, and all the test automation in the world won't save you.

Sundar Padmanabhan is the founder of Experience Exchange, Sydney. He has spent over 20 years in IT delivery, with specialist experience leading SAP test programmes across S/4HANA migrations, ECC upgrades, and enterprise transformation for Australian government and private sector clients.

If your S/4HANA programme is heading into a test phase and you'd like a second opinion on your approach — let's have a conversation →