A Case Study

Background

A FTSE 100 company, operating in over 140 countries, had embarked on a global Microsoft Dynamics AX 2012 implementation. Adopting the traditional, partner led, delivery model they invested in an initial template development and roll out to two pilot markets. There were a lot of lessons learnt from this exercise.

With over four hundred and fifty thousand lines of customised code this was a software development project and not a package roll-out. There was no testing team sitting between the partners development team and the business, the system went straight into business testing. The business raised hundreds of defects, a mixture of basic software errors and mismatch between what was needed and what the team had been aiming to build. The business were ill equipped to test, taking much guidance from the development team, and this led to ineffective testing. Once live more issues emerged and a long period of stabilisation was required putting the overall programme back years.

Ultimately the organisation took stock. They brought in fresh leadership. Leadership that had enterprise scale experience from the wider, non AX, world. A consequence was a recognition of the need to move away from a partner led model. To separate out design, build, tests and deployment and to have separate expertise led by the organisation itself working in each area. In line with best practice for large scale programme this led to the sourcing of a separate autonomous testing capability.

Our Story

Our capabilities, as suppliers of specialist test consultancy and test management services, were known to a number of the new leadership team. We were asked to undertake a review of the programme looking not only at testing but also at the approach to development. We were asked to propose development changes that would reduce the risk of major delays due to late discovery of defects and quality issues. We were asked to identify how to approach testing. Following the review we were then asked to provide not only the independent test leadership but also the test delivery capability that the programme needed.

Working to very tight timescales, driven by the business cycle and the need to get the major US market live within an acceptable timeframe, we faced major challenges:

  • Live markets had to be protected from regression issues within the next major release, but there was no fit for purpose regression test pack.
  • Live market business process documentation was partial and, in many cases, incomplete or, having been written when the system was conceived, at odds with what the live businesses were actually doing given the system they got.
  • System documentation was either not existent, out of date or, to all intents and purposes, inaccessible.
  • Knowledge and power lay with individuals, many contractors, and was hard to identify. Whole areas of the system were unknowns because people had left.
  • The programme reset necessitated a redesign of the template and its integrations with front end systems. This was another large scale software development. Not only did live market features need to be tested but also a complete new system for a new market.
  • All of this took place in a stressed multi-vendor environment with the obvious downside this has for effective collaboration.

Mobilising at scale we, within three and a half months, tackled these challenges. The team:

  • Analysed the existing solution and its use then developed and executed a regression test package for the live businesses comprising approximately 700 business process and operations oriented tests.
  • Tested cradle to grave interworking with the front end business system of each of the live businesses.
  • Tested interworking with payment systems.
  • Under took software testing on the changes being made for the new release.
  • Developed and executed around 400 additional business process tests applicable to the US business.
  • Delivered technical and functional testing of an upgrade to the BizTalk integration platform.
  • Built a simulator for the US business front end system to facilitate integration testing.
  • Delivered performance testing of the core AX application using custom tools.
  • Facilitated business acceptance of the release by the live market.
  • Eliminated around one hundred and fifty defects from the solution.

Unlike previous releases business testing, at the end of the programme, did not find significant issues with the AX solution. Defects had been found and eliminated by our independent test team rather than remaining, hidden, in the system waiting to frustrate the business users. Testing by the live businesses was very clean and most of the issues raised by the US business related to data and previously unrecognised requirements. The benefits of having a disciplined autonomous test team were clearly seen in this outcome.

Success Factors

What made this release different? Many things including:

  • The creation of a robust autonomous test function.
  • Taking a service from an established organisation rather than attempting to use individual contractors.
  • Using hardened tests specialists rather than generalist management consultants.
  • The ability to operate at scale.
  • Application of mature test analysis and definition practices brought in.
  • The disciplined test approaches applied.
  • The technical capabilities of the team.
  • The team’s all round capabilities and ability to operate autonomously.
  • The ethos of the team, quality is important not a nice to have.

Contact us