The Challenge: Decreasing Testing Time in Parallel and Agile Development Environments
Ignis Asset Management, an asset management company (acquired by Standard Life Investments (Holdings) Limited) with head offices in Glasgow and London and with over $100 billion (USD) in assets recently started a large project with the aim of outsourcing the back office and implementing the architecture and applications required to support the outsourcing model in parallel.
Aaron Martin, Programme Test Manager at Ignis explained that to do so in order to meet the business requirements, a number of projects would be required to be developed and delivered simultaneously in parallel. However, they didn’t have the resources, budget, and management capabilities needed to set up and sustain multiple test environments internally.
Lack of proper test environment access obstructed their ability to validate each application under test’s (AUT) integration with third-party architectures. To add up to the problem, their third-party suppliers conjointly had limited test environment access, that led to restriction in the time and scope of their joint integration testing.
Around the same time, the company was adopting an agile development methodology. To keep up with the aspect, they needed to bring in an automated testing solution that could provide prompt and quicker feedback after every build.
It was crystal clear that the pertaining testing process in the company needed to be optimized so as to meet the recent requirements. Execution of the existing test plan took around 10 man-days which not only involved manually entering transactions in the application but also manually building simple stubs to simulate interactions with third-party components that were not integrated. In order to enable complete testing to occur in a much more agile and parallel development, without building and maintaining additional test environments, the following has to be done:
Testing of the application against the test architecture in the company before further integration into the complete system.
Simulate the AUT’s interactions with third-party systems that aren’t yet integrated efficiently.
Extensive Application Testing before Integration
API Testing and Service Virtualization solutions were implemented in the system so as to establish a test automation framework that addressed the challenges outlined above while extending test automation across the SDLC.
At the initial stage, implementation of the API Testing solution concentrated on automating the generation of order management traffic at the level of Application API.
Service Visualization was implemented in parallel with the test automation to simulate the transaction response messages from the end of the third-party dependent components. Beginning with the implementation of a simple virtual asset that provided a positive response to all generated transactions they extended the virtual assets to handle much more complicated response cases and scenarios.
They also established a “ quality gate” using automated tests and virtual assets that must have to be passed in order to proceed to the integration phase
The transformation from a Manual Testing Process to an Automated One
Lack of test resources with experience in test automation and service visualization was a major challenge in the process of transformation hence they pitched in for utilizing automation developers to build the test requirements and replaced the entire manual test resource structure with Supero automation resources.
The expertise of the Supero resources was indeed prominent in building automated tests within the scrum teams, it turned out to be a major factor in the success of Agile environment adaptation in the company.
Testing Time Reduction at the End by 20X
Integrated functional test automation along with the service virtualization led to the reduction in the execution and verification time for the regression test plan from 10 days to about 12 hours. The resultant testing plan was not only automated but also quite extensive.
Earlier, the automation implementations focused more on test automation at the UI level—with inconsistent levels of success, but with a focus on the core test requirements they ended up getting more value from the investment in automation.
It not only addressed the original challenges existing in the project but also enabled the Automating testing from the unit level to system integration. In order to achieve this optimized and effective level of automation, testers cultivated close relationships with the development team. Now, the level and quality of testing have reached an all-time high in the company.