Alex Painter - 25 April 2019
The focus of software testing is changing. It's been moving away from simply checking that an application meets technical requirements and towards ensuring that it delivers better user experiences and business outcomes.
Or at least that's what we'd all like to believe.
The reality is that capabilities can lag behind the desire for change, and not everyone can agree on how best to represent the voice of the customer in the testing process.
Bridging that gap is what lies behind many of the recent improvements to Eggplant’s Digital Automation Intelligence suite. These include Release Insights, which can give you a better understanding of the likely business impacts of a given release. We can also now go beyond functional testing and automatically check for usability issues.
Another key part of the puzzle is how test models are built, and ensuring that the testing process truly reflects real user requirements lies at the heart of Eggplant’s philosophy.
But it’s more than just a philosophy. Eggplant acquired a number of performance monitoring solutions early in 2018, which have made it possible for us to draw on live customer data to drive testing.
This part of the Eggplant offering—branded Customer Experience Insights—is all about understanding and improving real user experiences, primarily through monitoring (both synthetic and real user). Synthetic monitoring provides independent assurance that key web pages, journeys and other online services are both available and performing. This is complemented by real user monitoring, which provides a wealth of information about how a website is performing for different groups of users. More than this, it can predict how changes in performance will affect KPIs such as conversion and bounce rate for those groups.
The aim is to understand how customer experiences and business outcomes are affected by the technical behavior of the application. More than this, though, it's about identifying opportunities for improvements, and predicting the business impact of those improvements.
We’ve been busy tying these two threads—Digital Automation Intelligence and Customer Experience Insights—together, using live customer data to inform software testing, and we can now use real user journeys to build test models. This helps take the guesswork out of testing. We can begin to move away from treating the business analyst as the proxy for the end user and draw on real user data instead. As well as making testing more meaningful, it makes it faster too—creating a continuous loop of test, release, and monitor, and achieving that holy grail of seamlessly joining test, dev, ops, and the business.
Find out more about customer driven testing in our webinar on 30 May.