During my 12 year career in software testing, I have had the privilege of working with different development methodologies. I have witnessed the transformation from Waterfall to Agile and have seen how the expectations, roles, and skills of testing professionals have taken a huge leap.

QAs are now evolving themselves from being just a “Bug Finder” to a “Bug Preventer”. They are acquiring new skills like Automation, TDD, BDD, and White box testing, not to mention Black box testing. They are now more solution-oriented, collaborate more with the development team and business stakeholders.

I sincerely believe in continuous learning and utilizing a platform (like softwaretestinghelp.com) to share my experiences and learn new concepts and tools.

So, here I am again, to share my knowledge and experience as a “Tester in Agile” and to hear from all you guys about your thoughts and opinions.

Mind Set Change of Agile Tester

“Agile Tester – Change in Mindset” is not something that I can communicate effectively in just one section. Therefore, I plan to break my article into 3 knowledge areas:

  1. Aligning Agile testers with Agile manifesto
  2. Involvement of testers in TDD, BDD, and ATDD
  3. Implementing automation in Agile

Aligning Agile Tester with Agile Manifesto

The term Agile means “Flexible”, “able to move quickly”.

Agile testing is NOT a new technique for testing, rather being Agile means developing a change in the mindset of delivering a testable piece.

Before we discuss more into Agile testing, let’s flashback and try to understand the origin and the philosophy behind Agile.

The old story

Before the world moved to Agile, Waterfall was the dominant methodology in the software industry. I am not trying to explain the Waterfall model here but am noting the pointers on some of the practices followed by the team on implementing a particular feature.

All these pointers are based on my experience and may have a disparity in opinion.

So here we go…

  • Developers and QAs worked as separate teams (Sometimes as Rivals 🙂 )
  • The same required document was referred to by developers and QAs simultaneously. Developers did their designing and coding and QAs did their test case writing referring to the same required document. Planning and execution were done in silos
  • The test case reviews were solely done by the QA leaders. Sharing the QA test cases with the developers was not considered a good practice. (Reason – The developers would code based on the test cases and the QAs would lose on the defects)
  • Testing was considered the LAST activity of an implementation cycle. Most of the time, QAs would get the feature in the last stage and were expected to complete the entire testing in a very limited time. (And the QAs did it)
  • The only goal of the QAs was to identify bugs & defects. The overall performance of the QAs was judged based on the number of valid bugs/defects they submit
  • STLC and Defect lifecycle followed while in execution. Email communication preferred
  • Automation is considered an end activity and mostly focused on UI. Regression suit was considered the best candidate for Automation

Yes, following these practices did have its drawbacks:

  • Because the teams worked in silos, the only medium of communication between the developers and QAs were “Bugs & defects”
  • The only scope of QAs was to write and execute test cases on the finished product
  • There was very little to no scope for QAs to view the code or interact with the developers or business
  • Because the entire chunk of the product is released at once, there was a huge responsibility on the QAs during the time of production. QAs were generally considered quality gatekeepers, and if anything went wrong in production, the entire blame was put on QAs
  • Apart from the functional testing, Regression testing of the entire product was also an additional responsibility of the QAs which comprised of a huge number of test cases

Out of all the drawbacks, the major disadvantage was that of “Loss of focus from the ultimate goal of delivering a good quality product at a sustainable pace”.

The ultimate goal of the team (developers + testers) is to deliver good quality software that would meet the customer requirements and is fit for use. Because of the big time span and increased time to market duration, the focus blurred, and the only objective that sustained was to finish the implementation and move the code to UAT.

QAs concentrated only on the test case execution (put a tick mark against the test case checklist), make sure the bugs/defects were closed or deferred, and move on to a different project/module. So, from QA’s perspective, the focus was not on speed and quality of delivery but to complete the test case execution (and of course some automation).

Agile Philosophy

It all started on early 2001, when a group of 17 professional met at Utah (USA) to ski, eat, relax and have a quality discussion; what came out was the Agile manifesto.

As a Quality professional, it is imperial that we too understand the essence of the manifesto and try to shape our thought process accordingly:

The Agile Philosophy

Let’s try to align our thought process of testing the software with the agile manifesto, but before I do that, let’s understand one thing: in Agile, the teams are cross-skilled and everybody in the team is contributing towards the development of the product/feature.

Therefore, I prefer to call the entire team the “Development team”, which includes programmers, testers, and business analysts. Henceforth, I would be using the terms as: Developers Programmers & QA Testers.

#1) Working software OVER Comprehensive documentation

The ultimate goal in Agile development is to deliver potentially shippable software/increments in a short period of time, which means that the time to market is the key. Having said that, it does not mean that quality is at stake. Because there is less time to market, it is important that the testing strategy and planning of the execution be more focused on the overall quality of the system.

Testing is an endless job and it can go on and on, testers have to determine certain parameters of which they can give a green signal for the product to be moved to production. To do so, it’s imperial that testers equally involve themselves while deciding the “Definition of ready (DOR) and Definition of Done (DOD)” and not forget to decide the “Acceptance Criteria of the story”.

Test scenarios and test cases should revolve around the definition of Done and Acceptance Criteria of the story. Instead of writing exhaustive test cases and including that information in the test cases which are seldom used, the focus should be more on the precise and to the point scenarios. I have used the below template to write my test cases.

Template to write Test Case

The point here is to include only the information in the test scenarios/cases that are required and add value to the cases.

#2) Customer collaboration OVER contract negotiation

Let’s communicate directly with the customer regarding our testing approach and try to be transparent in sharing the test cases, data and results.

Have frequent feedback sessions with the customer and share the test results. Ask the customer if they are good with the test results or if they want any specific scenarios to be covered. Let’s not try to restrict ourselves to asking questions and seeking clarification from only the product owner/business to understand the functionality and business.

The deeper our understanding of the features we have, the more precise coverage we have on testing.

#3) Responding to change OVER following a plan

The only thing which is Constant is Change!!

We cannot control change and we have to understand and accept the fact that there will be changes in the features and requirements; we have to adapt and implement.

The frequent change in requirements is very well adopted in Agile, hence in a similar fashion, as testers, we need to keep our test plan and scenarios flexible enough to accommodate new changes.

Traditionally, we create a test plan and the same is followed throughout the lifecycle of the project. Instead, in Agile, the plan has to be dynamic in nature and moulded as per the requirements. Again, the focus should be on meeting the Definition of Done and the Acceptance criteria of the story.

I don’t see a need to create a test plan for every story; instead, we can create a test plan at an epic level. Just like epics are written and being worked upon, simultaneous efforts can be put into the creation of Test Plans for the same. There may or may not be a defined template for it. Just to make sure we have coverage of the quality aspect of Epic entirely.

Try to utilize the PI (Product Increment) planning days to determine the high-level test scenarios for the story based on the definition of done and acceptance criteria.

#4) Communication & collaboration OVER process & tools

Testers are very much process oriented (which is perfectly fine), but we should keep in mind that in lieu of following a process, the turnaround time/response time for the issue is not impacted.

If the team is co-located, any issues can be resolved by direct communication. Perhaps we have daily stand-ups that provide a good platform to resolve issues. It’s important to log a defect, but it should be done only for tracking purposes.

Testers should pair up with the programmers and collaborate to resolve the defect. Product owners can also be pulled in if needed. Testers should actively and proactively participate in TDD (Test Driven Development) and should collaborate with the programming team to share the scenarios and try to identify the defects at the unit level itself.


“Testing in Agile” is not a different technique, rather it’s a change in mindset, and change does not happen overnight. It requires knowledge, skill and proper mentoring.

In the second part of the series, I will discuss testers involvement in Test-driven development (TDD), Behavior-Driven Development (BDD), and Acceptance Test Driven Development (ATDD) in Agile.

About the author: This article is by STH Author Shilpa. She has been working in the software testing field for the past 10+ years in domains like Internet advertising, Investment Banking, and Telecom.

Keep watching this space for more.

Genislab Technologies

NexGeneration complete end-2-end software testing & modern development operations tooling & solutions

Do you want to discuss your testing requirements with us? please don’t hesitate to hit the contact us button below, and we will get back to you at our earliest..