Doctrine and context-driven testing

I recently attended a Rex Black webinar on the subject of “Strategies of Testing, Not Schools”. The content of this webinar seemed to be an extension of the lightning talk Rex gave at the end of the ANZTB 2014 conference in Sydney and his basic argument was that it’s better to think in terms of test strategies than thinking in terms of “schools of testing”. He claimed that belonging to a school means being bound to its doctrines/teaching, whereas by choosing different strategies for different needs means that you better serve your stakeholders.

He mentioned that the idea of “schools of testing” was introduced by Kaner, Bach and Pettichord in Lessons Learned in Software Testing and the same book also outlined the seven principles of context-driven testing. Rex suggested that these principles are essentially “common sense”, apart from the “silly” idea that there are no best practices. He claimed that school membership is only ever talked about by CDT folks and that this community uses the differences between themselves and those of other schools as “an excuse to be exceptionally rude” and “yell at other people on Twitter”. The CDT community was (again) described as an “echo chamber”, that by not listening and talking to anyone outside their own community has made themselves “irrelevant”. Rex’s opinion seemed to be that CDT has had an overall negative impact on the software testing industry and has created a “schism” that doesn’t advance debate in this field. He also noted the CDT community’s opposition to ISO 29119 (which he said “captures the skills and wisdom of a large number of people”) and dismissed the impact that this standard will have on the way testing is actually performed.

Rex discussed the following seven test strategies:

  • Analytical (requirements-based testing and risk-based testing)
  • Model-based (operational profiling and UML modeling)
  • Methodical (standard set of test conditions that don’t vary across iterations or releases)
  • Process-compliant (follow set of processes defined by others)
  • Reactive (react when software is delivered, opposite of analytical)
  • Consultative (rely on input of stakeholders to determine the test conditions, e.g. outsourcing contracts)
  • Regression-averse (manage risk of regression through testing, extensive automation at one or more levels)

The reactive test strategies part of the webinar was very interesting, with many of the risks or downsides of reactive approaches (an example was exploratory testing) already having been well addressed by the CDT community in my opinion – things like lack of demonstrable coverage and having poorly defined test oracles are all problems that have been considered by many of us in the CDT community and there are plenty of examples of how to deal with these from real practitioners, so much of the risk/downside here is not real.

Having listened to the webinar and the Q&A that followed, it seems to me that there is much misconception about the CDT community. I can only speak for myself as part of the community and say that I certainly don’t feel like it’s a community based on doctrine and blindly following a teacher/leader, quite the opposite in fact. At the CDT-focused conferences like Let’s Test and CAST, there is plenty of debate and difference of opinion, and these differences are given the opportunity to be aired and discussed in every session. I don’t see the same open disagreement at the more “mainstream” testing conferences I attend, so the suggestion that it’s a “follow the leader” community seems misplaced in my experience. The idea that aligning with the principles of the CDT school bind me to a single test strategy doesn’t make any sense – the very essence of CDT is about doing what makes sense in the situation you’re in, not blindly following the same process or technique in every project.

How do we counter such perception? After attending Let’s Test and CAST this year, I came away from both events feeling like the CDT community is strong, passionate and becoming more mainstream. I’ve already blogged about the ISO 29119 petition as perhaps being a turning point for the CDT community, the thoughtful opposition being voiced by many in the community serves us well in advertising the critical thinking skills that are a trademark of the people I meet and discuss testing with in this community.

5 thoughts on “Doctrine and context-driven testing

  1. testsheepnz

    Interesting post Lee, thanks! Kind of like how he describes CDT people as “exceptionally rude” by being a little rude himself. There of course is a bit of challenge in that statement of how can any tester be more persuasive and inclusive. Which is no bad thing for anyone to think about.

  2. Michael Bolton

    How do we counter such perception?

    One of the best ways I can think of is to post whip-smart blog posts like this one.

    Another is to engage on the blogs and forums and Twitter feeds with the object of countering any nonsense that is being spread. Yet another is to remain committed to the integrity of our arguments and our forms of debate, patiently and reasonably dispelling falsehoods and questioning mythology.

    Well done, Lee.

    —Michael B.

    1. therockertester Post author

      Thanks for taking the time to read my post, Michael, and for your ideas and encouragement.
      The words “patiently” and “reasonably” leap out of your comment and will serve as good guiding principles for my interactions.

  3. Pingback: Five Blogs – 8 September 2014 | 5blogs

  4. Pingback: Testing Bits – 9/7/14 – 9/13/14 | Testing Curator Blog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s