Monthly Archives:

Luminaries and those who have given me inspiration in my testing career

It’s that time of year again when the Software Test Professionals open the voting for their Software Test Luminary Award and this year’s candidates are:

  • James Bach
  • Michael Bolton
  • Rob Sabourin
  • John von Neumann (posthumously)

The first three of these candidates have been inspirational in my testing career as I’m sure they have been in so many others too.

I first met Michael Bolton in 2007 when Quest Software decided to bring him in-house to run the Rapid Software Testing course at the Kanata office (near Ottawa in Canada). It was a long trip from Melbourne to Kanata to take part in this course but I’m so glad that I did – it quite literally changed my view about what software testing was, a true turning point in my career – and I have Michael to thank for showing me the light in terms of context-driven testing. Looking back on this now, I realize that this is when I actually started to have some passion for my work as a tester, despite being in software testing for seven years before that.

I had seen Rob Sabourin present many times at conferences before I finally had any personal interaction with him in 2013, when he acted as content owner for the Australian Workshop on Software Testing in Sydney. I presented an ER about my work with an offshore testing team in China doing session-based exploratory testing and he was very excited by my story (I know he says it a lot, but when he said “that’s so cool”, I was rather humbled) and encouraged me to share it more widely. He inspired me to submit a proposal to talk at Let’s Test 2014 in Sweden and I gave an extended version of that ER at this great context-driven testing conference. Rob was so supportive of my proposal and also “on the ground” at Let’s Test, so my thanks go to him once again for his inspiration and continued support. (He’s also a great fun guy to have dinner with!)

When people think of the context-driven school of testing, James Bach is probably the first name they will mention. James has been instrumental in getting recognition for the ideas of context-driven testing and continues to push the boundaries and encourages others to do the same. I’ve watched him present many times and I always come away with new ideas and feeling inspired, his passion for his craft is infectious. I was lucky enough to take all of my testers to see James present Rapid Software Testing in Melbourne in 2011 and so I got to see RST from his perspective to build on what I’d experienced with Michael Bolton in 2007. A dinner conversation over Indian food and red wine was a rare treat too and it was great to see him in fine form at the recent Let’s Test Oz conference.

These are my personal experiences of three people I respect and from whom I have drawn great inspiration during my testing career. If you have had similar experiences and been inspired by one or more of these guys, then I’d encourage you to cast a vote in recognition of that. If you don’t know much about any of these luminary candidates, then please use this as a trigger to go and learn more about them and devour the rich resources they provide for you to engage and become even better testers.

Greetings, welcome to the context-driven testing community

I’ve just had the wonderful experience of attending the very first all context-driven full testing conference to be held in Australia, Let’s Test Oz. The three day event was an excellent illustration of what makes this community so special – the spirit of sharing, openness, desire to learn, and – above all – passion is infectious and helps to make the conferences under this banner truly remarkable.

This post is not really a review of Let’s Test, however (even though it was great). The opening keynote of the conference came from James Bach, “How Do I Know I’m Context-driven?”. It was, as always, an enlightening way to spend an hour and I like the way that James genuinely tries to move things forward, there’s always new stuff in his presentations. In the course of an hour, he said several interesting things but I want to focus on an important takeaway for me – how to attract and welcome people into our community.

James pointed out that CDT was originally “conceived to be open”, but that “no-one is entitled to an unchallenged opinion”. This presents us as a community with some challenges and James acknowledged that his personality (and his desire within the community, “I want to push the state of software testing forward”) does not place him as the best person for the job of welcoming people in.

He introduced the idea of “greeters and guides” to bring people in and ‘show them around’ our CDT world – these people need soft skills, such as empathy and politeness, as well as familiarity with CDT and a passion for sharing and learning. These greeters and guides need to clearly communicate the message that everybody is welcome provided you want to do good work and are prepared for constant learning – and that these are ways you will earn respect in this community.

The good news is, I think, that there are lots of greeters and guides already out there, doing great work in publicizing our ideas and building awareness of what CDT is all about. It is the responsibility of all of us with a passion for CDT to act in these roles to nurture the next waves of talent into this already incredibly rich community.

So, to anyone interested in learning more about context-driven testing, I hope the bloggers and twitterers provide you with a welcoming introduction to our community and, most importantly, show you an avenue to get involved in a way that you feel comfortable with. Greetings… and welcome (and hope to see you at a Let’s Test one day!).

Doctrine and context-driven testing

I recently attended a Rex Black webinar on the subject of “Strategies of Testing, Not Schools”. The content of this webinar seemed to be an extension of the lightning talk Rex gave at the end of the ANZTB 2014 conference in Sydney and his basic argument was that it’s better to think in terms of test strategies than thinking in terms of “schools of testing”. He claimed that belonging to a school means being bound to its doctrines/teaching, whereas by choosing different strategies for different needs means that you better serve your stakeholders.

He mentioned that the idea of “schools of testing” was introduced by Kaner, Bach and Pettichord in Lessons Learned in Software Testing and the same book also outlined the seven principles of context-driven testing. Rex suggested that these principles are essentially “common sense”, apart from the “silly” idea that there are no best practices. He claimed that school membership is only ever talked about by CDT folks and that this community uses the differences between themselves and those of other schools as “an excuse to be exceptionally rude” and “yell at other people on Twitter”. The CDT community was (again) described as an “echo chamber”, that by not listening and talking to anyone outside their own community has made themselves “irrelevant”. Rex’s opinion seemed to be that CDT has had an overall negative impact on the software testing industry and has created a “schism” that doesn’t advance debate in this field. He also noted the CDT community’s opposition to ISO 29119 (which he said “captures the skills and wisdom of a large number of people”) and dismissed the impact that this standard will have on the way testing is actually performed.

Rex discussed the following seven test strategies:

  • Analytical (requirements-based testing and risk-based testing)
  • Model-based (operational profiling and UML modeling)
  • Methodical (standard set of test conditions that don’t vary across iterations or releases)
  • Process-compliant (follow set of processes defined by others)
  • Reactive (react when software is delivered, opposite of analytical)
  • Consultative (rely on input of stakeholders to determine the test conditions, e.g. outsourcing contracts)
  • Regression-averse (manage risk of regression through testing, extensive automation at one or more levels)

The reactive test strategies part of the webinar was very interesting, with many of the risks or downsides of reactive approaches (an example was exploratory testing) already having been well addressed by the CDT community in my opinion – things like lack of demonstrable coverage and having poorly defined test oracles are all problems that have been considered by many of us in the CDT community and there are plenty of examples of how to deal with these from real practitioners, so much of the risk/downside here is not real.

Having listened to the webinar and the Q&A that followed, it seems to me that there is much misconception about the CDT community. I can only speak for myself as part of the community and say that I certainly don’t feel like it’s a community based on doctrine and blindly following a teacher/leader, quite the opposite in fact. At the CDT-focused conferences like Let’s Test and CAST, there is plenty of debate and difference of opinion, and these differences are given the opportunity to be aired and discussed in every session. I don’t see the same open disagreement at the more “mainstream” testing conferences I attend, so the suggestion that it’s a “follow the leader” community seems misplaced in my experience. The idea that aligning with the principles of the CDT school bind me to a single test strategy doesn’t make any sense – the very essence of CDT is about doing what makes sense in the situation you’re in, not blindly following the same process or technique in every project.

How do we counter such perception? After attending Let’s Test and CAST this year, I came away from both events feeling like the CDT community is strong, passionate and becoming more mainstream. I’ve already blogged about the ISO 29119 petition as perhaps being a turning point for the CDT community, the thoughtful opposition being voiced by many in the community serves us well in advertising the critical thinking skills that are a trademark of the people I meet and discuss testing with in this community.