I recently had the pleasure of heading to Southern California to attend and present at the long-running STARWest conference. Although the event is always held at the Disneyland Resort, it’s a serious conference and attracted a record delegation of over 1200 participants. For a testing conference, this is just about as big as it gets and was probably on a par with some recent EuroSTARs that I’ve attended.
My conference experience consisted of attending two full days of tutorials then two conference days, plus presenting one track session and doing an interview for the Virtual Conference event. It was an exhausting few days but also a very engaging & enjoyable time.
Rather than going through every presentation, I’ll talk to a few highlights:
- Michael Bolton tutorial “Critical Thinking for Software Testers”
The prospect of again spending a full day with Michael was an exciting one – and he didn’t disappoint. His tutorial drew heavily from the content of Rapid Software Testing (as expected), but this was not a big issue for his audience (of about 50) here as hardly anyone was familiar with RST, his work with James Bach, Jerry Weinberg, etc. Michael defined “critical thinking” to be “thinking about thinking with the aim of not getting fooled” and he illustrated this many times with interesting examples. The usual “checking vs. testing”, critical distance, models of testing, system 1 vs. system 2 thinking, and “Huh? Really? And? So?” heuristic familiar to those of us who follow RST and Bolton/Bach’s work were all covered and it seemed that Michael converted a few early skeptics during this class. An enjoyable and stimulating day’s class. - Rob Sabourin tutorial “Test Estimation in the Face of Uncertainty”
I was equally excited to be spending half a day in the company of someone who has given me great support and encouragement – and without whose support I probably wouldn’t have made the leap into presenting at conferences. Whenever Rob Sabourin presents or teaches, you’re guaranteed passion and engagement and he did a fine job of covering what can be a pretty dry subject. In his audience of about 40, it was a 50/50 split between those on agile and waterfall projects and some of the estimation techniques he outlined suited one or other SDLC model better, while some were generic. He covered most of the common estimation techniques and often expressed his opinion on their usefulness! For example, using “% of project effort/spend” as a way of estimating testing required was seen as ignoring many factors that influence how much testing we need to do and also ignores the fact that small development efforts can result in big testing efforts. Rob also said this technique “belittles the cognitive aspects of testing”, I heartily agreed! Rob also cited the work of Steve McConnell on developer:tester ratios, in which he found that there was wide variability in this ratio, depending on the organization and environment (e.g. NASA has 10 testers to each developer for flight control software systems while it in business systems, Steve found ratios of between 3:1 and 20:1), rendering talk of an “industry standard” for this measurement seem futile. More agile-friendly techniques such as Wisdom of the Crowd, planning poker and T-shirt sizing were also covered. Rob finished off with his favourite technique, Hadden’s Size/Complexity Technique (from Rita Hadden), and this seemed like a simple way to arrive at decent estimates to iterate on over time. - Mary Thorn keynote “Optimize Your Test Automation to Deliver More Value”
The second conference day kicked off with a keynote from Mary Thorn (of Ipreo). She based her talk around various experiences of implementing automation during her consulting work and, as such, it was really good practical content. I wasn’t familiar with Mary before this keynote but I enjoyed her presentation style and pragmatic approach. - Jared Richardson keynote “Take Charge of Your Testing Career: Bring Your Skills to the Next Level”
The conference was closed out by another keynote, from Jared Richardson (of Agile Artisans). Jared is best known as one of the authors of the GROWS methodology and he had some good ideas around skills development in line with that methodology. He argued that experiments lead to experience and we gain experience both by accident and also intentionally. He also mentioned the Dreyfus model of skills acquisition. He questioned why we often compare ourselves as an industry to other “building” industries when we are very young compared to other building industries with hundreds or thousands of years of experience. He implored us to adopt a learner mentality (rather than an expert mentality) and to become “habitual experimenters”. This was an engaging keynote, delivered very well by Jared and packed full of great ideas.
Moving onto my track session presentation, my topic was “A Day in the Life of a Test Architect” and I was up immediately after lunch on the second day of the conference (and pitted directly against the legendary – and incredibly entertaining – Isabel Evans):
I was very pleased to get essentially a full house for my talk and my initial worries about the talk being a little short for the one hour slot were unfounded as I ended up going for a good 45 minutes:
There was a good Q&A session after my talk too and I had to cut it to make way for the next speaker to set up in the same room. It was good to meet some other people in my audience with the title of “Test Architect” to compare notes.
Shortly after my talk, I had the pleasure of giving a short speaker interview as part of the event’s “Virtual Conference” (a free way to remotely see the keynotes and some other talks from the event), with Jennifer Bonine:
Looking at some of the good and not so good aspects of the event overall:
Good
- The whole show was very well-organized, everything worked seamlessly based on years of experience of running this and similar conferences.
- There was a broad range of talks to choose from and they were generally of a good standard.
- The keynotes were all excellent.
Not so good
- The sheer size of the event was quite overwhelming, with so much going on all the time and it was hard for me to choose what to see when (and the resulting FOMO).
- As a speaker, I was surprised not to have a dedicated facilitator for my room, to introduce me, facilitate Q&A, etc. (I had made the assumption that track talks – at such a large and mature event – would be facilitated, but there was nothing in the conference speaker pack to indicate that this would be the case.)
- I’ve never received so much sponsor email spam after registering for a conference.
- I generally stuck to my conference attendance heuristic of “don’t attend talks given by anyone who works for a conference sponsor”, this immediately restricted my programme quite considerably. There were just too many sponsor talks for my liking.
In terms of takeaways:
- Continuous Delivery and DevOps was a hot topic, with its own theme of track sessions dedicated to it – there seemed to be a common theme of fear about testers losing their jobs within such environments, but also some good talks about how testing changes – rather than disappears – in these environments.
- Agile is mainstream (informal polls in some talks indicated 50-70% of the audience were in agile projects) and many testers are still not embracing it. There seems to be some leading edge work from (some of) the true CD companies and some very traditional work in enterprise environments, with a big middle ground of agile/hybrid adoption rife with poor process, confusion and learning challenges.
- The topic of “schools of testing” again came up, perhaps due to the recent James Bach “Slide Gate” incident. STARWest is a broad church and the idea of a “school of schools” (proposed by Julie Gardiner during her lightning keynote talk) seemed to be well received.
- There is plenty of life left in big commercial testing conferences with the big vendors as sponsors – this was the biggest STARWest yet and the Expo was huge and full of the big names in testing tools, all getting plenty of interest. The size of the task in challenging these big players shouldn’t be underestimated by anyone trying to move towards more pragmatic and people-oriented approaches to testing.
Thanks again to Lee Copeland and all at TechWell for this amazing opportunity, I really appreciated it and had a great time attending & presenting at this event.
Pingback: 2016 in review | Rockin' and Testing All Over The World – therockertester
Pingback: Attending and presenting at CAST 2017 (Nashville) | Rockin' and Testing All Over The World – therockertester
Pingback: ER of attending my first virtual testing conference, Tribal Qonf (27 & 28 June 2020) | Rockin' and Testing All Over The World – therockertester