Monthly Archives: October 2016

My ER of attending and presenting at STARWest 2016

I recently had the pleasure of heading to Southern California to attend and present at the long-running STARWest conference. Although the event is always held at the Disneyland Resort, it’s a serious conference and attracted a record delegation of over 1200 participants. For a testing conference, this is just about as big as it gets and was probably on a par with some recent EuroSTARs that I’ve attended.

My conference experience consisted of attending two full days of tutorials then two conference days, plus presenting one track session and doing an interview for the Virtual Conference event. It was an exhausting few days but also a very engaging & enjoyable time.

Rather than going through every presentation, I’ll talk to a few highlights:

  • Michael Bolton tutorial “Critical Thinking for Software Testers”
    The prospect of again spending a full day with Michael was an exciting one – and he didn’t disappoint. His tutorial drew heavily from the content of Rapid Software Testing (as expected), but this was not a big issue for his audience (of about 50) here as hardly anyone was familiar with RST, his work with James Bach, Jerry Weinberg, etc. Michael defined “critical thinking” to be “thinking about thinking with the aim of not getting fooled” and he illustrated this many times with interesting examples. The usual “checking vs. testing”, critical distance, models of testing, system 1 vs. system 2 thinking, and “Huh? Really? And? So?” heuristic familiar to those of us who follow RST and Bolton/Bach’s work were all covered and it seemed that Michael converted a few early skeptics during this class. An enjoyable and stimulating day’s class.
  • Rob Sabourin tutorial “Test Estimation in the Face of Uncertainty”
    I was equally excited to be spending half a day in the company of someone who has given me great support and encouragement – and without whose support I probably wouldn’t have made the leap into presenting at conferences. Whenever Rob Sabourin presents or teaches, you’re guaranteed passion and engagement and he did a fine job of covering what can be a pretty dry subject. In his audience of about 40, it was a 50/50 split between those on agile and waterfall projects and some of the estimation techniques he outlined suited one or other SDLC model better, while some were generic. He covered most of the common estimation techniques and often expressed his opinion on their usefulness! For example, using “% of project effort/spend” as a way of estimating testing required was seen as ignoring many factors that influence how much testing we need to do and also ignores the fact that small development efforts can result in big testing efforts. Rob also said this technique “belittles the cognitive aspects of testing”, I heartily agreed! Rob also cited the work of Steve McConnell on developer:tester ratios, in which he found that there was wide variability in this ratio, depending on the organization and environment (e.g. NASA has 10 testers to each developer for flight control software systems while it in business systems, Steve found ratios of between 3:1 and 20:1), rendering talk of an “industry standard” for this measurement seem futile. More agile-friendly techniques such as Wisdom of the Crowd, planning poker and T-shirt sizing were also covered. Rob finished off with his favourite technique, Hadden’s Size/Complexity Technique (from Rita Hadden), and this seemed like a simple way to arrive at decent estimates to iterate on over time.
  • Mary Thorn keynote “Optimize Your Test Automation to Deliver More Value”
    The second conference day kicked off with a keynote from Mary Thorn (of Ipreo). She based her talk around various experiences of implementing automation during her consulting work and, as such, it was really good practical content. I wasn’t familiar with Mary before this keynote but I enjoyed her presentation style and pragmatic approach.
  • Jared Richardson keynote “Take Charge of Your Testing Career: Bring Your Skills to the Next Level”
    The conference was closed out by another keynote, from Jared Richardson (of Agile Artisans). Jared is best known as one of the authors of the GROWS methodology and he had some good ideas around skills development in line with that methodology. He argued that experiments lead to experience and we gain experience both by accident and also intentionally. He also mentioned the Dreyfus model of skills acquisition. He questioned why we often compare ourselves as an industry to other “building” industries when we are very young compared to other building industries with hundreds or thousands of years of experience. He implored us to adopt a learner mentality (rather than an expert mentality) and to become “habitual experimenters”. This was an engaging keynote, delivered very well by Jared and packed full of great ideas.

Moving onto my track session presentation, my topic was “A Day in the Life of a Test Architect” and I was up immediately after lunch on the second day of the conference (and pitted directly against the legendary – and incredibly entertaining – Isabel Evans):

Room signage for Lee's talk at STARWest

I was very pleased to get essentially a full house for my talk and my initial worries about the talk being a little short for the one hour slot were unfounded as I ended up going for a good 45 minutes:


There was a good Q&A session after my talk too and I had to cut it to make way for the next speaker to set up in the same room. It was good to meet some other people in my audience with the title of “Test Architect” to compare notes.

Shortly after my talk, I had the pleasure of giving a short speaker interview as part of the event’s “Virtual Conference” (a free way to remotely see the keynotes and some other talks from the event), with Jennifer Bonine:

Lee being interviewed by Jennifer Bonine for the STARWest Virtual Conference

Looking at some of the good and not so good aspects of the event overall:


  • The whole show was very well-organized, everything worked seamlessly based on years of experience of running this and similar conferences.
  • There was a broad range of talks to choose from and they were generally of a good standard.
  • The keynotes were all excellent.

Not so good

  • The sheer size of the event was quite overwhelming, with so much going on all the time and it was hard for me to choose what to see when (and the resulting FOMO).
  • As a speaker, I was surprised not to have a dedicated facilitator for my room, to introduce me, facilitate Q&A, etc. (I had made the assumption that track talks – at such a large and mature event – would be facilitated, but there was nothing in the conference speaker pack to indicate that this would be the case.)
  • I’ve never received so much sponsor email spam after registering for a conference.
  • I generally stuck to my conference attendance heuristic of “don’t attend talks given by anyone who works for a conference sponsor”, this immediately restricted my programme quite considerably. There were just too many sponsor talks for my liking.

In terms of takeaways:


  • Continuous Delivery and DevOps was a hot topic, with its own theme of track sessions dedicated to it – there seemed to be a common theme of fear about testers losing their jobs within such environments, but also some good talks about how testing changes – rather than disappears – in these environments.
  • Agile is mainstream (informal polls in some talks indicated 50-70% of the audience were in agile projects) and many testers are still not embracing it. There seems to be some leading edge work from (some of) the true CD companies and some very traditional work in enterprise environments, with a big middle ground of agile/hybrid adoption rife with poor process, confusion and learning challenges.
  • The topic of “schools of testing” again came up, perhaps due to the recent James Bach “Slide Gate” incident. STARWest is a broad church and the idea of a “school of schools” (proposed by Julie Gardiner during her lightning keynote talk) seemed to be well received.
  • There is plenty of life left in big commercial testing conferences with the big vendors as sponsors – this was the biggest STARWest yet and the Expo was huge and full of the big names in testing tools, all getting plenty of interest. The size of the task in challenging these big players shouldn’t be underestimated by anyone trying to move towards more pragmatic and people-oriented approaches to testing.

Thanks again to Lee Copeland and all at TechWell for this amazing opportunity, I really appreciated it and had a great time attending & presenting at this event.

Making the most of conference attendance

I attend a lot of testing conferences (and present at a few too), most recently the massive STARWest held at Disneyland in Anaheim, California. I’ve been regularly attending such conferences for about ten years now and have noticed some big changes in the behaviour of people during these events.

Back in the day, most conferences dished out printed copies of the presentation slides and audience members generally seemed to follow along in the hard copy, making notes as the presentation unfolded. It was rare to see anyone checking emails on a laptop or phone during talks. The level of engagement with the speaker generally seemed quite high.

Fast forward ten years and it’s a very different story. Thankfully, most conferences no longer feel the need to demolish a forest to print out the slides for everyone in attendance. However, I have noticed a dramatic decrease in note taking during talks (whether that be on paper or virtually) and a dramatic increase in electronic distractions (such as checking email, internet surfing, and tweeting). The level of engagement with the presentation content seems much lower (to me) than it used to be.

I’m probably old school in that I like to take notes – on paper – during every presentation I attend, not only to give me a reference for what was of interest to me during the talk, but also to practice the key testing skill of note taking. Taking good notes is an under-rated element of the testing toolbox and so important for those practicing session-based exploratory testing.

Given that conference speakers put huge effort into preparing & giving their talks and employers spend large amounts of money for their employees to attend a conference, I’d encourage conference attendees to make every effort to be “in the moment” for each talk, take some notes, and then catch up on those important emails in the many breaks on offer. (Employers, please give your conference attendees the opportunity to engage more by letting them know that those “urgent” emails can probably wait till the end of each talk before getting a response.)

Conferences are a great opportunity to learn, network and share experiences. Remember how fortunate you are to be able to attend them and engage deeply while you have the chance.

(And, yes, I will blog about my experiences of attending and presenting at STARWest separately.)

Testers and Twitter

I was lucky enough to attend and present at the massive STARWest conference, held at Disneyland in Anaheim, last week. I’ll blog separately about the experience but I wanted to answer a question I got after my presentation right here on my blog.

Part of my presentation was discussing my decision to join Twitter and how it has become my “go to” place for keeping up-to-date with the various goings on in the world of testing. (If you’re interested, I was persuaded to join Twitter when I attended the Kiwi Workshop on Software Testing in Wellington in 2013 – and very glad I made the leap!)

I think I made a good case for joining Twitter as a tester and hence the question after my talk, “Who should I follow then?” Looking through my list, I think the following relatively small set would give a Twitter newbie a good flavour of what’s going on in testing (feel free to comment with your ideas too).

Ilari Henrik Aegerter: @ilarihenrik

James Marcus Bach: @jamesmarcusbach

Jon Bach: @jbtestpilot

Michael Bolton: @michaelbolton

Richard Bradshaw: @FriendlyTester

Alexandra Casapu: @coveredincloth

Fiona Charles: @FionaCCharles

Anne-Marie Charrett: @charrett

James Christie: @james_christie

Katrina Clokie: @katrina_tester

David Greenlees: @DMGreenlees

Aaron Hodder: @AWGHodder

Martin Hynie: @vds4

Stephen Janaway: @stephenjanaway

Helena Jeret-Mäe: @HelenaJ_M

Keith Klain: @KeithKlain

Nick Pass: @SlatS

Erik Petersen: @erik_petersen

Richard Robinson: @richrichnz

Rich Rogers: @richrtesting

Robert Sabourin: @RobertASabourin

Paul Seaman: @beaglesays

Testing Trapeze: @TestingTrapeze

Santhosh Tuppad: @santhoshst & @TestInsane