One of the more popular job listing sites in Australia is Seek. A search on this site for Melbourne-based jobs under the Information & Communication Technology -> Testing & Quality Assurance classification yielded 70 results as I began writing this post on 29th May 2020.
Looking at these jobs, I’ve broadly categorized them as follows:
- Automation (32)
- Test Automation Engineer (x2)
- Automation Test Consultant (x2)
- Senior QA Automation Engineer
- Test Automation Specialist (x2)
- Automation Test Analyst (x5)
- Automation Test Analyst – Tosca
- Senior Automation Test Analyst
- QA Engineer (Automation)
- Automation Tester (x4)
- Senior Test Automation Engineer/Lead (x2)
- Java Development Engineer in Test (DevOps)
- Lead/Sr.Automation Tester
- Senior Test Automation Specialist
- Automation Test Engineer
- Automation Test Engineer (Embedded Software)
- Applications Testing Automation Engineers (x4)
- Technical Test Lead – Automation Lead Tosca
- Management (7)
- Test Manager
- Performance Test Manager (x2)
- Defect Manager (x4)
- Specialists (14)
- Performance Tester (x4)
- Penetration Testing
- Performance Test Consultant
- Infrastructure and Security Test Analyst
- Senior Test Analyst – Infrastructure testing
- Performance Test Analyst (x2)
- Performance Engineer (x2)
- Jmeter Tester
- Network Test Engineer (hardware-related)
- “Manual” / Other (17)
- Test Analyst
- Test Analyst – Business Intelligence
- Senior Test Analyst (x2)
- UAT Tester
- Mobile QA Engineer
- Quality Assurance Engineer
- QA Engineer
- Graduate Test Analyst
- Automation Specialists and Defect Managers
- Junior Quality Assurance Tester Traineeship
- Senior Software Test / V&V Engineers (Defence)
- Validation & Verification Lead
- Integration Testers (x4)
These ads are not representative of unique jobs, as exactly the same ad is sometimes posted at different times and/or the same job is typically posted by a number of different recruiters (especially for government roles).
The breakdown in terms of the number of opportunities didn’t surprise me. The focus at the moment seems to be around automation driven by CI/CD investments, DevOps/agile transformations – and, of course, the overestimation of what automation can and can’t do. Similarly, performance and security-related testing are topics de jour as represented by a swathe of ads in these areas. Test management doesn’t seem like a good place to be, with very few roles being advertised in recent times and this change has been heavily driven by agile adoption in my experience.
I generally take more interest in the ads focused on human testing to see what companies are looking for in this area. Most of the more “traditional” human testing roles (e.g. in government departments) also now mandate some degree of proficiency in tools commonly associated with “automated testing”. It’s pleasing to see requests for ISTQB certification becoming much less common and I’ll occasionally even spot a reference to “exploratory testing”.
But there are often “red flags” in these ads and I proffer a few examples from this most recent search of the “opportunities” on offer.
First up, a “Senior Test Analyst” role “to take accountability for the testing capabilities within the Microservices team.” The “Technical experience” requirements are listed first and include “Linux/Unix Shell Scripting, Java Programming, Clean Coding, Git, Jenkins2, Gradle, Docker, REST APIs, SQL, Unit Testing (Junit), Component Testing, BDD, Integration Testing” and then finally the “Testing Practices” requirements are presented, viz. “Exploratory Testing, Mind Mapping, Requirements Analysis, Peer Collaboration, Continuous Integration, Continuous Deployment, Stubbing and Mocking”.
There are a few red flags here for me. If this were a true microservice team, then it would have a clear sense of ownership of its microservice and a whole team approach to the testing and quality of that service. I’d be looking for clarification of what “accountability for the testing capabilities” really means in the context of this team. Another issue is the lack of clarity about whether this is a human testing role or a development (automation code) role, or a hybrid, or something else. The fact that the technical (mainly development) skills requirements are listed before the testing ones would immediately lead me to believe that more value is placed on automation than on deep human testing here. While it’s good to see exploratory testing explicitly listed (albeit as a “testing practice”), the other requirements listed around testing are much less convincing as to whether this organization would truly value the contribution of an excellent exploratory tester.
Next up, a “Quality Assurance Engineer” role where the successful applicant (who becomes just a “Quality Engineer” as the ad goes on) will “play a critical role in transforming quality practices… and work to develop a world-class test automation solution. [They will] work as part of a cross functional squad, acting as the QA expert, across new features, enhancements and bug fixes. [They’ll use their] testing experience to advise on automation opportunities and review requirements to ensure a fit for purpose solution in (sic) delivered.”
In describing what the day-to-day work entails for this role, there are some positive signs: “You’ll be passionate about testing and use your experience to identify critical defects. You’ll be naturally curious and explore the latest tools and techniques to continuously improve testing” But there are also some less positive signs: “You’ll work as a fully engaged member of cross functional squad including Developers, UX/UI Designers and Product Managers to ensure the quality of products. You’ll create, maintain and execute tests (manual or automated) to verify adherence to requirements.” Again, it’s good to see exploratory testing getting a mention, albeit in what comes across as a confused way (especially in light of the “verify adherence to requirements” elsewhere), “You’ll perform risk based exploratory testing.”
In terms of skills, they’re expecting “expertise in developing functional tests that cover 100% of requirements and execute manually or automated (with focus on least maintenance and faster script development time)” and possession of “strong skills on efficient test design”.
There are obviously a few red flags in this ad. It sounds like this organization has latched into the so-called “Spotify Model” since it mentions “squads” a number of times, even though this model was actually not successfully adopted within Spotify. It talks about “ensuring” quality and verifying “adherence to requirements”, while at the same time asking for exploratory testing skills (of the risk-based variety, of course). Covering “100% of requirements” completes a picture of an organization where verification, preferably by machine, is valued much more than human testing.
My final example is for a “QA Engineer” role which asks for background in “both manual and automated testing” The red flags come real early in this one: “Can you imagine how it would feel to be responsible for ensuring that our key products work well in all conditions? Are you interested in working in a Global Centre of Excellence…?” I’ll choose not to imagine how it would feel to be given an impossible mission.
This lucky candidate “will be responsible for both manual and automated testing as we move towards complete automation.” At least there is no doubt left here about the value of deep human testing as this organization seeks to “automate everything.”
I like the requirement to be “Excellent at finding the problems others miss”, but much less keen to see this followed by “Able to document test cases based on solution requirements provided. Understanding of Developing in an Agile environment”. Advanced-level “ISTBQ” is then included in their list of “desirable” skills.
I don’t understand why this organization believes this ad would attract an excellent human tester who could leverage exploratory testing to genuinely find “the problems others miss”. They describe themselves as Agile but want the successful candidate to write test cases, all the time trying to get to a point where “manual” testing is no longer required. Based on the ad alone, either this is an organization with a confused outlook on testing or they’re really looking for someone to take on an impossible mission while devaluing their actual testing skills – either way, this doesn’t sound like a great way for a decent tester to spend their working life (but I acknowledge the ad could just be very poorly-written and misrepresents the core beliefs of the organization when it comes to testing).
While it’s interesting to review occasionally what organizations are looking for when they’re advertising for testing-related roles, it’s also pretty depressing to see just how little value seems to now be placed on deep human testing skill. The “automate everything” bandwagon seems to roll on and is taking over this market, while opportunities for those genuinely skilled in testing as a craft seem to become fewer and fewer, at least in the Melbourne market represented by published job ads.
As the economic impact of COVID-19 takes its toll, a large number of folks are hitting the IT market at the same time. In just the last couple of weeks, large IT staff reductions have been reported in Melbourne by big tech names such as MYOB and Culture Amp – and more seem likely in the coming months. If there’s a bright side to this situation, it’s that there are no doubt lots of good people coming back into the market so it’s probably a great chance to pick up testing talent for those organisations able to do so. If you’re representing such an organisation and really want to hire skilled human testers, please take the time to construct better tester job ads! Making it sound like you understand the value of testing is likely to attract those really good testers who might have come back into the market thanks to COVID-19.
Remember that a good tester should be a strong critical thinker and will be testing your ad, so catch your own “red flags” before they do!