Why converting test teams to automation is a challenge

Originally published on TechBeacon, 1/24/2018

Paul Merrill

CEO, Test Automation Consultant, Beaufort Fairmont

 

In 10 years of test automation consulting, I have never seen the successful wholesale transition of a team of testers into automation engineers.  Some approaches for moving toward "test automation first" do work, but converting entire teams of hands-on testers to test automation engineers is a fool's errand. Yes, you may have a few people who can make the cut, but you'll probably need to bring in experienced automation engineers to build up the new team.

Here's why such attempts usually fail, and a few strategies you can use to transition the right individuals from hands-on testers to test automation engineers.

 

 

The automated-test vision

You might be considering an initiative to transform your team. Or perhaps you have tried before, unsuccessfully, and are ready to give it another go.

After all, you're constantly hearing about how Microsoft and Google have only software development engineers in test (SDETs) as testers. There's even a book about it: How Google Tests Software. In it co-author James Whittaker lays out his vision:

Google tests and releases hundreds of millions of lines of code distributed across millions of source files daily. Billions of build actions prompt millions of automated tests to run across hundreds of thousands of browser instances daily. Operating systems are built, tested, and released within the bounds of a single calendar year. Browsers are built daily. Web applications are released at near continuous pace. In 2011, 100 features of Google+ were released over a 100-day period.

Executive love the time and labor savings that test automation promises. So why not convert all testing to test automation? Some companies have proved that a test-automation-first strategy can work, but the transition to this strategy fails in most organizations.

The leaders' approach

Google and Microsoft have a secret: They don't convert testers into automation engineers. Instead, they hire the very best test automation engineers from the get-go.

I’m familiar with their SDET hiring processes, and the interviews are as difficult as for any software engineering position. They want to guage a candidate's ability to write algorithms, and applicants must demonstrate a clear understanding of computing concepts and algorithmic performance. 

Genesys is another company that has successfully transitioned to continuous testing through test automation."We have been careful to hire people who are great to work with and brilliant," said TR Buskirk, senior director of QA at the customer experience software provider.  A learning culture is vital, he adds. "The rate of change in technology is only increasing, so we need to be constantly learning and improving."

The difference between us and them

These companies filter talent at the door. They make sure candidates possess raw aptitude, base-level skills, and experience prior to hiring.

Contrast that with your team. Be honest: How many of your testers know how to code? And how many have had an opportunity to learn coding already but didn't pursue it? How many people on your team understand algorithmic performance? How many can solve difficult logic problems like those asked in interviews at Microsoft and Google? Not too many, I'd wager.

Bad conversions yield bad code

Attempt to convert testers with no coding experience into test automation engineers, and you'll wind up with entry-level software engineers with no experience. And software developers will tell you that you don't want too many entry-level folks on your team. You need varied levels of experience and leadership.

What's more, some of your converted testers will probably be bad at coding. Remember how many people dropped out of MIS and computer science programs in school after finding out they were just not good at writing code? Don't expect a better result with your team.

Unfortunately, bad coders do worse than produce zero for a team; they produce zero and create negative production when the best programmers on the team have to fix their mistakes. And as this is happening, normal testing still needs to be done.

Testers who try to convert too quickly tend to produce convoluted, copy/pasted code that's buggy, inefficient, and creates maintenance nightmares.

To make matters worse, producing code like this for some time is unavoidable. Novice programmers don't understand what duplicate code is. They don't understand the negative impact, the importance of avoiding it, and the maintenance nightmare they're creating because they don't have the experience yet.

New test engineers don't test their own code

Junior developers and test automation engineers usually don't test their own code, for several reasons. 

  • They want to be done and move on to coding the next cool thing.

  • They may not understand the importance of high-quality products.

  • They don’t know how to write code that works or how to test it.

It is easy to assume that testers would be different. They aren't. Given the choice between production code with bugs in it and test code with bugs in it, which would you choose?

If the test code is buggy, you don't know that you have bugs in the production code. So you want your best code helping you understand the state of the product. With bad code, you introduce more risk and time into the development process.

The wrong motivation

Do you know which programmers end up being the most productive? The ones who would program even if it wasn't their job. They have an intrinsic motivation, and it is powerful.

You may have a mutual understanding with your testers that the job has changed and that they must learn to code "or else." That threat works for a while, but it won't last as long or be as powerful as it is in someone with an intrinsic motivation.

The testers I've helped convert into test automation engineers were all motivated to learn test automation on their own. 

Costs can run high

If you decide to train an entire testing department to code, know that it's going to take a while.

I had to earn a computer science degree and worked for two years as a programmer before I felt truly proficient. And it was another eight years before I could knock out a task without much trouble.

Think about that. A degree plus about 4,000 hours of coding to feel proficient. Are you going to send every tester to college for a computer science degree? Are you going to have testers program for two years to gain a professional level of proficiency? And how are they going to continue learning—and from whom, if all their peers are novices too?

Another strategy is to have your developers train the testers, but this presents opportunity costs. If your developers are training testers to be coders for test automation, what are they not doing? Furthermore, if testers are busy learning to program, who is doing the testing?

For each of these strategies, creating blue-sky projections is easy. You just assume that new test automation efforts will bring a return on the time invested in training. But that's not going to happen. It is difficult enough for most groups to create test automation for new features, much less for existing regression test suites.

Either you have to hire more testers to do your existing testing, or you choose not to do some of the testing. But if the problem was that you weren't getting enough testing out of your testers to begin with, you'll get even less testing out of them as they pursue this path. The effort will increase your payroll and training expenses while  slowing down product delivery.

But all is not lost

There are transition strategies that work, and you can plan them. You need clearly defined objectives and realistic expectations. You need realistic cost, opportunity cost, budget, and timeline estimates. And you need to do a realistic assessment of the capabilities, interests, and motivations of each person on your team.

You need safe transition plans for testers who want to continue to test without learning to code, with buy-in from testers and executives alike.

If you want to move into test automation after reading this, I'm here to help. I'm not saying individual testers can't do it. But the blanket transition of entire teams rarely works. 

 

Paul Merrill