The Rational Tester represents IBM Rational at a lot of trade shows and conferences. At these events, one of the top 10 questions that I get asked about is “How do I start a test automation project ?” or “What’s your recommended approach for implementing test automation ?“.
In one sense, it’s a very difficult question. In order to properly answer the question, we really need to first gather a ton of contextual information: about the testers involved, the size, scope and complexity of the project, various office dynamics, relationship with engineering, process orientation, etc… Effectively, it would take a 30 – 60 minute interview to gather enough information in order to formulate a proper, tailored answer for each individual.
Alas – rarely is there time at such events to dive so deep. However, there is a general path to test automation success that can serve as a generic blueprint for most teams.
Start by automating the smoke test
The smoke test is the set of tests that are run immediately on upon receipt of each and every build of the application. They are the core set of tests that validate the general functionality of the build. As an example, a smoke test for Microsoft Word might involve: Launching the product, creating a basic document, saving and printing. Maybe that’s too simplistic, but you get the concept.
The basic idea is really to validate whether or not you have a good build more than to assess the quality of the application. So much time can be wasted installing & uninstalling when you have a bad build, a smoke test can really return quick ROI (Return on Investment). Smoke tests are also generally simple tests which are independent and can be run in any order. Most importantly, these are tests that are run frequently – perfect candidate for automation.
Build a basic regression suite
A regression suite is a set of tests which validate that, after a given application update, “what used to work, still works”.
If you re-read the last paragraph on smoke testing, you’ll discover the foundation for building a good regression suite. Good candidates to include in this suite, which in one way of thinking is just a broader, deeper smoke test are:
- Tests that validate core application functionality
- Tests that are independent
- Tests that validate stable areas of the application
Focus first on tests that validate core application functionality. Tests that validate obscure areas of the application are not your target. Do the important stuff first. Remember you’re trying to validate that what used to work still works. Focus on the things your users will do most.
Making tests independent is a good idea. By independent, I mean tests that do not require, or at least require few, tests to be run before them. If Test C, requires that Test A and Test B must be run before it, it’s not independent. That’s not a horrible thing, but if your new to automation, keep it simple by making independent tests – that way they can be run in any order. You’ll want that later, as it will give you flexibility. If you really want to think ahead, you can make Test C check it’s state at the beginning of the test, and if it’s not at the right place, run Test A & Test B to get it to the right place. That’s a little trickier – but an even better strategy. That’s a whole article on it’s own – so I’ll save that for another day.
Finally, automate stable areas of the application. If the application interface and logic is still undergoing frequent changes, it’s not a good candidate for automation. You’ll spend too much time doing test maintenance, and not enough time running tests. As a basic starting recommendation, don’t automate any functionality for the “new” release. Automate only the functionality that existed in your previous release. Do manually testing for new functionality. Once that release gets out the door, or the functionality becomes stable, then automate – but not a second before.
Each of the above points is probably worth a deeper dive. I’ll see if I can attempt that in future postings. For now, hopefully that’s enough to get you started.