Page 191 -
P. 191
The primary reason people typically shortchange their test environments, or do not per-
form adequate performance tests, is money. It is expensive to perform these tests, and
often difficult to convince senior managers that they need to double their hardware bud-
get. The project manager must fight for adequate resources for performance tests, and plan
this hardware need from the very beginning (rather than tack it on at the end). The truth
is that if it is too expensive to set up an adequate test environment, then the organization
simply cannot afford to produce this software and should seek another solution.
Smoke Tests
A smoke test is a subset of the test cases that is typically representative of the overall test
plan. For example, if there is a product with a dozen test plans (each of which has hun-
dreds of test cases), then a smoke test for that product might just contain a few dozen test
cases (with just one or two test cases from each test plan). The goal of a smoke test is to
verify the breadth of the software functionality without going into depth on any one fea-
ture or requirement. (The name “smoke test” originally came from the world of electrical
engineering. The first time a new circuit under development is attached to a power source,
an especially glaring error may cause certain parts to start to smoke; at that point, there is
no reason to continue to test the circuit.)
Smoke tests can be useful in many scenarios. For example, after a product has been tested,
released, and deployed into production, a configuration management team member may
manually run through a smoke test each time a new installation of the software is put in
place at a client, in order to ensure that it is properly deployed. Another good use is to
allow programmers to judge the health of a build before they give it to the software testers
for testing: once a product has passed all of its unit tests, it may make sense to install the
build and manually run the smoke tests, in order to ensure that it is ready for testing.
Unfortunately, smoke tests are often abused by senior managers or stakeholders who are
impatient for the software to be complete. Typically, they will learn of a reduced battery of
tests that takes very little time to run, but will fail to understand how these tests differ
from the complete regression tests that are normally run. Suddenly, there is a new option
that doesn’t take very long. The project manager will start seeing requests to cut down the
testing tasks by substituting the smoke tests for actual tests.
What’s worse, the deployment scenario, in which a new deployment is verified with a
smoke test, will be abused. The idea behind the deployment scenario is that no changes have
been made—the smoke test is simply to help verify that the act of deployment has not acci-
dentally broken the environment. (It’s not uncommon for a complex deployment environ-
ment to have slight configuration or network differences that can break the software—or for
someone to leave a network cable hanging!) The smoke test is there for the people responsi-
ble for deploying the software to be sure that the installation was successful. If, however,
changes are made to the environment or (even worse) the code in production, the smoke
test will almost certainly fail to uncover the problems that have been introduced.
SOFTWARE TESTING 183