It is very surprising to hear how many "test professionals" engage in software testing without a test strategy. Often we come across QA professionals preaching that they will test everything to ensure quality and bug free systems. Digging into the idea of testing everything, it is common to observe that many test efforts are focused on working with business users to create detailed business oriented test scripts - happy path testing with limited success. Usually, hundreds of happy path test scripts are created, each equally valued by the business users, without any consideration of their importance - little to no risk analysis or traditional negative or exploratory tests. We discover, during the test execution stage, that not all test scripts created are executed and all the system defects that are discovered come late in the testing schedule (very often after the delivery of the product). The net result is that you end with poor quality or are left wondering why QA efforts have returned low value. The defects reported, late in the testing cycle, catch the entire team unprepared to resolve the issues and there aren't enough resources to deal with the defects and lack of time. Does this story sound familiar? Are you encountering this behaviour-based approach on a daily basis? Do you find yourself asking for more help, and if you are engaging vendors, aren't they always there to help you and take your money? Keep Reading!Our objective as test professionals is to discover system or application issues as quickly as possible so that we allow sufficient time for resolution. We fail our mission if we report the majority of system defects during the tail end of testing, thereby repeating the pattern above. We can break this pattern by having a well thought out strategy in place to deal with the situation at hand. We can turn to Sun Tzu and apply a lesson learned ages ago in his work "The Art of War". "To secure ourselves against defeat lies in our own hands." Sun Tzu. We look at Sun Tzu's work as application and system testing is like warfare. We are required to coordinate resources to meet our testing objective. 1. Software testing, like warfare, depends on people and tools, as well as dealing with the effective use of time and resources. 2. Software testing is based on some assumptions, just like warfare. 3. In both, parallel activities are happening. For example, in war, a number of different battles take place concurrently; and, similarly, in testing, several types of testing and execution of different test cases or scenarios occur simultaneously. 4. Leadership, the environment, communication skills, and the support of all individuals involved, be they privates or testers, have a big impact on success. 5. Doctrines of careful preparation have become fundamental approaches to military and software testing. In this article, we will map a few key strategies from the book The Art of War by Sun Tzu and how they relate to software testing. We will try to apply Sun Tzu's ideas to the world of software testing in a way that is readable, useful, and practical. Using the principles of Sun Tzu, we will be able to avoid failure and meet our testing objectives. It is important to discover system or application issues as quickly as possible so that we allow sufficient time for resolution. This article is intended to help you as a testing professional to open the door to sound test strategy and planning, which we believe are the keys to success. According to Lionel Giles, whose translation of The Art of War which is freely available on the web, Sun Tzu said: 1) "By method and discipline are to be understood the marshaling of the army in its proper subdivisions, the graduations of rank among the officers, the maintenance of roads by which supplies may reach the army, and the control of military expenditure." · In software testing, a good organization of the team, a clear understanding of tasks, well-maintained test cases and scripts (i.e., the roads analogy above), and control of testing expenditure are vital for successful project delivery. 2) "According as circumstances are favorable, one should modify one's plans." · In software testing, think of writing and creating a test plan as a plan of actions. Planning without action is a waste of time. If you act without planning, you can end up anywhere. While test strategies remain constant during the project, tactics must be adapted to each new situation. Being successful in testing or in war requires simultaneous planning and action. The initial test-planning phase is important, but too much planning can also be disastrous (agile development, taken directly from Sun Tzu's book). Any test plan must be designed to allow for easy implementation of changes. 3) "All warfare is based on deception" - interesting and true. · In software testing, be cautious with test automation, a common trap of automated test scripts that run for hours without doing anything. The key to success is to start with a limited number of automated test scripts that are well designed, need to be run often, and have easy maintenance. Look at your ROI and it will guide you to the next step and to the final victory. 4) "Attack when opposition is unprepared, appear where you are not expected." · In software testing, you should start testing in the most vulnerable areas of the system or application in order to uncover showstoppers and defects that must be fixed first. Plan for exploratory testing. 5) "Military devices, leading to victory, must not be divulged beforehand." · In software testing, as a tester, you should have, and always be trained to use, alternative testing tools and methods that are prepared in advance to use as contingency. As you can see from the examples above, war and testing have a lot in common when it comes to planning and strategy. In war, let your objective be victory, not lengthy campaigns, and the same applies to software testing. Here are a few concepts to consider when test planning, in order to save time: · Improve communication between developers and testers. · Reduce the number of cycles for fixing defects. · Thoroughly describe the defect and the steps necessary to reproduce it. · Use different automated tools where it is possible (this is usually highly effective). · Have a contingency plan with all available alternative testing methods. · Improve time management and leadership practices. Let's spend a few moments on time management and leadership. When we refer to time management, we are talking about testing early. Find critical issues quickly so your development teams have a chance to resolve these issues, hopefully with permanent solutions. Rapid decision-making produces rapid test execution, so review and streamline as many decision-making points as possible in your project. In general, software testing leaders must have profound knowledge of testing and theory as well as hands-on testing experience. Testing tactics begin with execution and are modified during the testing period. Methodology is only theory and it is experience that will allow you to solve problems in difficult testing situations. We believe this is a key ingredient for success, which is nicely expressed by these three gentlemen: · "When I give a minister an order, I leave it to him to find the means to carry it out." - Napoleon Bonaparte · "What you cannot enforce, do not command." - Sophocles · "Don't tell people how to do things, tell them what to do and let them surprise you with their results." - George S. Patton We hope this article has provided you with some basic fundamentals to test planning. As a recap: 1) After discovering the weak areas of your development environment, begin by testing the most critical areas of the system and finding the key issues quickly. 2) Target vulnerable segments in the system or application and test (attack) there first. 3) Define test scenarios around key customer functionality that evaluates system algorithms and then tests them. 4) Test at the boundaries between different systems or applications. Once you have completed your first round of testing, you may apply the Pareto 80/20 principle to continue your testing mission. As a test leader, you may be required to change tactics if your first round produces very few results or simply does not meet your expectations. Unfortunately, too often, test managers stick with the original plan and spend valuable testing time without finding system defects. Don't be one of them.
SourcesSun Tzu (c. 544BC - c. 496BC) was an ancient Chinese military strategist, philosopher and author of The Art of War, an influential book on military strategy, which is applicable to military thinking, business tactics, and software testing and development. It also works well as a general guide for solving a variety of problems. In 1998, Gerald Michaelson translated Sun Tzu's strategies for the business world in his book titled The Art of War for Managers, another great source of valuable strategies. Thanks to Project Gutenberg, you can now download Sun Tzu's The Art of War for free from: http://www.gutenberg.org/files/132/132.txt About the AuthorsJoe Larizza, CSQA, is the QA Manager for CPP Investment Board, and previously held the positions of Director, QA for Loblaw Companies Inc., and Senior Manager, QA with RBC Dexia. He also has held support management roles with International Financial Data Services and its sister companies. During his career, he has undertaken a number of strategic initiatives, including the expansion of testing programs and establishment of testing standards and procedures, implementation of a Quality Metrics program, evaluation of an IT Division against the Capability Maturity Model, and implementation of Automated Testing using the Behaviour Model and Data Driven scripts. Mr. Larizza has earned a reputation for competency and excellent leadership ability in the field of quality assurance and testing of software. Joe Larizza is President of the Board of Directors for the Toronto Association of Software Quality and volunteers for the Quality Assurance Institute of Canada. He is a Certified Quality Analyst and holds a Bachelor of Arts Degree in Economics, and the Canadian Securities Course. Alex Samurin, Software Tester and co-owner of Ersasoft (Toronto, Canada). He graduated from Leningrad (St. Petersburg) Polytechnic University and holds the equivalent to a Bachelor of Science Diploma in Technical Science. He has many years of international experience in system Research & Development, Information Technology, Quality Assurance and testing across various industries. For more information, please visit his website at www.extremesoftwaretesting.com
2010-2011 Toronto, Canada
I like the idea of applying ancient texts to modern situations. Personally, I prefer the lessons in the Book of Five Rings, since it's an ode to context-driven testing.
In warfare, two forces are in conflict. The goal for each is basically the destruction of the other. How exactly is that going on in testing? Certainly there are conflicts and tradeoffs, but there is no attempt to "destroy" or extinguish an opposing interest. So, talking about war seems a bit gratuitous and unnecessary. The parts that map to testing are pretty much the generic matters of solving complex problems with groups of people, not fundamentally anything about war.
|"All warfare is based on deception" - interesting and true.|
Okay, why is warfare based on deception? Well, because with deception you can inhibit your enemy's ability to oppose you. You can increase the probability of your attack succeeding, or discourage your enemy from attacking you.
There's no "enemy" in testing, that I can see. Are bugs the enemy? Not really. Bugs aren't agents that are trying to destroy us. And we certainly aren't trying to "deceive" the bugs, are we?
Perhaps we can apply it this way: we get fooled by bugs because we deceive ourselves. The purpose of the tester is partly to protect us from deception. Hence the testers serve the purpose of lone scouts or a cavalry brigade (General Lee sorely missed his wayward cavalry at Gettysburg). We need to get ground truth in order to plan our development.
|In software testing, be cautious with test automation, a common trap of automated test scripts that run for hours without doing anything. The key to success is to start with a limited number of automated test scripts that are well designed, need to be run often, and have easy maintenance. Look at your ROI and it will guide you to the next step and to the final victory.|
Automation is most often a sort of Maginot Line. The bugs just drive right around it, and you can't turn the guns around.
I think that that value of automation is in the projection of power combined with mobility: Think Agincourt and the role of the English archers. I design my tools to be nimble extensions of testers (portable missile launchers), rather than fixed gun emplacements.
I create cheap, throwaway, utilitarian tools. And the Americans in WWII never tried to copy the German approach to making tanks (big, powerful, expensive), because the Sherman tank was versatile, cheap, and available in large numbers. Also, they had other advantages such as close air support to draw upon.
| "Attack when opposition is unprepared, appear where you are not expected."
In software testing, you should start testing in the most vulnerable areas of the system or application in order to uncover showstoppers and defects that must be fixed first. Plan for exploratory testing.
One of the charms of ET is you don't necessarily have to plan for it. You prepare for it, perhaps. You prepare for ET by training your troops to improvise. ET is like the Army Rangers or Seal Team 6. Train and train and train. Then when the president calls you can go anywhere and do anything.
As for striking when the opposition is unprepared, okay I see your point. One thing I would add is that we use a lot of de-focusing in testing: to find bugs we must keep our testing fresh. We must test in new "unexpected" ways. We need to avoid getting into a rut. That's why submarine commanders do the "crazy Ivan" maneuver (see Hunt for Red October)
| "Military devices, leading to victory, must not be divulged beforehand."
In software testing, as a tester, you should have, and always be trained to use, alternative testing tools and methods that are prepared in advance to use as contingency.
Prepared in advance? I don't quite understand this advice. If you mean what I was talking about above, okay. But that's not clear. It sounds like you are suggesting that testers prepared additional unnecessary test tactics that are only used... when? As a contingency? What sort of contingency?
What does any of that have to do with not divulging your military devices? Seems to me that this statement just goes back to deception. Anyway, I don't see a reason to keep our testing secret. The military metaphor doesn't track with testing very well in this case.