An Ingredients List for Testing - Part One
Friday, August 20, 2010
By James Whittaker
Each year, about this time, we say goodbye to our summer interns and bid them success in the upcoming school year. Every year they come knowing very little about testing and leave, hopefully, knowing much more. This is not yet-another-plea to universities to teach more testing, instead it is a reflection on how we teach ourselves.
I like to experiment with metaphors that help people "get it." From attacks to tools to tours to the apocalypse, I've seen my fair share. This summer, I got a lot of aha moments from various interns and new hires likening testing to cooking. We're chefs with no recipes, just a list of ingredients. We may all end up making a different version of Testing Cake, but we better at least be using the same set of ingredients.
What are the ingredients? I'll list them here over the next couple of weeks. Please feel free to add your own and I'll hope you don't steal my thunder by getting them in faster than I. Right now I have a list of 7.
Ingredient 1: Product expertise
Developers grow trees, testers manage forests. The level of focus of an individual developer should be on the low level concerns of building reliable and secure components. Developers must maintain intellectual mastery from the UI to low level APIs and memory usage of the features they code. We don’t need them distracted and overwhelmed with system wide product expertise duties as well.
Testers manage system wide issues and rarely have deep component knowledge. As a manager of the forest, we can treat any individual tree abstractly. Testers should know the entire landscape understanding the technologies and components involved but not actually taking part in their construction. This breadth of knowledge and independence of insight is a crucial complement to the developer’s low level insights because testers must work across components and tie together the work of many developers when they assess overall system quality.
Another way to think about this is that developers are the domain experts who understand the problem the software is solving and how it is being solved. Testers are the product experts who focus on the breadth of technologies used across the entire product.
Testers should develop this product expertise to the extent that they cannot be stumped when asked questions like "how would I do this?" with their product. If I asked one of my Chrome testers any question about how to do anything with Chrome concerning installation, configuration, extensions, performance, rendering ... anything at all ... I expect an answer right away. An immediate, authoritative and correct answer. I would not expect the same of a developer. If I can stump a tester with such a question then I have cause for concern. If there is a feature none of us know about or don't know completely then we have a feature that might escape testing scrutiny. No, not on our watch!
Product expertise is one ingredient that must be liberally used when mixing Testing Cake.
Each year, about this time, we say goodbye to our summer interns and bid them success in the upcoming school year. Every year they come knowing very little about testing and leave, hopefully, knowing much more. This is not yet-another-plea to universities to teach more testing, instead it is a reflection on how we teach ourselves.
I like to experiment with metaphors that help people "get it." From attacks to tools to tours to the apocalypse, I've seen my fair share. This summer, I got a lot of aha moments from various interns and new hires likening testing to cooking. We're chefs with no recipes, just a list of ingredients. We may all end up making a different version of Testing Cake, but we better at least be using the same set of ingredients.
What are the ingredients? I'll list them here over the next couple of weeks. Please feel free to add your own and I'll hope you don't steal my thunder by getting them in faster than I. Right now I have a list of 7.
Ingredient 1: Product expertise
Developers grow trees, testers manage forests. The level of focus of an individual developer should be on the low level concerns of building reliable and secure components. Developers must maintain intellectual mastery from the UI to low level APIs and memory usage of the features they code. We don’t need them distracted and overwhelmed with system wide product expertise duties as well.
Testers manage system wide issues and rarely have deep component knowledge. As a manager of the forest, we can treat any individual tree abstractly. Testers should know the entire landscape understanding the technologies and components involved but not actually taking part in their construction. This breadth of knowledge and independence of insight is a crucial complement to the developer’s low level insights because testers must work across components and tie together the work of many developers when they assess overall system quality.
Another way to think about this is that developers are the domain experts who understand the problem the software is solving and how it is being solved. Testers are the product experts who focus on the breadth of technologies used across the entire product.
Testers should develop this product expertise to the extent that they cannot be stumped when asked questions like "how would I do this?" with their product. If I asked one of my Chrome testers any question about how to do anything with Chrome concerning installation, configuration, extensions, performance, rendering ... anything at all ... I expect an answer right away. An immediate, authoritative and correct answer. I would not expect the same of a developer. If I can stump a tester with such a question then I have cause for concern. If there is a feature none of us know about or don't know completely then we have a feature that might escape testing scrutiny. No, not on our watch!
Product expertise is one ingredient that must be liberally used when mixing Testing Cake.
" If I asked one of my Chrome testers any question..."
ReplyDeleteDoes this mean that EVERY Chrome tester should be able to answer EVERY question, or that there should exist A Chrome tester for EVERY question?
I suppose the key idea is that WIDELY is more critical than DEEPLY to a tester. For example, a Chrome tester should know every question (or most questions) about how to use Chrome from the view of end users.
ReplyDeleteI agree with your post. However, this implies that the same person may not be able to play both roles. This is against much of the current thinking in agile - where team members are supposed to be able to contribute based on the need - developers doing testing and testers doing development.
ReplyDeleteJac, I think every Chrome tester should be a slam dunk product expert. Everyone will have features they understand better than others, but I want them all to be expert on the entire breadth of functionality the product offers.
ReplyDeleteThe ideal situation would be that if any of my testers moves to another team, we wouldn't miss them (at least professionally).
ReplyDeleteHi James, its an amazing post. I learned to believe that a tester is more like a generalist. Tester should be aware of all the aspects of the application under test and should be ready to talk about it even at 4:00 AM in the morning.
ReplyDeleteThanks for sharing the ingredient 1 i.e. Product expertise
"Chrome tester should be a slam dunk product expert" - I agree.
ReplyDeleteHow about chrome testing manager's level of product expertise ? Should I have to expect the same level of expertise from him ??
Hi James
ReplyDeleteThanks for the post.
As you said a testers should be product experts who should be good in technology and business but the challenge is how will you develop these Test product experts? There is huge crowd of so called Testers who doesnt even know a single programming langauge. Even If you find one they dont want to be in testing and always want to be a programmer as Testing is not considered part of software development(hope not in Google atleast :) ) and its a thankless job.
"The ideal situation would be that if any of my testers moves to another team, we wouldn't miss them (at least professionally)."
ReplyDeleteThat puts on the table how Google testers share they knowledge?
Do you encourage them to talk about product also during their pool games, ps3 sessions, do they have to take lunch or coffee (smoke) break together?
How do testers share knowledge except the written stuff (product wikkies and other forms of written data)?