Blog Viewer

A follow up to Testing, Testing, 1, 2, 3

By Mike Chambers posted 16 days ago


Mike Chambers: Hey, Brooke! We get to write a blog post together!
Brooke Barton: Awesome! What are we going to blog about?
MC: Testing, of course!
BB: Ha, ha! Of course we’re gonna blog about testing! I had so much fun presenting "Testing, Testing, 1, 2, 3" at Virtual Alliance last month. I think the topic really resonated with a lot of folks. Testing comes with a unique set of challenges that can seem a bit daunting. Our presentation focused on collaboration as a way to share knowledge and to promote adoption of the changes to the system.
MC: And for all the project managers out there, we had practical advice on streamlining the test plan and maximizing efficient use of resources.
BB: Hey, Mike, I’m thinking that not everyone was able to participate in Alliance this year, so maybe it would be a good idea to cover some of the high points of our presentation.
MC: We are on EXACTLY the same page, Brooke!
BB: Let’s talk about the core message in our presentation: how collaboration among team members can be useful for testing systems.
MC: I’m in! How do you get people to collaborate on testing?
BB: The most important step to make collaboration possible is to get buy-in from directors and supervisors.
MC: Laying the groundwork is so important. One way to do this is to approach leadership in your organization and get their acknowledgment of the problem. An easy way to do that is to use fact based arguments that are so apparent they don’t need a lot of proof. Like “the testing doesn’t get finished in the 2 weeks it’s scheduled for.” Or “our team is demoralized and exhausted from the constant cycle of testing every 3 months.” Or “testing takes so long, people don’t have time to do it AND their day jobs.” Once leadership acknowledges these things, the next step is simple. You just ask “Will you support our efforts with your team members to reduce their testing time, increase their satisfaction of testing, and introduce benefits like cross-training and on-boarding?”
BB: So, let’s say leadership agrees, but it’s a different ball game for the folks who actually do the testing. You may start by asking them to acknowledge that testing creates more work for them and that some testers might end up struggling to balance day to day responsibilities with their testing assignments.
MC: Yes! When it comes to gaining buy-in, acknowledgement is a key ingredient. Another key ingredient is reassurance or “approved flexibility.” It’s important for leadership to understand -- and be okay with -- the fact that, while testing is taking place, priorities shift and the normal work items may need to be postponed. Doing collaborative testing should not mean that a 40 hour work week becomes a 50 hour work week.
BB: Acknowledgement, understanding and flexibility are all important. I absolutely agree. I’m curious, what do you recommend for situations when timelines are tight and the pool of testers is limited?
MC: Each situation is different, so there’s no “one size fits all” answer. This can be a good time to brainstorm with your team and get creative. To get you all thinking of possible solutions, consider questions like:

  • What opportunities does this present? Could we test AND cross train employees, for instance?
  • Who might we ask to cover the day to day tasks normally assigned to someone who is essential for testing?
  • Which work tasks could be paused while testing is taking place? What’s not essential that could be eliminated, even if it might mean doing the work later?
  • What adjustments can happen internally so testers don’t feel burdened by their normal work tasks while testing is taking place? For instance, is there a weekly report that can be made monthly? Is there a team meeting that can be re-scheduled?
  • What does the team need in order to succeed at this?
  • What can be done to make the testing more efficient?

In the spirit of acknowledgement, I want to acknowledge that evaluating internal processes and devising ways to incorporate testing without adding stress to the team can mean a lot of work up front. But if the goal is to create collaborative testing while reducing testing fatigue, the efforts are well worth it. They’re worth it because they pay off now AND with testing cycles in the future.
BB: Mike, during Alliance we also talked a bit about prioritizing the “nice to haves” vs. “must haves.” What suggestions do you have for teams that currently test “everything under the sun” and thus have a longer testing cycle?
MC: This is where the magic can really happen! I suggest that the team consider the following question:

  • What’s the worst that could happen if some items currently being tested are not tested in future cycles?

If the answer is “End users will have to use a workaround process if X, Y or Z breaks” the team then gets to consider which is the lesser of two evils -- meaning, does it make more sense to spend months testing ad nauseum OR does it make more sense to prioritize testing so only the items deemed critical are tested? Each organization needs to weigh the risks and make the best decision for their situation.
BB: It sounds like this might be a good opportunity for a facilitated conversation and maybe even a pros and cons list.
MC: Yep! That approach worked for us, so I’m thinking it might be a good fit for others. Get the team together, get everything out on the table (or whiteboard) and talk through it. At the end of the day, the team might agree there is value in testing everything during each cycle or they might reach consensus about where to cut some tests out of the plan.
BB: What happens if the group comes up with a lot of great ideas that just aren’t practical?
MC: So glad you asked that, Brooke! When you have the facilitated discussion, just be sure to clearly state the decisions that your leadership has already made and the goals they want to achieve. This not only frames the discussion, it keeps things real. The job of the facilitator is to keep the focus on what the team can do to make these decisions and goals actually happen.
BB: I get it. If the conversation is about “How do we complete all testing in one day?” you’ll get more relevant ideas than if the conversation is about “What ideas do you have to make testing easier?”
MC: So, I’m looking at the word count on this blog post -- we just passed the 1000 word milestone. I’m a project manager, so I love hitting milestones.
BB: Well, I’m a process improvement analyst, so I’m all about making sure we push ourselves just a little bit more to make things better, and I think our conversation will do that for people.
MC: One way to know for sure is by getting a bunch of comments and questions from readers. So, Brooke, what number of comments would indicate to you that people got something out of this?
BB: Let’s shoot for the moon... a thousand comments. Or maybe a million! :)
MC: I hear ya. Let’s start with 10, okay?
BB: That works. Let’s say that if we get at least 10 comments, we’ll do another post. Deal?
MC: Deal!




13 days ago

This is an awesome post and summarizes the great content of the Alliance session so well! 

I love Nicole's addition topics.  Perhaps they might indulge us with a webinar or community event in the coming months. 

Testing is such a rich topic.  I worked at a major corporation who had a dedicated Quality Assurance Team in IT, it was awesome.  Of course, in Higher Education, we don't have the luxury of tons of resources, so anything we can do to streamline while still ensuring full coverage of testing is always a welcomed opportunity.

15 days ago

OK - other people had better start commenting!!! :-)

I'd really like to hear more from you two about:
1 - reviewing, testing, and rolling out new features.  It's one thing to test pages/processes that many are familiar with - but when you've got a new page/process: how do you succeed in planning for that?
2 - can you talk about the challenges in coordination when everyone is remote?  How well does the zoom room work?
3 - I'd also like to hear more about what post-test review looks like.  How do you improve your plan?  Who do you review with?  Do you have a business process list that gets added to (and then test scripts are created from) for new processes?