User Stories and 5 Why Analysis


Today I stumbled upon a very useful technique I felt warranted sharing. I am in the process of drafting a blog around the frailty of introducing Agile techniques to a waterfall team. I’m confident I’ll post that either this weekend or next. In the meantime, one thing I have learned from you, the blog reader, is that you tend to appreciate and forward on useful techniques to others when they are practical solutions to present day problems. May this help you.


My love/hate relationship with User Stories


There are a lot of great User Story resources. My absolute favorite is Mike Cohn’s site, here. The term “User Stories” is often overloaded, so I recommend visiting Mike’s site to make sense of the rest of the post. My view is compatible with his (if not a blatant overlap).

One of the first things I try to do when I am helping a team shift to Agile is to teach them to stop scheduling workitems and, instead, start delivering outcomes. This approach enables teams to decouple the problem they are trying to solve from the solution they currently favor. There a lot of benefits to doing this. Not in the least is the ability to define DONE in terms of value added to the customer. There are other techniques to defining Done up front (such as ATDD), but I have found that User Stories is really palatable to newbies trying out Agile.

But even User Stories have problems. They can be hard to construct. When I know I need, say, a new report created, it is just easier to write “Create new report” on the ticket and place it on the task board in the backlog. They can be rather verbose and hard to communicate succinctly. “As a decision maker for the release, I want to see fresh execution reports online, so that I can weigh-in on readiness armed with the right data to make an informed choice“. The friction created in the longer format makes the shorter format very alluring. Especially for those who haven’t yet experienced the value of the longer version.


The problem

For folks from waterfall world, they get the workitem approach. “Create new report” seems easy to understand and execute on. Today’s problem came from one of the testers on the team, who quite honestly, had gotten tired of me complaining about Scope Creep in their “stories”. As the Agile Master for the team, I push hard on making sure we are maintaining a high, consistent and predictable velocity. Scope Creep makes this very difficult, causes delays, and creates the potential of significant waste of effort incurring within the system. My team is using Lean and Kanban. We do not timebox our iterations, but each story has a 2 week SLA. The story this tester was working on was about some tooling we are creating. They were coming to me to let me know that “the design has changed again” and wanted to know what to do about it. The ticket was already passed its 2 week SLA.
In addition, the ticket was similar to “enable performance thresholds”. IE. It was ambiguously worded and it was entirely unclear when we would be done. I had warned of this before, but my style of Agile Mastering is to let teams make the decision and the mistake in order to enable learning, so I let it stick.

The solution

It is insufficient to point out what not to do. If you want folks to learn, tell them what to do instead. Here I suggested that the problem with the ticket was that Done was not clear. I said use a User Story instead… now as well as in the future. This particular tester had a hard problem with that. Even after explaining Stories. They could not pivot the workitem into a story. It was just unnatural for them to think of the outcome they needed. Unfortunately, this a far too common experience for me.

However, during a rare flash of total insight, I fixed this for them, by throwing in SixSigma’s Why Analysis technique. Primarily used to determine root causes of things, basically, the way it works is you simply ask ‘why’ 5 times.

The dialogue

I started off: “Ok, let’s try a different thought process. What if I were to tell you I see no value in this “enable performance thresholds” task, so I am going to cut it. How do you feel about that?”

Tester: ” I hate that.”

Me: “Why?”

Tester: “Because we need it”

Me: “Why?”

Tester: “Because dev needs it”

Me: “Why?”

Tester: “So they can decide if the product is good or not”

Me: “so what you are saying is that ‘as a dev on this team, you want performance thresholds enabled, so that you can decide if the product is good or not’?”

Tester: “yes”

Me: <blank stare>

Tester: “ooooohhhhh!!!!!”


We then talked about how adding additional why’s adds precision to the outcome desired and clarity around Done, while in most cases, keeping the implementation decoupled from the outcome. In addition, we talked about how to determine when the story is too vague (likely an Epic) and needing to be broken down into smaller stories.

I will see how well it plays out over the coming weeks, but the tester, at least, believed they would be able to confidently break stories into smaller outcome-based stories and with that defend against scope creep while still handling undiscovered stories in an Agile fashion.