Retrospective Storytelling, a #RFG19 Session

It’s been a long while since I wrote… Well that is not entirely true; I’ve been writing a considerable amount on Excella’s Insights. It’s about time I returned to a post on my own blog.

I had the honor of attending the Retrospective Facilitator’s Gathering this past week with 19 others. I was invited by my friend George Dinwiddie. He invited me several years ago and checked in most years, but it just never worked out until this year. This is a week long Open Space event that began under Norm Kerth. It’s a wonderful community.

I held a session on Storytelling in Retrospectives. I wasn’t very good at keeping notes as the stories were too good 😉 This is my attempt to capture my memories of the session in the hopes it may help others.

I kicked off the session by establish the purpose of understanding techniques and their uses of storytelling in Retrospectives. I started with sharing how I use Dixit cards. I utilize these beautiful cards in two ways. The first is for check-ins and similarly for quick end-of class/workshop/conferences (that I run) retros where people select a card, share the card they selected and explain why they selected. It gives them a complex metaphor to explain. The longer retros I have run anchor this same selection, explain why, but ask people to go into the details of the card and how they represent what they have encountered over the period of time the retro is covering. I ask the others to listen to this story and capture the pertinent items (positive or negative events/issues) from the storyteller. This has the effect of getting people to truly focus and be present. The storyteller isn’t busy trying to do this capture and there is an interesting side effect; because people have their own points of view, these get captured into their notes. The cards are beautiful and most can be interpreted in both positive and negative ways.

Aino told us how she has used Rory’s Story Cubes to invite people to tell stories in a similar fashion. Kim mentioned that she has also done that and also how she has had people silently draw a story together, this digressed in a bit of how improv can be used to tell a story of a particular event and outcome with half the people being the audience and the other half being participants. For a little humor, while still keeping a storyline, this can be a silent improve. This does indeed lighten things up.

The drawing together reminds me an Art Gallery retro where different people that work together draw/diagram how they see the work processes from their point of view. This was a cool technique I learned at Problem Solving Leadership. You end up effectively telling stories as well.

Ainsley mentioned that storytelling has been a consistent topic that appeared at RFG since the early ones.

Somewhere along the line, I mentioned I like to close retros on occasion with the Hero’s Journey; “Once upon a time…”, “Until…”, “Then…”, “Happily ever after…”. The Until is/are the root-causes that created problems for the team, where the Then is the retrospective’s actions. Only if they put these actions into play can someone see the Happily Ever After part.

As Diana and George arrived in our session they reminded that in many ways, any activity we do is a story being told. Sometimes with data, sometimes by people… And even the arc of the Retrospective creates a story.

Towards the end, I wanted to run an idea I had thought of for a Retrospective. I call this the Newspaper Retro. Newspaper articles tell stories based on facts. So my thoughts were to let people in a retro individually brainstorm onto (probably larger) sticky notes news articles of 1-5 sentences about various aspects of what went well or didn’t go well. Everything should be based on fact. These get categorized the following news sections:

  • Politics are all articles about team collaboration
  • Technology is all about tool usage
  • Business is all about the process
  • Science is about new discoveries or learning the team has made
  • Foreign Affairs is all about things external to the team
  • The Nation is the section devoted to things about management

After the initial creation of stories, the team would come together and combine like stories. This is where multiple perspectives now are coming together. The team would take these similar stories and provide analysis on these stories, collectively writing what they think is happening. Finally, they prioritize the 1-3 that will appear to be on the front page, analyze those as well (if not analyzed because of similarity) and then create the planned elements or experiments that will be done to correct these (or amplify them if they are positive), adding these onto the story.

Final front page stories then wind-up being 1-3 factual statements of the problem, 1-3 sentences of analysis, and 1-3 actions that can be taken. A few other stories later in the newspaper wind up being factual statements and analysis, and most are just factual statements. These would be categorized into the appropriate sections and later retros could review the newspaper in future retros.

The group seemed to like this idea for a retro; just realize it is untested at the point of this writing. If you decide to try this, I’d LOVE to hear how it went! I will be looking for a time to use this, but as we know context is important, so I don’t want to use it with a team where it wouldn’t fit.

Shortly after this, we closed… I am certain I forgot some stuff, so I hope attendees will chime in and remind me of what was forgotten.

The Economics of Agile Communications for Requirements

This post originally appeared on my BoosianSpace blog on 28 October 2011. Some minor updates were made.

I’ve been reading Democratizing Innovation by Eric von Hippel.  One of the items he talks about is the cost of information transfer from innovation user to innovation creator.  In his context he’s demonstrating why uniqueness causes innovations to be grown internally by organizations as opposed to being bought off the shelf.

This got me to thinking on a challenge we see in Agile Adoption, explaining the reason we want lighter-weight documentation and more face-to-face collaboration.  I got a small inspiration on the economics of the two opposite ends of the spectrum.

Let’s start with the sequential phased-gate (aka ‘Waterfall’ or as I prefer to call it a ‘Canal’) approach.  Here’s what typically happens:

A set of business or systems analysts create a document. This gets approved by the business user(s) and often management.  This then gets distributed the development team.  They theoretically read the whole thing through once and understand it perfectly (single set of communication paths of one document to N people reading it).  So here’s what the formula would look like for the communication of that information throughout the entire team:

Xfer$W = Labor$avg x [[(Ncreators x CreationHrsavg) + (Napprovers x ApprovalReadingHrsavg)] x Cyclesapproval

+ [ ((Nteam – Ncreators) x ComprehensionReadingHrsavg)]]

In words: The transfer cost is equal to the creation labor hours (number of creators x average creation time for the documents as this is what communicates it to analysts creating it) plus the approval labor hours (number of approvers x average time to read the resulting documents as this is the communications to the business representative(s)) multiplied by the number of approval cycles plus the comprehension  hours (number of remaining team members that need to read the approved document x average time to read) finally multiplied by the average labor cost per hour.

Let’s see this in action as an example with a team of 6 and 1 business user that has to approve the requirements on a small application development effort:

Xfer$W = $100 avg hourly rate x [(1 analyst x 120 hours creation time) + (1 approver x 4 hours reading to approve)] x 1 cycle + [5 remaining team members x 40 hours to read and fully understand the requirements)] = $100 x[ [120 + 4] x 1 +[200]] = $100 x 324 or $32,400

Two primary assumptions here are that an approver won’t be as interested in reading it in detail as they supposedly know the requirements and thus will not pay as much attention to reading the document he or she is signing off on…AND more importantly the team can read the document ONCE and it contains EVERYTHING they need to know.  It is perfect, nothing is missing.  These numbers aren’t exactly realistic of course, most projects would take longer and would involve more signatories and more cycles to get sign-off.  I’ll be discussing the costs of change using this model in a bit.

Now let’s look at the same communications using an Agile approach…

In the Agile approach, the entire team is going to be involved with the creation, which will now include the business owner/manager.  There is no need for a sign-off as he or she is directly involved.  There also is no need to have the development side of the team expend time in reading the documentation since they are also directly involved in creating it. To reflect on the time, the people creating the knowledge (and artifacts) is equal to the by the number of paths of communication in the team multiplied by amount of effort (the average creation time) each person has to put in divided by the number of people assisting in the communications (i.e. the number on the team).  Also, since the business owner is involved throughout the development process there is only one cycle (for the lify of this project cycle).   Thus, our equation becomes the following:

Xfer$A = Labor$avg x [(Ncreators + / CommPaths) x CreationHrsavg]

Where CommPaths = ∑((Nteam – 1 ) + (Nteam – 2 ) + … +(Nteam – (Nteam – 1) )  

The assumptions here are the average creation time per person is the same as the creation time in a canal environment; i.e. the scope is the same.  Since this is done throughout the development by all members of the team, we know this will not be one solid time block and will involve more people.  The effort to distribute the information, however, is represented by the number of paths involved divided by thepeople trying to move the information along those paths. This is why the communication paths variable is the numerator and team members the denominator.

For our example of a team of 7 (since the business owner is now a part of the team),

CommPaths = ∑((7 – 1 ) + (7 – 2 ) + … +(7 – (7 – 1) )  =  21

Xfer$A = $100 x [(21/7) x 120] = $100 x [3 x 120] = $100 * 360 = $36,000

Your probably wondering where the savings is…  This looks like a wash. It isn’t.  What comes into play is the cost of change as it occurs over the project.  To truly understand the costs though, we need to discuss what happens over the life of the project.  In the ‘canal’ project, if we make a change, we have to go though the same expensive communications path as the initial development.

Xfer$W = Labor$avg x [[(Ncreators x CorrectionHrsavg) + (Napprovers x ApprovalReadingHrsavg)] x Cyclesapproval

+ [ ((Nteam – Ncreators)x ComprehensionReadingHrsavg)]]

Let’s use our example and say we had a change that requires roughly a quarter of people’s time to produce version 1.1 of the requirements specification:

Xfer$W = $100 x [[(1 x 30 hours) + (1×1 hour sign-off)] x 1 + (5 x 10 hours comprehension)] = $100 x [[30+1]x1+50] = $100 x 81 = $8100

So now total cost is the $32,400 + $8100 or $40,500 ; each time I go through a change the cost will go up by some amount.

Going back to the Agile side, because we are performing the requirements communication throughout the development and we defer to discussing only the requirements needed for the next piece of work, changes and more importantly the associated communications are already baked in.  We haven’t defined it all upfront an then distribute it for use once.  Thus, the additional costs for the next distribution are near zero.

We expect requirements to change.  We defer unknown things to as late as we can responsibly can (the last iteration possibly if the work can be done in one Sprint) so that the risk of needing to change it is minimized. Thus our costs are not going up with changes, they are remaining basically flat.  In the sequential phased-gate scenario, one significant change could ‘wipe out’ the supposed savings you saw in the simple calculation, which optimistically presumed that everything worked perfectly the first time.

 

Note: I am not a accounting type by nature; this just seemed like a logical fit and I am trying to find some empirical evidence to support it or that contradicts it; if you know of some, it would be appreciated!  Just post below and the sources you are using.

BTW, I have also toyed with the fact that requirements (stories) that need to change along the development cycle have the cost of the original one, but to multiplied by the probability that they are still in the backlog to be done and not done yet.  If you you added up the percentages as buckets of 10% along the project and divided by 10 to get the likelihood that this occurs as 50% (on average) then the cost would be akin to the following by example:

CommPaths = ∑((7 – 1 ) + (7 – 2 ) + … +(7 – (7 – 1) )  =  21

Xfer$A = $100 x 50% of [(21/7) x 40] = $100 x 50% of [3 x 40] = $100 x 50% of 120 = $100 x 60 = $6000 so the total cost of changes accumulate at a slower rate.

During a project execution, you could actually use a real rolling percent of stories closed over total stories.