No more Single Wringable Neck

So as noted earlier in, Why you (might) need a good Product Owner, I expressed reservations about the Product Owner role.

It’s risky to have it all rest on one person.  It could also be unfair.  It could halt the delivery of value and of learning.  Instead of ironing out behavioural issues that may arise from the person focused role could we consider another view.

Here’s the view from Kanban Training that I got from David Anderson.

Elevated Role of the PO

Here the Product Owner has helped create a set of policies for selection of work.  Here it is noted as Risk Management policies.  This would include items like cost of delay also balanced against dependencies outside of the team.   It could include risks within the team as well.  For example, should we schedule work when someone is out on holiday.  Maybe you’d bring that forward or realize ways to increase staff liquidity.

It’s observed that when this is done, meetings to replenish a ‘Ready To Start’ column are much quicker and less prone to argument.

Would these policies also mitigate against a PO bottleneck.  It might be something worth trying in your context.  It may just end up producing more value for you or rather for your customer.

 

Advertisements

Coach as ‘Pair of Hands’

What happens if you acquiesce to being an extra ‘pair of hands’ on your coaching assignment by agreeing to do things for the coaching client.

What might it look like if you resist doing ‘staff augmentation’?

Here’s what it may look like with various scenarios plotted, passage of time horizontally and performance level (you choose the metric e.g. throughput, cycle time, quality) vertically.

Note: A J-Curve visualization is different, that represents the dip that occurs when learning something new.  This is a model on the coach’s impact on a team/organisation overall.

No apologies offered for the hand drawn nature of these drawings.

Coaches Effect on Team_Organisation Performance

Gabe Abella’s presentation on self-organizing teams is a source of inspiration as well as various sources in the Lean literature.


Attention to Quality gets Results

Most managers, and some ‘Agile’ coaches, I encounter shudder when the subject of quality comes up.  They think quality hampers delivery.   In some ways they are right, you do want to achieve balance, avoiding polishing too much because it can cause being late in delivery  (in mission critical projects it may be better to over polish if the cost of failure is high).

Some managers may shudder because, ooops I may have been found out and I’ve been pushing my teams to push out code which is not really ready.

Other managers, maybe the inexperienced ones but not always, have no idea what quality is.  They tend to believe the lies their development teams’ say when they say it’s ‘unit tested.’

As a manager, quality is one of if not your biggest concern.  Addressing quality is an economic concern as well and managers are responsible for economic outcomes.  Bad quality will slow down your team.  But not just that, it will also slow down your organisation.

How does this occur?  How does it slow down your organisation?  Well I’ve seen this time and time again when I come to a new client.   Work is at a stand still and when I dig a bit deeper I see the same issue appear.

Yes, teams are overwhelmed with work.  So limiting WiP will help.  Using workflow mapping and lean techniques will identify bottlenecks.  Yes address those.   Importantly, in the bottlenecks I find that the reason for those delays is the negative feedback of poor quality.

I see upstream, product ideas waiting to be developed and tested with the customer because there is a wave of poor quality slowing down the delivery pipe.   Overwhelmed with defects teams struggle to pull new work into development.  When they do eventually pull that work in, it is developed with such poor quality or worse the facade of quality (e.g. formal QA) that it further feeds into the negative feedback loop.

That loop gets even slower to run such that it turns into ‘3 month Stabilization’ phases or an entire quarter whereby the organisation freezes any new releases into production to avoid an outage.

These sort of mitigating steps often occur after some sort of disaster has occurred which has its root cause in poor quality.  One big example is releasing a new version of software such that it shuts down an assembly line at a factory and affects thousands of customers who are prevented from using their mobile devices.  The cost there was millions of dollars.

So asking for quality is easier said than done.  Or does it have to be that hard.  Usually workers are over-burdened.  So limit WiP as was mentioned earlier.  But then ask for quality.  Now that they are not over burdened they can put their efforts into producing quality output.

Give teams the space to improve and they usually will.  Sometimes as a coach I can give them some training and they will do it all by themselves (this interview demonstrates this) to a point, and sometimes being involved with them is needed from the outset or when levelling out or stalling out occurs.

In my experience, results can happen very quickly.  For an organisation I was recently involved with it took three months to go from a cycle time per feature of 40 days to 3 days.  This involved addressing quality and bringing in aligning practices from extreme programming and specification by example.  Reducing WiP and batch size also assisted.  This happened in stages, guided by data from a Cumulative Flow Diagram that mapped the stages of delivery and guided improvement efforts.  The first stage got it down to 9 days (by applying WiP limits, Policies on size and changing the Test Strategy) and then, the next stage, amplifying the unit testing practices to remove an archaic and slow UI testing bottleneck (sunk cost fallacy associated with that) really accelerated the flow of work.

AgWorld, the published case study, did it in 6 months.  They did chose to do for themselves which is fine.  They can achieve a lot more with a coach who can bring the practices quicker and help avoid stagnation which does tend to happen as well.

Find a slow process, you more often than not find poor quality.  Address quality and just observe how things get better for everyone.

 


Design Thinking and its Relevance to Agile Testing

 Design Thinking Cover

Here I present a review of Design Thinking: Integrating Innovation, Customer Experience, and Brand Value at the same time connecting it to Agile Testing.  Quotations are from the book unless explicitly mentioned otherwise.  The good news, I think, is Agile Testing and the Quality mindset has an important part to play.  Does it resonate and connect with you?  Can they work together?  Are you doing something like it already?  Can we learn from each other? Let’s see.

Many things that model Agile Testing map to Design Thinking.  The techniques are either knowingly or unknowingly being applied.  Let’s make it them explicit.

We can connect this immediately to this from the book’s introduction:

‘The point of the story is that our innovation was successful, better products came to market, and people were more satisfied because a few open-minded designers teamed with some thoughtful engineers and with users who liked to experiment, developed and tested very rough prototypes, discovered flaws and reworked quickly, and included business analysis during the development process.  I call this design thinking…’

Well, I also find it a very good definition of Agile Testing!  Or near enough – there is a missing piece for most teams, it has key words, thoughtful engineers (testers and developers), users, business analysts combining with designers (the missing piece?) to create the process.  The missing piece is that testers and developers rarely get to participate in the well strategy defining tests that design thinkers are doing.  There may be cases where they are, and certainly in smaller companies it happens.  In larger organisations, the more traditional, I haven’t see it.  The real customer facing side of the business is often obscured and obfuscated.

Here is some more on the missing piece for us from the book’s introduction:

‘design thinking is primarily an innovation process.  It is a way to help discover unmet needs and opportunities and create new solutions’

Phil Best says it’s ‘a five step process that includes immersion and understanding, discovery of opportunities, creating a vision, validation with key stakeholders, and finally, integration and activation’

Testing is not usually considered a player in innovation.  I think it is when it is allowed to flourish it can really excel.  We may not even notice but some very innovative ideas come out of good agile testing (requirements) workshops,  which leads on to …

Everybody is Connected to a Goal

‘A big part of modern product development is the workshop exercise.’

And this too is what we do with ATDD, BDD and SBE.  These are focussed discussions of what the software for a product should do.  They encourage discussion and importantly dissension.   We call this a cross functional gathering.

Design Thinking takes it further.  It more than cross functional, it’s interdisciplinary.   Design thinking calls for a heightened view of Customer needs.  Often this is given lip service.  Many enlightened teams do mind mapping and paper prototyping, all of which fit nicely in the Design Thinking toolkit.

Can we take it further.  Customer Empathy,  generates real value.  Do we understand what makes the Customer happy or are we just order takers.  Design Thinkers spend their time on the options that determine these.   As testers can we take it to another level, amplifying the techniques we already use and supplementing them with others to achieve better value.

Let’s continue with more excerpts from the book, making direct connections to what we do:

‘start any new design activity or change program with an intent workshop.  This involves inviting all key stakeholders …’  p. 27

Liftoffs are one way to approach this,  Strategy Deployment is a more detailed and focused approach.

Exploratory Testing

Exploratory Testing, is just testing according to  http://www.satisfice.com/blog/archives/1509

‘The role of thinking, feeling, communicating humans became displaced.’ from http://www.satisfice.com/blog/archives/1509

We need to link to a human process and unleash from the straightjacket of the script.   Context Driven Testing (CDT), seems to line up somewhat with Design Thinking.

To further align with CDT definitions, would be to identify with a humanistic and even all of body experience of testing.   It’s bringing in abduction, what might be.  Less on deduction, what it should be – that could prevent innovation.  Testing would still be somewhat aligned to induction, proving it does work but the continuum slanted to a mindset of abduction, particularly early on.

From Elizabeth Hendrickson’s book Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing,

‘No matter how many tests we write, no matter how many cases we execute, we always find the most serious bugs when we go off script’,

the message in going off script helps even when discovering what the user needs.  That’s how we can debug the process.  That’s how testers can help in Design Thinking.

Let’s turn Hendrickson’s definition of Exploratory Testing around to suit Design Thinking:

‘Simultaneously designing and executing tests to learn about the system, using your insights from the last experiment to inform the next’

Changing only a few words for

‘Designing and executing experiments to learn about the system, using your insights from the last experiment to inform the next’

Almost the same and perhaps I could have left it the same.  Tests are experiments, but I put experiment in to convey a larger gravity to idea.  The experiments do not have to be the real thing, they can be a role play and will generate immediate feedback from real customers.

Exploratory Test Charters may in fact be useful and its template of Target, Resources and Information (to find) could be a useful model for Design Thinkers to use.  Importantly the charter’s are designed to leave enough room for exploration, ignoring the specifics.

During requirements sessions, design thinking sessions, candidate charters can be constructed for later on.   You can be testing the requirements with an inter-disciplinary group of people.  It would shortcut assumptions even they can have.  Formulating tests, is an expression of requirements and all their shortcomings are realised earlier.

Design Thinkers may not understand the -ilities part of software.  Here’s the chance to bring that to bear. Listen out for the reliability, performance, load abilities often expressed in the language they are familiar with.

Testers Question

Testers have the opportunity through their knowledge of how products work to be innovators.  A frequent example I see in websites is not dealing with ethnographic qualities.  Different font sets for non English languages are often neglected.  That can give rise to interesting bugs as well.  But what about the experience for that group.  Could that experience take them elsewhere.  Was it even considered and what of the cost if it was.

Accepting requirements as is, and from the book: ‘This form of research is really just being an order taker, not an innovator’

Accepting requests as is, is leaving out innovation.  Testers know this as well as ‘test infected’ developers.

It means encouraging the proverbial dog with a bone:

‘Remember, the key is to solve the right problem, and the right problem is not often the first identified’ p. 89

‘Only if you take people out of their comfort zone you get meaningful answers’ p. 116

Use your testers doubting mind to check on assumptions.  Some examples are documented by Gojko Adzic in his book Specification by Example: How Successful Teams Deliver the Right Software.  I include my own example in my training.

As mentioned previously, also make sure it always connected to a goal:

‘Ideas need to be based on some relevant insight about connection to the desired outcome – otherwise they are just ideas’ p. 148

Here’s something where we can probably all improve, most of us are just not exposed to this.  We should be.  We should (be allowed to) take the plunge and learn more.

‘The more brands understand that it is not just the features of a brand that create consumer pull but the benefits as well, the more leadership they will attain within their categories’  p. 101

Leaders need to allow that, there will be failures and will need to be needing to accept that.  Risk taking leaders will excel here.  Their companies generally do better. Office furniture maker, Herman Miller, is an active risk –taker factoring that into their strategy.  p. 166

Here is something that links to overall goals of good product development:

‘Designers, however, prefer to proceed with a flexible toolbox of heuristics and an agile, curious mind. They don’t know yet what the outcome will be of their creative explorations, and therefore cannot define what specific steps may be required to get there’  p. 25

So for testers to shine, then following this is key.  Following this quote is a table of dysfunctions that describes the cults and the corresponding antidote for a design friendly environment – instead of copy and paste take a look yourself.

Behaviour Driven Development – BDD

BDD fits in very well with Design Thinking.  Two areas that work in complex domains that sprung up, it seems, independently but share many of the same values.  BDD brings in automation to shorten the feedback loop between the what and the how, however the front end of BDD is the most valuable when executed in the spirit it was intended.

Dan North, coiner of the term, has spoken and written about Deliberate Discovery.   It means admitting to fallibility in ego, in skillsets and what the user truly needs.   Ethnography is a key plank, but few people in the process get to witness what the user does and feel empathy for the user.  Usually the ‘budget’ and resource utilization thinking means this is usually siloed with business people responsible for what the user needs, and delivering personas perhaps, and the devs and testers making sure it’s built and delivered.  As an aside, Rachel Davies uses the word ‘research’ for user stories.  She coaches the developers in XP teams to write the story and do the research to build it and therefore connecting fully with customer needs rather than a build all those stories and demo in a sprint review.  Seems to fit in nicely, building on empathy for customer needs.

Often the disconnect to needs and delivery is not felt till later when the unhappy user gets something they didn’t want or need.  This can result in expensive re-writes, which I’ve seen and no doubt others have as well.   Design Thinking and BDD both want to eliminate this possibility as much as possible.   Testers and Devs have a key role to play.   (Some teams only have Devs – so being Design and Test infected is even more helpful for them)

Connecting to What is Human

Because ultimately it comes down to what people feel and if you aren’t doing something that connects to them positively then we haven’t passed.  Connecting the head, heart and gut.   Usability testing, often neglected for pure functional outputs, is important then.

head heart gut

 This is where innovation can arise and testers can have the eye for this.  Being connected from the beginning and during execution through to the end fits right in.  Here’s some more to back up starting at the beginning:

‘Customer experience mapping, or the process of storyboarding and documenting a variety of possible scenarios with detailed interactions and outcomes, is a useful means of probing and uncovering opportunities to design a better service’ p. 200

This example actually refers to service design, Design Thinking goes beyond ‘things’ to design experiences, but the concepts are still valid.  A tester is learned in thinking through scenarios and different outcomes,  and continuing on to the details is another area testers thrive in:

‘This reminds us that service design is about attention to detail, as even a staff member’s failure to say “thankyou” can leave the customer with a perception of inferior service.  Looking for opportunities to influence positive perceptions should be part of the service design process.’  p. 200

Again it refers to service design, yet it is still applicable to the software world.  The little details get noticed, they might not be obvious but they still accumulate.  Testers know through asking questions, and with an exploratory mindset they can sniff these things out.

At the same time that you are building your technical skills in automation, keep this in mind

‘If you want long-term profits, don’t start with technology – start with design’  p. 19

It so tempting to resort to and keep to the norm of reductionist silos and let others do that.  Rather it’s more relevant today than it was previously, that it is the interdisciplinary collaboration in all it’s initial messiness is where the insights, innovations and opportunities lie.

‘All too often, it seems, businesses either excel at the creative side, in which case innovations usually fail, or they excel at the analysis side, which generally leads to only incremental innovation, or more likely, stagnation.’  from the introduction of the book

What does it mean, for testers, it means knowing that this important:

‘a shift from using design to make things simple and easy to design being about making people care’  also from the book’s introduction.

It will mean that you’ll need to factor in the ‘Social, Economic with the Technical’  These SET factors are forces that interact in emergent ways, and only through probing can the preferred state be discovered for customers.

Role of Quality

The probes will not all be a right!  Generally accepted that 50% will fail and sometimes 80% will fail as observed at a recent client.  Still much learning occurs from either state.   Testing will execute on quality and that is still important.  Quality will enable the emergence of the right, and only after the market has had a chance to validate it.  We can’t know that upfront, it may exceed or underperform expectations.   Just know that the quality has its part to play. That helps with the punt that the business is taking.

Agile Testing therefore has a very relevant part to play in innovative product development.


Keeping up Technical Chops

Got asked to do this for a Coaching job.  It was nice to do.

There is an important and underestimated place for technical excellence.  Heck, it’s a principle!  Combine it with the others 🙂  Standing on the shoulders, Thanks Beck, Rainsberger, Meszaros,  Fowler, Wake, Prag Programmers and  many more unnamed.  It has 36 commits to demonstrate the process.  Feel free to take a look at it.  #agile #tdd #quality #emergent

https://github.com/nzdojo/spellchecker

Read the notes:

https://github.com/nzdojo/spellchecker/wiki/Notes-on-Implementation

‘Keep hands on, that includes the code!’

 


Right Size First, Then Split

There is a hell of a lot going on in Agile Space on Story Splitting.  On the internet and I hear it in interviews and conversations.  It’s as if the only solution to smaller batches is to split ad infinitum.

Now don’t get me wrong, big batches are not good.  They hurt feedback mechanisms, create silos and delay delivery of value.

And there is the point, delivery of value.  Getting value to the customer is important.  Story Splitting is one way.  It doesn’t need to be the starting point.  We may be losing our focus on delivery of value by thinking about a process.  Value is not defined by the splitting of stories.

Something to consider then is to understand the nature of your work.  Are there different types of work, or another way to put it, sources of demand?  Can you use that information to help?

Maybe your team is predictable in it’s delivery.  That’s great and congratulations to you on that.  That data on delivery time can be leveraged.  If you can say we can deliver product enhancements on one product within a time frame and be confident about that say 85% of the time.  There you go – you have the makings of a Service Level Agreement or SLA.  Your customer may even be happy with that!

How can that help in sizing stories?  Well Dan Vacanti describes it in his book Actionable Agile Metrics for Predictability.  It’s called Right Sizing.  In essence it shortens the sizing conversation by asking does this work which we are about to take on fit in within our SLA.  If so then pull the work.  If not have that splitting conversation.

It’s that simple.  However, if you are experiencing problems in predictability and expanding delivery times then you’ll want to deal with those.  Dan’s book describes how Little’s Law and its components gives you ample information to help discover issues in your process.

Story Splitting is an option to start with.  You can consider others like this.  I think it will help you understand work to a higher level of fidelity.  Your customer will hopefully feel the difference as you deliver to them more meaningful increments of value from the increased understanding.  Coincidentally, you’ll also be on your way to be being better Systems and Lean Thinkers which are interesting subjects that will aid you being of best service to your customers.


What makes you Fit for Purpose?

Can someone answer this question?   Well yes, I can help.  In short it’s what you do, be that individually but mostly in a team, to ensure anyone you serve receives the service you provide in a reasonable time frame with appropriate quality.

Where does that start?

It starts with understanding who your customer is.   Taking steps to learn about the customer and what satisfies them.  There are some tools available to you to help like Customer Surveys, Customer Interviews, Customer Empathy Maps, Personas and going to see for yourself.

Somewhat lagging but still providing information about the customer and their needs ( which can and are probably changing) is a Net Fitness Score which is an alternative to Net Promoter Score.  If you are producing software you can add code to capture data about how your customers are using the application.  This is is part of a process called instrumentation that is done to computer programs.

This takes us into the area of measurement.  In addition to customer measures, a team can also use other metrics to ensure delivery is just right.  By just right, we mean that any feelings of over-burdening (muri) are minimised to ensure that a team can sustainably deliver work.

Here some measures include, service response times (cycle time) for the different types of requests a team gets.  Are we able to deliver those reliably.  By reliable we mean within the realms of probability and not exact measures.  Knowledge Work being naturally variable in nature we tend to defer to probabilities like a Service Level Agreement. 85% of the time we can deliver in 3 days as an example.

In aiming for better on-time delivery you may need to eliminate muda or wasteful activities.  You may find amplifying collaborative activities and learning new skills will help. These type of improvements stem from understanding the nature of different requests like demand (high and low periods), expectations of quality and when request are expected to be fulfilled (Cost of Delay).

Another measure is acceptable defect levels, with the aim to reduce these to a negligible level.  Defects may need to be balanced with responsiveness.  If you require greater responsiveness then Fit for Purpose may mean acceptance of higher failure load (another name for total defects).  Responsiveness may also mean less predictability and some work may have an have a wider range of delivery date performance.

If failure load is high, then addressing some level of quality can also have a bearing on on-time delivery.  In software development that means ensuring little or no technical debt.  High levels of technical debt lengthen cycle times as a team looks to deal with the complexity of software laden with technical debt.   Continually reducing and maintaining low levels of technical debt will help maintain reliable delivery.  It will also allow innovation to occur because the team is freed from the burden of low quality.

Addressing these and becoming reliable means you will have confidence to communicate service level expectations within reasonable levels of probability.  Doing this with appropriate quality will often result in plaudits to the team and reversing what may be many sources of dis-satisfaction for the customer and team a like.  Find out what makes your system of work Fit for Purpose.  Work hard on reaching that level.  Agility will be a natural result.