Category Archives: Agile

Design Thinking and its Relevance to Agile Testing

 Design Thinking Cover

Here I present a review of Design Thinking: Integrating Innovation, Customer Experience, and Brand Value at the same time connecting it to Agile Testing.  Quotations are from the book unless explicitly mentioned otherwise.  The good news, I think, is Agile Testing and the Quality mindset has an important part to play.  Does it resonate and connect with you?  Can they work together?  Are you doing something like it already?  Can we learn from each other? Let’s see.

Many things that model Agile Testing map to Design Thinking.  The techniques are either knowingly or unknowingly being applied.  Let’s make it them explicit.

We can connect this immediately to this from the book’s introduction:

‘The point of the story is that our innovation was successful, better products came to market, and people were more satisfied because a few open-minded designers teamed with some thoughtful engineers and with users who liked to experiment, developed and tested very rough prototypes, discovered flaws and reworked quickly, and included business analysis during the development process.  I call this design thinking…’

Well, I also find it a very good definition of Agile Testing!  Or near enough – there is a missing piece for most teams, it has key words, thoughtful engineers (testers and developers), users, business analysts combining with designers (the missing piece?) to create the process.  The missing piece is that testers and developers rarely get to participate in the well strategy defining tests that design thinkers are doing.  There may be cases where they are, and certainly in smaller companies it happens.  In larger organisations, the more traditional, I haven’t see it.  The real customer facing side of the business is often obscured and obfuscated.

Here is some more on the missing piece for us from the book’s introduction:

‘design thinking is primarily an innovation process.  It is a way to help discover unmet needs and opportunities and create new solutions’

Phil Best says it’s ‘a five step process that includes immersion and understanding, discovery of opportunities, creating a vision, validation with key stakeholders, and finally, integration and activation’

Testing is not usually considered a player in innovation.  I think it is when it is allowed to flourish it can really excel.  We may not even notice but some very innovative ideas come out of good agile testing (requirements) workshops,  which leads on to …

Everybody is Connected to a Goal

‘A big part of modern product development is the workshop exercise.’

And this too is what we do with ATDD, BDD and SBE.  These are focussed discussions of what the software for a product should do.  They encourage discussion and importantly dissension.   We call this a cross functional gathering.

Design Thinking takes it further.  It more than cross functional, it’s interdisciplinary.   Design thinking calls for a heightened view of Customer needs.  Often this is given lip service.  Many enlightened teams do mind mapping and paper prototyping, all of which fit nicely in the Design Thinking toolkit.

Can we take it further.  Customer Empathy,  generates real value.  Do we understand what makes the Customer happy or are we just order takers.  Design Thinkers spend their time on the options that determine these.   As testers can we take it to another level, amplifying the techniques we already use and supplementing them with others to achieve better value.

Let’s continue with more excerpts from the book, making direct connections to what we do:

‘start any new design activity or change program with an intent workshop.  This involves inviting all key stakeholders …’  p. 27

Liftoffs are one way to approach this,  Strategy Deployment is a more detailed and focused approach.

Exploratory Testing

Exploratory Testing, is just testing according to  http://www.satisfice.com/blog/archives/1509

‘The role of thinking, feeling, communicating humans became displaced.’ from http://www.satisfice.com/blog/archives/1509

We need to link to a human process and unleash from the straightjacket of the script.   Context Driven Testing (CDT), seems to line up somewhat with Design Thinking.

To further align with CDT definitions, would be to identify with a humanistic and even all of body experience of testing.   It’s bringing in abduction, what might be.  Less on deduction, what it should be – that could prevent innovation.  Testing would still be somewhat aligned to induction, proving it does work but the continuum slanted to a mindset of abduction, particularly early on.

From Elizabeth Hendrickson’s book Explore It!: Reduce Risk and Increase Confidence with Exploratory Testing,

‘No matter how many tests we write, no matter how many cases we execute, we always find the most serious bugs when we go off script’,

the message in going off script helps even when discovering what the user needs.  That’s how we can debug the process.  That’s how testers can help in Design Thinking.

Let’s turn Hendrickson’s definition of Exploratory Testing around to suit Design Thinking:

‘Simultaneously designing and executing tests to learn about the system, using your insights from the last experiment to inform the next’

Changing only a few words for

‘Designing and executing experiments to learn about the system, using your insights from the last experiment to inform the next’

Almost the same and perhaps I could have left it the same.  Tests are experiments, but I put experiment in to convey a larger gravity to idea.  The experiments do not have to be the real thing, they can be a role play and will generate immediate feedback from real customers.

Exploratory Test Charters may in fact be useful and its template of Target, Resources and Information (to find) could be a useful model for Design Thinkers to use.  Importantly the charter’s are designed to leave enough room for exploration, ignoring the specifics.

During requirements sessions, design thinking sessions, candidate charters can be constructed for later on.   You can be testing the requirements with an inter-disciplinary group of people.  It would shortcut assumptions even they can have.  Formulating tests, is an expression of requirements and all their shortcomings are realised earlier.

Design Thinkers may not understand the -ilities part of software.  Here’s the chance to bring that to bear. Listen out for the reliability, performance, load abilities often expressed in the language they are familiar with.

Testers Question

Testers have the opportunity through their knowledge of how products work to be innovators.  A frequent example I see in websites is not dealing with ethnographic qualities.  Different font sets for non English languages are often neglected.  That can give rise to interesting bugs as well.  But what about the experience for that group.  Could that experience take them elsewhere.  Was it even considered and what of the cost if it was.

Accepting requirements as is, and from the book: ‘This form of research is really just being an order taker, not an innovator’

Accepting requests as is, is leaving out innovation.  Testers know this as well as ‘test infected’ developers.

It means encouraging the proverbial dog with a bone:

‘Remember, the key is to solve the right problem, and the right problem is not often the first identified’ p. 89

‘Only if you take people out of their comfort zone you get meaningful answers’ p. 116

Use your testers doubting mind to check on assumptions.  Some examples are documented by Gojko Adzic in his book Specification by Example: How Successful Teams Deliver the Right Software.  I include my own example in my training.

As mentioned previously, also make sure it always connected to a goal:

‘Ideas need to be based on some relevant insight about connection to the desired outcome – otherwise they are just ideas’ p. 148

Here’s something where we can probably all improve, most of us are just not exposed to this.  We should be.  We should (be allowed to) take the plunge and learn more.

‘The more brands understand that it is not just the features of a brand that create consumer pull but the benefits as well, the more leadership they will attain within their categories’  p. 101

Leaders need to allow that, there will be failures and will need to be needing to accept that.  Risk taking leaders will excel here.  Their companies generally do better. Office furniture maker, Herman Miller, is an active risk –taker factoring that into their strategy.  p. 166

Here is something that links to overall goals of good product development:

‘Designers, however, prefer to proceed with a flexible toolbox of heuristics and an agile, curious mind. They don’t know yet what the outcome will be of their creative explorations, and therefore cannot define what specific steps may be required to get there’  p. 25

So for testers to shine, then following this is key.  Following this quote is a table of dysfunctions that describes the cults and the corresponding antidote for a design friendly environment – instead of copy and paste take a look yourself.

Behaviour Driven Development – BDD

BDD fits in very well with Design Thinking.  Two areas that work in complex domains that sprung up, it seems, independently but share many of the same values.  BDD brings in automation to shorten the feedback loop between the what and the how, however the front end of BDD is the most valuable when executed in the spirit it was intended.

Dan North, coiner of the term, has spoken and written about Deliberate Discovery.   It means admitting to fallibility in ego, in skillsets and what the user truly needs.   Ethnography is a key plank, but few people in the process get to witness what the user does and feel empathy for the user.  Usually the ‘budget’ and resource utilization thinking means this is usually siloed with business people responsible for what the user needs, and delivering personas perhaps, and the devs and testers making sure it’s built and delivered.  As an aside, Rachel Davies uses the word ‘research’ for user stories.  She coaches the developers in XP teams to write the story and do the research to build it and therefore connecting fully with customer needs rather than a build all those stories and demo in a sprint review.  Seems to fit in nicely, building on empathy for customer needs.

Often the disconnect to needs and delivery is not felt till later when the unhappy user gets something they didn’t want or need.  This can result in expensive re-writes, which I’ve seen and no doubt others have as well.   Design Thinking and BDD both want to eliminate this possibility as much as possible.   Testers and Devs have a key role to play.   (Some teams only have Devs – so being Design and Test infected is even more helpful for them)

Connecting to What is Human

Because ultimately it comes down to what people feel and if you aren’t doing something that connects to them positively then we haven’t passed.  Connecting the head, heart and gut.   Usability testing, often neglected for pure functional outputs, is important then.

head heart gut

 This is where innovation can arise and testers can have the eye for this.  Being connected from the beginning and during execution through to the end fits right in.  Here’s some more to back up starting at the beginning:

‘Customer experience mapping, or the process of storyboarding and documenting a variety of possible scenarios with detailed interactions and outcomes, is a useful means of probing and uncovering opportunities to design a better service’ p. 200

This example actually refers to service design, Design Thinking goes beyond ‘things’ to design experiences, but the concepts are still valid.  A tester is learned in thinking through scenarios and different outcomes,  and continuing on to the details is another area testers thrive in:

‘This reminds us that service design is about attention to detail, as even a staff member’s failure to say “thankyou” can leave the customer with a perception of inferior service.  Looking for opportunities to influence positive perceptions should be part of the service design process.’  p. 200

Again it refers to service design, yet it is still applicable to the software world.  The little details get noticed, they might not be obvious but they still accumulate.  Testers know through asking questions, and with an exploratory mindset they can sniff these things out.

At the same time that you are building your technical skills in automation, keep this in mind

‘If you want long-term profits, don’t start with technology – start with design’  p. 19

It so tempting to resort to and keep to the norm of reductionist silos and let others do that.  Rather it’s more relevant today than it was previously, that it is the interdisciplinary collaboration in all it’s initial messiness is where the insights, innovations and opportunities lie.

‘All too often, it seems, businesses either excel at the creative side, in which case innovations usually fail, or they excel at the analysis side, which generally leads to only incremental innovation, or more likely, stagnation.’  from the introduction of the book

What does it mean, for testers, it means knowing that this important:

‘a shift from using design to make things simple and easy to design being about making people care’  also from the book’s introduction.

It will mean that you’ll need to factor in the ‘Social, Economic with the Technical’  These SET factors are forces that interact in emergent ways, and only through probing can the preferred state be discovered for customers.

Role of Quality

The probes will not all be a right!  Generally accepted that 50% will fail and sometimes 80% will fail as observed at a recent client.  Still much learning occurs from either state.   Testing will execute on quality and that is still important.  Quality will enable the emergence of the right, and only after the market has had a chance to validate it.  We can’t know that upfront, it may exceed or underperform expectations.   Just know that the quality has its part to play. That helps with the punt that the business is taking.

Agile Testing therefore has a very relevant part to play in innovative product development.

Advertisements

Keeping up Technical Chops

Got asked to do this for a Coaching job.  It was nice to do.

There is an important and underestimated place for technical excellence.  Heck, it’s a principle!  Combine it with the others 🙂  Standing on the shoulders, Thanks Beck, Rainsberger, Meszaros,  Fowler, Wake, Prag Programmers and  many more unnamed.  It has 36 commits to demonstrate the process.  Feel free to take a look at it.  #agile #tdd #quality #emergent

https://github.com/nzdojo/spellchecker

Read the notes:

https://github.com/nzdojo/spellchecker/wiki/Notes-on-Implementation

‘Keep hands on, that includes the code!’

 


Right Size First, Then Split

There is a hell of a lot going on in Agile Space on Story Splitting.  On the internet and I hear it in interviews and conversations.  It’s as if the only solution to smaller batches is to split ad infinitum.

Now don’t get me wrong, big batches are not good.  They hurt feedback mechanisms, create silos and delay delivery of value.

And there is the point, delivery of value.  Getting value to the customer is important.  Story Splitting is one way.  It doesn’t need to be the starting point.  We may be losing our focus on delivery of value by thinking about a process.  Value is not defined by the splitting of stories.

Something to consider then is to understand the nature of your work.  Are there different types of work, or another way to put it, sources of demand?  Can you use that information to help?

Maybe your team is predictable in it’s delivery.  That’s great and congratulations to you on that.  That data on delivery time can be leveraged.  If you can say we can deliver product enhancements on one product within a time frame and be confident about that say 85% of the time.  There you go – you have the makings of a Service Level Agreement or SLA.  Your customer may even be happy with that!

How can that help in sizing stories?  Well Dan Vacanti describes it in his book Actionable Agile Metrics for Predictability.  It’s called Right Sizing.  In essence it shortens the sizing conversation by asking does this work which we are about to take on fit in within our SLA.  If so then pull the work.  If not have that splitting conversation.

It’s that simple.  However, if you are experiencing problems in predictability and expanding delivery times then you’ll want to deal with those.  Dan’s book describes how Little’s Law and its components gives you ample information to help discover issues in your process.

Story Splitting is an option to start with.  You can consider others like this.  I think it will help you understand work to a higher level of fidelity.  Your customer will hopefully feel the difference as you deliver to them more meaningful increments of value from the increased understanding.  Coincidentally, you’ll also be on your way to be being better Systems and Lean Thinkers which are interesting subjects that will aid you being of best service to your customers.


What makes you Fit for Purpose?

Can someone answer this question?   Well yes, I can help.  In short it’s what you do, be that individually but mostly in a team, to ensure anyone you serve receives the service you provide in a reasonable time frame with appropriate quality.

Where does that start?

It starts with understanding who your customer is.   Taking steps to learn about the customer and what satisfies them.  There are some tools available to you to help like Customer Surveys, Customer Interviews, Customer Empathy Maps, Personas and going to see for yourself.

Somewhat lagging but still providing information about the customer and their needs ( which can and are probably changing) is a Net Fitness Score which is an alternative to Net Promoter Score.  If you are producing software you can add code to capture data about how your customers are using the application.  This is is part of a process called instrumentation that is done to computer programs.

This takes us into the area of measurement.  In addition to customer measures, a team can also use other metrics to ensure delivery is just right.  By just right, we mean that any feelings of over-burdening (muri) are minimised to ensure that a team can sustainably deliver work.

Here some measures include, service response times (cycle time) for the different types of requests a team gets.  Are we able to deliver those reliably.  By reliable we mean within the realms of probability and not exact measures.  Knowledge Work being naturally variable in nature we tend to defer to probabilities like a Service Level Agreement. 85% of the time we can deliver in 3 days as an example.

In aiming for better on-time delivery you may need to eliminate muda or wasteful activities.  You may find amplifying collaborative activities and learning new skills will help. These type of improvements stem from understanding the nature of different requests like demand (high and low periods), expectations of quality and when request are expected to be fulfilled (Cost of Delay).

Another measure is acceptable defect levels, with the aim to reduce these to a negligible level.  Defects may need to be balanced with responsiveness.  If you require greater responsiveness then Fit for Purpose may mean acceptance of higher failure load (another name for total defects).  Responsiveness may also mean less predictability and some work may have an have a wider range of delivery date performance.

If failure load is high, then addressing some level of quality can also have a bearing on on-time delivery.  In software development that means ensuring little or no technical debt.  High levels of technical debt lengthen cycle times as a team looks to deal with the complexity of software laden with technical debt.   Continually reducing and maintaining low levels of technical debt will help maintain reliable delivery.  It will also allow innovation to occur because the team is freed from the burden of low quality.

Addressing these and becoming reliable means you will have confidence to communicate service level expectations within reasonable levels of probability.  Doing this with appropriate quality will often result in plaudits to the team and reversing what may be many sources of dis-satisfaction for the customer and team a like.  Find out what makes your system of work Fit for Purpose.  Work hard on reaching that level.  Agility will be a natural result.


Webinar: Case Study Agile Testing results in DevOps Success

I recorded a webinar for CodeGenesys on a case study on Agile Testing involving my Australian Client called AgWorld.

Hopefully it conveys that quality is owned by everyone and quality starts well before a line of code is written.

Quality right through the value stream is cornerstone of DevOps success. This carries on right through the software development life cycle starting with requests and turning those into executable specification with BDD/ATDD, automation of unit and integration tests and with the aid of automation tools (Build and Deploy) to increase the delivery rate of completed software to increase the frequency of feedback lloops.

 


The Importance of Shared Purpose

My latest blog is actually one written for my employer here in the United States, Code Genesys.

You can take a look at it here and is on the importance of shared purpose.  Keep practicing that because it’s hard for first timers and anyone whose experienced for that matter.  Well worth the investment in time 🙂

I have an example of a company who go to great lengths to maintain their purpose.  An old blog article written 2 years ago.


Agile before it was cool

This is a page from a great book by an important author in the software development field, Tim Gilb, called Principles Of Software Engineering Management.  It came out in 1988.

20160119_202722

The language is reminiscent of the time and the practices that were in place then, but it was ground breaking in that it talks a lot about what we call Agile Values today.  In fact Tom Gilb invented his ‘Agile’ and ‘Iterative’ EVO methodology in the early 1970s.  It’s mentioned in the book as well.  This was well before Agile became what it is today – a business.   Tom was so ahead of his time, I remember back then that Iterative development was never ever mentioned in schools and in the workplace.

And therefore, I think this Bill of Rights still holds relevancy.  I made the challenge on twitter and some thought point 9 was not relevant, rather a relic of the past.  I tend to think it’s misinterpreted in the twitter response.  Here’s the link to the twitter feed and a snap shot below.

 

There is power in this for the worker here, even on performance.  Leaders can use this to create an ethos of transparency.  Looking back to look forward.