As a manager, how do you find yourself making decisions about new initiatives?
Figuring out the right thing to do is very challenging in dynamic environments where human behaviour plays a significant role in determining the success or failure of an initiative. We want to operate in a way that lowers the risk of failure – however, there is no way to effectively predict that human behaviour that plays such a significant role.
We have seen iterative processes emerge over the last 50 years – from the Totoya Production System to Agile software development and design thinking with user experience. Why have these iterative approaches been so successful? The answer lies in the recognition that there is no way to know all the details about the end at the beginning; in the Toyota Production System, they recognised that all processes need to be adapted and optimised by front-line factory employees in constant pursuit of excellence. This required a level of detail and insight that no process concocted in a manager’s head could ever have. The same issues came with the waterfall methodology of software design and again with human-centred design – these systems ultimately recognise that there is just too much we don’t know at the start – and by designing an iterative process, it forces us to avoid pretending that we can somehow know this information upfront.
How then do we take this principle of iteration and apply it to business initiatives when we are expected to make annual plans and define annual budgets? How do we iterate in a corporate environment that implicitly expects certainty?
The answer is to approach these initiatives as questions rather than solutions. More precisely, treat them as data-driven hypotheses.
a supposition or proposed explanation made on the basis of limited evidence as a starting point for further investigation.
…or a possibly better definition from philosophy…
a proposition made as a basis for reasoning, without any assumption of its truth.
Example of a business initiative:
“Provide customer self-service to reduce operating costs and improve customer satisfaction”
Example of an initiative as a hypothesis:
“Would customer self-service reduce operating costs and improve customer satisfaction?”
Identify measures for success – the obvious ones would be operating cost and customer satisfaction, right? The real question is which self-service activities could reduce operating cost and improve customer satisfaction the most? From this, we can start to direct our experimentation into smaller activities to provide the greatest possible impact for the organisation while we do less.
Transforming statements to questions is something we want to keep working at every step of the way. This is where design thinking comes in. Now we can use a well-defined framework for better understanding a customer through empathy – observing where we perceive pain points to further develop our understanding. We can also use the ideation skills that are part of design thinking to further develop a hypothesis about how we might improve customer experience.
Now we have moved from a simple statement to using questions, data and research to drive us to a set of hypotheses on how we might improve the customer experience for the most valuable self-service areas. Wait a minute – now we have a bunch of possible solutions rather than one. Which hypothesis should we choose? The answer is to choose a few – maybe even all of the possibilities.
Now take those ideas and start iterating through a prototyping phase on each one, where you can start to design rough implementations of the idea to a place where you can get real measurable customer feedback on them. Their first-round might be as simple as paper sketches, or perhaps digital wireframes, or it may be as simple as a conversational script. Whatever it is, you want to measure performance of the prototypes relative to one another so you can prioritise where your effort goes next – the winners move to the next stage, the less successful ideas – well, it may be incomplete execution, so if you really believe this, prove it – iterate quality again and test again. Did that change the performance of that idea? It always has to come down to data and performance.
As we progress further, we have fewer, yet more advanced experiments and yet we are informed as much by the ones that did not succeed as much as the ones that do. What do customers really respond to? We start to get a better and better idea as we develop our hypotheses to a point where we can measure them in depth against existing production environments.
If you consider what has occurred throughout this process it is quite amazing. We have dramatically reduced our risk of sinking money into an ineffective strategy by using an iterative process to converge to a higher-performing solution through experimentation.
So, what does this mean for the planning phase of corporate initiatives?
Fostering a data-driven culture that accepts experimentation is critical and it is challenging in a very traditional corporate business culture. Try introducing the concept of ‘hypothesis-driven business planning’ without those words – instead focus on framing strategy and initiatives as questions and look for how to measure success first – then – instead of making big commitments to very specific implementations, focus on quickly developing a portfolio of hypotheses that can be pursued in a time box with clear requirements for success. This approach is the easiest way to reduce risk in your corporate initiatives.
Feature image thanks to Pixnio.com