This section of the portal is for supporting the Disciplined Agile Value Stream Consultant Workshop (DAVSC), currently under development. Discussions on the pages here will take place on the Disciplined Agile LinkedIn group.
Developing new products, services and software is a complex endeavor. That means we can never know for sure what’s going to happen. There are many layers of activities going on at the same time and it’s hard to see how each relates to the others. Systems are holistic and not understandable just by looking at their components. Instead we must look at how the components of the system interact with each other. Consider a car, for example. While cars have components, the car itself is also about how the car’s components interact with each other. For example, putting a bigger engine in a car might make the car unstable if the frame can’t support it, or even dangerous if the brakes are no longer sufficient.
These relationships, however, come in different degrees of predictability:
- Simple – you do something and the result is obvious. For example, drop a held ball and it drops.
- Complicated – there are so many understandable relationships present the overall picture is difficult to see – even if it’s possible. A Rube Goldberg machine is a great example of complicated.
- Complex – not all of the relationships may be clear and even those that are may not interact how you think they will. Complex systems are somewhat defined as being unpredictable.
- Chaotic event – this is when a small event causes a big result. This is the proverbial “straw that broke the camel’s back.” This is distinct from chaos where one can’t tell what’s going on. Misunderstood requirements are a common example in knowledge work.
(Note, for the reader familiar with Cynefin, this is not intended to be a variant of it. These concepts predate Cynefin by decades and are being used in a different manner than Cynefin approaches them).
Knowledge work can be thought of as an integration of several systems:
- How people interact with each other
- How work being done in one part of the system affects the work in others
- How people learn
- How people in the system interact with people outside of the system
These interactions are unique to a particular company. The principle of “context counts” means we must make intelligent choices based on the situation we are in. But how? We just stated that a large part of our system is unpredictable.
We first recognize we’re just trying to improve our predictability of what will happen. That means we want to attend what Don Reinertsen (Reinertsen, 2009) calls “macro-predictability” as opposed to “micro predictability.” Micro-predictability is the degree of predictability of specific actions – for example, whether a roulette ball will end up in black, red or green. Macro-predictability is the degree of predictability over time – for example, we can be pretty sure more money is staying at the table.
Given our lack of micro-predictability we want to take a scientific approach, be agnostic as to our methods and avoid our cognitive bias. We offer up each potential improvement as an hypothesis that it will make a positive difference. When we try it, we think of it as an experiment to see if our understanding was correct or not. We either get an improvement or we learn something.
Deciding on these hypotheses is often based on looking at workflows and how people interact with each other. We also have to attend to the experience level of people and avoid pushing them beyond their abilities. For example, multi-tasking is bad for efficiency, creates additional work and causes unpredictability. We’d therefore like to reduce it. But how? Multi-tasking is usually caused by people working on too many things that they are not able to finish quickly, so they conflict with each other. Our “macro-predictability” tells us that reducing this overload would be a good thing. We can reflect on our situation and context and make a choice based on principles applied to our context and see if we get an improvement. For example, are people assigned to too many projects? We make choices guided by both our understanding of general principles of Lean and Flow. Our actions will lead either to improvement or learning.
The creation of a series of small steps and validating each one based on the context of the organization leads to effective emergent change. We can guide these steps with Dr. Goldratt’s concept of inherent simplicity (Goldratt, 2008). Inherent simplicity is the presumption that inherent in complex systems there are rules that, when understood, enormously simplify how we can create potential solutions for the challenges in our system. Inherent simplicity already exists. We must find it and take advantage of it. This will enable us to increase performance and reduce or eliminate the challenges we are facing. In knowledge work we have found that looking at the following can be very useful to understand what’s happening:
- The extent of focus on customer value
- How workload relates to capacity
- Efficiency of the value streams
- The batch size being worked on
- Visibility of work and workflow
- Level of collaboration present
- Quality of the product
These “factors for simplicity” as we refer to them, are reflected in many of the principles, promises, and guidelines discussed in this chapter. In particular, be enterprise aware, create a safe environment, improve predictability, improve continuously, validate our learnings, attend to relationships throughout the value steam, and adopt measures to improve outcomes. This does not mean we achieve predictability, of course. Our goal is to improve both our process and improve our predictability through learning – our improved behavior emerges as we learn.. While we are driven by a belief we don’t need to make random changes to see what improvement we’ll get we are also guided by two maxims
It is difficult to make predictions, especially about the future. – Mark Twain
For every complex human problem, there is a solution that is neat, simple and wrong. – H. L. Mencken
In other words, we move forward with caution while taking advantage of what we know.