Skip to main content

Idea 43 - System thinking

Idea 43 - System thinking


Some farmers discovered systems thinking the hard way. With their crops being devoured by insects, they reached for the spray gun and blasted them with pesticide. And it worked - for a time. But then the crop damage returned, worse than ever, and the pesticide that was so successful had no more effect, As it happened, the insect that was eating the crops had also been eating, or competing with, another insect. Now that insect no. 1 was out of the way, insect no. 2 was having a field day. 'Systems thinking' says that things are more complicated than they seem and actions can have unforeseen and unintended consequences.

 

Systems thinking recognizes that no man - or insect - is an island, and that there is an interconnectedness in social and natural processes that is not always immediately apparent. 'Linear' thinking operates in a straight line. It says that if you do A to B, the result will be C. Systems thinking says that if you do A to B, it may also influence D and E, resulting in F –except that F may take some time to show up.

 

 

Systems thinking comes out of 'system dynamics', the child of American computer engineer Jay Forrester. He studied how even simple systems could behave in a surprisingly nonlinear way, publishing a paper on 'Industrial dynamics' in 1958. More recently, Peter Senge has looked at the way systems thinking and systems awareness can help people work together more productively towards common goals, in learning organizations.

 

 

Forming a circle Systems thinking sees a process as a system, not as a straight line but as a loop or series of interconnected loops. The system links people, institutions, processes, and so on, but they themselves are not the point - the point is the influences they have on each other. Senge argues that the roots of the 'war on terror' lie not in rival ideologies but in a way of thinking shared by both sides.

 

 

The linear thinking of the US establishment is that terrorist attacks cause a threat to Americans, which causes a need to respond militarily. Terrorist thinking says US military activity causes a perception of US aggressiveness which makes people willing to become terrorists. In fact, those two straight lines form a circle, a system of variables that influence each other – a perpetual cycle of aggression. 'Both sides respond to perceived threats', Senge says. 'But their actions end up creating escalating danger for everyone. Here, as in many systems, doing the obvious thing does not produce the obvious, desired outcome.'

 

 

How often could you say the same about problem-solving in the workplace? A key idea in systems thinking is 'feedback' which, confusingly, is a very different thing from the feedback

you. might want to obtain from the customer. Instead, it's the flow of influence between each player in the system. Every influence is both a cause and an effect. The disappearance of insect no. 1 was the effect of pesticide application and the cause of a resurgence of insect no.2. This chain of cause and effect eventually creates a loop.

 

 

Reinforcing feedback causes escalation, and small doses of it can amplify into large results, for better or worse - the vicious circle. Self-fulfilling prophecies are examples of reinforcing feedback at work, and so is escalating tension between the US government and terrorists. Balancing feedback acts to stabilize the system and is the result of goal-oriented behaviour. If you are travelling at 60 miles per hour but want to drive at 50, that desire will 'influence' you to apply the brakes. If you're travelling at 40, it will cause you to put your foot down, but only until you reach 50. That's an explicit balancing feedback system. Implicit balancing feedback may be the reason why trying to change the system resists all your efforts. Delays, another key idea, are often present in feedback, interrupting the flow of causes or influences so that consequences only appear gradually.

 

 

You can spot systems dynamics at work in many guises, including the way in which a solution turns into a problem somewhere else in the system. A new manager 'solves' the problem of high inventory costs by reducing stocks, but the sales force ends up spending more time dealing with customers upset over late deliveries. Or sales are way down in the fourth quarter, because of the huge success of the discount programme in the third, prompting customers to bring forward their purchases. Senge tells of how the seizure of a large narcotics shipment caused a new wave of street crime by reducing the drug supply, which pushed up prices, which in turn drove desperate addicts to more crime to keep funding their habit.

 

Pushing back Reinforcing feedback takes place when managers' expectations influence their subordinates' performance. You think someone has high potential, so you take special care to help them develop it. They do. You feel you were right all along and help them some more. It happens in reverse, with poor performers 'justifying' lack of attention. 'The harder you push, the harder the system pushes back' is how Senge describes a typical balancing feedback system. He tells of a friend who tried in vain to reduce burnout among the professionals working for his busy training business, shortening the hours, locking the offices. No good. They took work home, defied the shortened hours, and why? The reason it happened was because an unwritten norm was that the organization's true heroes, the ones that got ahead, worked 70 hours a week - because that was the example the boss had set himself.

 

 

These are simple cases. Systems within big organizations can be much more complex. Businesses have powerful and sophisticated forecasting, planning and analysis tools at their disposal and yet these fail to detect the causes of some of the most trying problems. That, says Senge, is because they are designed to handle the kind of complexity that has many variables - detail complexity. But there is another kind, which the tools are not designed to cope with - dynamic complexity - where cause and effect are subtle and the effects of interventions over time are not obvious.

 

 

Dealing with this demands a 'shift of mind', Senge declares. The essence of systems thinking, he says, is to see interrelationships rather than linear cause-effect chains, and to see processes of change rather than just snapshots.




Comments