Moving from a system designed for robustness to one that supports resilience represents a significant strategic shift. Whilst systems have commonly been designed to be robust – systems which are designed to prevent failure – increasing complexity and the difficulty it poses to fail-proof planning have made a shift to “resilience” strategically imperative. A resilient system on the other hand accepts that failure is inevitable and focuses instead on early discovery and fast recovery from failure.
Expert entrainment is both good and bad depending on the domain in which it is applied. Dave Snowden‘s video explains why. Not only is this instructive, it is humorous. The main points I took away were:
- Despite having a plausible theory and good empirical proof, uptake of a new idea is not a slam-dunk. External pressure is needed to to drive change in many instances.
- Mental filters cloud our thinking. Adapting one set of mental tools to solve a problem in another domain can fail when you are operating the complex domain.
- And finally, the Welsh discovered America.
Leading change effectively means dealing with expert entrainment by adding competing perspectives. Enjoy…