The holy grail of most organisations offering executive education interventions is knowing and justifying if they really work. Reflecting on what traditionally happens, this paper sets out to propose a more imaginative and relevant way of tracking the success (or not) of executive education interventions. We suggest the focus needs to be more on what is learnt, not how people feel, and more in tune with the means of design not just the ends.
A recent McKinsey report; “Why Leadership Development Programmes Fail”, highlighted the following research findings:
- US companies spend almost $14 billion annually on leadership development.
- Customised leadership development offerings from top business schools can reach $150,000 a person.
- 500 executives ranked their top three human-capital priorities; leadership development was included as both a current and a future priority.
- Only 7 percent of senior managers polled by a UK business school think that their companies develop global leaders effectively
- Around 30 percent of US companies admit they lack enough leaders with the right capabilities.
Professor Jeff Pfeffer of Stanford University in his article; “Getting beyond the BS of leadership literature”, states the US spend could actually be as much as $50bn. He goes on to say that; ”Leaders aren’t doing a good job for themselves or their workplaces, and things don’t seem to be improving.”
From this evidence, a conscientious Leadership Development professional may be perfectly entitled to conclude that investing in Leadership Development Programmes and interventions is an expensive waste of time and money with questionable returns.
The counter argument is that whilst the cost of Executive Education may appear high, the cost of Executive ignorance is significantly higher. That doesn’t, however, negate the need for the pursuit of more helpful, relevant and accurate measures of success.
In his article; “The Corporate Leadership Landscape”, Tim Coburn highlights the complex contextual environment today’s Executives have to operate in. There is no doubt they live in a world where the shelf life of knowledge is getting shorter and shorter and the need to learn continuously is a must, not a nice to have. If Executives are to remain valuables, they need to focus on the complex not the repetitive, the unknown, not the known, and increasingly on the future, not today… because “today” is being standardised, outsourced, automated and digitised.
From the evidence, a conscientious Leadership Development professional may be perfectly entitled to conclude that investing in Leadership Development Programmes and interventions is an expensive waste of time and money with questionable returns.
Whichever report you choose to read there is little doubt that Executive Development is crucially important. The growing dilemma facing all those responsible for Executive Development is: how do we know if our investment will provide the value and returns we seek? The answer, of course, is that there is no simple answer! Having said that we would like to show that there are many surrogates which can help point the way and, when taken in the round, provide strong evidence for progress.
The Purpose of Executive Education
Given the challenges faced by the leaders of today’s companies, we believe the sole purpose of executive education should be to enable executives to learn and learn how to learn, and in doing so apply that to improving themselves and their organisations.
For individual leaders, the ability to learn has already been identified as the strongest factor in determining their potential to succeed. And for organisations, the ability to adapt and change has become so obviously critical to their survival.
The Traditional Approach
The most widely used method of measuring the effectiveness of Executive Education Programmes is the classic “Happy Sheet”, or delegate feedback form. These are designed to gather feedback on the quality of content, speakers, learning experiences, the venue and administrative support. Using a combination of closed (rating scale) and open questions, delegates are asked to evaluate and report on their experience.
The fundamental design flaw and perhaps unintended consequence of this sort of measurement mechanism is that it invariably forces Executives to make a one-dimensional judgement of; good/bad, yes/no, like/don’t like and not make more valuable self-reflections on their own learning experience and feelings.
If the purpose of executive education is to create people with the ability to learn how to learn then reinforcing such a judgemental approach is damaging and counterproductive to that aim.
The evaluation of a delegate’s learning experience is crucial but in doing so, we should be asking questions with a known correlation to an improvement of thinking and behaviour in job performance, like some of those identified by ABDI’s research:
- Were the personal and collective objectives achieved? (effectiveness)
- Was the content relevant to your role? (relevance)
- Do you have an implementable action plan to apply your new insights? (action plan)
- Were you challenged and supported in relation to your individual and collective assumptions? (insights)
- Would you recommend this programme to colleagues? (willingness to recommend)
As useful as these questions are however, we believe they only go part of the way in addressing the real purpose of executive education.
A Different Perspective
Given the emphasis on learning – as well as the emphasis on improving performance – knowing if Executive Education Programmes work becomes a function of both means and ends. By means, we refer to the underpinning pedagogical and design principles and processes of any intervention. The way a programme is designed will dictate whether ‘learning to learn’ and ‘improving yourself and your organisation’ actually happens.
The design principles we believe in at Accelerance ensure we address the purpose of executive education as we see it. In doing so, they also provide an alternative set of measurement criteria that can be used for evaluating impact. These principles include:
- The importance of working on real issues – identifying real business challenges which allow teams to anchor their learning in “doing” and to produce outputs that may be adopted in the business. The goal is to apply new insights and learning in the creation and delivery of well-structured projects with clear measures of success that will change the business.Measure: The number and quality of projects worked on. Also did the programme help to address and resolve real business issues effectively with imaginative solutions based on new insights?
- Co-creation and the involvement of senior leaders/sponsors in the design and delivery – this facilitates a significant degree of ownership and intimacy with the programme and its delegates allowing line manager and sponsors to see the change in behaviour and thinking over time.
Measure: Were senior leaders engaged and involved in shaping the programme? How were their needs and perspectives considered? How were senior leaders and sponsors supporting delegates before, during and after the intervention? What changes did sponsors see in delegates?
- Live testing and experimentation – the translation of insight and ideas into action under experimental conditions is not only a way of understanding if new ideas will work but also another reference point for “ends” and the number of ideas generated, experiments undertaken and new ideas implemented.Measure: Did the programme include live testing and experimentation? How many experiments, new ideas or new ways of working were tested as a result of the programme?
- Emotional engagement and reflection – Effective learning is a function of the level of both intellectual as well as emotional engagement. Facilitating interventions such as learning sets allows the delegates to connect at both an intellectual and deeply emotional level. Peer to peer interaction identifies reference points of change from the moment the learning sets start to the moment they finish. Providing ample time for reflection also allows delegates to record their thoughts, feelings, impressions and emotions before a programme starts and again at the end. This can be captured using interview techniques or video recordings.Measure: Did the programme allow the development of learning relationships that enabled the development of thoughts and feelings as part of the learning journey? Were qualitative observations captured before, during and after the intervention to assess “shift”?
- Asking and enquiring, not telling – programmes designed to actively engage delegates in the learning process and not transmit knowledge have the opportunity to both see engagement in action and also measure levels of engagement throughout the intervention.Measure: Did the programme encourage and enable inquiry-led learning driven by participant curiosity? How did that process work (for the delegates) and how did the quality of questioning change individually and collectively over the life of the intervention and beyond? How did the proportion of time shift over the life of the intervention from “transmit” to “converse”?
- Creating “thirsty learners” – programmes that help delegates learn how to learn have a greater probability of getting feedback from line managers and colleagues in both a formal or informal context. Delegates who are comfortable actively seeking feedback as part of a learning process receive more insight in regard to their own performance.Measure: Did the programme enable the improvement of your ability to learn that you will be able to transfer and apply in your job? What was the level of active feedback seeking during and post the intervention?
- Moving away from your comfort zone – delegates invariably learn more from what is unusual or unorthodox to them compared to staying with the familiar. This can be both an extremely rewarding but also very uncomfortable experience (hence the view we hold of avoiding questions which reference a good/bad experience). Learning through discovery and exploring new perspectives is one of the most powerful ways of helping leaders reframe their views.
Measure: identifying the specific insights that came from any discovery experience. How many new insights were generated? How relevant might they be to helping move forward your personal and business challenges?
The list of design principles referenced is by no means exhaustive and is only a sample of those used by Accelerance to construct and deliver its work. When executive education is designed with principles like these, traditional “happy sheet” questions become less helpful when seeking to find out if a programme has achieved its real purpose. Given the importance of Executive Development to the future of every business, it’s important we move away from simplistic and often misleading measurement methods and adopt a more holistic view of what can be measured based on more enlightened design principles and practices.