google-site-verification: googlefccb558c724d02c7.html

Translate

Friday, December 31, 2010

26 lessons about RBM from the 1990's remain valid today

[Updated November 2018]
Greg Armstrong --


Lessons learned about RBM in the last century remain valid in 2018.

Implementing Results-Based Management: Lessons from the Literature – Office of the Auditor-General of Canada

Level of difficulty: Moderate
Primarily useful for: Senior Managers of partner Ministries and aid agencies
Length: Roughly 18 pages (about 9,000 words)
Most useful section: Comments on the need for a performance management culture
Limitations: Few details on implementation mechanisms

The Office of the Auditor-General of Canada deals with the practical implications of results-based management, or with the failure of agencies to use RBM appropriately, as it conducts performance audits of a large number of Canadian government agencies. The Auditor-General's website, particularly the Audit Resources section  holds several documents in the “Discussion papers” and “Studies and Tools” that are a reminder that many of the lessons learned fifteen years ago about RBM remain relevant today.


Who this is for:
26 Lessons on RBM - Reviewed by Greg Armstrong 


The paper Implementing Results-Based Management: Lessons from the Literature  provides a concise and relatively jargon-free summary of lessons from practical experience about how to implement results-based management. Its purpose, as the introduction notes, was to “assess what has worked and what has not worked with respect to efforts at implementing results-based management”.   It is shorter and more easily read than some of the useful but much longer publications on RBM produced since, and could be useful to agency leaders wanting a reminder of where the major pitfalls lie as they attempt to implement results-based management and results-based monitoring and evaluation systems.

The lessons reported on here about implementation of results based management remain as valid today as they were in 1996 when a first draft was produced, and in 2000, when this document was released.


Lessons learned about RBM in the last century remain relevant in 2010


Many of the lessons described briefly here are derived from study of field activities of agencies from North America, Europe, and the Pacific, going back at least twenty years. The 2000 paper is based on reviews of 37 studies on lessons learned about RBM which were themselves published between 1996-1999, and builds on the earlier study, referenced briefly here, which reviewed 24 more studies produced between 1990-1995.

More recent reviews of how RBM or Management for Development Results are -- or should be --implemented in agencies such as the United Nations, such as Jody Kusek and Ray Rist’s 2004 Ten Steps to a Results-Based Monitoring and Evaluation System  and Alexander MacKenzie’s 2008 study on problems in implementing RBM at the UN country level  build on, and elaborate many of the points made in these earlier studies, moving from generalities to more specific suggestions of how to make operational changes.

The 2000 paper from Canada's Office of the Auditor-General lists 26 lessons on how to make RBM work, and many of them repeat, and elaborate on the lessons learned earlier. The lessons on effective results-based management as they are presented here are organized around three themes:

  • Promoting favourable conditions for implementation of results-based management
  • Developing a results-based performance measurement system
  • Using performance information

A brief paraphrased summary of these lessons will make it obvious where there are similarities to the more detailed work on RBM and results-based monitoring and evaluation done in subsequent years. My comments are in italics:



Promoting Favourable Implementation Conditions for RBM


1. Customization of the RBM system: Simply replicating a standardised RBM system won’t work. Each organization needs a system customized to its own situation.

  • The literature on implementation of innovations, going back to the 1960’s confirms the need for adaptation to local situations as a key element of sustained implementation.

2. Time required to implement RBM: Rushing implementation of results-based management doesn’t work. The approach needs to be accepted within the organization, indicators take time to develop, data collection on the indicators takes more time, and results often take more time to appear than aid agencies allocate in a project cycle.

  • Many of the current criticisms of results-based management in aid agencies focus on the difference between the time it takes to achieve results, and aid agencies’ shorter reporting timelines.

3. Integrating RBM with existing planning: Performance measures, and indicators, should be integrated with strategic planning, tied to organizational goals, and management needs, and performance measurement and monitoring need high-level endorsement from policy makers.
  • Recent analyses of problems in the UN reporting systems repeat what was said in articles published as long ago as 1993.  These lessons have evidently not been internalised in some agencies.
4. Indicator data collection: We should build management systems that support indicator data collection and results reporting and, where possible, build on existing data collection procedures.

5. Costs of implementing RBM: Building a useful results-based management system is not free. The costs need to be recognised and concrete budget support provided from the beginning of the process.

  • This is something most aid agencies have still not dealt with. They may put in place substantial internal structures to support results reporting, but shy away from providing implementing agencies with the necessary resources of time and money for things such as baseline data collection.

6. Location for RBM implementation:  There are mixed messages on where to locate responsibility for coordinating implementation of RBM.  Some studies suggested that putting control of the performance measurement process in the financial management or budget office, “may lead to measures that will serve the budgeting process well but will not necessary be useful for internal management".  Others said that responsibility for implementation of the RBM system should be located at the programme level to bring buy-in from line managers, and yet another study made the point that the performance management system needs support from a central technical agency and  leadership from senior managers.

  • The consensus today is that -- obviously in a perfect world -- we need all three:  committed high level leadership, technical support and buy-in from line managers.

7. Pilot testing a new RBM system: Testing a new performance management system in a pilot project can be useful before large-scale implementation – if the pilot reflects the real-world system and participants.

8. Results culture: Successful implementation requires not simply new administrative systems and procedures but the development of a management culture, values and behaviour that really reflect a commitment to planning for and reporting on results.

  • 15 years after this point was made in some analyses of implementation of results-based management, the lack of a results culture in many UN agencies was highlighted in the 2008 review of UN agency RBM at the country level, and the 2009 UNDP handbook on planning, monitoring and evaluating for development results, reiterates the old lesson that building this culture is still important for implementation of results-based management.

9. Accountability for results: Accountability for results needs to be redefined, holding implementers responsible not just for delivering outputs, but at least for contributing to results, and for reporting on what progress has been made on results, not just on delivery of outputs.

  • The need to focus on more than just deliverable outputs to make results-based management a reality, was mentioned in some articles in the early 1990’s, reiterated in OECD documents ten years later, yet remains an resolved issue for some aid agencies which require still, just reports on deliverables, rather than on results.


10. Who will lead implementation of RBM: Strong leadership is needed from senior managers to sustain implementation of a new performance management system.

  • This remains a central concern in the implementation of results based management and performance assessment.  Strong and consistent leadership, committed to, and involved in the implementation of a new RBM system, remains in recent reviews of aid agency performance, such as the evaluation of RBM at UNDP, a continuing issue.

11. Stakeholder participation: Stakeholder participation in the implementation of RBM  -- both from within and from outside of the organization – will strengthen sustainability, by building commitment, and pointing out possible problems before they occur.

  • There is now a general acceptance – in theory – of the need for stakeholder participation in the development of a results-based performance management system but, in practice, many agencies are unwilling to put the resources – again, time and money – into genuine involvement of stakeholders in analysis of problems, collection of baseline data on the problems, specification of realistic results, and ongoing data collection, analysis and reporting.


12. Technical support for RBM: Training support is needed if results-based systems are to be effectively implemented, because many people don’t have experience in results-based management. Training can also help change the organizational culture, but training also takes time. Introducing new RBM concepts can be done through short-term training and material development, but operational support for defining objectives, constructing performance indicators, using results data for reporting, and evaluation, takes time, and sustained support.

  • A fundamental lesson from studies dating back to the 1970’s on the implementation of complex policies and innovations, is that we must provide technical support if we want a new system, policy or innovation to be sustained – We can’t just toss it out and expect everyone else to adopt it, and use it.
  • Some aid agencies have moved to create internal technical support units to help their own staff cope with the adoption and implementation of results-based management, but few are willing to provide the same technical support to their stakeholders and implementation partners.


13. Evaluation expertise: Find the expertise to provide this support for management of the RBM process on a continuous basis during implementation. Often it can be found within the organization, particularly among evaluators.

14. Explain the purpose of performance management: Explain the purpose of implementing a performance management system clearly. Explain why it is needed, and the role of staff and external stakeholders.

Auditor-General of Canada web page, on lessons learned about implementing RBM
Click to go to the English version of  Auditor-General of Canada's website or here for French

Developing Performance Measurement Systems



15. Keep the RBM system simple: Overly complex systems are one of the biggest risks to successful implementation of results-based management. Keep the number of indicators to a few workable ones but test them, to make sure they really provide relevant data.

  • Most RBM systems are too complex for implementing organizations to easily adopt, internalize and implement. Yet, they need not be. Results themselves may be part of a complex system.  But  simpler language can be used to explain the context, problems and results, and jargon discarded, where it does not translate -- literally in language but also to real world needs of implementers and ultimately the people who are supposed to benefit from aid.


16. Standard RBM terms: Use a standard set of terms to make comparison of performance with other agencies easier.


  • The OECD DAC did come up with a set of harmonized RBM definitions in 2002, but donors continue to use the terms in different ways, and, as I have noted in earlier posts, have widely varying standards (if any) on how results reporting should take place.  So simply using standardised terms is not itself sufficient to make performance comparisons easy.


17. Logic Models: Use of a Logic Chart helps participants and stakeholders understand the logic of results, and identify risks.

  • Logic Models (as some agencies refer to them) were being used, although somewhat informally, 20 years ago, in the analysis of problems and results for aid programmes. Some agencies such as CIDA [now Global Affairs Canada]  have now brought the visual Logic Model to the centre of project and programme design, with some positive results. The use of the logic model does indeed make the discussion of results much more compelling for many stakeholders, than did the use of the Logical Framework.

18. Accountability for results: Make sure performance measures and reporting criteria are aligned with decision-making authority and accountability within the organization. Indicator data should not be so broad that they are useless to managers. If managers are accountable for results, then they need the power and flexibility to influence results. Managers and staff must understand what they are responsible for, and how they can influence results. If the performance management system is not seen as fair, this will undermine implementation and sustainability of results based management.


19. Credible indicator data:   Data collected on indicators must be credible -- reliable and valid.   Independent monitoring of data quality is needed for this.

  • This remains a major problem for many development projects, where donors often do not carefully examine  or verify the reported indicator data.

20. Set targets:  Use benchmarks and targets based on best practice to assess performance.

  • Agencies such as DFID and CIDA are now making more use of targets in their performance assessment frameworks.

21. Baseline data:   Baseline data are needed to make the results reporting credible, and useful.

  • Agencies such as DFID are now concentrating on this. But many other aid agencies continue to let baseline data collection slide until late in the project or programme cycle when it is often difficult or impossible to collect.  Some even focus on the reconstruction of baseline data during evaluations – a sometimes weak and often ultimately last-ditch attempt to salvage credibility from inconsistent, and unstructured results reporting.
  • Ultimately, of course, it is the aid agencies themselves which should collect the baseline data as they identify development problems.  What data do international aid agencies have to support the assumptions that first, there is a problem, and second that a problem is likely to be something that could usefully be addressed with external assistance? All of this logically should go into project design. But once again, most aid agencies will not put the resources of time and money into project or programme design, to do what will work.

Using Performance Information



22. Making use of results data: To be credible to staff and stakeholders, performance information needs to be used – and be seen to be used. Performance information should be useful to managers and demonstrate its value.

  • The issue of whether decisions are based on evidence or on political or personal preferences remains important today, not just for public agencies but, as it has been recently argued, for private aid.


23. Evaluations in the RBM context: Evaluations are needed to support the implementation of results based management. “Performance information alone does not provide the complete performance picture”. Evaluations provide explanations of why results are achieved, or why problems occur. Impact evaluations can help attribute results to programmes. Where performance measurement is seen to be too costly or difficult, more frequent evaluations will be needed, but where evaluations are too expensive, a good performance measurement system can provide management with data to support decision making.

  • Much of this is more or less accepted wisdom now.  The debate over the utility of impact evaluations, primarily related to what are sometimes their complexity and cost, continues, however.

24. Incentives for implementing RBM: Some reward for staff – financial or non financial – helps sustain change. This is part of the perception of fairness because “accountability is a two way street”. The most successful results based management systems are not punitive, but use information to help improve programmes and projects.

25. Results reporting schedule: Reports should actually use results data and regular reporting can help staff focus on results. But “an overemphasis on frequent an detailed reporting without sufficient evidence of its value for public managers, the government, parliament and the public will not meet the information needs of decision-makers.”


26. Evaluating RBM itself: The performance management system itself needs to be evaluated at regular intervals, and adjustments made.


Limitations:

 This study is a synthesis (as have been many, many studies that followed it) of secondary data, a compilation of common threads, not a critical analysis of the data and not, itself, based on primary data.

It is only available, apparently, on the web page, not as a downloadable document. If you print it or convert it to an electronic document, it runs about 18 pages.

The bottom line:

The basic lessons about implementation of RBM were learned, apparently, two decades ago, and continue to be reflected throughout the universe of international aid agency documents, such as the Paris Declaration on Aid Effectiveness, but concrete action to address these lessons has been slow to follow.

This article still provides a useful summary of the major issues that need to be addressed if coherent and practical performance management systems are to be implemented in international aid organizations, and with their counterparts and implementing organizations.


Further reading on Lessons learned about RBM



OECD’s 2000 study: Results-based Management in the Development Cooperation Agencies: A review of experience (158 p), summarizes much of the experience of aid agencies to that point, and for some agencies not much has changed since then.

The World Bank's useful 2004, 248-page Ten Steps to a Results-Based Monitoring and Evaluation system written by Jody Kusek and Ray Rist, is a much more detailed and hands-on discussion of what is needed to establish a functioning performance management system, but it is clear that some of their lessons, similar to those in the Auditor-General's report, have still not been learned by many agencies.

John Mayne’s 22-page 2005 article Challenges and Lessons in Results-Based Management  summarises some of the issues arising between 200-2005.  He contributed to the earlier Auditor-General's report, and many others. [Update, June 2012: This link works sometimes, but not always.]

The Managing for Development Results website, has three reports on lessons learned at the country level, during the implementation of results-based management, the most recent published in 2008.

The 2009 IBM Center for the Business of Government’s 32-page Moving Toward Outcome-Oriented Performance Measurement Systems written by Kathe Callahan and Kathryn Kloby provides a summary of lessons learned on establishing results-oriented performance management systems at the community level in the U.S., but many of the lessons would be applicable on a larger scale and in other countries.

Simon Maxwell’s October 21, 2010 blog, Doing aid centre-right: marrying a results-based agenda with the realities of aid  provides a number of links on the lessons learned, both positive and negative, about results-based management in an aid context.



_____________________________________________________________




GREG ARMSTRONG
Greg Armstrong is a Results-Based Management specialist who focuses on the use of clear language in RBM training, and in the creation of usable planning, monitoring and reporting frameworks.  For links to more Results-Based Management Handbooks and Guides, go to the RBM Training website

RBM Training

RBM Training
Results-Based Management

Subscribe to this blog by email

Enter your email address:

Delivered by FeedBurner

 
Read the latest posts