Читать книгу Learning in Development - Olivier Serrat - Страница 12

Directions in Evaluation

Оглавление

The new modus operandi for OED that took effect in 2004 has implications for its performance. The structural changes place more emphasis on OED’s developmental role and make development impact a more explicit consideration for operational decisions. In a changing context, OED has focused its 3-year rolling work program on priority areas, has moved to capture synergies between OED and ADB’s operations departments, and has begun to select evaluation topics in a way that should ensure higher effectiveness and impact.

Assigning Resources. Fewer project or program performance evaluation reports are in OED’s forward work program. Efforts are made to select evaluation topics that are of strategic relevance to ADB and DMCs.21 There are changes, too, in the way that OED assigns staff to evaluation studies. In the past, OED specialists worked individually on separate evaluations, supported by consultants and national officers. Today, two or more evaluation specialists work together on one evaluation. This is expected to deepen the analysis.

Harmonizing Evaluation Standards. The ECG was chaired by the director general of OED in 2005–2006 and, in October 2005, the group met for the first time at ADB’s headquarters. The ECG has developed good-practice standards for various types of evaluations and undertakes benchmarking studies to assess how each member applies them. Standards have been completed for public and private sector project lending and program lending, and are being prepared for the evaluation of country strategies and TA. Two benchmarking studies have been completed for private sector project lending, and one is under way for public sector project lending. ADB took the lead in developing the standards for evaluating policy-based lending and is leading ongoing work to develop standards for country assistance program evaluations.

The ECG is examining the feasibility of developing a robust peer review of evaluation functions in the multilateral development banks that will address issues such as (i) the independence of the evaluation office and the role of Management and Board of Directors, (ii) the selection of evaluation topics, (iii) adherence to good-practice standards, (iv) the quality of evaluation reports, (v) use of lessons and recommendations, and (vi) staffing and budgets. The group has appointed ADB to a task force to develop the peer review methodology.

Box 12: OED’s Work Programa

In 2007–2009, OED will (i) undertake fewer evaluations of individual operations but validate ratings in all project or program completion reports; (ii) increase the number of broad evaluations, such as corporate and policy evaluations, country and sector assistance evaluations, thematic evaluations, and impact evaluations; (iii) improve the evaluative content of the Annual Evaluation Review and Annual Report on Loan and Technical Assistance Portfolio Performance; (iv) prepare or revise evaluation guidelines; (v) sequence evaluations so that those done early in the 3-year rolling work program feed those planned for the later years; and (vi) promote knowledge management, including dissemination of findings and recommendations in accessible and digestible ways.

a Available: www.adb.org/evaluation/2007-work-program.pdf

Evaluating Country/Sector Assistance Programs. Because country and sector assistance program evaluations are having an impact on the formulation of the subsequent country partnership strategies, these evaluations will receive priority in allocating OED staff resources. The DEC strengthened feedback between evaluation findings and formulation of country strategies by requiring country assistance program evaluations for major countries to be prepared and discussed by the DEC before a new country partnership strategy is completed. A new appendix has been added to the document template for country partnership strategies to indicate how the strategy addresses the country assistance program evaluation and the DEC’s recommendations. A new product, the country assistance program evaluation update, will be introduced because (i) completion reports for country partnership strategies are being prepared, which will provide a better basis for preparation of country assistance program evaluations; (ii) some country portfolios are relatively small, and do not merit the depth of analysis undertaken in past country assistance program evaluations; and (iii) OED will undertake second country assistance program evaluations for an increasing number of countries.

Jointly Evaluating Country/Sector Assistance Programs. The Evaluation Network of the Development Assistance Committee of the Organisation for Economic Co-operation and Development has identified the evaluation of total official development assistance flows to a country as an important topic that has not been addressed by evaluators. Besides the ECG, and as part of the international harmonization agenda, there is increasing interest in undertaking joint evaluations for greater consensus and usefulness of results. The Joint Evaluation of Global Environment Facility Projects of 2005–2006 was OED’s first involvement in such an evaluation. In 2007, work will begin on a joint country assistance program evaluation in Bangladesh in partnership with the World Bank, the Department for International Development of the United Kingdom, and the Japan Bank for International Cooperation. This evaluation is expected to be finished in 2009.

Validating Country Partnership Strategy Completion Reports. ADB is beginning to produce country partnership strategy completion reports. OED will pilot a new evaluation product in 2007 in the Maldives, the country strategy completion report validation, primarily for countries with small portfolios for which production of a full country assistance program evaluation would not be an efficient use of resources.

Validating Project/Program Completion Reports. OED will change the way that individual projects/programs are selected for evaluation. The strategic objective is for the quality of completion reports to be sufficient to rely on this self-evaluation. Efforts to improve the quality of completion reports appear to have removed the upward bias of ratings at completion. The changes include the following: (i) moving away from the 25% random selection of projects and programs for which evaluation reports are prepared to a smaller, purposeful sample of 10 per year; (ii) OED would validate, based on a desk review, all ratings (with the validation assessment to be attached to the completion report) rather than commenting on draft completion reports—much of this work will be outsourced using funds that were previously spent on staff consultants for preparation of evaluation reports; and (iii) reporting success based on the combined completion and evaluation ratings as has been done in the more recent issues of Annual Evaluation Reviews. The projects and programs for evaluation will not be randomly selected. Selection triggers will include dispute over a rating, by OED or external stakeholders, and special interest.

Evaluating Impact. OED agrees with the general conclusions of the debate in the international evaluation community (IEC) about impact evaluations: (i) more rigorous impact evaluations are desirable; (ii) the methodology will be determined by issues related to data availability, time, and resources; and (iii) impact evaluations will be undertaken selectively, largely in the social sectors. OED is undertaking its first rigorous impact evaluation on microcredit in the Philippines as part of an evaluation study on the effectiveness of ADB’s microcredit operations. This will be completed in 2007. One impact evaluation is programmed per year.

Developing Evaluation Capacity. Evaluation capacity development is part of OED’s mandate. As of January 2007, OED had formulated 15 TA operations to this intent for a total amount of $4.35 million in Bangladesh, the People’s Republic of China, Nepal, Papua New Guinea, Philippines, Sri Lanka, and Thailand. Three phases can be distinguished in this assistance:

• Phase 1: TA focused on building a postevaluation capacity within a central agency and providing the means for disseminating postevaluation findings for decision making.

• Phase 2: TA aimed at establishing ADB’s project performance management system in central and sector agencies.

• Phase 3: TA aimed at building more generic results in monitoring and evaluation capability.

In the future, OED expects to work with evaluation units in DMCs to provide on-the-job evaluation experience and knowledge transfer, building on lessons learned from the evaluation of the 15 TA operations, not all of which were successful.

Promoting Portfolio Performance. OED began to provide real-time feedback on portfolio performance in 2001. The 2005 Annual Report on Loan and Technical Assistance Portfolio Performance highlighted serious and fundamental corporate issues. At the DEC’s recommendation, ADB’s Management prepared an action plan to address these issues.

Evaluating Business Processes. In connection with ADB’s reorganization of 2002, a working group on business process change was appointed in 2001 to review country strategies and programs and subregional cooperation strategies and programs; public sector loans (including project preparatory TA operations), private sector loans, and nonlending products and services (including advisory TA operations); and portfolio management. In addition to its reporting on portfolio performance, OED has included business process-related evaluation studies in its work program. Forthcoming evaluation studies will examine, for instance, the effectiveness of ADB’s loan processing system, its approaches to policy dialogue and reforms, and the quality of the design and monitoring framework.

Box 13: Developing Evaluation Capacity in Developing Member Countries

• Stability of trained staff, high-level support, and the existence of a mandate for evaluation by decree are factors that contribute to success.

• More thorough preparation of future TA operations should ensure high-level ownership and commitment, and participation of key stakeholders in formulation and design.

• If the conditions for public sector capacity building are not met, an assessment must determine whether the systemic or underlying problems should be addressed first.

• Building DMC capacity requires a holistic approach, considering the needs at all levels.

• The location of responsibility for evaluation within organizational hierarchies is also important.

• During design and implementation of TA operations, care must be taken that performance evaluation systems do not become supply driven, complex, or too resource intensive to sustain.

• Establishing performance evaluation systems is a means to an end—benefits are obtained when the results are used in decision making. The design of TA should include specific features to encourage, facilitate, and formalize the incorporation of evaluation results in decision making.

• A case study approach is needed to develop staff competency and confidence to carry out evaluation.

• For larger TA operations, a firm or institution should be recruited, rather than individuals.

• The pace of TA should be driven by a sense of ownership and commitment in DMCs.

• The introduction of computerized information systems is not a solution to poorly performing manual systems. Various institutional, management, and social factors need to be taken into account.

Box 14: Perceptions of OED: Feedback from Members of the Board of Directors

Interviews with Board members revealed the general perception of the mission and functions of OED as to provide independent assessment with a direct link to operations. OED is seen as collegial, dedicated, and professional. While OED has generally changed with the changing focus of ADB, there is an inevitable lag as evaluation activities adjust to new organizational thrusts.

OED has been able to influence ADB’s operations at all levels by providing concrete recommendations based on solid and credible analysis. At the project/program level, the time lag between completion and evaluation is an issue, as evaluation findings can easily be dismissed as discussing the old way of doing things, while current practices may have changed. At the strategy and policy levels, the improved timing of country and sector assistance program evaluations has increased impact on the design of new country partnership strategies.

In its knowledge management, OED faces several interface problems. Within ADB, OED should open up channels of communications, become even more specific about actionable recommendations, and delineate accountabilities clearly. The most difficult interface is with DMCs: OED should emphasize development of evaluation capacity. In the eyes of wider clienteles, such as NGOs, civil society organizations, and the general public, OED should not only be independent, but be perceived as such. It should produce concise and insightful summaries of its work that people can access and understand easily.

To help ADB improve its development effectiveness, Board members invited OED to

• develop a comprehensive annual development effectiveness report—building on the Annual Evaluation Review and Annual Report on Loan and Technical Assistance Portfolio Performance— that presents a truly serious discussion of results and holds ADB’s Management accountable for what it promised to do;

• work in ways that enhance the link between development effectiveness and resource allocation;

• generally emphasize simplicity in project/program designs;

• keep the focus of ADB on poverty reduction, both income and non-income;

• further strengthen the design and monitoring framework of projects, in particular by identifying killer assumptions and risks; and

• promote more interaction and sharing among ADB departments and offices.

Disseminating Findings and Recommendations. Although there have been improvements, ADB is not yet a learning organization in terms of actively using the lessons documented in OED reports to improve future operations. OED is developing a better system to categorize and disseminate its findings and recommendations using information technology. However, technology by itself will not solve the problem. OED is investing resources in knowledge management to distill lessons and do a better job of disseminating them within and outside ADB. New knowledge products and services are being designed, tailored to specific audiences,22 in forms that present results in accessible and digestible ways. Objective indicators are being developed to assess whether ADB is becoming a learning organization by using OED findings and recommendations.

Learning in Development

Подняться наверх