Back to top

IV High-Level Intergovernmental Conference, Plenary Session 2: Status of the Independent Evaluation - Montevideo, Uruguay (November 2011)

Statement by Ms. Liliam Flores, Chair of the Evaluation Management Group

Excellencies, Ladies and Gentlemen,

After nearly five years of Delivering as One, Member States have been asking two questions:

It is a great pleasure for me and my colleagues in the EMG to attend this Conference and provide you with an update on progress made with the Independent Evaluation.

Mandate and modality

The mandate of this evaluation was contained in the 2007 TCPR resolution and also in the 2010 resolution on system-wide coherence. Member States emphasized the need for an independent evaluation of lessons learned from the voluntary efforts of pilot countries, for consideration by Member States, without prejudice to a future intergovernmental process.

The summary report of this evaluation will be submitted to the President of the General Assembly during the 66th Session, which ends in September 2012. The final purpose of the evaluation is to inform the forthcoming quadrennial comprehensive policy review of operational activities for development (QCPR) that will take place between October and December 2012.

Allow me a moment to give some important background information on the modality of the evaluation agreed in the 2010 resolution of the General Assembly.

 In the first place, an Evaluation Management Group (EMG) was appointed by the Secretary-General in order to ensure independence and credibility to the Independent Evaluation of Lessons Learned for Delivering as One.  Once the evaluation is completed, the group will dissolve.

The EMG is a group of nine evaluation experts from the five regions, from the pilot countries as well as from the Joint Inspection Unit and the United Nations Evaluation Group (UNEG).  None of us have had previous involvement with Delivering as One, though some of us are familiar with the UN system in general. We are all committed to strict adherence to Norms and Standards of the United Nations Evaluation Group (UNEG) and other international evaluation standards.

We report directly to the President of the General Assembly.

The EMG relies on a team of consultants, who implement the evaluation under our overall guidance. The team comprises of highly qualified evaluation specialists from both developed and developing countries.

Given its ad-hoc nature, such a complex evaluation would not be possible without extensive financial support. I would like to acknowledge extra-budgetary financial contributions made by many Member States as well as from within the UN system.

Generous financial contributions have been received from Australia, Canada, Denmark, Estonia, Germany, India, Sweden, Switzerland and the United Kingdom. The following UN organizations have also made contributions: UNDP, UNEG, UNESCO, UNFPA, UNICEF, UNIDO and WFP.

The total budget now amounts to US$ 2.2 million. UN-DESA has made available a trust fund for this purpose and ensures the secretariat for administrative and technical support ensuring a fully independent process. The EMG wishes to express its deep gratitude for this support and for the trust expressed in the proposed arrangements for the implementation of the evaluation.

Implementation of the mandate

Let me now turn to the way in which we have embarked on implementing our mandate. And give you an update on where we are now in the process.

We defined two strategic activities at two separated stages. First a review of all the documents related with the project including the evaluations already made, which we named the inception phase. This inception phase was concluded last September 2011. Secondly, the implementation phase, during which the countries as well as regional hubs and headquarters of organizations will be visited. During this phase all relevant information will be collected. This phase just started and we are hoping to conclude it by March 2012.

During the inception phase, we tried to understand in the first place the origins of the Delivering as One initiative. At the end of 2006, eight countries informed the Secretary-General of their intention to pilot the Delivering as One approach. They were inspired by the recommendations of the High-Level Panel on System-Wide Coherence appointed by the Secretary-General in 2006 as a follow-up to the 2005 World Summit Outcome.

We quickly understood that the "Four Ones” recommended by the Panel, namely “One Leader”, “One Programme”, “One Budget” and, where appropriate, “One Office” were applied rather differently in the eight pilot countries given their different national contexts. National ownership and leadership and respect of the principle “No One Size Fits All” drove the process.

We also understood that the recommendations of the High-Level Panel, as much as they were visionary, also proved to be rather controversial among Member States. In fact, they stimulated a great deal of debate on system-wide coherence that took place in the General Assembly between 2007 and 2010.

In 2007, Member States agreed on a comprehensive set of policies concerning operational activities for development of the UN system in General Assembly resolution 62/208, commonly referred to as the triennial comprehensive policy review (TCPR) resolution. Interestingly, this resolution did not incorporate recommendations of the High-Level Panel.

But the TCPR resolution does note the voluntary efforts to improve coherence, coordination and harmonization in the United Nations development system, including those of the pilot countries. The resolution also encourages pilot countries to evaluate and exchange their experiences. Last but not least, the resolution emphasizes the need for an independent evaluation of lessons learned from such efforts, for consideration of Member States, without prejudice to a future intergovernmental decision.

This provision in the TCPR resolution provides the mandate to the independent evaluation. The TCPR resolution is also important for this evaluation in another way. As a matter of fact, it provides the overall and consensual policy framework for operational activities of the UN system. In this sense, the resolution represents the overall benchmark for any evaluation of development activities of the UN system.

The conceptual framework and approach for the independent evaluation that the EMG developed took the policy framework of the TCPR as the most important starting-point. At the same time, we wanted to be open to the innovative nature of the Delivering as One initiative as well as to the diversity of experiences and lessons learned in very different country contexts.

Before presenting this conceptual framework and the approach, I would like to mention that during the inception phase we carefully assessed the rather extensive documentation on Delivering as One and especially the reports of the Country-Led Evaluations conducted in 2010 in seven of the eight pilot countries.

These evaluations were also undertaken with a view to generate lessons learned and produced much useful evidence. The one country that did not conduct a Country-Led Evaluation, Pakistan, undertook a stock-taking exercise in 2010 and is in the process of conducting a review of its One Programme. The evaluations and reviews will be useful for the independent evaluation.

During the inception phase, we realized that efforts to promote Delivering as One principles were not only realized in the eight pilot countries. A growing number of countries, many of which are represented in this room, voluntarily adopted the same principles. They are commonly referred to as “self-starters”. Given time and resource constraints, it was decided to adhere to the original mandate of this independent evaluation and not to extend the coverage of the evaluation to these “self-starters”.

However, the EMG did consider it as important to take into account systemic issues related to Delivering as One, i.e. initiatives at the headquarters or regional levels to promote and support Delivering as One at the country level as well as efforts to learn from the pilot experiences and incorporate Delivering as One at the systemic level.

Examples of such innovations triggered by Delivering as One are the Management and Accountability System” of the United Nations Development Group (UNDG), innovative funding mechanisms, for example, the Expanded Funding Window, and efforts to simplify and harmonize rules, regulations and business practices. It should be mentioned that during the inception phase we did not have access to the recent review of the Management and Accountability System and the management response of the UNDG. But these documents will be given serious consideration during the implementation phase.

A major outcome of the inception phase is a detailed Framework Terms of Reference PDF for the implementation phase, which is available for pilot  and other stakeholders. This document serves the main purpose of facilitating the next step, which is the implementation phase.

Another outcome of the inception phase is a detailed working document, which has been submitted to pilot countries and other stakeholders for comments. Comments will help us to collect credible data and accurately analyze results.

The President of the General Assembly kindly allowed us to use his website to convey some basic information on the independent evaluation. We will periodically update our web pages to keep Member States and other interested parties informed throughout the evaluation process.

Let me now present to you the conceptual framework and approach for the implementation phase.

Conceptual framework and approach of the implementation phase

As a starting point, for each Delivering as One initiative, the original intent or intervention logic is established in close consultation with stakeholders involved. Subsequently, it is important to assess, what actually happened and to what extent this adhered to the intervention logic. Thirdly, it is necessary to understand, what factors helped or hindered change and which were the drivers of change.

To make this clearer, it may be useful to briefly present the key evaluation questions of the independent evaluation:

The Delivering as One evaluation is challenged by the very nature of the initiative:

Delivering as One was also meant to reduce transaction costs for both the UN system itself as well as for national and external partners. In practical terms, it will be challenging to evaluate to what extent this aim has been achieved, as a methodology to measure transaction costs was developed by the UNDG in 2010 only. There is a chance that the respective data have not yet been collected in the countries.

Evaluation of complex, international system-wide processes and of their application and results in individual countries pose substantial difficulties in terms of counterfactuals, which can be used to represent what the situation would have been without the interventions.  Under such circumstances the use of common OECD / DAC evaluation criteria (relevance, effectiveness, efficiency, sustainability and impact) is by no means easy. It is recommended to overcome these difficulties through the development and use of a limited Theory-of-Change approach.

The overall analysis will have a strong focus on drawing out factors that may have contributed to or hindered the implementation of Delivering as One.  Delivering as One needs to be evaluated in the respective programme country contexts.

At the end of the day, it is not enough to ask, whether the UN system has improved its performance, but to assess, whether the countries are in a position, as a result of Delivering as One, to make better use of the UN system to achieve their national goals and priorities. This includes the Millennium Development Goals and other internationally agreed development goals such as gender equality, women’s empowerment and human rights.

Next steps and timelines

As we speak, three countries have already been visited by the evaluation team (Albania, Mozambique and Tanzania) and visits to the other pilot countries are scheduled to take place between November 2011 and January 2012. In addition, the evaluators will visit regional hubs in Bangkok and Panama and at least conduct a video-conference with the UNDG team in Johannesburg.

At headquarters level, the evaluation team has already visited FAO and IFAD in Rome and plans to conduct interviews and stakeholder consultations with Specialized Agencies and other entities in Geneva, New York, and Vienna. In New York, the team will seek to interview Permanent Missions of UN Member States (both developed and developing countries). This will seek to obtain an understanding of perceptions among these bodies concerning the results and future of the DaO initiative and of related changes in UN practices.

The final major task of the evaluation team during the implementation phase will be analysis of the evidence collected and drawing out key findings and lessons learned. Findings, conclusions and lessons learned will be shared with stakeholders, including with pilot countries, for validation and feedback during the months of February and March 2012.

The full evaluation report is scheduled to be finalized by the end of the month of March 2012.

The EMG will closely monitor the work of the evaluation team as well as stakeholder consultations. Our group will be supported by a Quality Assurance Panel composed of outstanding international experts in the areas of evaluation theory and methodology and UN development assistance.

As I previously said, the full evaluation report will be the basis of a summary report, which will be submitted to the President of the General Assembly during the 66th Session, which ends in September 2012. In accordance with Norms and Standards of the United Nations Evaluation Group (UNEG), we will request the Secretary-General to provide a separate management response to the independent evaluation for the consideration of Member States.

Concluding remarks

Excellencies, Ladies and Gentlemen,

I would like to conclude my presentation by recalling the ultimate purpose of this evaluation, which is to inform the quadrennial comprehensive policy review (QCPR) of operational activities for development of the UN system in late 2012, as well as other inter-governmental processes concerning system-wide coherence.

As I emphasized in the beginning, Member States have been asking two questions in the General Assembly:

To answer the first question, the independent evaluation will hopefully be able to crystallize, what has happened in the pilot countries as well as at regional and headquarters levels, how it happened and why it happened. In other words: the evaluation will contribute to an understanding of underlying factors that have helped or hindered Delivering as One as well as the drivers of change.

The second question challenges us in the sense that evaluation should not only be independent and credible, but also useful for policy-making. It will be important that lessons learned will be articulated in such a manner that Member States find them useful when they negotiate the QCPR resolution in late 2012.

          Last but not least, I would like to ask for the support of many of you present here in this room, governments and other national stakeholders of pilot countries, governments of other countries, UN organizations at all levels, and many others. We consider evaluation to be a participatory process. The quality of findings, conclusions and lessons learned will ultimately depend on the willingness and interest of all stakeholders to provide all relevant information and to remain actively involved in the process.

Thank you for your kind attention.