Uganda is often cited as a great example of a country government that has made rapid strides in strengthening its evidence ecosystem, particularly over the past decade. Until 2020, the country’s development plans used a sectoral approach for setting priorities and monitoring the achievement of results. The recent reform to move from a sectoral to a programme approach in the third National Development Plan is meant to address several sticky challenges in government functioning – break sectoral silos, reduce duplication and promote synergies in planning, monitoring, evaluation and budgeting. The ‘programme approach’ has involved reconstituting ministries, departments and agencies into eighteen programmes to deliver the national plan
In 2021, the FCDO-funded Strengthening Evidence Use for Development Impact (SEDI) programme partnered with the Monitoring & Evaluation (M&E) directorate of the Office of the Prime Minister, Uganda, to co-design a diagnostic that examined evidence use within the Development Plan Implementation Programme. This is the programme that is responsible for ensuring the effective implementation of the national development plan and involves several key players including the National Planning Authority, the Ministry of Finance, Planning and Economic Development, the Office of the President and the Office of the Prime Minister. The SEDI diagnostic combined focus group discussions, participatory exercises, and interviews, while drawing on the learning from previous assessments of the government’s evidence system in Uganda. It examined how the key players worked with each other to generate and use evidence.
In this blog, we draw on the work we did on the diagnostic to share some broad insights about evidence use within government. (To read more about what we learned about the process of co-designing a diagnostic using the Context Matters framework, read our learning brief here.
Uganda has made significant progress in institutionalising evidence but breaking the silos across a complex system needs continuous investments. The country’s existing policies and supportive institutional structures have given an impetus to the generation of evidence and promoted conversations on its use. The government has an M&E directorate, housed within the Office of the Prime Minister and many ministries have their own in-house M&E units. It has an M&E strategy and there are several coordination structures and forums that have been created for multi-stakeholder collaborations and discussions on evidence for informing planning, programming and budgeting. Local governments are also active in carrying out independent audits for assessing their performance.
With the transition to the programme approach, secretariats for coordinating across ministries, and annual learning forums for learning and uptake in decision-making have been added to this mix. While some of these forums promote engagement within government as well as with non-government actors, coordination within a complex system is difficult and time consuming. A challenge that was often flagged by our government partners was the lack of alignment between M&E, planning and budgeting. They emphasized the need for vertical alignment (between the national and local government) and horizontal alignment (between ministries, departments and agencies as well as with non-government stakeholders). With the move to the programme approach, our government partners are more than aware of the challenges involved in breaking silos across the system and the time as well as the resources required to implement reforms. But the interest shown in having these conversations and considering solutions keeps the momentum going in implementing reforms.
To understand evidence, use, you need to understand power dynamics at the macro, sectoral and institutional level. Politics and powerful interests affect the generation and use of evidence both in formal and informal ways. SEDI carried out a political economy analysis of evidence use in Uganda (2020) that showed that at the macro-level, business associations, big business leaders, trade unions, and religious and cultural groups wield considerable influence on policy formulation. Religious and cultural leaders, for example, have stymied legislation, policy, and programming related to family planning, sexuality education and gender equality for the past several years. If evidence does not favour political considerations, and counters deeply rooted traditional and gendered norms and beliefs, it is less likely to be considered.
Politics within and between institutions also affects the motivation to collaborate in the production and use of evidence. We learned through our work on the diagnostic that reorganising Uganda’s sectors into programme areas involved extensive negotiations between the different ministries, departments and agencies over leadership positions and access to resources. The time taken for setting up coordination structures and developing a shared understanding of a more harmonised evidence culture was longer than envisaged. What it showed us was that to identify spaces for change for strengthening evidence use, you need to think and work politically and consider the implications of power dynamics at different levels of an evidence ecosystem.
To strengthen evidence, use, you need to consider both visible and invisible constraints. ‘Technical’ aspects such as the lack of budget, infrastructure, staff and skills are visible constraints that are often highlighted as factors that prevent evidence use. However, there are many ‘soft’, invisible factors, such as organisational culture, motivation, incentives, trust, beliefs, norms and openness to changes, that are also crucial for ensuring evidence-informed decision-making.
Drawing on Harvard University’s problem-driven iterative adaptation approach, we used the ‘Authority, ability and acceptance’ framework to determine whether there are spaces for change within organisations and between organisations. The use of this framework was important for unpacking both the visible and invisible constraints. By tailoring stakeholder mapping techniques, we also tried to understand both the formal and informal relationships between organisations and teams.
While there are ‘technical’ constraints to evidence use in Uganda, there are also ‘softer’ challenges. Evidence, for example, is mainly used to monitor performance and to promote accountability rather than learning within the government. We also found that in some instances evidence is not used when it is not generated internally. There was an “evidence acceptability” issue which is linked with the lack of coordination as well as trust between generators, users, and brokers of evidence. Along with more investments in building the technical aspects of the evidence ecosystem, there is also a cultural shift that is needed for building trust and embedding evidence use and learning within the system.
Unpacking what ‘Evidence’ means is important for identifying gaps and building on strengths. SEDI’s political economy analysis and the diagnostic work delved into the meaning of ‘evidence’ for different government stakeholders. We found that most stakeholders emphasised the use of statistics, monitoring and administrative data as evidence. The government systems are set up to generate this type of evidence to meet reporting requirements. This emphasis on nationally representative quantitative data meant that qualitative research on small and marginalised populations was rarely part of the discourse.
Although a limited number of evaluations are carried out, they are often process evaluations which focus on implementation rather than effectiveness related questions. Most evaluations are donor funded and since they are mainly used as mechanisms for accountability, they find limited use in decision-making. Although citizen feedback is actively sought by district governments through the Barazas or community forums, the government’s follow-up mechanisms need strengthening.
We found great openness in our government partners to discuss both gaps and strengths. There was also interest in looking at new sources of evidence, such as big data, for informing policy and programming.
Through our work with the government in Uganda, we learned valuable lessons about the importance of building trust-based relationships across the evidence ecosystem. The relationships may be both formal and informal. But in most cases, the informal, trust-based relationships between the different actors (generators, brokers, and users) influence a quick response to policy windows where evidence can be useful. Investments in building relationships between actors in an evidence ecosystem need to therefore be valued, planned for and resourced. The complexity of evidence-informed decision-making within government can be mind boggling. But the enthusiasm of our government partners and their commitment to making change happen remains a great source of inspiration for future work.
This blog is authored jointly with Radhika Menon, Lead, Research uptake hub at Oxford Policy Management