Background
Methods
Report identification
Quality appraisal
-
Was the purpose of the evaluation clearly stated?
-
Was the methodology described (including the analysis)?
-
Were the indicators made explicit and justified?
Indicator extraction
Synthesis
Results
Evaluations by international funder | Health RCS characteristics | Relation of evaluator to funder | Evaluation characteristics | ||
---|---|---|---|---|---|
Project, programme(s), organisation | Period (duration) covered by the evaluation | Timing | Main approaches/methods | ||
RCS programme – with streams, health sector | 1960+ (48 years) | Funder staff | Periodic review | Analysis of existing award data, alumni evaluation survey, 15 case studies, and 5 telephone interviews of selected scholarship recipients; impact assessment. | |
Carnegie[25] | RCS initiative with networks | 2008–2010 (2 years) | Contract evaluation organisation | Mid-term | Desk review or initiative and network documents, interviews and focus groups with stakeholders (key staff and students within each network). |
Danida[26] | Health research programmes of which health RCS is a part | 1997–2006 (10 years) | Contracted evaluation organisation | Periodic review | Components were: a) country reports with visits; b) desk study review of projects; c) institutional questionnaires for Danish research groups; d) ‘internal’ [Danish organisations] individual staff questionnaires; e) ‘external’ [non-Danish other HIC funder] questionnaires and interviews; f) literature review of publications supported; g) evaluation document analysis; and h) health-related project database analysis. |
DfID[27] | Project Health research council | 2008–2010 (2 years) | External programme evaluation team | Mid-term review | Desk review of organisational, programme, and project documentation; site visit with interviews of stakeholders, beneficiaries, non‒beneficiaries, funders, and secretariat; in‒depth case studies of selected grantees and their institutions; and evaluation of the grants selection process. |
EDCTP[28] | Health research partnership | 2007–2009 (2 years) | Independent external panel | Periodic review | Documentation analysis, meetings/discussions and interviews with organisational representatives, questionnaire survey of researchers, site visit, conference attendance and country case study. |
Health research programme with projects | Roughly 2001–2008 (7 years) | Contracted evaluation team | Special review | Conducted a gender audit at three levels – institutional, programmatic, and project (review of 15 projects) – through documentation review; search of guidelines and strategies of other organisations working on policy, health and gender issues; review of a previous internal gender survey; gender questionnaire to assess capacity development needs; and individual interviews with funder staff. | |
NIH-FIC (1) [33] | Health RCS programme | 1992–2003 (11 years) | Contract evaluators | Periodic review | Outcome evaluation using NIH-FIC evaluation framework and FIRCA logic model. Administrative data collection and review, interviews with programme stakeholders, census surveys of the US principal investigators and international research collaborators, bibliometric analysis of publications, and site visits. |
NIH-FIC (2) [34] | Health RCS programme | 2002–2008 (6 years) | Contract evaluation team | Mid-term review | Programme implementation and preliminary outcomes. Data collection methods included two online surveys (GRIP awardees, unsuccessful applicants with scored applications). Supplementary data from administrative sources and databases, MEDLINE, and from interviews with US-based mentors, FIC staff members, and programme partners. |
Health RCS & health research programmes | 2005–2008 (4 years) | (2008) Committee of three experts & two secretariat members (2009) Contract evaluators | Mid-term review | (2008) Background document review, discussions with programme coordinators, site visits with interviews, formulate recommendations, and discuss with Programme Committee. (2009) Not specified but included: programme document review, programme logic construction, projects’ progress reports analysis, and stakeholder interviews. | |
Sida[37] | Linked health RCS project funding (three routes) | 1999–2005 (6 years) | Contract evaluators | Mid-term for re-formulation | Emailed questionnaires to institutions, individuals, and graduates. Interviews during site visits and evaluation seminar at main site. |
Organisation’s entire set of health RCS programmes | 2000–2008 (9 years) | Contracted institute evaluation team | Periodic review | Questionnaires (individuals, research groups, and institutions), selected in-depth interviews, institutional site visits with stakeholder semi-structured interviews. | |
Wellcome trust[40] | Health RCS project – Consortium | 2009–2011 (2 years) | Contract evaluation organisation | Mid-term (Second annual) | Real-time, monitoring and evaluation with mutually agreed framework of qualitative and quantitative indicators. Analysis in the light of all consortia within the programme of which this project is a part. |
Quality of the health RCS evaluation designs
Quality appraisal of evaluations – illustrative examples
Purpose of evaluation clearly stated
-
To appraise Swedish International Development Agency’s support to capacity building in the sub-Saharan Africa region. The most important purpose from the evaluators’ point of view was to provide stakeholders with the opportunity to learn about and develop the ongoing project [37].
-
To assess implementation and preliminary outcomes, focusing on awardees careers; to guide a future outcome evaluation [34].
-
To assess European and Developing Countries Clinical Trials Partnership (EDCTP) programme performance, including economic, social, and environmental impacts; address the role of EDCTP in the broader international research and development agenda; learn lessons and make recommendations for future initiatives [28].
Explicit evaluation design
-
A feasibility study, including pilot tests, guided the evaluation survey design [33].
Data collection clearly described and validity checked
-
Used qualitative interview recording, transcribing, and thematic coding. A self-assessment tool was used for research competency but its provenance was not explained [38].
-
Interviews to solicit information on factors influencing post-grant careers; interviewees selected to balance gender, research interest, and nationality [38].
-
Online surveys for awardees and unsuccessful applicants [34].
Indicators explicit and justified
-
Each bibliometric indicator provided insights into research quality, i.e., quantity of papers, citation rates, impact factor; norm-referencing [33].
-
Indicators were stipulated in an evaluation framework and designed with stakeholders using intervention logic [40].
-
Evaluation used EDCTP’s indicators, but limited by absence of any a priori formulated measurable indicators for the expected outcome set at the start of the programme [28].
Biases and limitations discussed
-
The lack of a uniform monitoring and evaluation framework and reporting system resulted in collection of different types of data, and therefore different insights and conclusions [25].
-
Limitations of using a self-assessment survey [25, p. 14] and the subjectivity of the evaluations and learning [40].
-
Variables (e.g., linguistic, internet access) and potential biases in responses, recall, and classification were taken into account [33].
-
Consideration was given to the feasibility of a comparative evaluation design and the need for longer-term and more rigorous design to assess outcomes and impact of Global Research Initiative Program [34].
-
Lack of pre-determined measurable indicators and independently verifiable data necessitated an opinion-based retrospective evaluation [28].
Indicators used for tracking progress in health RCS initiatives
Individual level indicators
Research skills training activities: PhDs, MScs, scholarships, fellowships, and salary supplementation. Training of research support staff, i.e., data managers, research laboratory personnel, statisticians, and research managers. | |
Outputs
|
Outcomes
|
Feedback from recipients about career prospects. | Development of research skills, i.e., identification of a research problem, analytical review of a scientific article, research proposal, and scientific report writing. |
Quality of training. | |
Balance between training in research methods (i.e., protocol, methods, collection and analysis), research process (i.e., writing, communication, knowledge transfer), and advocacy, promotion, negotiation, and resource mobilisation. | Quantitative and qualitative evidence of the effectiveness of the awards (from survey about careers, achievements, and impact). |
Evidence that awardees returned to active and independent research in LMICs. | |
Reasons why trainees did not return/stay in LMICs (e.g., poor career prospects; no opportunity to use skills). | |
Development of sustainable research collaborations. | |
For HIC researchers, improved understanding of international research issues and increased desire to collaborate with researchers in developing countries. | |
New research funding obtained. | |
Mentoring activities: Individual support for developing skills in research and supervision. | |
Outputs
|
Outcomes
|
Number of trainees with a mentor. | Number of grantees working as senior researchers and their location (e.g., academia, in government agencies, or private sector). |
Knowledge of reasons for lack of career development, i.e., lack of resources, supervision, and collaborators. | |
Percent of time spent on research activities. | |
Scientific conference and workshop activities: Health Economics Conference, EDCTP Forum, networking, sharing with colleagues, and policy makers. | |
Outputs
|
Outcomes
|
Number of meetings/workshops attended pre- and post-funding. | Research by awardees published in conference proceedings. |
Invitations to speak at meetings. Honours, awards, esteem, expanded social networks. | |
Membership and/or leadership role (e.g., president, chair, secretary, editor) in professional societies, advisory groups or scientific journal. | |
Course and curricula development activities: Short courses/diplomas/degrees in research skills and methods, and scientific topics developed in response to a needs assessment and embedded within the university. | |
Outputs
|
Outcomes
|
Partnerships used for course design, student supervision, mentoring, and bilateral recognition of credits. | Secondary benefits to students through training, travel and education opportunities made them ‘diffusers’ of new techniques between institutions. |
Courses (e.g., masters, PhD) run by university consortia promoted relationships between universities and/or across specialities (e.g., health economics). | |
Database of courses; attendance register. |
Institutional level indicators
Human resources strengthening activities: Staff training and recruitment (e.g., data management, laboratory scientists), including salaries. Strengthening inter-staff and inter-student relationships. Promoting inter-disciplinarity, diversity, and specialization. | |
Outputs
|
Outcomes
|
Numbers of potential supervisors. | Recruitment and retention of researchers, supervisors, and core staff. |
Capacity to mentor junior researchers, take on | |
leadership and inspirational roles. | Clear research career paths/possibilities. |
Institutional destination/return home of researchers and graduates. | Involvement of research managers in the collaboration/network. |
Activities for strengthening research infrastructure and management: Support for infrastructure (e.g., laboratory facilities, equipment, and maintenance; libraries, IT, computers). Setting up ethical review boards, engagement of stakeholders and secretariats. Improved governance, planning, strengthening of financial reporting, institutional evaluation capacity, and gender analysis. | |
Outputs
|
Outcomes
|
Establishment of cross-cutting projects, sharing of equipment (e.g., fridge, freezer, thermocycler, microscopes, centrifuge, and computer), staff (e.g., lab technicians), and systems (e.g., data management) facilitates integration of research activities. | Better access to resources (e.g., staff, libraries, journals, equipment). |
Research staff satisfied with institution’s research services (i.e., workplace, library, internet access, journal access, lab facilities, purchasing system, maintenance, human resources). | |
Standard operating procedures, quality assurance mechanisms. | |
Improved management and administrative capacity and technical capacity (e.g., for lab quality control, trial monitoring services, data management, and data analysis support). | |
A research support centre, scientific steering committee, institutional governance structure, and organisational chart. | |
Commitment to or implementation of strategic | |
planning, management, new policies, resource allocations. | Achievement of international accreditation, e.g., of laboratories able to attract private funding as well. |
Evidence of a transferable partly self-sustaining model (salaries externally supported) for Research Support Centre. | |
Scientific collaboration activities: Promotion of collaborations for North–South and South-South and/or regional partnerships, sometimes restricted to existing grantees, or projects led from the South. | |
Outputs
|
Outcomes
|
Formal agreements, including for data sharing. | Collaborations characterised by trust and commitment, and continue after award concludes. |
Site inspections, meetings together. | |
Joint PhD students, projects, and technologies shared between collaborators. | |
Benefits for northern institutions (i.e., understand LMICs health system, engage with research and training institutions). |
National/international level indicators
Engagement and communication activities for research uptake: Engagement with private and non-health organisations, NGOs, HIV programmes, research institutions, health ministries, regulatory authorities. Using journals, press, magazines, conferences/workshops, networks, face-to-face interaction, websites, consensus reports, policy briefs, newsletters. | |
Outputs
|
Outcomes
|
Skills development program from public-private-academic partnerships. | Advocacy resulted in enhanced health RCS effort, or enhanced knowledge about neglected topic diseases (e.g., fish-borne zoonotic parasites). |
Systematic plan for acquiring and using research information, and for sharing and transferring knowledge. | |
Knowledge about focus of health RCS efforts – tend to be more on researchers and less on research users. | |
Media articles (i.e., press, magazines, reports, website). | Partnerships for research dialogue (e.g., with policymakers, research users, decision makers national authorities, professional groups, private sector, NGOs, civil society) at local, regional, and international levels. |
Communication/knowledge management strategy | |
Trends in website hits. | |
Activities to develop national health research systems or scientific councils: Promote financial sustainability in regional research activities. | |
Outputs
|
Outcomes
|
Map of national research system. | Strong commitment and active engagement by national health research institutions and health ministries to review progress and determine research priorities. |
Knowledge about contribution (or not) of national agencies to development of effective national health research system and in creating demand for research. | |
External funds provided more accessibility and flexibility than local funds. | |
Networking activities for researchers and/or research users: Facilitation of collaborations and large-scale networks, sometimes through multi-disciplinary workshops, curricula, meetings, and seminars. | |
Outputs
|
Outcomes
|
New programme and partnership for research to strengthen links between universities and policy making (e.g., systematic reviews for research). | Impact on policy, practice, and knowledge at different levels (i.e., international, regional, national, district level) and on health and non-health sectors, through research and policy networks. |
Project staff contributed to evaluations of health centres and systems and to motivating medical staff. | Estimated impact on disease control and prevention. |
Harmonised regional research activities. | |
North–South and South–South networking activities. | |
Active committees with institutional representation in each member country. | |
Commitment and communication with the Northern and among Southern partners. |
Discussion
Indicator coverage
Indicator quality
Health RCS contribution assessment
Limitations of our study
Directions for evaluation of health RCS
Recommendation | Funding agencies | Priority decision-makers | Producers | Users | Evaluators | ||||
---|---|---|---|---|---|---|---|---|---|
International | National | International organizations | National research councils | Institutions (universities, research institutes, NGOs), networks | Researchers (established and learning) | International organizations | National and sub-national health services | ||
Adequate allocation of resources to quality evaluation research alongside investments in the quality of the science, scientists, and science communication. | +++ | ++ | ++ | ||||||
Systematic attention to indicator framing, selection, measurement (multiple data sources and valid standards to enhance quality), and analysis. | + | + | ++ | ++ | + | + | +++ | ||
Development of indicators which better encompass relationships with knowledge users. | ++ | ++ | ++ | ++ | ++ | ++ | +++ | ||
Disaggregation of indicator data according to equity categories. | + | + | ++ | ++ | ++ | ++ | +++ | ||
Systematic consideration of assumptions, pre-conditions, or measurement confounders associated with the evaluations. | ++ | +++ | |||||||
Greater attention to evaluation design, use of clear conceptual frameworks, systematic linkage of indicators in keeping with theories of change. | + | + | ++ | +++ | |||||
Development of comprehensive, prospective systems for health RCS indicator monitoring and evaluation, in which long-term impact is considered throughout the entire project cycle. | ++ | ++ | + | + | ++ | ++ | + | + | ++ |
Separation out of three components of the upper level– provincial-national research environment, international-global research environment, and research networks. | ++ | ++ | ++ | ++ | + | + | + | +++ |