Background
Working definitions of key terms used in this paper
Methods
Literature review
Expert group validation
Organization | Specialty |
---|---|
Makerere University School of Public Health | Public Health/Health Systems Specialist (1) |
Makerere University School of Public Health | Public Health/Monitoring & Evaluation (M&E) Specialist (1) |
Private Consultant | Health Economist/M&E Specialist (1) |
World Health Organization Country Office | Health Systems Specialist/PhD Student (1) |
Ministry of Health/GFATM Coord. Office | Public Health/Health System Specialist (1) |
World Health Organization Country Office | Health Systems/M&E Specialist (1) |
CUAMM (NGO) Uganda Country Office | Health Systems/Hospital Performance Assessment (1) |
PEPfAR Monitoring Unit Uganda | Public Health/Programme & District Assessment (1) |
Uganda Catholic Medical Bureau | Health Systems/District & Hospital Assessment (1) |
Uganda National Health Consumers Org. | Social Scientist/Health-care Consumers Advocacy (1) |
Institute of Statistics, Makerere University | Statistician/PhD Student (1) |
Application of attributes to HSPA frameworks
Results
Attributes of HSPA frameworks derived from literature review
Expert group input
Attribute | Characteristics | |
---|---|---|
From literature review | By expert group | |
Process of development | • Participation of various stakeholders to bring on board various perspectives, increase transparency, appreciation, and ownership | • Some categories of stakeholders indicated include public, communities, and funders |
• Use of data to explain causal links | • Data use said to make framework more believable and more likely to be used for decision-making | |
Relating with health system framework | • Embedded in an explicit health system with clarity of HS conceptual framework including determinants of health, goals, elements, and actors | • May require working with stakeholders to develop health system conceptual framework if not in place already |
Relating with policy/organizational context, societal values and principles | • Relating to general model of government | • Derivation of health system performance assessment attributes in this paper should be recognized as a specific perspective and not as generic |
• Relating to organization of the health system, inter- and intra-linkages at different levels of the system | ||
• Societal values and principles determine system goals and trade-offs | ||
• Governance and empowerment influence relationship between values and explicit policies | • Health financing – levels & structure – sources, mechanisms as one of the issues to monitor | |
• Governance related to levels of literacy | ||
The elaboration of the framework | • Includes conceptual framework, purpose, dimensions, sub-dimensions, and indicators | • Highlighting linkages and accountability relationships to facilitate attribution |
• Dimensions and sub-dimensions should reflect linkages between different functions and elements of the system | ||
• Indicators – may require some flexibility & dynamism to allow for learning and ownership | ||
• Choice of indicators determined by perceived importance, scientific soundness, and feasibility | ||
Institutional set-up | • With appropriate institutional set-up, with linkages to other entities, champions, & resources (infrastructural, financial, human) provision | • Information management system requirements should consider feasibility & costs versus benefits |
• Regular and systematic application | ||
• Should be usable at lower levels for self-assessment | ||
Mechanisms for change | • Linking measurement of performance with changes in policy & management | • Packaging of information should consider types and needs of users |
• Making comparisons across time, different levels, systems, and settings | ||
To consider negative/unintended effects of incentives including on data quality and increasing inequity | ||
• Analysis and use of complementary information from various sources | ||
• Incentives – financial accreditation recognition – name and shame | ||
Adaptability | • History of use over time and in different places and contexts |
Application of attributes
Performance assessment framework | Attribute | ||||||
---|---|---|---|---|---|---|---|
Process of development & review | Health system framework | Policy, organizational, & societal context | Content of framework | Institutional set-up | Mechanism for change | Adaptability | |
Australia National Health Performance Framework NHPF | • Work on PAF since the 90s | • The Lalonde model, appreciating both the healthcare & non-healthcare determinants of health | • Healthcare intended to be universally accessible | • Purpose: provide structure for reporting at national level & for developing PI sets for lower levels | • Rationalized and converged previous efforts at PA including indicator definitions, data processes, and local needs | • Present information in performance reports and HCAs | • Adapted from CHIRII |
• Shared responsibility by federal & state governments for funding, regulation, & provision of services | |||||||
• Dimensions (2nd edition of NHPF): Effectiveness, responsiveness, accessibility, safety, continuity, efficiency, & sustainability | |||||||
• Led by national & state ministers & using technical experts | |||||||
• Dimensions: health status & outcomes, determinants of health, HS performance | • National & international comparison | ||||||
• Equity as key concern | |||||||
• Linkage with generic national bodies responsible for funding & PA | • Accreditation & professionalism | ||||||
• NHCAs outline goals & HS roles & responsibilities for government bodies | • Indicators emphasize: national standards, worthiness, relevancy, validity, reliability, priority (minority) groups, user understanding | ||||||
• NHPF developed in 2001 & reviewed in 2009 | |||||||
• Accountability & consumer & participation | |||||||
• Has been in use for more than 10 years – with review in 2009; | |||||||
• Involving a number of organizations: ACSQHC, COAG Reform Council, NHPAC, NHPC,, NICS, National HCAs | |||||||
• Quality of care initiatives | |||||||
• Epidemiological analysis linking inputs, processes, outputs, & outcomes | |||||||
• Learning process with adjustment of dimensions, indicators, & reporting given current priorities, data availability, & possibility of interpretation | |||||||
• Financial incentives for building capacity for quality & safety | |||||||
Canadian Health Indicator Framework CHIF | • Initiated in 1998, endorsed by First Minister’s Meeting in 2000 | • Lalonde model – appreciating healthcare and non-healthcare determinants of health | • Federal, provincial, & territorial levels roles & responsibilities | • To provide governments, providers, & public with reliable, comparable data across entities & assist in its use & interpretation | • Integrated network of HIS initiatives & structures, across country & levels including CIHI, SC, HC, CCHSA, CMA, AIM | • Biennial National Report | • Has been in use, evolving over more than a decade |
• Public (mainly) & private funding | |||||||
• Domains: acceptability, accessibility, appropriateness, competence, continuity, effectiveness, efficiency, safety | |||||||
• Built on previous work by CIHI and CCHSA | |||||||
• Defined up to 70 indicators | |||||||
• Various providers | |||||||
• Dimensions: health status, non-medical determinants, HS performance, community, & HS characteristics | • Provincial & regional governments link to plans & targets | ||||||
• Informed the development of frameworks for the OECD, Australia, & Netherlands | |||||||
• periodical pan-Canadian surveys for consumer opinion | |||||||
• Wide consultation at national, regional and local levels; | • Minority populations with equity concerns | ||||||
• Extensive use of evidence | |||||||
• Marked financial & logistical investment over the last decade through CHIRII | |||||||
• Benchmarking, CQI, Certification/Accreditation with professional bodies | |||||||
• Change in indicators given data availability & interest | |||||||
• Accountability, through making Information available to public; | |||||||
• Learning, innovation, sharing best practices | |||||||
• National Consensus Conferences on Indicators | |||||||
Ghana Holistic Assessment of Health System | • Developed by the MoH and discussed with sector stakeholders, first time at the April 2009 Health Summit | • Health in center of national development agenda | • The assessment relates to the Health Sector PoW & the GPRS, guided by National Health Policy & MDGs | • Provide balanced and transparent assessment of sector performance indicating factors that may have influenced performance and suggest corrective measures | • Carried out by MoH & stakeholders & external reviewers | • Presented in briefs and reports discussed at national and regional forums | • Has been used for 4 years, to be adjusted with development of new PoW |
• Data mostly from HMIS, surveys, and KIIs | |||||||
• Goals – child survival & RH, decreasing burden of disease, & health services availability & use | |||||||
• Dashboard approach, with 3-step process: assessment of indicators & milestones, assessment against goals & targets, & assessment of whole sector | |||||||
• Receives information from districts, regions, agencies, & MoH | |||||||
• Uses 22 out of 34 PoW indicators | |||||||
• Thematic areas: healthy lifestyle & environment, provision of health, RH and nutrition services, HS capacity development & governance & financing | |||||||
• Marked challenges in data availability and quality – sanctions proposed for those who do not submit data as required | |||||||
• Prizes proposed for good performers | |||||||
• High donor contribution to sector including through the Multi-donor Budget Support, MDBS | |||||||
• Decentralization, with geographical equity concerns | |||||||
Netherlands Dutch National Health System Performance Framework | • Consultative process between MoH & RIVM, & researchers over period 2002–2005 | • Lalonde model for health determinants & Balanced Score Card (BSC) model of HSPA | • Transition from budget-driven healthcare system to regulated market | • Focus on technical healthcare quality, keeping other dimensions in sight | • Close working relationship between MoH & RIVM & researchers for ownership, & evidence base | • To provide evidence to make appropriate policy decisions | • Adapted from experiences in Canada (Lalonde model); and UK, US and Dutch healthcare organizations (BSC model) |
• Not really designed to link information with management strategy | |||||||
• Used evidence in form of frameworks from elsewhere, consideration of roles of MoH & other stakeholders, & existing information infrastructure | |||||||
• Interface of Lalonde model & BSC is the consumer, relating population health & health management | • Emphasis on transparency & results oriented management | ||||||
• Linked existing databases; created new cost-effective sources of data as required | |||||||
• BSC - consumer, financial, internal business processes & innovative perspectives | • Adapted in Ontario and & for OECD’s HCQI Project | ||||||
• BSC model adapted to a non-corporate, market-oriented entity | |||||||
• Indicators selected in line with core questions posed on each perspective | |||||||
• Compares healthcare performance with healthcare needs | |||||||
South Africa District Health Barometer SA DHB | • Developed by the Health Systems Trust (HST), a non-governmental organization in consultation with DoH | • Equitable access to good healthcare as a major goal of the health system | • Decentralized, with bulk of primary health care services funded by government | • To monitor progress & support improvement of equitable provision of PHC | • Housed by HST a private entity with research & HSPA skills, working in close consultation with DoH | • Annual reports with tables, graphs and maps comparing all districts and within metro and rural districts; | • Has been in place with annual publications since 2005 |
• Adjustments made with improving data availability and quality and perceived needs for information | |||||||
• Post-apartheid inequality in access to healthcare | • Equity analysis, | ||||||
• Research and consultation with experts | |||||||
• Use of evidence | |||||||
• Information to policy makers and managers at national, provincial & district levels &public domain including academic/research institutions | |||||||
• Indicators: socioeconomic, input, process, output, outcome & impact, related to MDGs | • Uses secondary data from various government institutions | ||||||
• Geographical equity a major issue | |||||||
• Poor health information systems and quality of data cited | |||||||
• For comparison of all provinces & districts and within the categories of rural and metropolitan districts; | |||||||
• Equity as a major focus; | |||||||
• Trends studied | |||||||
World Health Organization Health System Performance Assessment Framework | • Developed by WHO technocrats with wide stakeholder involvement only after the World Health Assembly of 2000 and marked criticism | • WHO introduced a number of concepts about a HS including health actions, boundaries, goals, functions and building blocks | • Intended as a l tool for use by all member states and therefore supposed to be generic and usable for assessment of and in widely varying contexts across the globe; | • For the purpose of helping member states to measure own performance, understand factors behind this and improve response; | • Global and national support for HSPA including establishment of EHSPI | • Presents information of member states in the World Health Report in league tables and plots; | • Has been in place since 2000 with substantial consultations following its launch; some adjustments have been made including dropping the composite goal performance index and elaboration of specific methodologies; |
• Utilise DALYs and DALEs as measures of overall population health; | |||||||
• Development of tools and approaches for data collection and analysis | |||||||
• Extensive use of evidence | |||||||
• Computation of indicator of composite goal performance in 2000. | |||||||
• Main (extrinsic) Goals indicated as: improving population health, responsiveness, & fair financial contribution | |||||||
• Assessment of 5 components of the HS using a number of indicators: population health level and distribution; responsiveness level and distribution; distribution of financial burden; | |||||||
• Use of WHO regional groupings, research institutions and international organizations for consultation; | |||||||
• Relates DALES to health systems’ potential given country/health system resources. | • Has been adapted and used for subnational assessments and also adapted for use by Health Systems 20/20 in several countries. | ||||||
• Benchmarking and competition | |||||||
• Public reporting & accountability | |||||||
• Highlighting stewardship as important for system design, performance assessment, priority setting, inter-sectoral advocacy, rule setting, and consumer advocacy |
Discussion
Attribute | Lesson | Identified gaps/Areas for further research |
---|---|---|
Process of development | • It is possible and useful to involve a range of stakeholders in the development and review of HSPA framework as done in Canada, and involvement of researchers as in the Netherlands | • Limited involvement of beneficiaries of health systems |
• A private entity can act as lead agency as seen in South Africa | ||
Clarity of HS conceptual model | • Explicit HS conceptual models facilitate relating the HSPA framework to the HS model, e.g., the WHO HS model was developed just prior to developing the HSPA framework | • In the absence of explicit HS models, it is difficult to determine system goals and whether the right things are being measured |
• Explicit HS model coupled with clarity in partitioning the HS for PA highlights linkages and enables attribution – the Netherlands framework provides an example with the Lalonde HS model and the Balanced Score Card for the HSPA framework | ||
• Lack of delineation between HS and healthcare systems provides challenges for HSPA | ||
• Contribution of healthcare to health often difficult to estimate, and responsibility for delivery and reporting on non-healthcare determinants is challenging | ||
Relating to policy and organizational context | • Variations in context are reflected in the HSPA frameworks; 2 diverse examples provide different lessons for countries intending to develop HSPA frameworks: | • The effect of governance and various aspects of empowerment on HSPA and their relationship to literacy are not well documented |
○ Canada – very contextualized | ||
○WHO – intended to support HSPA in member countries and thus fashioned generically | ||
Elaboration | • Similarities noted between HICs and differences between HICs and L/MICs at the level of dimensions and lower level indicators, with HICs emphasizing service and provider-specific indicators & L/MICs emphasizing population-based indicators | • There are still challenges in relating the different pieces of data in most frameworks to tell a story, to determine what is not working well |
Institutional set-up | • Canada and WHO made substantial investment in HSPA including methodological aspects and technology for data collection, analysis, and dissemination, which has yielded results | • What is the right balance – how much do you invest in LICs given competing obligations? |
• Ghana & South Africa demonstrate that you can start simple & build useful systems for HSPA | ||
• Champions for HSPA have been noted to have made an impact in Australia and Canada (ministers of health) and the Netherlands (researchers) | ||
Mechanism for eliciting change in the HS | • Working with various pieces of information from different sources validates, enriches, and supports interpretation for decision-making | • There is still limited information on what works in terms of eliciting change in the HS using HSPA, and more research needs to be done in specific contexts to learn more about this |
• Use of appropriate technology and strategies for analysis and dissemination helps provide information to more people, as seen in Canada | ||
• There is not much noted in these experiences about unintended/negative consequences of HSPA | ||
• Combination of mechanisms (internal & external) facilitates change, as seen in Canada | ||
• Combinations of stakeholder groups and skills (e.g., statisticians/researchers/policy makers/health managers/generic managers/professional bodies) facilitate decision-making – different combinations noted in Australia, Canada, Netherlands | ||
Adaptability | • HICs adapt HSPA frameworks from other HICs; LICs from international agencies | Given the contextual differences and their implications for HSPA, case studies of HSPA frameworks are likely to provide further understanding of what works (or does not) and why |
• The frameworks that have been in place for longer have evolved/changed with circumstances to remain relevant |