Background
Methods
Literature sources and search strategy
-
[“health information system” OR “health surveillance” OR “health information network”] + “evaluation guidelines” + [methods OR tools]
-
[“health information system” OR “health surveillance” OR “health information network”] + “evaluation framework” + [methods OR tools]
-
[“health information system” OR “health surveillance” OR “health information network”] + “assessment guidelines” + [methods OR tools]
-
[“health information system” OR “health surveillance” OR “health information network”] + “assessment framework” + [methods OR tools]
Study selection and data extraction
Classification of the approaches
-
A method provides information about how to accomplish an end; it is a regular and systematic way of accomplishing something [12].
Results
Field of application and category of evaluation approaches
References | Approach category | Surveillance field | Main objective | Objective(s) of the evaluation as stated in the document | Case study application | |
---|---|---|---|---|---|---|
Author’s definition | Reviewed definition* | |||||
[5] | Framework | Framework Guidelines Method | PHa | Evaluate performance and effectiveness | To assess the quality of the information provided; the effectiveness in supporting the objective(s), in supporting informed decision-making; and the efficiency of SS | - |
[18] | Tool | Method Tool | PHa | Design efficient surveillance systems | Help plan, organize, implement SS | Not described |
[19] | Tool | Guidelines Method Tool | PHa | Design efficient surveillance systems | To establish a baseline and to monitor progress | - |
[20] | Guidelines | Framework Guidelines Method | PHa | Evaluate performance and effectiveness | To establish and maintain effective and efficient surveillance and response systems | - |
[21] | Framework | Guidelines | PHa | Evaluate performance and effectiveness | To assess existing SS and identify areas which can be improved | - |
[22] | Framework | Framework | PHa | Evaluate performance and effectiveness | To evaluate whether SS attain their objectives, and to provide information for further development and improvement | Military surveillance systems for early detection of outbreaks on duty areas |
[23] | Framework | Frameworl | PHa | Evaluate performance and effectiveness | To provide objective, valid and reliable information for the decisions on which surveillance activities and functions should be continued | - |
[24] | Framework | Framework Guidelines | PHa | Evaluate performance and effectiveness | To establish the relative value of different approaches and to provide information needed to improve their efficacy | - |
[25] | Tool | Framework Guidelines | PHa | Evaluate performance and effectiveness | To assess whether the surveillance method appropriately addresses the disease/health issues; whether the technical performance is adequate | - |
[26] | Guidelines | Framework Guidelines | PHa | Evaluate performance and effectiveness | To define how well the system operates to meet its objective(s) and purpose | - |
[27] | Tool | Method Tool | AHb | Evaluate performance and effectiveness | To propose recommendation for improvement of SS | Implemented in France: surveillance network for antimicrobial resistance in pathogenic bacteria from animal origin (also mentioned but not described: early detection of FMD; case detection of rabies in bats; poultry disease surveillance network and salmonella laboratory surveillance network) |
[28] | Framework | Framework Guidelines Method | AHb | Evaluate performance and effectiveness | Support the detection of disparities in surveillance and support decisions on refining SS design | Implemented in UK: demonstration of freedom from Brucella melitensis; early detection of CSF and case detection of Tb. |
Method | Guidelines Method | AHb | Evaluate performance and effectiveness | To contribute to the improvement of the management of epidemiological animal health SS | Implemented in France: evolution of mycoplasmosis and salmonellosis rates in poultry (RENESA network); and the FMD surveillance network in cattle | |
[30] | Framework | Framework | EHc | Evaluate performance and effectiveness | Make evidence-based decisions regarding the future selection, development and use of data | Environmental public health surveillance programs |
[31] | Method | Guidelines Method | PHa & AHb | Evaluate the completeness of the surveillance systems in terms of core components | Evaluate the completeness and coherence of the concepts underlying a health surveillance program | National Integrated Enteric Pathogen Surveillance Program, Canada |
Approach development processes and case study applications
Objectives of the evaluation and description of the evaluation process
References | Organisation | Steps | Practical evaluation elements | |
---|---|---|---|---|
Presence | Absence | |||
[5] | Structured roadmap | Context of the surveillance system | - List of evaluation attributes (13) | - No case study presentation |
Evaluation questions | - Lack of visual representation of the results | |||
Process for data collection and management | - Lack of information about evaluator(s) | |||
Findings | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (only general questions) | ||
Evaluation report | - Lack of attributes’ selection matrix | |||
Following up | ||||
[18] | Structured roadmap - Worksheets (checklist) | - | - Methods and tools for the assessment: questionnaire and worksheets | - No case study presentation |
- Lack of information about evaluator(s) | ||||
- Visual representation of the results: bar and radar charts | - Lack of evaluation attributes | |||
- Lack of definitions of evaluation attributes | ||||
- Lack of attributes’ selection matrix | ||||
[19] | Structured roadmap - Application guide | Resources assessment | - No case study presentation | |
Indicators | - Lack of information about evaluator(s) | |||
Data sources assessment | ||||
Data management assessment | - Methods and tools for the assessment: scoring guide | - Lack of evaluation attributes | ||
Data quality assessment | - Visual representation of the results (graphs) | - Lack of definitions of evaluation attributes | ||
Information dissemination and use | - Lack of attributes’ selection matrix | |||
[20] | Structured roadmap | Plan to evaluation | -List of evaluation attributes (10) | - No case study presentation |
- Lack of visual representation of the results | ||||
Prepare to evaluate | - Lack of information about evaluator(s) | |||
Conduct the evaluation | Definitions of evaluation attributes | - Lack of methods and tools for the assessment (only general questions) | ||
Dissemination and use of the results | - Lack of attributes’ selection matrix | |||
[21] | Structured roadmap | Preparation for the evaluation | - Type/knowledge of evaluator(s): Ministry of Health (national, provincial or district levels) | - No case study presentation |
Documentation and evaluation of the surveillance system | - List of evaluation attributes (8) | - Lack of visual representation of the results | ||
Evaluation of the capacity of the surveillance system | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (general questions) | ||
Outcome of the evaluation | - Lack of attributes’ selection matrix | |||
[22] | General roadmap | Initial evaluation | - List of evaluation attributes (16) | - No case study presentation |
- Lack of visual representation of the results | ||||
Intermediate evaluation | -Definitions of evaluation attributes | - Lack of information about evaluator(s) | ||
Final evaluation | - Lack of methods and tools for the assessment | |||
- Lack of attributes’ selection matrix | ||||
[23] | General roadmap | Usefulness of the activities and outputs | - Type/knowledge of evaluator(s): three to four evaluators (5 years of expertise in surveillance on communicable diseases for the team leader, plus a laboratory expert and an expert in epidemiology) | - No case study presentation |
- Lack of visual representation of the results | ||||
- Lack of definitions of evaluation attributes | ||||
Technical performance | - Lack of methods and tools for the assessment | |||
Fulfilment of contract objectives | - List of evaluation attributes (7) | - Lack of attributes’ selection matrix | ||
[24] | General roadmap | System description | - List of evaluation attributes (9) | - No case study presentation |
- Lack of visual representation of the results | ||||
Outbreak detection | - Lack of information about evaluator(s) | |||
System experience | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (general questions) | ||
Conclusions and recommendations | - Lack of attributes’ selection matrix | |||
[25] | Structured roadmap - Questionnaire | Usefulness of the operation | - Type/knowledge of evaluator(s): experts in international surveillance on communicable diseases | - No case study presentation |
Quality of the outputs | - Lack of visual representation of the results | |||
Development of the national surveillance system | - Lack of definitions of evaluation attributes | |||
Technical performance | - List of evaluation attributes (6) | - Lack of methods and tools for the assessment (general questions) | ||
Structure and management | - Lack of attributes’ selection matrix | |||
[26] | General roadmap | Engage the stakeholders | - List of evaluation attributes (10) | - No case study presentation |
Describe the surveillance system | - Lack of visual representation of the results | |||
Evaluation design | - Lack of information about evaluator(s) | |||
Performance of the surveillance system | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (general questions) | ||
Conclusions and recommendations | - Lack of attributes’ selection matrix | |||
Findings and lessons learned | ||||
[27] | Structured roadmap - Questionnaire - Scoring guide - Worksheets | Design the evaluation | - Case study presentation (c.f. Table 1) | - Lack of definitions of evaluation attributes |
- Visual representation of the results through diagram representations (pie charts, histogram, radar chart) | ||||
Implement the evaluation | - Type/knowledge of evaluator(s): requires little knowledge and experience related to surveillance | |||
Finalisation | - List of evaluation attributes (10) and performance indicators | - Lack of attributes’ selection matrix | ||
- Methods and tools for the assessment: questionnaire, scoring guide and worksheets | ||||
[28] | Structured roadmap - Application guide | Scope of evaluation | - Lack of methods and tools for the assessment (only references provided) | |
Surveillance system characteristics | - Visual representation of the results through colour-coding (green, orange, red) | |||
Design the evaluation | - Type/knowledge of evaluator(s): “Anyone familiar with epidemiological concepts and with a reasonable knowledge of the disease under surveillance” | |||
Conduct the evaluation | ||||
Report | - List of evaluation attributes (22) | |||
- Definitions of evaluation attributes | ||||
- Attributes’ selection matrix | ||||
Structured roadmap - Questionnaire - Scoring guide | Description of the surveillance system | Case study presentation (c.f. Table 1) | - Lack of visual representation of the results | |
Identification of the priority objectives | - Lack of information about evaluator(s) | |||
-Lack of evaluation attributes | ||||
Building of dashboard and indicators | Provides performance indicators | - Lack of definitions of evaluation attributes | ||
Implementation and follow-up | - Lack of methods and tools for the assessment | |||
Updates and audit | - Lack of attributes’ selection matrix | |||
[30] | General roadmap | Priority setting | - Provides performance indicators | - No case study presentation |
- Lack of visual representation of the results | ||||
- Lack of information about evaluator(s) | ||||
Scientific basis and relevance | - Lack of evaluation attributes | |||
Analytic soundness and feasibility | - Lack of definitions of evaluation attributes | |||
Interpretation and utility | - Lack of methods and tools for the assessment | |||
- Lack of attributes’ selection matrix | ||||
[31] | General roadmap | Text analysis | - Case study presentation (c.f. Table 1) | - Lack of visual representation of the results |
- Lack of information about evaluator(s) | ||||
Program conceptual model | ||||
- Lack of evaluation attributes | ||||
Comparison Validation | - Lack of definitions of evaluation attributes | |||
- Lack of methods and tools for the assessment | ||||
- Lack of attributes’ selection matrix |
Description of the assessment process: evaluation attributes
Comparison between approaches
Practical elements | Usefulness |
---|---|
List of evaluation attributes to be assessed | Design the evaluation |
Definitions of the evaluation attributes to be assessed | Design the evaluation |
Case study presentation | Ease of applicability |
Visual representation of the results | Ease of communication |
Information about evaluator(s) (e.g. required expertise level) | Design the evaluation |
List of methods and tools to assess the evaluation attributes targeted | Design the evaluation |
Ease of applicability | |
Guide for the selection of relevant evaluation attributes | Design the evaluation |
Ease of applicability |