Background
Malaria prevention and control interventions have recently undergone major scale-up in Africa, and malaria disease burden is reported to be declining in several countries [
1], including Ethiopia and other East African countries [
2‐
5]. However, there is complexity within countries, including large geographical variation in incidence and differing upward or downward trends between indicators, hospitals or areas [
2,
6,
7]. Repeated representative nationwide malaria prevalence surveys are now becoming the norm, and there is a welcome emphasis on improving estimates of the impact of control measures on malaria mortality [
8]. However, comprehensive longitudinal data sources measuring several malaria indicators monthly or weekly at multiple sites are relatively few, and more are needed [
9,
10].
While most countries have routine morbidity and mortality reporting systems, distrust of their quality for malaria surveillance is widespread [
11,
12] and may sometimes be justified. However, there are many examples of carefully assessed routine malaria surveillance data making essential contributions to understanding the malaria burden, how it varies over space and time, and the impact of control measures and climate on malaria [
5,
13‐
20]. A more frequent problem is the collection of large amounts of data that are never collated or used, with consequent degrading of reporting as health workers receive no feedback for their surveillance effort. There is little alternative to routine surveillance for understanding inter-annual and seasonal trends over sub-national areas, and since there are often multiple sources, triangulation between them can assess which ones are providing high quality data as well as lend reassurance that observed trends are real.
Recognition of the importance of improvement of quality of routine surveillance for infectious diseases [
21] led to major input by WHO and CDC into strengthening and streamlining of this effort through the Integrated Disease Surveillance and Response (IDSR) system [
22] starting in the 1990s. The IDSR matrix included originally 19 diseases (pandemic influenza being later added) in three categories: 1) major endemic diseases of public health importance; 2) diseases targeted for eradication and elimination; and 3) epidemic prone diseases. Malaria comes under the first category in many countries but may be in the third category in less endemic countries like Ethiopia and Eritrea.
Some important features of IDSR and its training materials were the use of standard case definitions, the establishment of action thresholds for epidemics, the use of information in response, provision of feedback, and continuous evaluation [
22]. Evaluations of IDSR systems have occurred [
21,
23], but there has been no detailed use and evaluation of IDSR for monitoring trends in malaria in Ethiopia.
The total estimated population of Ethiopia in 2007 was 73,918,505 [
24]. Administrative levels of Ethiopia consist of regional states usually divided into zones, which have populations (in 2007) averaging 785,000 and ranging from 35,000 to 2,970,000. There are some ‘special districts’, administratively equivalent to zones. The next levels are
woredas (districts) with populations averaging 140,000 and ranging from 8,000 to 340,000, followed by
kebeles (the lowest government administrative unit, divided into villages or ‘development teams’).
Kebeles have at least 900 families or about 4,500 to 5,000 persons. The number of hospitals per region varies, with at least one in each region. Zones may or may not have a hospital, but have at least one health centre. Each
woreda (district) usually has one or more health centres and a number of health posts or stations in more rural areas. Starting in 2005, the health station category was phased out and each
kebele is now intended to have at least one health post staffed by two Health Extension Workers, reporting to the cluster health centre. The national health management information system has recently undergone major reorganization. Each health facility reports quarterly on morbidity, mortality, and health resource and preventive indicators through that system.
Ethiopia implemented Integrated Disease Surveillance (IDSR) reporting from all hospitals and health centres using a one page form (see Additional file
1). Most diseases were reported on the monthly form, but certain high priority indicators were to be reported immediately. Health stations and health posts were not formally included in the initial phase. While IDSR data exist on paper at the district level, they were available in electronic form at zone and Regional Health Bureau level in an EpiInfo database entered in Addis Ababa, formerly under the Disease Prevention and Control Department, but now at the Public Health Emergency Management (PHEM) center at Ethiopian Health and Nutrition Research Institute.
This paper presents and assesses national surveillance malaria data from the IDSR system at the zone (reporting unit) level in Ethiopia over a five-year period July 2004-June 2009. The goals are 1) to investigate the completeness of reporting for malaria indicators in the IDSR system, 2) to stratify malaria risk by area of the country (zone); 3) to contribute to the information on whether and how the malaria burden in the country is changing over the time period, which coincided with the time of major scale up of malaria control efforts in the country. After this time period (in 2009), the reporting format changed, health posts were included in reporting, and the time interval for reporting changed to weekly during major Ministry of Health reorganization.
Methods
Malaria indicators in IDSR
The IDSR reporting form (see Additional file
1) includes 17 malaria items encompassing total malaria cases (clinical and confirmed) for out-patients, in-patients and deaths; confirmed out-patient malaria cases by species; in-patient cases and deaths for malaria with severe anaemia; and out-patients, in-patients and deaths for malaria in pregnancy. All are given by age group <5 years and > = 5 years except malaria in pregnancy indicators which only apply to women of childbearing age.
The Epi Info (Windows) IDSR database was imported into Access 2003 and expanded to include all eligible zone-months. Several variables were renamed for clarity and some summary malaria indicators were created in Access 2007, such as combining different age groups and estimating total confirmed malaria by summing cases of both species (see Additional file
2).
Incidence estimates for summary malaria indicators by year and month were obtained by summing the appropriate indicator over the time period in question using the relevant population denominator for each reporting unit (see Additional file
3).
Reporting time periods sites, units and population denominators
The Ethiopian Calendar (EC) is used in the country, rather than the Gregorian (Western) calendar (GC). There are 13 months in the EC, 12 of 30 days and a short remaining month. The EC year runs from the equivalent of GC September to August, with a 7 to 8 year lag period behind the GC, whereas the reporting year for the Ministry of Health is GC July to June. The time period of this report is July 2004 to June 2009, which with the lag period is equivalent to EC reporting years 1997 to 2001.
In 2004 (EC 1997), there were 87 original IDS reporting units entered in the database at national level. Reporting units are mostly zones, but also include special woredas (districts), sub-cities, towns and referral hospitals that report separately. For convenience, these are referred to as ‘reporting units’ or just ‘units’ rather than zones. The number of reporting units had gradually increased to 108 by 2008–2009 (EC 2001) by the splitting of zones and the addition of referral hospitals and towns, and the strategy to adjust for this change is described below. In July 2007, five referral hospitals in Addis Ababa were added, five zone splits occurred, and 10 towns were split from their local zones and added as separate reporting units. One more referral hospital in Addis was added in July 2008. One zone in Somali region started reporting in July 2005 (one year late). Zones in Somali region that stopped reporting (9 in number) after June or December 2007 have been excluded from the count of ‘eligible months’ (reporting denominator) for the appropriate periods.
Reporting completeness was assessed based on the information available in the 108 units, after allocating eligible reporting time periods to each unit. Each monthly report showed a number of sites ‘expected to report’ which was taken as the number of eligible sites, totaled over each reporting unit.
For assessing malaria incidence, the 108 units reported by 2009 were collapsed into the original 87 units of 2004 to ensure consistent population denominators, estimated from the 2007 census. One of the original 87 units (Gambella Special Woreda) did not have a separately reported population so was combined with its surrounding area, giving 86 units. Recently added referral hospitals and newly demarcated towns were assigned to their original local zone for malaria incidence estimates. Split zones were reunited to their original zone as in 2004 (EC 1997) (see Additional file
3).
Most data were obtained in digital form from the database in the Public Health Emergency Management (PHEM) center, Addis Ababa; missing month reports from Amhara region that had never been submitted or entered were obtained by visiting the Regional Health Bureau in Bahir Dar in early 2010.
Data quality
Factors affecting quality of IDSR data for reflecting disease trends include accuracy of the diagnosis according to the case definitions, accuracy of data recording and entry, completion and correct summary of the monthly forms at all levels, submission of the forms to next higher level, timeliness of reporting, and consistency in reporting over time.
While case definitions for the reporting form were clear and had been the subject of training, diagnosis is done by hundreds of staff members at the health centers and hospitals and its accuracy therefore cannot be assessed here. If accuracy of diagnosis remains relatively constant over time, it should not introduce bias. Occasional outliers or missing items within a unit-month due to either reporting or data entry errors (unresolvable at this time) were observed. Clear outliers or inconsistencies such as the number of confirmed cases of a disease being greater than the total number of cases, or cases of one species exceeding the total number of cases of both species were re-coded to ‘missing’. If reports existed but were not submitted from woreda to zone level, or received but never entered, this was reflected in the database as a completely missing unit-month report.
Data timeliness, completeness and consistency
While timeliness of reports is vital for detecting and responding to epidemics, it is not evaluated here since the focus is on completeness of reporting to observe relative incidence between zones as well as trends over a five-year period. The numbers of sites (health facilities) reporting on time and those reporting late were therefore combined to arrive at an overall proportion of sites reporting for each unit-month.
The first focus of this evaluation is reporting completeness and consistency based on the expected number of eligible sites, reporting units and months. Increase in number of both eligible sites and reporting units (due to reconfiguration of boundaries) affected reporting consistency. One would expect a slow increase in the number of eligible sites as new health centres and hospitals were built (and this was observed in increase in number of eligible sites), but since the population served remains relatively constant, this increase in sites would not necessarily bias the disease trend unless access to care increased greatly. Boundary reconfiguration was addressed in the compilation of the incidence data by re-combining newly defined areas (e.g. split zones or separate reporting from towns) with their previous reporting units for assessing disease trends, depending on availability of population denominators. Thus 108 reporting units were compressed into 86. Change in reporting completeness over time is the factor most likely to bias disease trends.
Since the database initially did not contain any zone-months lacking reports, a full list of eligible months for each year and reporting unit was first generated. The completeness of reporting was calculated as the product of the following two proportions:
1)
percentage of unit-month reports: For each zone (reporting unit), the number of months with any report divided by the eligible months.
2)
percentage of site reports: For each unit-month: the proportion of reports from eligible sites (health centres or hospitals) that were received at the reporting unit level and included in their reports.
Discussion
This study carefully examined the completeness of reporting of malaria indicators for the IDSR system in Ethiopia between 2004 and 2009, and concluded that at over 80% it was of sufficient quality to provide estimates of malaria incidence by reporting unit (usually zone) and month until the end of 2008. The results suggest marked decline in numbers of malaria and malaria-related in-patients and deaths over the period, perhaps as a result of scale up of interventions in 2007, although little drop in out-patient case numbers was observed.
It must be remembered that the IDSR system in 2004–2009 expected reports from health centers and hospitals only. During the last few years, there has been a large expansion in the number of health posts in the country [
25], which diagnose and treat an increasing proportion of out-patient malaria cases. This could bias the number of out-patient cases seen at higher-level facilities downwards, although this factor may be counteracted if health post cases are reported through their supervisory health centre. However, health posts do not accept in-patients. It is possible that expansion of access to diagnosis and treatment through Health Extension Workers contributed to earlier and more effective care-seeking and effective treatment, with consequent reduced incidence of severe malaria cases and mortality at health centers and hospitals. Since 2004, reports of outbreaks or epidemics due to malaria in the study zones of the country have also been very low.
A limitation of the data set was that it was not possible to independently verify the accuracy of the reported number of eligible sites in each reporting unit, and under-reporting of the number of such sites would overestimate the completeness of eligible site reporting. However, examination of time trends did not indicate any significant changes over time that would bias the results.
Incidence of total malaria out-patients (clinical and confirmed malaria) in the overall population averaged 23.4 per 1,000 persons per year over 2005 to 2008, and was about three fold higher than incidence of confirmed malaria at 7.6 per 1,000 per year. Both indicators should be taken into account in stratification of risk between zones (reporting units), since the true incidence probably lies between the two. Differences in ranking of units by the level of risk observed and by predicted absolute number of total and confirmed malaria cases (Figures
8,
9,
10, and
11) emphasize the need to consider more than one malaria indicator (including in-patient admissions) in stratification schemes and prediction of needs for prevention interventions and drugs. It would also help to assess effectiveness of malaria control if intervention planning was conducted in the same geographical units as the incidence estimates.
A separate source of malaria incidence estimates over the period of this study is the Health Management Information System (HMIS), Federal Democratic Republic of Ethiopia, in which the estimated annual incidence of reported malaria for the whole country was 107 per 1,000 in 2004 and 55 per 1,000 in 2009. Severe malaria cases reported through the HMIS declined from 148 per 100,000 in 2004 to 54 per 100,000 in 2009. Thus a steeper decline in severe cases than out-patient cases was also observed in the national statistics reported through a different system, although incidences of overall reported cases and of severe cases were higher in the national HMIS data than in the IDSR reports. This is expected given that IDSR data compiled here are only from health centres and hospitals rather than all health facilities, and thus do not capture the cases treated at the large network of health posts and also private facilities. The importance of the dataset here is to show trends and comparisons between zones/reporting units over time and to stratify by reporting unit.
The IDSR dataset is a rich source of information for stratification of malaria risk by zone in Ethiopia, using reported empirical data rather than climate predictions. It is known that not all of Ethiopia is at malaria risk since some areas are too high and others too dry for transmission [
26]. However, it was not possible to adjust the incidence estimates for reporting units by a standard factor of population at malaria risk because the proportion of each zone or reporting unit at risk was highly variable. The overall incidence by reporting unit is affected both by the proportion of zone at risk and the intensity of transmission in the area at risk. Incidence defined by sub-zonal units such as
woredas would be highly valuable in future.
While the pattern of incidence stratification shown by these empirical data fits reasonably well with maps predicted from climate data [
27‐
29], reviewed in [
30], the data suggest that some areas of the country such as Afar and Beneshangul Gumuz regions have higher incidence than generally realized and deserve increased control programme attention. In arid regions where incidence is higher than expected, irrigation projects may be influencing malaria transmission.
It is also important to note that patients do not always seek treatment in their zone of residence. In most zones, boundary crossing may occur in both directions and hence is unlikely to introduce systematic bias, but it may inflate estimates in urban areas. For example, those living in the surrounding areas of towns like Harar or Dire Dawa may cross the reporting unit boundary and inflate the incidence estimates in such towns. While this may overestimate the risk of malaria within the town, it is nevertheless a useful estimate of need for services in those places. Without detailed studies of patients’ residence, this must be accepted as a limitation of the data.
Prediction of the absolute numbers of cases expected, generated here by reporting unit from the IDSR data, confirms the important contribution of highly populated zones such as those in Eastern Amhara Regional State and the rift valley of Oromia and SNNPR to the large numbers of cases reported in Ethiopia annually [
26]. These large regions and zones tend to be the ones already given the most attention by the malaria control programme, but the current study may assist in refining the programme’s efforts. Obviously it would be ideal to generate risk maps by
woreda rather than zone, but even at zone level clear seasonal and inter-annual patterns are apparent and can be used to establish thresholds for defining future epidemics. In general, both absolute case numbers and the use of incidence when comparing between zones and regions will improve the ability of Regional Health Bureaus and the FMOH to plan resources appropriately, improve targeting of malaria control efforts, and allow better evaluation of the programme.
Since these data were reported, a new weekly system has been instituted, in which the malaria indicators have been condensed and reconciled with HMIS indicators to reduce workload for health staff in filling multiple forms. All health facilities down to health posts are now required to report. The number of persons tested for malaria confirmation (by slide or rapid test) has been included as well as total malaria and confirmed malaria cases to provide a denominator for test positivity rate. The change to the new system occurred during extensive business process restructuring in the Ministry of Health and was not implemented across all the states at once.
It remains to be seen whether these low rates in many zones and for the country overall remained at such levels in 2010 and 2011. Inclusion of climate information over this time period [
31] and extension of the dataset to more years is needed to clarify the role of control measures compared to natural cycles. It is possible that the declines seen merely represent the down side of an epidemic cycle, but it may be real and sustainable, due to improved health system delivery, better malaria prevention (LLIN and IRS) and use of effective drugs. Given that Ethiopia is considering malaria elimination [
32], it is useful to note that between 2004 and 2009, only 8 reporting units had average annual estimated incidence of confirmed malaria (reported to IDSR) above 20 per 1,000 persons, while 26 units (excluding Somali region) were consistently below five reported cases per 1,000 persons per year. Elimination at least from selected zones thus appears a possibility while attention is given to reducing incidence further in the other zones.
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
DJ, MW, AAl, AT, NA, and AAd set up and managed the IDSR database; DJ, WD, ZT, TG, FOR and PG devised data management and analysis plan and did data analysis; AWM did the mapping; PG and DJ wrote the draft manuscript with extensive input from TG, FOR, AT, MW, AAl and NA; all authors commented on the draft manuscript and approved the final version.