QA programme implementation and data collection
Health facilities for the initial pilot were identified by QA officers, in consultation with supervisory laboratory staff, based on the distance from each QA officer’s primary duty location. The QA officers did not administer the pilot QA programme in the facilities to which they were permanently assigned. The three health facilities that were generally closest to the QA officer’s primary duty station and that had malaria diagnostic services were purposefully chosen for QA programme support. Twenty-seven QA officers were assigned three health facilities each for support, and one was assigned two facilities because of distances in a remote area. Therefore, the facilities formed a convenience sample based on travel time for the QA officers. Over the 7-month pilot period in 2013, QA officers made 1-day visits in the months of June, July, November and December to each health facility for a total of four visits per facility.
At the initiation of the programme, each QA officer was given seed commodities (i.e., slides, slide mailers, slide boxes, Giemsa and immersion oil) to distribute to the health facilities during their first QA visit. Each QA officer was expected to cross-check five negative and five weak-positive slides (i.e., ten total slides) for accuracy at each facility during every visit in accordance with national and WHO guidance [
5,
14]. Thick films were examined for the presence or absence of parasites; a minimum of 100 high-power magnification fields were examined before the slide was classified as negative [
5,
14]. No thin films were collected or examined for parasite density or speciation as part of the QA pilot programme, and no slides were collected during the first visit. Because the QA programme was implemented in low-transmission counties, five weak-positive slides were not always available to cross-check; in such cases, the QA officer collected additional negative slides to ensure 10 slides in total were checked.
Each QA officer completed a standardized checklist with 17 discrete sections that covered the eight QA components during each visit (Additional file
1). The QA officers recorded data on specific indicators and processes via observation and structured questions. The data was used in real time to tailor the interventions the QA officer provided during the visit and was analysed to evaluate the QA pilot. Sections 1–4 covered basic health facility information including staffing, training and infrastructure. Sections 5–7 covered the availability of laboratory equipment, supplies and consumables including RDTs. QA officers observed and visually confirmed the presence or absence and count of laboratory equipment, which they recorded on the checklist. QA officers also visually confirmed the presence of supplies and consumables and asked the supervisory laboratory officer if the laboratory had experienced a stock out of 7 or more consecutive days during the previous 3 months that prevented the laboratory from performing malaria diagnostics. In addition, dates were recorded when available (e.g., last maintenance date for microscopes).
Section 8 of the checklist covered malaria reference materials, which were visually verified as being present or absent and the location of the material was documented. Reference materials included the national malaria policy and guidelines for laboratory, diagnosis and treatment, and quality assurance. Reference materials also verified were jobs aids and standard operating procedures (SOPs). QA officers documented the availability of one SOP for RDT use and nine SOPs related to microscopy: (1) collection of blood samples, (2) preparation of blood films, (3) preparation of buffered water, (4) preparation of Giemsa, (5) preparation of Field stain, (6) staining of blood films, (7) examination of blood films, (8) slide selection for QA/QC, and (9) use, care and maintenance of microscopes. The SOPs for microscopy are described in detail in the national guidelines for parasitological diagnosis of malaria [
14]. QA officers documented the availability of one job aid for RDT use and six related to microscopy: (1) malaria microscopy images, (2) sample collection, (3) smear preparation, (4) staining, (5) smear examination and reporting, and (6) slide selection and validation. QA officers additionally documented whether the job aids and SOPs had been updated in the previous 12 months.
Sections 9 and 10 covered internal and external QA practices. Six internal QA processes were evaluated as present or absent by either observation or asking the supervisory laboratory officer structured questions. The six processes were (1) batch testing of stain using positive-control slides, (2) pH meter calibration, (3) slide cross-checking, (4) QA process and results recording, (5) slide filing, and (6) slide storage. For external QA, QA officers asked the supervisory laboratory officer three yes-or-no questions related to external QA programme participation, which were (1) participation in a malaria-specific external QA programme, (2) participation in any external QA programmes [e.g., WHO Stepwise Laboratory Improvement Process Towards Accreditation (SLIPTA) programme], and (3) feedback received from external QA programme. The name or affiliation of the programme conducting the external QA and last validation dates were documented where available. Section 11 covered the laboratory turnaround time for both slide and RDT results; the results are not reported.
Sections 12–14 were observations of laboratory staff preparing patients and slides, staining and reading slides and using RDTs. For slide preparation, there were nine discrete procedures observed and each procedure had between one and seven steps. For slide staining and reading, there were six discrete procedures observed and each had between one and five steps. For RDT use, there were six discrete procedures observed and each had between two and six steps.
The procedures and steps are described in the standardized malaria diagnostics QA checklist (Additional file
1). The QA officers observed the procedures and recorded the number of steps completed correctly by laboratory staff for each. However, the results are not reported because the same laboratory staff were not observed serially over the course of the QA pilot.
Section 15 documented six laboratory safety issues including the presence or absence of two laboratory safety SOPs: infection prevention (i.e., use of personal protective equipment [PPE]) and post-exposure prophylaxis. The QA officers also observed the presence or absence of the following safety practices in the laboratory: (1) availability of material safety data sheets, (2) use of PPE, (3) segregation and disposal of waste, and (4) container labelling. Sections 16 and 17 of the checklist were a summary of findings identified during the QA visit, recommendations and signatures.
Inaccuracies or deficiencies identified by the QA officer were immediately addressed through on-the-job training and mentoring of health-facility personnel during the visit. The findings and recommendations were also shared with the health-facility laboratory manager and head administrator at the completion of each visit. Laboratory managers and administrators were responsible for implementing corrective actions recommended by the QA officer. Completed checklists and cross-checked slides were sent to MDC with a written report that included a summary of findings and recommendations after each visit.
All ten slides cross-checked on the second, third and fourth visits were sent for review by expert microscopists. Because of the large number of slides generated, slides were distributed to three reference laboratories: MDC, National Malaria Reference Laboratory and Kisumu County Vector Borne Disease Laboratory. Reference laboratory microscopists were certified through the WHO External Competency Assessment for Malaria Microscopy scheme. The NMCP and MDC staff conducted one supportive supervision visit with QA officers to 30% of health facilities during the pilot phase.