Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Specimen origin, type and testing laboratory are linked to longer turnaround times for HIV viral load testing in Malawi

  • Peter A. Minchella ,

    klf9@cdc.gov

    Affiliation Laboratory Leadership Service assigned to Division of Global HIV and Tuberculosis, Centers for Disease Control and Prevention, Atlanta, Georgia, United States of America

  • Geoffrey Chipungu,

    Affiliation Division of Global HIV and Tuberculosis, Centers for Disease Control and Prevention, Lilongwe, Malawi

  • Andrea A. Kim,

    Affiliation Division of Global HIV and Tuberculosis, Centers for Disease Control and Prevention, Atlanta, Georgia, United States of America

  • Abdoulaye Sarr,

    Affiliation Division of Global HIV and Tuberculosis, Centers for Disease Control and Prevention, Lilongwe, Malawi

  • Hammad Ali,

    Affiliation Division of Global HIV and Tuberculosis, Centers for Disease Control and Prevention, Atlanta, Georgia, United States of America

  • Reuben Mwenda,

    Affiliation Ministry of Health, Lilongwe, Malawi

  • John N. Nkengasong,

    Affiliation Division of Global HIV and Tuberculosis, Centers for Disease Control and Prevention, Atlanta, Georgia, United States of America

  • Daniel Singer

    Affiliation Division of Global HIV and Tuberculosis, Centers for Disease Control and Prevention, Lilongwe, Malawi

Abstract

Background

Efforts to reach UNAIDS’ treatment and viral suppression targets have increased demand for viral load (VL) testing and strained existing laboratory networks, affecting turnaround time. Longer VL turnaround times delay both initiation of formal adherence counseling and switches to second-line therapy for persons failing treatment and contribute to poorer health outcomes.

Methods

We utilized descriptive statistics and logistic regression to analyze VL testing data collected in Malawi between January 2013 and March 2016. The primary outcomes assessed were greater-than-median pretest phase turnaround time (days elapsed from specimen collection to receipt at the laboratory) and greater-than-median test phase turnaround time (days from receipt to testing).

Results

The median number of days between specimen collection and testing increased 3-fold between 2013 (8 days, interquartile range (IQR) = 6–16) and 2015 (24, IQR = 13–39) (p<0.001). Multivariable analysis indicated that the odds of longer pretest phase turnaround time were significantly higher for specimen collection districts without laboratories capable of conducting viral load tests (adjusted odds ratio (aOR) = 5.16; 95% confidence interval (CI) = 5.04–5.27) as well as for Malawi’s Northern and Southern regions. Longer test phase turnaround time was significantly associated with use of dried blood spots instead of plasma (aOR = 2.30; 95% CI = 2.23–2.37) and for certain testing months and testing laboratories.

Conclusion

Increased turnaround time for VL testing appeared to be driven in part by categorical factors specific to the phase of turnaround time assessed. Given the implications of longer turnaround time and the global effort to scale up VL testing, addressing these factors via increasing efficiencies, improving quality management systems and generally strengthening the VL spectrum should be considered essential components of controlling the HIV epidemic.

Introduction

In an effort to control the HIV epidemic and meet the ambitious “90-90-90” targets set forth by the Joint United Nations Programme on HIV/AIDS (UNAIDS) in 2014 [1], low and middle income countries (LMICs) have rapidly expanded the number of individuals on antiretroviral therapy (ART). Globally, the number increased by an estimated 2 million to more than 17 million people on ART between the end of 2014 and the end of 2015 [2]. Such regional increases in ART coverage have intensified the need for scale-up of ART monitoring, for which the current gold standard and World Health Organization (WHO) recommendation [3, 4], is viral load (VL) testing.

Treatment monitoring via VL testing is the standard of care in developed countries and is preferred over its predecessor, immunological monitoring, because it enables earlier and more accurate detection of treatment failure [5, 6]. However, a 2012/2013 World Health Organization WHO survey of LMICs indicated that just 50% of individuals on ART have access to VL monitoring [7], and in many countries there are significant barriers to scaling-up VL testing. Among the most prominent of those barriers are the cost and complexity of VL testing [8], which limits the number of testing sites and trained technicians in resource-constrained settings [9, 10] and leaves aspects of testing, such as turnaround time, vulnerable [11]. Longer turnaround time delays initiation of treatment adherence counseling and/or switch to second-line ART in patients experiencing treatment failure. These delays negate an advantage of VL testing over immunological monitoring and lead to poorer health outcomes including increased risk of opportunistic infections [12], prolonged immune activation [13], development of drug resistance [14], and increased mortality [15].

Given the links between delayed switch to second-line ART and poor health outcomes, improved understanding of factors that contribute to longer VL turnaround time, which may cause such delays, is needed. This is particularly important in the context of viral load scale-up [16], which may exacerbate the effects of such factors. Thus, this study examined viral load turnaround time and factors that affected turnaround time using nationally representative VL testing data from Malawi between 2013 and March 2016, a time period during which there was rapid scale-up of VL testing.

Methods

Setting

The data utilized in this study were extracted from the Malawi Laboratory Information Management system (LIMS), established in 2012 and designed to collect laboratory data on all HIV viral load and early infant diagnosis tests conducted in Malawi. The extraction period for this analysis was for VL tests conducted from January 1, 2013 to March 31, 2016.

VL testing in Malawi is initiated at ART clinics, where a VL laboratory requisition form is completed for each specimen collected. The form includes patient identifiers, demographic information, the type of specimen collected, and whether the requisition is for a routine or targeted VL test. Following completion of the form, the specimen is transported to one of Malawi’s nine molecular laboratories capable of conducting HIV VL testing. Upon receipt at the VL testing laboratory, data from the requisition form are entered into the LIMS and the specimen is placed in a testing queue. Following testing, the result is entered into the LIMS and a report is printed for delivery back to the referring clinic and patient. Data from the LIMS at the nine VL testing labs are routinely synced with a central server via dedicated internet connections. For the period during which data for this study were collected, Malawi’s VL testing guidelines were in line with WHO treatment recommendations [3, 4].

Measures

Afferent turnaround time was defined as the number of days elapsed between the date of specimen collection and the date of specimen testing (Fig 1). More precise measures of turnaround time within the afferent period were pretest phase turnaround time, which was defined as the number of days elapsed between specimen collection and receipt at the laboratory; and test phase turnaround time, which was defined as the number of days elapsed between receipt at the laboratory and specimen testing. Data on efferent turnaround time was not systematically collected in Malawi by LIMS or any other means, and thus not included in the analysis. Since Malawi’s LIMS collects data on all specimens undergoing VL testing, regardless of missing or implausible dates, it was necessary to distinguish between specimens with valid and invalid dates in order to accurately calculate turnaround time. Valid specimens were defined as those that had present and plausible collection, receipt, or testing dates depending on the turnaround time being calculated. All other specimens were excluded from turnaround time calculations. Overall medians for each phase of turnaround time were calculated and used as cut-offs to define longer turnaround times. The primary outcomes used in regression analyses were longer (i.e., greater-than-median) afferent, pretest phase and test phase turnaround time.

thumbnail
Fig 1. Turnaround time definitions and specimens used in turnaround time calculations.

Afferent turnaround time is defined as the number of days elapsed between specimen collection and specimen testing. Two phases are defined within afferent turnaround time: Pretest phase, the number of days elapsed between specimen collection and date of receipt at the laboratory; and Test phase, the number of days elapsed between specimen receipt at the laboratory and the date of testing. Efferent turnaround time, which was not assessed in this study, is defined as the number of days elapsed between the date of testing and the date the result is received by the patient. Of 243,539 viral load specimens tested between 2013 and March 2016: 219,121 specimens had valid dates of both specimen collection and testing and thus were included in calculations of afferent turnaround time, 207,645 specimens had valid dates of both specimen collection and receipt at the laboratory and thus were included in calculations of pretest phase turnaround time, and 214,786 specimens had valid dates of both receipt and testing and thus were included in calculations for test phase turnaround time. Valid dates were defined as those that were present and plausible (e.g., an implausible testing date would be one that fell prior to the specimen collection date).

https://doi.org/10.1371/journal.pone.0173009.g001

Factors included in the analyses were hypothesized a priori to be predictors of longer turnaround time. Since pretest phase and test phase turnaround times assess different parts of the VL spectrum, predictors differed by phase. For pretest phase, predictors were: region, presence of a molecular laboratory in the collection district, and collection month. Region is a geopolitical distinction defined by the Government of Malawi, presence of a molecular laboratory in the collection district identifies districts with and without laboratories capable of conducting viral load testing, and collection month is the month during which the specimen was collected. Analysis on collection month was conducted using only 2015 data in order to better understand recent temporal trends in turnaround time. Predictors for test phase were: sample type, testing laboratory, and receipt month. Sample type distinguishes whether the specimen was collected as a dried blood spot (DBS) or plasma, testing laboratory is a de-identified variable for the laboratory that conducted the VL test, and receipt month is the month during which the specimen was received at the testing laboratory. Similar to collection month, analyses on receipt month were restricted to 2015 data.

Data analyses

Frequencies were generated for categorical variables and means, standard deviations, medians, and interquartile ranges (IQR) for normally distributed and non-normally distributed continuous variables, respectively. Wilcoxon-Mann-Whitney tests were used to compare specimen collection volumes and turnaround times.

To examine the relationship between turnaround time and the volume of specimens collected, a variable was created for all valid specimens that was equal to the total number of VL specimens collected nationally during the same month. That variable was standardized and assessed for univariate association with longer afferent turnaround time. A similar approach was utilized to assess the relationship between longer turnaround time and both the volume of specimens collected per district and the volume of tests conducted per testing laboratory. Univariate and multivariable logistic regression were employed to assess associations between phase-specific categorical factors and greater-than-median pretest phase and test phase turnaround times. Factors for both phases were assessed independently and also included in multivariable models. Associations with specimen collection and receipt months were restricted to data collected during 2015, only. All analyses were performed using SAS 9.3 (Cary, NC).

Ethics

The protocol for this analysis was approved by the Centers for Disease Control and Prevention (CDC) Institutional Review Board (IRB) and the Malawi National Health Science Research Committee (NHSRC).

Results

Monthly volumes of VL specimens collected increased significantly from 2013 (median: 2,171 tests, IQR = 1,557–3,481) to 2014 (median: 5,622 tests, IQR: 4,276–6,112; p<0.001) and from 2014 to 2015 (median: 10,296, IQR: 9,001–12,025; p = 0.002). This was paralleled by year-to-year increases in the number of clinics referring specimens for viral load testing and median afferent turnaround time. Overall, the vast majority (97.6%) of VL tests were conducted on specimens referred for routine testing, which had a higher median turnaround time (median: 21 days, IQR: 10–41) compared to specimens referred for targeted testing (median: 10 days, IQR: 6–19). Most specimens (86.1%) had a VL result that was less than or equal to 1000 copies/ml (Table 1).

thumbnail
Table 1. Test characteristics for viral load specimens tested, 2013-March 2016.

https://doi.org/10.1371/journal.pone.0173009.t001

Nationally, while greater volumes of viral load specimens collected per month were associated with longer afferent turnaround time (odds ratio (OR) = 2.65, 95% confidence interval (CI) = 2.64–2.68), greater volumes of specimens collected per district (OR = 0.45, 95% CI = 0.45–0.46) and greater volumes of specimens tested per laboratory (OR = 0.65, 95% CI = 0.65–0.66) were not. Fig 2 graphically shows more precise breakdowns of this relationship. In Fig 2A, median pretest phase turnaround times track monthly changes in national specimen collection volumes, but that is not the case for district specimen collection volumes arranged in ascending order (Fig 2B). Similarly, median test phase turnaround time trends alongside monthly changes in the national volume of specimens received (Fig 2C), but no such relationship is evident between test phase turnaround time and the volume of specimens received at laboratories (Fig 2D).

thumbnail
Fig 2. Volumes of viral load specimens collected/received/tested and corresponding median pretest phase and test phase turnaround times.

2(A) National number of specimens collected and referred for viral load testing by month and corresponding monthly median pretest phase turnaround time. 2(B) Number of specimens referred by district and corresponding median pretest phase turnaround time by district. 2(C) National number of specimens received at the laboratory for viral load testing by month and corresponding monthly median test phase turnaround time. 2(D) Number of specimens received per laboratory and corresponding median test phase turnaround time by laboratory. All pretest and test phase turnaround times were calculated using only specimens with valid dates.

https://doi.org/10.1371/journal.pone.0173009.g002

Categorical predictors of longer pretest phase turnaround time were assessed (Table 2). In univariate models, the odds of longer turnaround time were slightly increased for Malawi’s Northern and Southern regions compared to the Central region and substantially increased for districts without molecular laboratories compared to those with molecular laboratories. For 2015 specimen collection month, only February had significantly decreased odds of longer turnaround time compared to the reference month of January.

thumbnail
Table 2. Factors associated with longer pretest phase turnaround time for viral load specimens collected in Malawi, 2013-March 2016.

https://doi.org/10.1371/journal.pone.0173009.t002

Adjusting for multiple predictors in the same model affected associations between longer pretest phase turnaround time and categorical factors. Compared to the univariate model, the odds of longer turnaround time for Malawi’s Northern and Southern regions increased in the adjusted model, while the odds of longer turnaround time for districts without molecular laboratories decreased slightly upon adjustment (Table 2). In the adjusted model for collection month, the odds of longer turnaround time were decreased for some months, but increased for others relative to the univariate model; reflecting inter-month variability in factors’ influence on turnaround time.

Categorical predictors of longer test phase turnaround time were also assessed (Table 3). The univariate odds of longer turnaround time for DBS were increased compared to plasma. For testing laboratories and 2015 receipt month, some laboratories and months had increased univariate odds of longer turnaround time while others had decreased odds compared to their respective reference categories. The months of July, October, November and December stood out with particularly high odds of longer test phase turnaround time.

thumbnail
Table 3. Factors associated with longer test phase turnaround time for viral load specimens received by testing laboratories in Malawi, 2013-March 2016.

https://doi.org/10.1371/journal.pone.0173009.t003

In an adjusted model, the odds of longer turnaround time for DBS were decreased relative to the univariate model, though still significantly higher than for plasma. For testing laboratory, Table 3 shows that after adjustment, the odds of longer turnaround time increased for some laboratories, but decreased for others.

Discussion

This study shows that increased turnaround time for VL testing appeared to be driven in part by categorical factors specific to the phase of turnaround time assessed. Indeed, factors, such as the presence of a molecular testing laboratory in the collection district, specimen type, and testing laboratory appeared to contribute to longer turnaround time; potentially putting patients at risk of poorer health outcomes by delaying adherence counseling and/or switch to second-line therapy. Shortening afferent turnaround time and ensuring that health outcomes are not affected by laboratory or operational delays will require identification of the specific causes responsible for longer turnaround time and implementation of measures to mitigate those causes in the future.

While turnaround times were lower in districts and laboratories that collected and received higher volumes of specimens (Fig 2B and 2D), suggesting factors such as familiarity with workflow shorten turnaround time, a similar relationship between increasing specimen volumes and shortened turnaround time was not evident on a national scale (Fig 2A and 2C). These data, along with the overall increase in median afferent turnaround time between 2013 and 2015, support the assertion that factors contributing to shorter turnaround time, were outweighed by factors contributing to longer turnaround time. Understanding what those factors are, how they influence turnaround time, and in which ways they can be addressed is an important step in the effort to shorten turnaround time.

HIV VL testing’s dependence on expensive equipment, dedicated laboratory space and highly trained technicians limits its accessibility for many ART patients in LMICs [9, 10]. The current preferred strategy to overcome this accessibility problem is through centralized testing and a robust sample transport network. Use of DBS instead of plasma, which in many cases requires neither a phlebotomist nor a cold-chain, further enhances the cost-effectiveness and feasibility of centralized testing. WHO recommends that DBS can be used effectively at a threshold of 1000 copies/ml in most laboratory settings [17], and its feasibility has been documented as part of a centralized testing network in Malawi [18]. However, impacts of centralized testing and use of DBS on aspects of the VL spectrum such as turnaround time have been largely overlooked in published studies.

Of Malawi’s 28 districts, just seven have molecular laboratories capable of conducting viral load testing. The remaining 21 districts refer specimens to those districts with molecular testing laboratories. Our results indicated that districts without a molecular laboratory were associated with longer pretest phase turnaround time, suggesting that efficiency of specimen transfer to centralized testing sites may be one factor driving longer turnaround time in Malawi. In particular, the association with longer pretest phase turnaround time points to specimen transport time and/or time spent waiting for transport as possible drivers of the association.

We also found that use of DBS for VL testing has significantly higher odds of longer test phase turnaround time compared to plasma. While DBS typically travels further than plasma on its way to testing labs in Malawi, the association with test phase turnaround time, which measures time elapsed between receipt at the laboratory and testing, links DBS to more time spent in the laboratory. This association suggests that there are factors that delay testing of DBS specimens. Those factors may include technicians untrained in DBS preparation, bottlenecks related the DBS sorting and specimen rejection process, or preference for plasma due to pressure from nearby clinics, ease of preparation or limited cold-storage.

While centralization of testing and use of DBS increases access to VL monitoring, it is nevertheless susceptible to inconsistent specimen transport networks, staff shortages, weather, holidays, reagent stock-outs, equipment problems and administrative delays. Several of these factors have been noted as barriers to VL scale-up [19, 20] and many have also been documented in perception and feasibility studies [11, 21]. Whether or not these factors affect VL testing and thus turnaround time is largely dependent on where and when the specimen was collected and where and when the testing occurred.

Our data indicated that certain testing labs had much stronger associations with longer test phase turnaround time than others, suggesting that factors such as those mentioned previously play a role in laboratory to laboratory variation. Factors related to staffing, supplies or administration may be responsible for the differences between labs, but it is difficult to know without intimate knowledge of the lab. Our models also indicated that certain specimen collection months had higher odds of longer pretest phase turnaround time and certain receipt months had higher odds of longer test phase turnaround time. The collection months of April, May, November and December of 2015, for example, stood out with increased odds of longer pretest phase turnaround time, suggesting that issues related to sample transport such as poor weather or a fuel shortages may have been present during those months. Similarly, specimens received during November and December 2015 had substantially higher odds of longer test phase turnaround time, which may be linked may to reagent stock-outs or staff leave due to holidays.

While our results do not pinpoint the exact causes of longer turnaround time, they reveal the scope of the problem, inform strategies for improvement, and provide baseline data to assess future improvement efforts. Perhaps the clearest strategies to shorten turnaround time are to increase efficiencies and improve quality management systems (QMS) across the VL spectrum. Implementing these strategies in LMICs, however, is challenging, particularly implementation of comprehensive QMS. Thus, use of focused, data-driven investigations into laboratory quality issues in lieu of comprehensive QMS may be a more feasible option. The key to such investigations are data collection systems reminiscent of Malawi’s LIMS, which are akin to surveillance systems long utilized in public health to detect and respond to disease outbreaks. In the case of LIMS, though, the disease is sub-standard laboratory quality, such as long turnaround time, and the response is a targeted investigation to identify and address causes responsible for that sub-standard quality. Future efforts in Malawi and elsewhere should treat such systems, many of which are already in existence, as they would traditional surveillance systems; that is, the data being collected should be regularly monitored and the system collecting the data should be periodically evaluated. Optimized use of these systems will enable LMICs to efficiently utilize resources to improve laboratory quality and ensure that health outcomes are not compromised.

The implications of shortened turnaround time are not limited to improved health outcomes. Other aspects of the VL spectrum, such as demand for and quality of VL testing are essential to reaching the 90-90-90 goals [1] and may also be positively affected by shortened turnaround time. A 1989 study conducted in an American hospital alluded to a phenomenon in which the number and type of tests ordered within the hospital were influenced by turnaround time, and suggested that decentralization of testing could shorten turnaround time [22]. While the setting is vastly different, that study is echoed by recent anecdotal reports from Malawi that suggest demand for VL testing and movement toward decentralized testing are influenced by turnaround time. Specifically, longer turnaround times may be decreasing demand for VL testing and driving decentralization via adoption of point-of-care (POC) technology, which, though promising, raises concerns about quality [23, 24]. These and other turnaround time-affected aspects lend additional urgency to turnaround time improvement efforts.

Key strengths of this study include the nationally-representative data and the focus on factors that affect turnaround time for VL testing not only in Malawi, but in many LMICs. These data contribute to a better understanding of VL turnaround time and provide evidence that supports the need for a stronger, more efficient VL spectrum. Limitations include the access to data only for afferent turnaround time, which limits our ability to connect findings to patient indicators such as adherence counseling, treatment switches, and patient notification of VL results. Additionally, an adequate duration for HIV VL testing turnaround time has not been identified by either the Malawi Ministry of Health or international health bodies such as the WHO. This lack of a proverbial ‘measuring stick’ for turnaround time limits our ability to judge the adequacy of the turnaround times observed in Malawi. Finally, there were a substantial number of VL specimens with missing and/or implausible date combinations, which were omitted from turnaround time calculations.

Conclusions

Longer afferent turnaround times for VL testing were observed in Malawi and appeared to be driven in a phase-specific manner by factors such as the presence of a molecular lab in the collection district, the type of specimen collected, the month that the specimen was collected, and the laboratory where the test was run. Given turnaround time’s impact on the VL spectrum, addressing the specific causes of longer turnaround time nested within these factors is critically important. Strengthening the VL spectrum via increasing efficiencies, improved QMS, and capacity to conduct data-driven investigations into laboratory quality issues are initiatives that should be considered essential to reaching the 90-90-90 targets and controlling the HIV epidemic.

Author Contributions

  1. Conceptualization: PM GC AK RM JN DS.
  2. Data curation: PM GC DS.
  3. Formal analysis: PM.
  4. Investigation: PM GC HA AS.
  5. Methodology: PM AK.
  6. Visualization: PM.
  7. Writing – original draft: PM.
  8. Writing – review & editing: PM AK JN DS.

References

  1. 1. HIV/AIDS JUNPo. 90-90-90: A transformative agenda to leave no one behind. 2014.
  2. 2. AIDSinfo [Internet]. UNAIDS. 2015.
  3. 3. WHO. Consolidated guidelines on the use of antiretroviral drugs for treating and preventing HIV infection. Recommendations for a public health approach. Geneva: 2013.
  4. 4. WHO. Consolidated guidelines on the use of antiretroviral drugs for treating and preventing HIV infection. Recommendations for a public health approach—Second edition. Geneva: 2016.
  5. 5. Keiser O, MacPhail P, Boulle A, Wood R, Schechter M, Dabis F, et al. Accuracy of WHO CD4 cell count criteria for virological failure of antiretroviral therapy. Trop Med Int Health. 2009 Oct;14(10):1220–5. PMCID: PMC3722497. pmid:19624478
  6. 6. Sigaloff KC, Hamers RL, Wallis CL, Kityo C, Siwale M, Ive P, et al. Unnecessary antiretroviral treatment switches and accumulation of HIV resistance mutations; two arguments for viral load monitoring in Africa. J Acquir Immune Defic Syndr. 2011 Sep 1;58(1):23–31. pmid:21694603
  7. 7. WHO. The Availablity and use of HIV Diagnostics: A 2012/2013 WHO Survey in Low-and Middle-Income Countries. WHO, 2014.
  8. 8. Roberts T, Cohn J, Bonner K, Hargreaves S. Scale-up of Routine Viral Load Testing in Resource-Poor Settings: Current and Future Implementation Challenges. Clin Infect Dis. 2016 Jan 6.
  9. 9. Fiscus SA, Cheng B, Crowe SM, Demeter L, Jennings C, Miller V, et al. HIV-1 viral load assays for resource-limited settings. PLoS Med. 2006 Oct;3(10):e417. PMCID: PMC1592347. pmid:17032062
  10. 10. Roberts T, Bygrave H, Fajardo E, Ford N. Challenges and opportunities for the implementation of virological testing in resource-limited settings. J Int AIDS Soc. 2012;15(2):17324. PMCID: PMC3494160. pmid:23078767
  11. 11. Rutstein SE, Golin CE, Wheeler SB, Kamwendo D, Hosseinipour MC, Weinberger M, et al. On the front line of HIV virological monitoring: barriers and facilitators from a provider perspective in resource-limited settings. AIDS Care. 2016 Jan;28(1):1–10. pmid:26278724
  12. 12. Ramadhani HO, Bartlett JA, Thielman NM, Pence BW, Kimani SM, Maro VP, et al. The Effect of Switching to Second-Line Antiretroviral Therapy on the Risk of Opportunistic Infections Among Patients Infected With Human Immunodeficiency Virus in Northern Tanzania. Open Forum Infect Dis. 2016 Jan;3(1):ofw018. PMCID: PMC4776054. pmid:26949717
  13. 13. Mugavero MJ, Napravnik S, Cole SR, Eron JJ, Lau B, Crane HM, et al. Viremia copy-years predicts mortality among treatment-naive HIV-infected patients initiating antiretroviral therapy. Clin Infect Dis. 2011 Nov;53(9):927–35. PMCID: PMC3189165. pmid:21890751
  14. 14. Phillips AN, Pillay D, Garnett G, Bennett D, Vitoria M, Cambiano V, et al. Effect on transmission of HIV-1 resistance of timing of implementation of viral load monitoring to determine switches from first to second-line antiretroviral regimens in resource-limited settings. AIDS. 2011 Mar 27;25(6):843–50. pmid:21192233
  15. 15. Petersen ML, Tran L, Geng EH, Reynolds SJ, Kambugu A, Wood R, et al. Delayed switch of antiretroviral therapy after virologic failure associated with elevated mortality among HIV-infected adults in Africa. AIDS. 2014 Sep 10;28(14):2097–107. PMCID: PMC4317283. pmid:24977440
  16. 16. Lecher S, Williams J, Fonjungo PN, Kim AA, Ellenberger D, Zhang G, et al. Progress with Scale-Up of HIV Viral Load Monitoring—Seven Sub-Saharan African Countries, January 2015-June 2016. MMWR Morb Mortal Wkly Rep. 2016 Dec 02;65(47):1332–5. pmid:27906910
  17. 17. WHO. Technical and operational considerations for implementing HIV viral load testing. 2014.
  18. 18. Rutstein SE, Hosseinipour MC, Kamwendo D, Soko A, Mkandawire M, Biddle AK, et al. Dried blood spots for viral load monitoring in Malawi: feasible and effective. PLoS One. 2015;10(4):e0124748. PMCID: PMC4405546. pmid:25898365
  19. 19. Medicine ASoL. Viral Load Monitoring in African HIV Treatment Programmes. 2013.
  20. 20. Peter TE, D, Kim, A. Boeras, D. Messele, T. Robers, T. Stevens, W. Jani, I. Abimiku, A. Ford, N. Katz, Z. Nkengasong, J. Early Antiretroviral Therapy Initiation: Access and Equity of Viral Load Testing for HIV Treatment Monitoring Lancet Infectious Diseases. In press.
  21. 21. Boillot F, Serrano L, Muwonga J, Kabuayi JP, Kambale A, Mutaka F, et al. Implementation and Operational Research: Programmatic Feasibility of Dried Blood Spots for the Virological Follow-up of Patients on Antiretroviral Treatment in Nord Kivu, Democratic Republic of the Congo. J Acquir Immune Defic Syndr. 2016 Jan 1;71(1):e9–e15. PMCID: PMC4679362. pmid:26413848
  22. 22. Hilborne LH, Oye RK, McArdle JE, Repinski JA, Rodgerson DO. Evaluation of stat and routine turnaround times as a component of laboratory quality. Am J Clin Pathol. 1989 Mar;91(3):331–5. pmid:2923096
  23. 23. O'Kane MJ, McManus P, McGowan N, Lynch PL. Quality error rates in point-of-care testing. Clin Chem. 2011 Sep;57(9):1267–71. pmid:21784764
  24. 24. Pai NP, Vadnais C, Denkinger C, Engel N, Pai M. Point-of-care testing for infectious diseases: diversity, complexity, and barriers in low- and middle-income countries. PLoS Med. 2012;9(9):e1001306. PMCID: PMC3433407. pmid:22973183