Background
Iron deficiency (ID) is the most common nutritional deficiency worldwide, affecting up to 25% of the population [
1‐
3]. A variety of causes are responsible for the depletion of iron stores, ranging from deficient dietary iron intake to increased blood loss (e.g. gastro-intestinal cancers, peptic ulcera) [
4,
5]. In addition, in chronic disease populations, due to the pro-inflammatory state that chronic diseases constitute, upregulation of serum hepcidin blocks iron absorption from the gut and iron release from the reticulo-endothelial system leading to reduced iron availability despite adequate stores [
6]. Indeed, in these populations, such as chronic heart failure (CHF) and chronic kidney disease (CKD), it has been shown that ID is highly prevalent and associated with an increased risk of morbidity and mortality, independent of potential confounders, including anemia [
7‐
9].
The definition of ID is still a matter of debate [
10,
11]. ID is generally divided into absolute ID (low iron stores) and functional ID (insufficient iron supply to the bone marrow despite sufficient iron stores). Due to the existence of both absolute ID and functional ID and the absence of an unequivocal gold standard, it remains challenging to correctly identify ID [
12]. Clinicians and epidemiologists alike predominantly rely on two frequently used markers, namely ferritin (for iron load) and transferrin saturation (TSAT, for iron transport availability) [
13‐
15]. However, to date, no consensus has been reached which cutoffs of these parameters should be utilized to define absolute and functional ID per population. Except perhaps in the cardiology field where absolute ID is defined as a ferritin level < 100 μg/L, and functional ID as a TSAT< 20% accompanied by ferritin levels between 100 and 299 μg/L [
7,
16]. Currently in nephrology, the Kidney Disease Improving Global Outcomes (KDIGO) committee recommends a trial of 1 to 3 months of oral iron therapy in non-dialysis CKD patients when TSAT levels are below 30% and ferritin below 500 μg/L. However, it is not known which cutoffs of ferritin and/or TSAT perform best with respect to predicting anemia, response to iron treatment or outcome [
17].
Correctly defining which cutoffs of ferritin and TSAT associate with outcome would identify which patients are most at risk to develop these outcomes and thus in which patients correction of ID could potentially have the greatest benefit. Therefore, the present study was performed to define which cutoffs of serum ferritin and TSAT perform optimally for the risk of all-cause mortality, cardiovascular mortality, and risk of developing anemia in CKD patients.
Discussion
In this study, consisting of a large cohort of CKD patients, we show the impact of using different cutoffs for ferritin and/or TSAT on the association with all-cause mortality, cardiovascular mortality, and the subsequent development of anemia. Remarkably, in CKD patients the highest risk to develop adverse outcomes was uniformly observed at a low TSAT level, i.e. lower than 10%, largely independent of the level of ferritin. The current results are of importance for defining ID in CKD patients and may aid clinicians to focus on these specific cutoffs for TSAT in order to improve outcome.
To date, there is no consensus in the field of CKD which cutoffs of ferritin and TSAT should be retained to define ID. Multiple important studies in CKD patients have utilized different definitions for ID. For example, the FIND-CKD study used serum ferritin < 100 μg/L or TSAT < 20% in combination with serum ferritin of < 200 μg/L to identify patients as iron deficient, whereas Qunibi and colleagues defined ID as TSAT ≤25% with ferritin ≤300 μg/L and Fishbane and colleagues utilized TSAT ≤25% in combination with ferritin ≤200 μg/L to determine ID [
21‐
23]. Currently, the plethora of ID definitions impedes comparability among iron studies. As a result, translation to clinical practice is difficult. Therefore, it is important to identify optimal cutoffs for CKD patients. Accordingly, in the present study, we assessed prospectively which cutoffs for ferritin and TSAT performed optimally for the association with adverse outcomes, implicating that, at least in terms of survival and development of anemia, these selected cutoffs are clinically most relevant.
Previously, few studies have evaluated the accuracy of serum ferritin and TSAT cutoffs to define ID in CKD patients in terms of sensitivity and specificity. Fishbane et al. determined in hemodialysis (HD) patients which levels of serum ferritin and TSAT were most predictive for ID. The authors concluded that in erythropoietin-responsive patients ferritin level of lower than 100 μg/L or TSAT< 18% are indicative of inadequate iron status, whereas in erythropoietin-resistant patients a serum ferritin < 300 μg/L or a TSAT < 27% should be utilized [
24]. Also in HD patients, Kalantar-Zadeh et al. identified high specificity for a cutoff of serum ferritin < 200 ng/mL and high sensitivity for a TSAT< 20% [
25]. As far as we know, we are the first in CKD patients to assess the performance of different cutoffs for ferritin and/or TSAT in terms of prospective associations with adverse outcomes.
Our results identified TSAT lower than 10% to be the optimal cutoff associated with increased risk of detrimental outcomes in the CKD population. In our population of early stage CKD patients, i.e. CKD stadia one to three, the importance of adequate iron status is evident, in view of the increased hazard ratios for development of adverse outcomes. When carefully assessing the hazard ratios for all-cause mortality, cardiovascular mortality, and risk of anemia, it is clear that the highest risk is observed for TSAT< 10%, however, also for TSAT< 15% a significant increased risk in adverse outcomes is observed. It should be noted that the cardiovascular mortality risk associated with ID is markedly higher (nearly double) than the risk for all-cause mortality. For a cutoff value of TSAT < 20 and < 30% the observed hazard ratios for all-cause mortality and cardiovascular mortality are less impressive, whereas the risk for anemia decreases steadily with increasing cutoff levels. Conditional definitions as those used previously in the FAIR-HF and FIND-CKD did not improve the association with increased risk. This suggests that in CKD patients the main focus should be on low TSAT, especially TSAT lower than 10%. Based on these results, it may be speculated that failure to correct these low TSAT levels might jeopardize the survival of CKD patients.
Currently, ferritin and TSAT are the most commonly used markers in clinical setting to evaluate iron stores and iron availability. However, there are important drawbacks on the use of ferritin and TSAT as iron status parameters. Serum ferritin is an acute-phase reactant and therefore in chronic disease populations serum ferritin levels will be elevated [
26,
27]. TSAT also has acute-phase reactivity as transferrin is elevated in the setting of acute inflammation which will lower TSAT when circulating iron remains constant [
28]. However, other markers, such as soluble transferrin receptor, percentage hypochromic red blood cells, and reticulocyte hemoglobin content, are not readily available in clinical practice, less well studied, or not used for other reasons.
Our study has strengths and limitations. Strengths are that it comprises a large cohort of CKD patients with availability of data on iron status and that it is the first study to assess all combinations of cutoffs with respect to “hard” clinical endpoints. Limitations of the current study include its observational design, that it comprises a single center study and that measurement of iron parameters were performed at a single time point, which precludes our ability to discern the impact of changes in iron parameters over time on clinical outcomes. Furthermore, the current study is only valid for early CKD, and precludes us to discern whether similar results apply for more advanced CKD stages. Another limitation might be that we did not adjust for several potential confounders in the different associations between ID and outcomes, however, the primary aim of this study is to study the prospective associations of ID with adverse outcomes using several cutoffs for ferritin and TSAT, not to investigate the mechanisms involved.