Introduction
Material decomposition imaging (MDI) using dual-energy computed tomography (DECT) was first described by Hounsfield in 1973 [
1]. Different materials, which cannot be distinguished on the basis of attenuation number, can be distinguished with the use of material decomposition algorithms using DECT acquisitions [
2‐
4]. Materials with high atomic numbers, such as iodine (
Z = 53) and gadolinium (
Z = 64), show characteristic high attenuation profiles at low energies owing to a substantial contribution of the photoelectric effect to the attenuation [
5]. MDI uses these characteristic attenuation profiles to differentiate these contrast agents from other materials. MDI has not been widely applied in clinical practice until recently. Over the past few years several CT vendors have made DECT commercially available for daily clinical practice. Recently a novel DECT technique has become commercially available, which uses a single tube with a dual-layer detector capable of differentiating between low and high energy X-ray photons, and is further investigated in this study.
One of the most widely researched MDI applications is quantitative mapping of iodine distribution in tissues. The resulting maps can be used as a surrogate for tissue perfusion. Early evidence has shown the clinical capability of iodine quantification with DECT at a specified time point for the detection of myocardial [
6‐
12] and pulmonary perfusion defects [
13‐
16]. In addition, DECT iodine mapping is capable of tumour mass characterization and therapy response assessment [
17‐
19]. However, iodine contrast administration, while safe in most patients, is associated with contrast-induced allergic reactions and nephropathy which can cause acute renal dysfunction [
20,
21] and significant morbidity and mortality, especially in high-risk patients [
22,
23]. In patients with contraindications to iodinated contrast media, gadolinium-enhanced magnetic resonance (MR) angiography can be used as an alternative. However, depending on the indication, MR angiography may have poor diagnostic value compared to (DE)CT angiography. Gadolinium-based CT has been used off-label in higher doses as an alternative for conventional CT angiography with diagnostic image quality [
24,
25]. With the use of DECT, higher attenuation can be achieved at low (monochromatic) energies, which could enable the use of much lower gadolinium concentrations [
26,
27]. In addition, accurate gadolinium quantification using DECT could allow for a quantitative evaluation of contrast agent distribution in tissue as a surrogate for tissue perfusion using MDI. Therefore, accurate gadolinium quantification combined with increased attenuation could potentially open up the possibility for gadolinium as an alternative contrast agent for DECT imaging in patients with contraindications to iodinated contrast media.
In several studies the feasibility of gadolinium-enhanced DECT has been reported in phantom and animal models [
28‐
31]. These studies described the capability of spectral differentiation and visualisation [
28‐
30] and accuracy of quantification [
31] of gadolinium using DECT. However, the accuracy of gadolinium quantification using the novel dual-layer spectral detector CT system (SDCT) is unknown. Therefore, the aim of the current study was to evaluate the feasibility and accuracy of gadolinium quantification using a SDCT system.
Discussion
This study showed that it is feasible to quantify a commonly clinically encountered range of gadolinium concentrations in a phantom model with overall high accuracy and reproducibility using an in-house-developed material decomposition method on a novel clinical dual-layer spectral detector CT system.
Whereas conventional CT displays anatomical structures as a function of tissue density, DECT enables enhanced tissue characterization using MDI. Quantitative assessment of contrast agent uptake and its provided distribution map can be used as a surrogate for tissue perfusion [
6‐
12,
14]. In the current study we showed that clinically encountered low concentrations of gadolinium, down to 0.5 mg/mL, can be accurately quantified with a mean measurement error of 0.1 mg/mL using SDCT at both 120 and 140 kVp. In the ultra-low gadolinium concentration range (0.1–0.4 mg/mL), expected to be encountered in tissues with a perfusion defect, the mean measurement error remained around 0.1 mg/mL at both 120 and 140 kVp. However, at these low concentrations the margin of error increased substantially and approached the gadolinium concentration itself, indicating that the lower limit of reasonably accurate gadolinium quantification using SDCT lies between 0.5 and 1.0 mg/mL. In the range of clinically encountered gadolinium concentrations (0.5–5.1 mg/mL) after administration of 0.1–0.2 mmol/kg bodyweight, mean CT numbers at 40 keV ranged between 28 and 464 HU (Fig.
3). The combination of high(er) attenuation at lower monochromatic energies and accurate quantification of low gadolinium concentrations opens up the possibilities for DECT scanning with the use of gadolinium as a contrast agent. Potential clinical applications include detection of myocardial [
6‐
12] and pulmonary perfusion defects [
14‐
16] and the characterization of tumour masses and therapy response assessment [
17‐
19].
In clinical routine, adequate tissue contrast and contrast agent density maps are important for the diagnosis and evaluation of organ perfusion defects. However, to be able to create a gadolinium density map as a surrogate for tissue perfusion, accurate gadolinium quantification is essential, as the post-processing is based on these measurements. This is the first study to describe the accuracy of gadolinium quantification using MDI on SDCT. Gabbai et al. [
28] described the capability of spectral differentiation of gadolinium using SDCT, which is in accordance with our study. However, no quantitative values were described and high concentrations (4.7–187.6 mg/mL) of gadolinium were used, which is at least one to two orders of magnitude above the estimated range encountered in healthy cardiac, lung, liver, spleen and kidney tissue (0.58–4.66 mg/mL). Zhang et al. [
30] showed a high sensitivity and specificity for gadolinium-enhanced dual-source DECT pulmonary angiography to detect pulmonary embolism in rabbits. However, as in the study by Gabbai et al. gadolinium concentration was not quantified. In addition, high intravenous doses of gadolinium contrast agent, 1.5 and 2.5 mmol/kg bodyweight, were administrated. Bongers et al. [
31] evaluated the potential of gadolinium as a CT contrast agent using dual-source DECT in a phantom setup. In accordance with our study they found that monochromatic images at low energy (e.g. 40 keV) allow for higher attenuation. Additional quantification was performed by using the material-specific dual-energy ratio for gadolinium. For the true gadolinium concentrations 6.3, 3.2, 1.6, 0.8, 0.4 and 0.2 mg/mL relative measurement errors were 11.5, 12.0, 21.6, 21.6, 104.2 and 159.4%, respectively. In our study we found a higher accuracy with relative measurement errors of less than 10% down to 2.0 mg/mL at 120 kVp and 1.0 mg/mL at 140 kVp. A possible explanation for this difference can be found in the algorithm. The post-processing algorithms used by Bongers et al. [
31] was originally designed for iodine, whereas our algorithm was specifically designed for gadolinium quantification.
We found a slightly lower measurement error, and thus higher accuracy, for scans acquired at 140 kVp compared to 120 kVp. When scanning with a higher tube voltage, more high energy X-ray photons are produced. This decreases the spectral overlap between high- and low-energy spectra, and thereby improves the accuracy of material decomposition, which is in accordance with the findings of Gabbai and colleagues [
28]. Moreover, 140 kVp acquisitions resulted in higher CT numbers of different gadolinium concentrations at monochromatic 40 keV images (34–464 HU) compared to 120 kVp acquisitions (28 to 416 HU), indicating a superior spectral separation at a higher tube voltage.
Even though gadolinium chelates are generally considered to be safe contrast agents, with acute reaction rates of approximately 0.001–0.07% [
41], recently concerns have arisen about their long-term safety after the discovery that administration of multiple doses has led to detectable gadolinium levels in the brain [
42,
43]. In addition, gadolinium contrast has been linked to an increased risk of nephrogenic systemic fibrosis (NSF) in patients with impaired renal function [
44]. In both conditions the linear non-ionic and linear ionic contrast agents have primarily been implicated, whereas macrocyclic gadolinium agents, such as used in the current study, have not been linked conclusively to either of these conditions [
45‐
47]. Although both iodine and gadolinium contrast agents pose a risk for patients with impaired renal function, gadolinium is thought to be preferred in patients with renal failure and a glomerular filtration rate greater than 30 mL/min since the risk of NSF is low in these patients, while the risk of iodine contrast-induced nephropathy clearly exists [
41]. Furthermore, using gadolinium could potentially obviate the need for pre- and post-imaging hydration as well as premedication protocols that are commonly used in patients with impaired renal function who undergo contrast-enhanced CT scanning, or patients with known allergies to iodinated contrast agents. In the current study a relatively simple method for material decomposition using in-house-developed software is proposed. Our method is based on the mass attenuation coefficient across monochromatic energies. Monochromatic reconstructions take into account the function of two independent factors: the photoelectric and the Compton effect [
2]. The photoelectric effect is strongly related to the atomic number of a material in the CT energy range and is therefore material-specific [
37]. Our method takes into account this material-specific effect by evaluating the attenuation across monochromatic energies.
The strength of our study is that we evaluated accuracy of gadolinium quantification in an optimal controlled setting with a wide and clinically relevant range of gadolinium concentrations, which provides the basis for further research and clinical applications. Our study also has some limitations. The most important is that we used a static phantom in which organ motion was not taken into account. In addition, a fixed concentration is not the same as a bolus injection. However, we tried to mimic the clinical situation as best as possible by using low concentrations of gadolinium, which are expected to be typically encountered clinically. A second limitation is that our study only takes into account water and gadolinium when calculating the amount of gadolinium concentration. Since human tissue does not only consist of water and gadolinium, future phantom and patient research will have to address (healthy) tissue attenuation as well using a three- or multi-material decomposition method. A third limitation is the need for relatively high peak tube voltage (120 or 140 kVp) settings to ensure sufficient spectral separation. However, the higher radiation dose due to the use of high kVp acquisitions can be addressed by reducing tube current (mAs). The fourth limitation is that we only evaluated one DECT technique; therefore, our results may be limited to the vendor used in this study.
In conclusion, SDCT allows for accurate quantification of commonly clinically used gadolinium concentrations at both 120 and 140 kVp. Lowest measurement errors were found for 140 kVp acquisitions.