Background
Definitive radiotherapy (RT) with platinum-based doublet chemotherapy has been widely recognized over the past few decades as the standard of care for unresectable stage III non-small cell lung cancer (NSCLC). However, the survival outcomes with this approach remain disappointing, with a 5-year overall survival (OS) rate of approximately 15% using conventionally-fractionated radiotherapy (CFRT) with doses of 60–66 Gy in 30–33 fractions [
1,
2]. As promising results were observed in the PACIFIC trial, consolidation treatments with immune checkpoint inhibitors (ICIs) are now recommended for patients with stage III, unresectable NSCLC after receiving definitive CRT [
3]. However, on the basis of CFRT, increasing the number of fractions and radiation duration to increase the overall radiation dose did not improve outcomes for patients [
4]. With higher biological effective dose (BED) over a shorter period, hypofractionated radiotherapy (HFRT) could overcome the accelerated repopulation of tumor cells during treatment and theoretically obtain better efficacies [
5,
6]. However, the clinical application of HFRT has been limited due to the side effects of irradiation, particularly grade III or IV toxicities [
6]. In recent years, advances in RT techniques have facilitated the safe and effective delivery of high doses of radiation to tumors while preserved the surrounding disease. Several studies reported that HFRT could achieve good local control with well-tolerated toxicity for locally-advanced NSCLC when modern RT techniques were utilized [
7‐
9]. HFRT is experiencing increasing importance in the treatment of locally-advanced NSCLC.
Numerous studies demonstrated that the host immune system and peripheral lymphocyte populations play crucial roles in oncologic outcomes, and the reduction in total peripheral lymphocyte counts (TLCs) results in impairing antitumor immunity [
10,
11]. Lymphopenia is a common consequence of RT in cancer patients, regardless of whether other lymphotoxic agents, such as corticosteroids or cytotoxic chemotherapeutics are administered[
12‐
15]. The preclinical model suggested that decreasing the target radiation volume and fraction size significantly reduced the circulating blood exposure radiation dose [
16]. It is therefore possible that protracted radiation courses with higher fractions will increase exposures to radiation and increase lymphocyte destruction. Such assumptions have led some researchers to conclude that HFRT with shorter treatment periods would reduce radiation exposure dose to circulating blood and spare peripheral lymphocytes, compared to CFRT with treatment period extending.
Despite theoretical superiority, the impact of different fractionation regimes and overall treatment times (OTT) on radiation-induced lymphopenia (RIL) in unresectable stage III NSCLC was largely unexplored until recently. In our institution, apart from CFRT, other fractionation regimes with increased doses per fraction and shorter OTT have been increasingly adopted since 2011 for locally advanced NSCLC. Therefore, we sought to evaluate the potential impact of fraction regime and OTT on TLCs by retrospectively analyzing stage III, locally advanced, unresectable NSCLC patients receiving definitive chemoradiation therapy (CRT) at our institution.
Methods
Patient population
After approval by the institutional review board, we retrospectively analyzed the medical records of consecutive patients with histologically-verified NSCLC receiving definitive CRT (equivalent dose in 2 Gy fraction [EQD2] ≥ 60 Gy, whereas the α/β ratio for lung cancer is 10 Gy [
16]) at Zhongshan Hospital, Fudan University from January 2011 to December 2017. All patients were (re)staged according to the 8th edition of the AJCC TNM classification. The inclusion criteria were as follows: (1) unresectable Stage III NSCLC;(2) age 18 years or above and an Eastern Cooperative Oncology Group (ECOG) performance status ≤2; (3) no history of a concomitant malignancy; (4) no prior RT or immunotherapy for any disease; (5) finished the scheduled treatment including four or more cycles of platinum-based doublet chemotherapy; (6) complete and retrievable medical records, including at least 3 documented weekly TLCs values during RT course were available for review. Patients were excluded if they had received any other anticancer treatments besides CRT before disease progression, had used any systemic corticosteroids during RT except for pretreatment before chemotherapy, had any ongoing infection, rheumatoid disease, and had immune system disease. All patients underwent a comprehensive assessment within 3 weeks prior to the treatment, including routine blood testing, pulmonary function tests, chest/abdominal CT with contrast, brain magnetic resonance imaging, abdominal ultrasonography, bone scan, and/or positron emission tomography/computed tomography (PET/CT). Informed consent was obtained from all patients.
Therapeutic interventions
Treatment planning, including radiation targets, normal tissue dose constraints, and delivery techniques for locally advanced NSCLC in our institution have been previously described [
8]. The fractionation regimes primarily depended on the treating physicians’ preference, based on the clinical tumor size and location. Typically, 2.0 to 3.0 Gy per fraction for 20–30 fractions, and a total dose of 60–70 Gy were adopted in our institution. The fractionation regimes information used in our study population were shown in Table
1. The median calculated BED was 76.8 Gy (interquartile range [IQR] 72.0–78.0 Gy), whereas the α/β ratio for lung cancer is 10 Gy [
17], and the median EQD2 was 64.0 Gy (IQR, 60.0–65.0Gy).
Table 1
Details of fractionation regimes used in our study population (n = 115)
6000 cGy/20 fractions | 30 (26.1%) |
5940 cGy/22 fractions | 2 (1.7%) |
6300 cGy/23 fractions | 1 (0.9%) |
6000 cGy/24 fractions | 1 (0.9%) |
6240 cGy/24 fractions | 1 (0.9%) |
6000 cGy/25 fractions | 11 (9.6%) |
6100 cGy/25 fractions | 1 (0.9%) |
6250 cGy/25 fractions | 9 (7.8%) |
6400 cGy/25 fractions | 1 (0.9%) |
6500 cGy/25 fractions | 1 (0.9%) |
7000 cGy/25 fractions | 1 (0.9%) |
6240 cGy/26 fractions | 1 (0.9%) |
6210 cGy/27 fractions | 4 (3.5%) |
6160 cGy/28 fractions | 5 (4.3%) |
6000 cGy/30 fractions | 30 (26.1%) |
6400 cGy/32 fractions | 8 (7.0%) |
6600 cGy/33 fractions | 6 (5.2%) |
7000 cGy/35 fractions | 2 (1.7%) |
Data acquisition
Clinical and laboratory parameters were extracted from the electronic medical record for all patients. Patient-specific variables included age, sex, ECOG performance status, smoking history, and baseline laboratory values. Tumor-specific variables included histological type and TNM classification. Treatment-specific variables included the RT technique, dose per fraction, fractions, treatment interruption, receipt of chemotherapy, chemotherapy type, and chemotherapy timing. Dosimetric parameters included gross target volume (GTV), planning target volume (PTV), mean dose to lung (MLD) and heart (MHD), and percentage volume of lung and heart receiving 5–60 Gy (V5–60 Gy) in 5 Gy increments, respectively.
TLCs were collected and investigated at the following intervals: within 2 weeks prior to induction chemotherapy or RT (baseline TLCs), within 2 weeks prior to the start of RT (Pre-RT TLCs), weekly during radiation treatment, and 1 and 2 months after RT completion. If complete blood cell counts were not available at the exact time point, i.e. no TLCs data were recorded for that time point. TLCs nadir was defined as the minimum value of TLCs during the course of RT. Severe lymphopenia (SL) was defined as TLCs nadir below 500 cells/μL during the course of RT, consistent with grade 3 toxicity according to the Common Terminology Criteria for Adverse Events version 5.0.
Clinical follow-up was generally conducted every 3 months for the first year, every 6 months for the next 2 years, and yearly thereafter. Follow-up examinations included physical examinations, blood tests, chest CT scans, and/or PET/CT. Further examinations were performed when needed for clinical purposes. Progression-free survival (PFS) was calculated from the date of pathologic diagnosis to the date of disease progression, death or last follow-up. OS was calculated from the date of pathologic diagnosis to the date of death from any cause or last follow-up. Patients were censored at the date of the last available follow-up if alive and the data were updated in October 2018.
Statistical analysis
Patients, tumor, and treatment characteristics for the entire group and patient subgroups were summarized using descriptive statistics. To visualize trends in TLCs over the course of RT, TLCs were plotted versus RT time (weeks). Chi-squared and non-parametric tests were used to compare the differences in proportions or medians between groups. Survival rates were calculated using the Kaplan-Meier method, and comparisons were made using the log-rank test. Prognostic variables with p-values ≤0.2 on univariate analyses were entered as covariates in the multivariable Cox proportional hazards model using backward stepwise selection. Spearman correlations were used to examine correlations between dosimetric parameters and TLC nadir. Receiver operating characteristic (ROC) curve analyses were used to determine the optimal cut-off points for continuous variables identified as influencing SL. Univariate logistic regression analyses were conducted to identify variables potentially associated with an increased risk of SL. Multivariate stepwise logistic regression (using variables with univariate significance of p ≤ 0.2) were performed to assess the independent effect of such factors on the development of SL. In addition, age, sex, and ECOG performance status were retained in the initial multivariable model owing to their perceived clinical significances. All statistical analyses were conducted using IBM SPSS software version 23.0 (SPSS Inc., Chicago, IL, USA). Statistical tests were two-sided, and a p-value < 0.05 was considered statistically significant.
Discussion
Many studies have shown that RT can dramatically reduce TLCs, and that low TLC nadirs were correlated with poor survival in many solid tumors. Our results further suggested that the level of TLCs was related to the RT treatment duration for unresectable stage III NSCLC. TLCs declined steeply each week for the first 5 weeks, after which, TLCs nadir occurred at approximately the 5th week. Increasing the dose of radiation per fraction to deliver the entire RT regimen in 4 weeks could significantly lower the risk of developing SL. This is in line with a recent study suggesting that SBRT could decrease the severity of RIL compared to CFRT in locally advanced pancreatic cancer [
18]. Finally, our results revealed a significantly reduced risk of disease progression and death in patients who did not experience SL during RT.
Radiation can suppress or stimulate the immune system. The contribution of lymphocytes to radiation-induced tumor control was shown in mouse models over 30 years ago [
19], and more recently, the availability of T cell receptor (TCR)-transgenic mice made it possible to unequivocally demonstrate that radiation can induce priming of T cells to exogenous model antigens expressed by tumors [
20,
21]. Those studies together with the demonstration that radiation induces immunogenic cell death [
22], have provided proof that radiation can induce tumor-specific T cells. This study revealed an association between higher lymphocyte levels during treatment and better clinical outcomes. Maintaining an intact adaptive immune system during cancer therapy may be important for enhancing the effectiveness of cytotoxic therapies and improving cancer control. This may impact cancer recurrence by affecting the numbers of tumor-infiltrating lymphocytes, which correlates with the prognosis of multiple cancers [
23,
24]. This is consistent with our observation that TLC nadir was associated with worse PFS and OS.
The clinical implications of our findings may also be more pronounced in the setting of the new therapeutic strategy of combining RT and immunotherapy in NSCLC patients. As promising results were observed in the PACIFIC trial, consolidation treatments with immune checkpoint inhibitors (ICIs) are now recommended for patients with stage III, unresectable NSCLC after receiving definitive CRT [
16]. Previous work has shown that RIL would further compromise the therapeutic efficacies of ICIs through the loss of effector cells, which identify and destroy tumor cells [
25,
26]. The efficacy of ICIs relies on the modulation of lymphocyte activity and are dependent on circulating lymphocyte counts [
27]. SL at the onset of ICI was independently associated with poor survival [
25]. This further emphasized the importance of preserving and maintaining circulating lymphocytes in the emerging era of clinical immunotherapies. Considering the potential therapeutic abilities of circulating lymphocyte populations, a hypofractionated regimen with fewer fractionations and shorter treatment course (within 4 weeks) may be a superior approach for combination with immunotherapies.
Pulmonary circulation is the main portion of the cardiovascular system in which oxygen-depleted blood is pumped away from the heart via the pulmonary artery. A separate system known as bronchial circulation also supplies blood to the tissues of the larger airways. Arteries are further divided into very fine capillaries, which are extremely thin-walled and cover the lung. Some tumors may also be adjacent to heart, thoracic aorta, and vein which are filled with blood. Thus, the distribution of low-dose irradiation could include part of the heart or large vessels. One proposed mechanism for RIL is via radiation exposure of the circulating blood pool, since lymphopenia occurs after irradiation of tissues, including breast and brain, which contain little bone marrow or lymphatic tissues, respectively [
28,
29]. Peripheral blood lymphocytes are known to be extremely sensitive to radiation despite mitotic inactivity [
30]. Nevertheless, an unexpected finding of our study was that the TLC nadirs occurred at approximately the fifth week, not at the completion of RT or upon delivery of maximal doses.
Multiple studies demonstrated that pretreatment with low doses of irradiation can induce resistance to damage from subsequent high doses of irradiation [
31,
32]. Yovino et al. analyzed a modeled “typical” radiation plan for glioblastoma (60 Gy/30 fractions) and found that a single fraction caused 0.5 Gy exposure to 5% of all circulating blood cells. L. Basler et al. also regarded that the proportion of circulating lymphocytes exposed to at least 0.5 Gy was a surrogate parameter for radiation-induced immunosuppression [
33]. The circulating blood dose appeared to depend on the target volume and fraction number in early treatments. However, as treatment progressed to the 5th week of RT, nearly all the circulating blood cells were projected to receive at least 0.5 Gy (mean dose 2.2 Gy) [
16]. This simulation model might explain our finding that by the 5th week of RT, nearly all circulating lymphocytes had received a low-dose (at least 0.5 Gy) of irradiation and developed resistance. The underlying mechanisms of this phenomenon are not well understood and our findings provide an important basis for further investigations.
In addition to the treatment duration, pre-RT TLCs, GTV, and MLD were also found to strong impact RIL in our study. A higher GTV and MLD tended to trigger greater depletion of peripheral lymphocytes due to greater exposure of those lymphocytes to radiation. We had excluded patients who had received any other anticancer treatments besides CRT before disease progression, had used any systemic corticosteroids during RT except for pretreatment before chemotherapy, had any ongoing infection, rheumatoid disease, or immune system disease to eliminate potential clinical factors influencing the lymphopenia. As for underlying respiratory system disease, our data show that it has no impact on lymphopenia. The contribution of chemotherapy to lymphopenia has been disputed [
14,
34]. Our study showed that concurrent chemotherapy was not associated with SL, which was consistent with previous study [
14]. Additionally, there was no difference in the median baseline TLCs or pre-RT TLCs in our study population, whether stratified by the chemotherapy regimen or cycles (all
P > 0.05). This was consistent with the notion that the choice of chemotherapy regimens and the cycles had no correlation with lymphopenia in patients with stage III NSCLC [
35]. Thus, the chemotherapy regimen used in our study population did not play a major role in causing lymphopenia. Bone marrow suppression caused by chemotherapy primarily resulted in the reduction of peripheral neutrophil cell counts in our patients (data not shown). Given the potential effect of corticosteroids on circulating lymphocytes, patients who had received systemic steroids during RT were excluded from our analyses, unless steroids were used as pretreatments before chemotherapy [
36,
37]. However, logistic regression analyses showed that the dose of corticosteroids (equivalent to approximately 60–120 mg of prednisone) was not associated with an increased risk of developing SL. Clinical trials for ICIs typically excluded patients who received systemic steroids exceeding 10 mg of prednisone daily (or the equivalent) [
38]. Collectively, our findings indicated that the doses of transient corticosteroid administration and chemotherapy had no obvious cytotoxicity to circulating lymphocytes.
Several limitations should be noted in interpreting our results. First, the frequencies of blood tests during RT were subject to variations, although the general practice at our institution was to perform blood testing prior to, and weekly during, RT. Second, considering the TLC nadir was at the 5th week of RT, we divided patients into two groups based on whether the RT course was completed within 4 weeks. However, the small number of patients in the STRT group decreased the analysis power. For example, despite a large absolute difference in 3-year OS rates favoring the STRT group after adjustment, the data did not reach statistical significance. Third, the selection of fractionation and dosing schemes was generally based on the treating physician’s preference. Fourth, unscheduled treatment interruptions due to malfunctions prolonged the OTT. Therefore, we cannot exclude the possibility that such occurrences biased the results.