Discussion
Our study highlights a worrying trend in malnutrition-related deaths among older adults. Although an initial period of decline in malnutrition mortality was seen from 1999 to 2006, this was not long-lasting. It was followed by a plateau between 2006 and 2013 and an eventual uptrend from 2013 to 2020, at a rate sharper than the initial rate of decline. The end result is that malnutrition mortality rates among older adults in the U.S. are at a historical high.
The increase in mortality rates is likely attributable to a confluence of risk factors for malnutrition rather than a singular culprit. Further, we suspect the changing mortality trends between 1999 and 2020 were due to the delicate balance between risk factors and protective factors for malnutrition, which at different points in time, succeeded in outpowering the other.
Because economic prosperity has long been established as a crucial determinant of population health [
17,
18], the potential role of economic factors in driving the trend is critical to note. At the beginning of the period of trend decline, the U.S. was still experiencing one of its longest economic expansions in history, coming out of the 1990s boom [
19]. A mild eight-month recession in 2001, was followed by more economic growth for several years [
20]. Furthermore, the U.S. at this point was spending more on healthcare than any other developed nation in the world, becoming a world leader in health technology [
21] and health research [
22]. The systemic factors supporting health in this period decreased all-cause AAMR in U.S. older adults from 5,220 per 100,000 in 1999 to 4640 per 100,000 in 2006 [
12].
The economic stressors that followed during the Great Recession may have placed tremendous financial and psychological stress on older adults, particularly the economically disadvantaged. This destabilization of protective factors for malnutrition was likely responsible for the plateau in the mortality trend from 2006 to 2013.
After years of progressively waning, protective factors could not prevent the 2013 uptrend in mortality. While the U.S. continued its astronomical spending on healthcare, the value of healthcare spending in the older adult population has been decreasing over time [
23]. Upon closer scrutiny, it became clear the high U.S. healthcare spending could have potentially started to drive up the mortality trend. Two of the reasons responsible for the high cost of U.S. healthcare are care fragmentation (the dispersion of a patient’s healthcare across systems and providers) [
24] and unregulated fee-for-service. Care fragmentation can cause poor communication between healthcare providers with sometimes contradicting instructions being given to patients, while unregulated fee-for-service can lead to unneeded procedures, with risks of secondary complications [
23]. A recent study on the U.S. medicine floors found that three-fourths of patients with malnutrition had at least a single gap of care, the most common of which involved discharge diet instructions, procedures, and inadequate communication, hindering interventions [
25]. In line with the counterproductive nature of care fragmentation, screening of older adults for malnutrition is still not routinely performed – despite malnutrition being preventable with screening and appropriate interventions – because there is doubt about which healthcare professional should perform the screening [
9].
Moreover, the high spending on healthcare could be driving resources away from public health problems in need of funding and contributing to malnutrition. For example, despite several studies showing the training of nursing home staff on diet quality improvement and feeding assistance care leads to improvement of malnutrition indices of residents [
26,
27], funding for such quality improvement projects is still deficient [
28].
More broadly, the fall of protective factors for geriatric health is reflected in other indices: all-cause mortality in older adults only declined from an AAMR of 4267 per 100,000 in 2013 to an AAMR of 4074 per 100,000 in 2019. The initial decline in mortality rate from 1999 to 2006 (decline of ~ 600) far outpaced the decline seen from 2013 to 2019 (decline of ~ 200) [
12]. Additionally, specific causes of death, aside from malnutrition, in older adults were also on the rise in the same timeframe as malnutrition [
29,
30] and U.S. life expectancy, despite decades of persistent increases, decreased for three consecutive years after 2014 [
31]. This reinforces the argument that protective factors, which were once able to decrease malnutrition mortality, were systemic in nature.
Further contributing to the recent uptrend in malnutrition mortality are the rising risk factors for malnutrition. Among the number of potential culprits is a growing burden of comorbidities. For instance, depression is an established risk factor for malnutrition [
32], due to its ability to decrease appetite and food intake [
33]. In the rural setting, older adults with depression were more than three times more likely to be malnourished than their non-depressed counterparts [
34]. Similar to the malnutrition mortality uptrend, the estimated prevalence of past-year major depressive disorder in older adults increased by a startling 60% from 2010/11 to 2018/19 [
35].
Diseases, outside the realm of mental health disorders, have also been shown to increase the risk of malnutrition [
36]. Considering more than 80% of older adults in the U.S. have at least one chronic disease [
37], this may be an important additional contributor to our findings given the significant burden of malnutrition among these patients [
38‐
40]. Moreover, trends for chronic diseases showed an overall increase in the same timeframe as the start of the malnutrition mortality uptrend [
41].
The growing utilization of nursing homes, whose utilization has more than doubled between 2000 and 2013 [
42] may also be an important aspect to consider [
43]. A study collecting and analyzing data from seven nursing home rehabilitation wards found that more than one out of three patients were moderately to severely malnourished [
44]. It is difficult to assert causality from nursing homes as they typically house vulnerable older adults [
45] who are inherently predisposed to malnutrition. Nevertheless, the adoption of effective interventions that can detect early signs of malnutrition is an important step in decreasing its prevalence in nursing homes.
Other maladies that defy biomedical labels but which are nevertheless crucial in influencing geriatric health should also be considered. A previous study identified social isolation and subjective loneliness, two growing public health concerns [
46], as independent risk factors for malnutrition in older adults [
47]. A further pivotal risk factor for malnutrition is food insecurity [
48,
49], a state of limited access to food due to a lack of financial resources [
50]. A cross-sectional study on community-dwelling older adults reported that any category of food insecurity, even if mild, increased the risk of malnutrition relative to food-secure older adults [
51]. In line with the malnutrition mortality trend, analysis of the National Health and Nutrition Examination Surveys (NHANES) showed that food insecurity increased among older adults by more than two-fold from 2007 to 2016 (increasing from 5.5% to 12.4% respectively) [
52]. This increase in food insecurity was most notable among those with lower incomes. Indeed, national reports show that 26.4% of U.S. older adults below the poverty line suffer from food insecurity compared to only 2.7% of older adults with incomes greater than twice the poverty line [
53]. A county-level analysis combining data from the American Community Survey, CDC WONDER, and the U.S. Department of Agriculture (USDA) has shown that household poverty is associated with increased rates of malnutrition-related deaths [
54].
Further, recent data indicates that, although poverty rates among older adults declined significantly since 1975, this decline reached a plateau since approximately 2011 [
55]. It is possible that the financial adversity incurred by the preceding economic depression may have adversely affected the ability of older adults to enter retirement with financial security [
56]. Unlike the population overall, rates of food insecurity among older adults still had not returned to their pre-Great Recession levels, as of 2021 [
53]. Many Americans rely on fixed incomes as they age [
57], which can threaten their access to basic food given the ongoing retail food prices inflation since 2013 [
58]. Because of the aforementioned links between economic adversity, the resulting food insecurity, and the increase in malnutrition mortality observed herein, tackling these trends will likely require stronger government-funded programs aiming to ensure food security among older adults.
The unequal distribution of modifiable risk factors, like poverty, is part of a larger problem of health disparity where certain socially disadvantaged groups systematically experience greater obstacles to health [
59]. Therefore, the stratification of malnutrition mortality by certain demographics showed that while all studied groups experienced the fall and rise of the trend, they did not experience it equally. Our study found the mortality trend disproportionately affected persons ≥ 85 years of age, females, Non-Hispanic Whites, those living in the West region of the U.S., and those living in urban areas.
The steep mortality uptrend in persons ≥ 85 years of age is due to a rising average age in this subgroup [
60]. As previously mentioned, aging is itself a risk factor for malnutrition [
3], with several observational studies showing the likelihood of malnutrition increasing continuously with age [
61‐
63].
As for gender, female older adults were more affected by malnutrition than their male counterparts in other studies as well [
1,
64,
65]. One possible explanation for this is life expectancy, with females in the U.S. living around five years longer than males [
66]. Additionally, females occupy a greater share inside nursing homes than males (67% versus 33%) [
67]. Of note, some studies did not find a significant association between the female gender and malnutrition [
68]. Instead, they reported that female gender is associated with functional limitation and disability, which are risk factors for malnutrition. This suggests that if the burden of functional limitation and disability which more commonly affects females can be overcome, then so can the resultant gender disparities.
Non-Hispanic Whites experienced the greatest rise in their malnutrition mortality trend. Indeed, they have multiple risk factors with reports of the highest rate of severe depression and decreased social contact compared to other races [
69]. Furthermore, Whites also occupy the greatest share inside nursing homes (79% versus 15% of Blacks and 6% of Hispanics) [
70]. However, it is worth noting that while Non-Hispanic Blacks had the smallest AAPC, their malnutrition mortality rate in 1999 was already high, and was consistently higher than any other included race group till the end of our study period. Previous cross-sectional analysis on community-dwelling older adults in adult day healthcare centers found that Blacks were at a greater nutritional risk than any other race group, with close to 21% reporting eating fewer than two meals per day, while around 2% of Whites reported so [
69]. Moreover, a review encompassing relevant literature from 1995 to 2021 found that persistent food insecurity was more than twice as likely in African-American homes than in the general population [
71].
Further, the West was the census region with the steepest rise in its malnutrition mortality trend. There are likely many contributing factors as to why this may be. Recent data from the American Community Survey shows that 11.6% of all counties in this region experienced an increase in poverty among those aged 65 and above [
72]. However, it is worth noting that the Northeast, which had a lesser uptrend in mortality in our analysis, had an even greater share of counties experiencing an increase in poverty (14.8%). This suggests that other risk factors could be at play in the West.
The higher uptrends of mortality in urban (as opposed to rural) areas may at first seem perplexing in light of known difficulties in accessing healthcare in rural areas and the consequently higher rates of mortality and shorter life expectancies [
73,
74]. This surprising finding may, in part, be related to differences in the degree of social and familial support between the two settings. Previous data has shown that, in urban settings, older adults report a greater degree of familial cohesion and support than in urban settings [
75]. Therefore, in rural settings, older adults who suffer food insecurity may be more likely to have their nutritional needs met than in urban settings via their more robust social and familial networks. Indeed, large central metropolitan and small metropolitan areas had the sharpest uptrends in malnutrition mortality rates in our analysis. Supporting this, data from the USDA shows food insecurity rates to be highest in major metropolitan cities [
76]. However, it is worth noting that both areas suffer substantial burdens of food insecurity, and differences in the prevalence thereof may not entirely explain our results, as data from the USDA shows that, on average, nonmetropolitan areas have slightly greater rates of food insecurity than metropolitan areas [
76].