Aspirin Use May Help Lower Colorectal Cancer Risk, Study Finds

Older male stops on side of road to drink water
Regular aspirin use is linked to a lower risk of colorectal cancer, but the risk is lowest among adults with healthy lifestyle habits, regardless of their aspirin use. AzmanL/Getty Images
  • People who use aspirin regularly have a lower risk of colorectal cancer compared to those who don’t use aspirin regularly, a new study shows.
  • The benefits were greatest for people with an unhealthy lifestyle, such as moderate or heavy smokers, and those with overweight or obesity.
  • People with the healthiest lifestyle — whether or not they used aspirin regularly — had a lower risk of colorectal cancer compared to people with the unhealthiest lifestyle who used aspirin.

In the United States, an estimated 152,810 people will be diagnosed with colorectal cancer in 2024, with more than 53,000 deaths this year due to this cancer, according to the National Cancer Institute.

While rates of colorectal cancer in the country declined by about 1% each year from 2011 to 2019, this has been mostly in older adults, the American Cancer Society (ACS) reports. In contrast, ACS said rates among people under 55 years old have increased by 1% to 2% since the mid-1990s.

Genetics plays a role in the development of colorectal cancer. For example, people whose parent, sibling or child had colorectal cancer are at an increased risk.

However, lifestyle factors can also increase a person’s risk of colorectal cancer, including having overweight or obesity, having type 2 diabetes, eating an unhealthy diet, smoking tobacco, and drinking alcohol.

Now researchers from Massachusetts General Hospital and Harvard Medical School have found that regular aspirin use may lower colorectal cancer risk in people with unhealthy lifestyles.

The study was published Aug. 1 in JAMA Oncology.

Mixed evidence on the anticancer effects of aspirin

Previous research showed that regular aspirin use can lower the risk of colorectal cancer.

In 2016, the US Preventive Services Task Force (USPSTF) recommended low dose aspirin for colorectal cancer prevention in adults ages 50 to 59.

However, in 2022, the USPSTF withdrew its recommendation, citing a lack of evidence showing that aspirin reduces a person’s chance of developing or dying from colorectal cancer.

Long-term use of aspirin can also cause gastrointestinal bleeding, ulcers and other complications.

Given that some earlier research showed that aspirin reduced the risk of colorectal cancer, the authors of the new study decided to look at whether this benefit was higher for people with certain lifestyle factors.

For the study, they examined data from more than 107,000 people who participated in the Nurses’ Health Study or the Health Professionals Follow-up Study

The average age of participants was 49 years. They were all health professionals, and most were white. Additional research would be needed in more diverse populations to see if the results would be the same.

Researchers followed participants for more than three decades. During this time, participants completed surveys about five lifestyle factors: body mass index (BMI), whether they smoked tobacco or used alcohol, and their physical activity and diet. 

Participants also reported on their use of aspirin or other medications and whether they developed any diseases during the study period, including colorectal cancer. 

Researchers defined regular aspirin use as two or more standard-dose tablets per week or six or more low-dose tablets per week.

Aspirin lowers colon cancer risk for adults with unhealthy lifestyles

Overall, the risk of developing colorectal cancer over a 10-year period was 1.98% among participants who used aspirin regularly, compared with 2.95% for people who didn’t use aspirin regularly.

When comparing these two groups, researchers found that regular users of aspirin had an 18% lower relative risk of being diagnosed with colorectal cancer, compared with people who didn’t use aspirin or used it less often.

Participants with unhealthier lifestyles benefitted the most from regular aspirin use in terms of lowering their relative risk of colorectal cancer. The greatest benefits occurred for moderate or heavy smokers and people with a BMI of 25 or greater.

BMI is a screening measure for having overweight or obesity. Generally, a healthy weight for adults 20 years and older is a BMI of 18.5 to less than 25. However, BMI is not always reliable during pregnancy or for athletes or older adults.

The study also showed that people with healthier lifestyles benefitted from regular aspirin use, but less so. 

It also reinforces the overall benefits of a healthy lifestyle. People with the healthiest lifestyle — whether or not they used aspirin regularly — had a lower 10-year risk of colorectal cancer compared to regular aspirin users with the unhealthiest lifestyle.

The study does not show how regular aspirin use might help. Still, the authors point to previous research suggesting that aspirin may inhibit pro-inflammatory signals contributing to cancer growth.

Given the risks of long-term use of aspirin — such as gastrointestinal bleeding — the authors write that “these results support the use of lifestyle risk factors to identify individuals who may have a more favorable risk-benefit profile for cancer prevention with aspirin.”

A growing need for personalized medicine

Wael Harb, MD, a hematologist and medical oncologist at MemorialCare Cancer Institute at Orange Coast and Saddleback Medical Centers in Orange County, CA, said the new study’s results might prompt the US Preventive Services Task Force to reconsider the regular use of aspirin for the prevention of colorectal cancer. Harb wasn’t involved in the study.

However, Jason Zell, DO, MPH, a hematology-oncology specialist at the UCI Health Chao Family Comprehensive Cancer Center and associate professor at the UCI School of Medicine, questions whether the study is strong enough to change the USPSTF recommendations. Zell was likewise not involved in the study.

This was not a randomized controlled trial (RCT) that compared people who took aspirin to those who didn’t. “As such, this level of evidence is insufficient to change USPSTF recommendations,” he said.

In addition, “the USPSTF withdrew its recommendations for aspirin use — which was limited to a very small portion of the population anyway — in part due to complications such as bleeding,” Zell said.

The new study did not provide data on how many people taking aspirin regularly had bleeding or other complications, he said. This information could influence the assessment of the risks and benefits of regular aspirin use in people with unhealthy lifestyles.

Overall, “while this level of evidence is not enough to change physician recommendations about aspirin, it certainly could spawn more detailed future research,” Zell said, such as that “related to the benefits and risks of regular aspirin use in those with varying degrees of healthy or unhealthy lifestyles.”

Harb thinks the study, which implies that aspirin’s preventive benefits may vary based on individual lifestyle factors, “could lead to more personalized recommendations from physicians.”

Anton Bilchik, MD, PhD, surgical oncologist, chief of medicine and director of the Gastrointestinal and Hepatobiliary Program at Providence Saint John’s Cancer Institute in Santa Monica, CA, agreed. Bilchik wasn’t involved in the study.

“This study shows that taking two regular aspirins a week reduces the risk of getting colon cancer,” he said. “It also identifies groups of patients — particularly those that are obese or smokers — who are more likely to benefit from aspirin.”

Ask you doctor about daily aspirin use

Bilchik emphasizes that patients who are concerned about their risk of colorectal cancer should talk with their doctor before starting to take aspirin regularly.

“Although aspirin is a very safe drug, there is group of patients that may be at higher risk of gastrointestinal bleeding,” he said. “It’s not a common side effect of aspirin, but no one should routinely take two large aspirins a week without consulting their doctor.”

When they do, they should also ask about other ways to reduce their cancer risk.

“This study adds to the growing body of evidence on the importance of diet and lifestyle modifications in preventing colorectal cancer,” Zell said.

These kinds of lifestyle changes can reduce your risk of other types of cancer, cardiovascular disease, and other health problems, pointed out Harb, which is why doctors always recommend that patients reduce these risk factors.

However, “in reality, even with our best efforts, some of these factors might not be modifiable, or people might not be able to change them,” he said.

“So in the interim, while we’re trying to change this behavior, it’s reasonable to consider regular aspirin use as way to reduce the risk of colorectal cancer, he said.

Takeaway

Researchers examined data from more than 107,000 health professionals who took part in two long-term studies. Participants answered surveys about lifestyle factors and use of aspirin or other medications.

People who used aspirin regularly had a lower risk of being diagnosed with colorectal cancer, compared with people who didn’t use aspirin regularly. 

The largest benefit of aspirin was for people with the unhealthiest lifestyle, including moderate and heavy smokers, and people who had overweight or obesity.

People with the healthiest lifestyle — whether or not they used aspirin regularly — had a lower risk of colorectal cancer compared to people with the unhealthiest lifestyle who used aspirin regularly.

FDA Approves Blood Test for Colorectal Cancer Screening, What to Know

Guardant Health lab technician with a blood test
The FDA has approved a new colorectal cancer blood test by Guardant Health called Shield for Medicare coverage. Photo by Business Wire for Guardant Health
  • The FDA has approved a new colorectal cancer blood test by Guardant Health for Medicare coverage.
  • The test, called Shield, looks for “free-floating” fragments of cancer DNA in the bloodstream.
  • The Shield test also offers accessibility and convenience at the cost of reduced accuracy.
  • The ECLIPSE clinical trial, which preceded the FDA approval, found that the test accurately confirmed 83.1% of colorectal cancer cases.

A new blood test for colorectal cancer was approved this week by the Food and Drug Administration (FDA), making it the first test of its kind to meet requirements for Medicare coverage.

Guardant Health’s blood test, called Shield, is a noninvasive blood test that is touted as a simple, convenient alternative to other forms of colorectal cancer (CRC) screening. 

Colon cancer screening is a notoriously tricky issue: fewer than 60% of adults in the United States ages 45 and 75 receive screening for the disease, despite it being the second-leading cause of all cancer-related deaths. Yet colorectal cancer is highly treatable when detected early.

Hesitancy around colorectal cancer screening may stem from the belief that colonoscopies and stool-based tests are onerous or unpleasant. However, finding a non-invasive colorectal cancer screening test that is both accurate and noninvasive has proved challenging. 

“The persistent gap in colorectal cancer screening rates shows that the existing screening options do not appeal to millions of people,” Daniel Chung, MD, a gastroenterologist at Massachusetts General Hospital and professor of medicine at Harvard Medical School said in a press release from Guardant Health.

“The FDA’s approval of the Shield blood test marks a tremendous leap forward, offering a compelling new solution to close this gap. This decision will help make screening tests more broadly accessible and propel blood-based testing and CRC screening into a new era,” Chung continued.

Can a blood test for colon cancer replace a colonoscopy?

Shield’s greater ease of use comes with a not-insignificant trade-off in screening accuracy.

Medical professionals contacted by Healthline said that the potential for missed diagnoses is concerning and that the test should not be viewed as a replacement for a colonoscopy.

“Most of us look at these types of tests as adjuncts or ‘helper’ tests to get more people screened and then possibly get more colonoscopies if they were positive,” Ben Park, MD, PhD, director of the Vanderbilt-Ingram Cancer Center at Vanderbilt University, told Healthline. Park was not affiliated with the Shield research but disclosed that he is currently engaged in separate clinical trials involving Guardant Health.

“The danger of these types of tests is if they’re negative, they’re not necessarily really negative,” Park said.

There is a shared sense of enthusiasm among doctors about the potential for Shield to get more people screened for colorectal cancer more frequently. But exactly how the test will fit into current recommendations is less clear. 

“The gold standard is still very much the colonoscopy. So, I would be cautious counseling patients that we don’t see this as an equal alternative to colonoscopy,” Christopher Chen, MD, an assistant professor of oncology and director of the Early Drug Development at the Stanford Cancer Institute, Stanford Medicine, told Healthline.

Uri Ladabaum, MD, director of the Gastrointestinal Cancer Prevention Program at Stanford Medicine told Healthline, “Guardant Shield is expected to have substantial net benefit if it extends CRC screening to persons who are unwilling or unable to undergo screening colonoscopy or stool-based screening.”

However, if Guardant Health’s Shield were to substitute for screening colonoscopy or stool-based screening in those willing to use them, this is expected to worsen outcomes,” Ladabaum noted.

Blood test detects colon cancer with 83% accuracy

The approval of Shield arrives on the back of the ECLIPSE trial, the results of which were published in the New England Journal of Medicine in March 2024.

The trial included nearly 8,000 participants between the ages of 45 and 84. Shield is what’s known as a cell-free DNA test (cfDNA test), which looks for “free-floating” DNA molecules that are shed by cancer cells into the bloodstream.

The ECLIPSE trial found that in patients with CRC, confirmed by a colonoscopy, Shield accurately identified the cancer in 83.1% of cases. That means a significant number of participants with CRC (16.9%) received a false negative test result.

“You may not think that sounds like a big number, but when you translate it into the hundreds of thousands of patients who get diagnosed with colon cancer every year in this country alone, that is a significant number,” Park said.

“It could meaningfully risk missing a significant number of diagnoses,” Chen noted.

The concern for doctors is what happens when someone gets a false negative test result who isn’t regularly getting a colonoscopy as well. 

“That’s where the potential danger is where, you know, we could potentially be giving false security or senses of security,” Park said.

The test also doesn’t test for pre-cancerous lesions, only detecting them in about 13% of cases. It is therefore not regarded as a preventive screen, since it only detects cancer that is already present.

Despite these caveats, the test represents a new opportunity to increase screening and adherence for a dangerous, pervasive form of cancer. But its true value will be found in expanding screening as an adjunct to colonoscopy, not replacing it.

The takeaway

A blood test for colorectal cancer, Guardant Health’s Shield, has just been approved by the FDA for Medicare coverage, putting the test in reach of many more people in the U.S. at risk for developing the disease.

The blood test is a convenient, non-invasive alternative to other forms of colorectal cancer screening that are perceived as time-consuming or unpleasant.

Doctors caution that the test should not be viewed as a replacement for the “gold standard” colonoscopy, but rather an adjunct test that can help to accommodate certain patients who might not otherwise be screened.

Gen X, Millennials in U.S. Face Higher Risk of 17 Cancers Than Older Generations

Young female adult with cancer sitting outside
A new study found that Gen Xers and Millennials have a higher risk of 17 types of cancer compared to previous generations. pocketlight/Getty Images
  • A new study investigates trends in 34 types of cancer in the United States.
  • For 17 cancer types, Generation X and Millennials have a higher risk than previous generations.
  • Similarly, for five cancer types, mortality risk is also higher in younger generations.
  • Many factors may be involved, but obesity likely plays a substantial role.

A new study published August 1 in The Lancet Public Health takes an in-depth look at cancer rates in the United States. The scientists measured incidence rates of cancer and cancer mortality in different generations.

They found that 17 cancer types were more common in recent generations, noting the incidence rate for some forms was 2–3 times higher in people born in 1990 than in 1955.

While the causes for these increases require more research, the authors cite obesity, diet, and environmental toxins as major contributing factors.

Higher cancer rates in younger generations

In a previous study published in 2019 by the same authors, they found the incidence of eight types of cancer increased in younger generations compared with older generations.

However, no study has looked at both cancer incidence — the number of new cases — and cancer mortality by birth year. The latest study plugs this gap.

As the authors explain, trends in cancer incidence in people aged 50 or younger mostly reflect an increased exposure to carcinogenic factors in early life or young adulthood.

So, they “foreshadow future disease burden as these young cohorts carry their increased risk into older age, when cancers most frequently occur.”

Increases in cancer incidence and mortality

To investigate, the scientists used information from 23,654,000 people diagnosed with 34 types of cancer and 7,348,137 deaths from 25 cancers from 2000–2019.

They found an increased incidence in 17 of the 34 cancers in progressively younger generations:

  • cardia gastric: a type of stomach cancer
  • small intestine
  • estrogen receptor-positive breast
  • ovary
  • liver and intrahepatic bile duct (in females)
  • non-HPV-associated oral and pharynx (in females)
  • anus (in males)
  • Kaposi sarcoma (in males): a form of cancer that starts in the lining of blood and lymph vessels
  • colorectal cancer
  • endometrial cancer
  • gallbladder and other biliary
  • kidney and renal pelvis
  • pancreas
  • myeloma: a blood cancer
  • non-cardia gastric: another type of stomach cancer
  • testis
  • leukemia: a blood cancer

The increase in incidence was particularly pronounced in cancers of the small intestine, thyroid, kidney and renal pelvis, and pancreas. Compared with people born in 1955, the incidence in those born in 1990 was two- to three-fold greater.

Also, in five cancer types, mortality rates also increased:

  • liver and intrahepatic bile duct in females
  • endometrial cancer
  • gallbladder and other biliary
  • testicular
  • colorectal cancers

“These findings are sobering as they indicate the increased cancer risk in younger generations is not merely an artifact due to more frequent cancer detection and diagnosis,” explained study author Hyuna Sung, PhD.

“Instead, it points to a genuine increase in cancer risk at the population level, with the increase in incidence being substantial enough to outweigh improvements in cancer survival,” Sung told Healthline.

Why the sharp increase in cancer rates?

While this study was not designed to explain why these cancers increased, the researchers explain that obesity likely plays a substantial role.

10 of the 17 cancers listed in the study are associated with obesity. The authors explain that, since the 1970s, obesity has increased in all age groups, but the swiftest increase has been in younger people, aged 2–19 years.

This is backed up by other research demonstrating that excess weight and obesity at a younger age are associated with an increased risk of 18 forms of cancer.

According to the new paper, beyond overweight and obesity, other factors may also play a role, such as an increase in sedentary lifestyles, altered sleep patterns, and chemicals in the environment. However, much less is known about the importance of these factors.

How diet and the gut microbiome may affect cancer rates

The so-called Western diet, which is high in saturated fats, sugar, refined grains, and ultra-processed foods, is linked to increased cancer risk. 

“Emerging evidence suggests that ultra-processed food increases body weight but is also independently associated with the risk of some cancers, such as breast and colorectal,” Sung said.

Because some cancers affecting the digestive system are not related directly to obesity, the authors suggest that changes in the gut microbiome may also be a factor.

With the two-pronged rise of the Western diet and antibiotic use, the gut microbiome has been severely impacted.

Although scientists do not fully understand the role of gut bacteria in cancer, the authors write that specific microbes and dietary patterns have now been linked to oral and gastrointestinal tract cancers.

How is epigenetics related to cancer?

Jennifer Dunphy, MD, doctor of public Health and co-founder of the Wellness Innovation Network told Healthline she was particularly interested in the potential role of epigenetics on health.

“Epigenetic changes are changes to the expression of proteins from DNA, without changes to the DNA itself — usually as a result of environmental factors,” she explained. In other words, when an individual passes on their genes to their offspring, they also pass on certain changes to how these genes are expressed — how easily they are turned “on” or “off.”

“It seems that epigenetic changes are a large part of the equation here, meaning your DNA isn’t the only thing that matters — your behavior today is probably going to impact your offspring through heritable changes, just like the behavior and exposures of your parents and even your grandparents impact your health today,” Dunphy said.

This adds a fresh and rather bleak twist to the findings.

“While there are many harmful exposures we can and should prevent against,” Dunphy continued. “I do not think that, on average, we are doing all of this extensive damage to our bodies in one lifetime alone, I think we are carrying over vulnerabilities from our predecessors.”  

Exposure to toxins may affect cancer rates

The authors suggest that certain environmental toxins may play a role in increasing cancer rates.

“The most harmful environmental toxin thought to contribute to carcinogenesis is the use of plastics and their breakdown products,” Walter Kim, MD, an integrative medicine physician with Brio-Medical, told Healthline. Kim was not involved in the study.

Dazhi Liu, PharmD, an oncology clinical pharmacy specialist and medical contributor for Drugwatch, not involved in the study, suggested other potential candidates, including:

  • aflatoxins
  • benzene
  • soot
  • arsenic
  • aristolochic acids
  • nickel compounds
  • radon
  • thorium
  • trichloroethylene
  • vinyl chloride
  • wood dust

Rates of some forms of cancer are declining

Despite the concerning implications of increasing cancer rates in younger generations, Sung noted a few silver linings.

“The accelerated downturn in the trend of cervical cancer incidence shows the effectiveness of HPV vaccination among women born around 1990, who were about 16 years old when HPV vaccination was first approved in the United States,” she said.

Sung added the recent decline in cancers of the lung, larynx, and esophagus was driven by the drop in smoking rates.

Mortality rates are also declining for many cancer types, even those with increased rates. This, the scientists believe, is likely due to early detection due to better screening, advances in treatment, or both.

“The five-year survival rate for pancreatic cancer in young adults increased significantly from 16.5% in 2000 to 37.2% in 2016,” Liu told Healthline. 

Sean Devlin, MD, chief medical officer for Brio-Medical, not involved in the study, told Healthline the silver lining is awareness.

“Further work and funding for earlier screening should be instituted in an effort to catch these diseases in a timely manner so that they can be adequately treated with the tools that we have,” Devlin said.

Takeaway

The incidence rate of 17 cancers and the mortality rate of five cancers is higher in Gen Xers and Millennials than in older generations, a new study reported. These increases are probably due to overweight, obesity, the Western diet, changes in the gut microbiome, and environmental toxins.

Red Meat Raises Dementia Risk, but Nuts and Beans May Have a Protective Effect

Person eats nuts from a container
A new study found a link between processed red meat and dementia risk but suggests nuts and legumes had a protective effect. Stefa Nikolic/Getty Images
  • A new study has found a link between processed red meat and dementia risk.
  • However, nuts and legumes appeared to protect against dementia.
  • The saturated fat and preservatives in processed meats might contribute to this risk.
  • Nutrients found in nuts and legumes, on the other hand, might protect brain health.
  • Experts suggest replacing processed red meat with healthier swaps like beans.

Eating at least one-quarter serving per day of processed red meats — such as hot dogs, lunch meat, and bacon — is linked to a greater risk of developing dementia.

This finding, presented on July 31 at the Alzheimer’s Association International Conference in Philadelphia, is based on a comparison with those eating less than one-tenth of a serving per day (equivalent to about three servings per month).

Additionally, the findings of the study suggested that a serving-for-serving replacement of processed red meats with plant foods like nuts, beans and legumes, and peas could help ameliorate dementia risk.

The Alzheimer’s Association describes dementia as an umbrella term for the loss of a person’s cognitive abilities — including memory, language, and problem-solving — that is severe enough to interfere with a person’s day-to-day functioning. This neurodegenerative condition occurs when brain cells become damaged and can no longer function properly.

Processed red meat raises dementia risk

To study how processed red meat influenced dementia risk, the team of scientists examined over 130,000 people who had participated in the Nurses’ Health Study and Health Professionals Follow-Up Study.

After following them for as long as 43 years, they found that 11,173 had gone on to develop dementia.

They were able to track what study participants ate by using food-frequency questionnaires that were done every two to four years. The questionnaires asked about how often people ate a serving of certain foods, including processed red meats and various nuts and legumes.

When the people who ate one-quarter serving or more of processed red meat daily were compared with those who ate less than one-tenth of a serving daily, they were found to have a 14% greater risk of dementia.

Additionally, the team looked at cognition in 17,458 people, finding that each added daily serving of processed red meat was associated with greater cognitive aging when it came to overall function and people’s ability to remember and understand language.

On the other hand, substituting nuts and legumes for processed red meat was associated with a 20% lower risk of dementia as well as fewer years of cognitive aging.

These findings highlight the benefits of a diverse diet to help reduce the risk of cognitive decline.

Why might processed red meat cause dementia

Sham Singh, MD, a psychiatrist at Winit Clinic, who was not involved in the study, described the possible mechanisms that might underlie the link between processed red meat and dementia risk. He said the saturated fat and cholesterol content in foods like sausages and bacon are one way that these meats could play a role.

“Excessive intake of saturated fats can lead to the buildup of cholesterol plaques in the arteries, contributing to atherosclerosis and impairing blood flow to the brain,” Singh told Healthline.

“This reduction in blood flow can hinder the delivery of oxygen and nutrients to brain cells, potentially accelerating cognitive decline and increasing the risk of dementia.”

Singh further pointed to the cardiovascular impact of red meat consumption.

“According to my observations, the consumption of red meat has been strongly associated with cardiovascular diseases such as hypertension, coronary artery disease, and stroke,” he stated. “These conditions are related to vascular damage and inflammation throughout the body, including the brain.”

Chronic inflammation and vascular dysfunction can contribute to causing dementia, Singh added.

Finally, he discussed how cooking meat at high temperatures, whether you are grilling, frying, or broiling red meat, can form harmful compounds like heterocyclic amines (HCAs) and polycyclic aromatic hydrocarbons (PAHs).

“These compounds are known to induce oxidative stress and inflammation in the body, including the brain,” Singh said. “Oxidative stress contributes to cellular damage and accelerates aging processes, which are implicated in neurodegenerative diseases like Alzheimer’s and dementia.”

Varsha Khatri, a certified nutritionist at Prowise Healthcare, not involved in the study, agreed. She further noted that substances like nitrates and nitrites as well as preservatives found in processed red meats can form potentially harmful compounds in the body that might also increase dementia risk.

Nuts and legumes may preserve brain health

As the study indicated, replacing processed red meat with nuts and legumes could lower your dementia risk.

“Nuts and beans have important nutrients and antioxidants that support brain health,” Vashtri said.

She noted these foods contain healthy fats like omega-3 fatty acids. These are anti-inflammatory and they aid in maintaining healthy cell membranes in the brain.

Vashtri added that the fiber, vitamins, and minerals found in nuts and beans help to improve the health of our heart and blood vessels, which also reduces dementia risk by promoting better blood flow to the brain.

“Moreover, these plant-based foods also abound with polyphenols among other antioxidants needed to fight against oxidative stress; a key cause of dementia development,” she said.

Getting more nuts and legumes in your diet

To reap the benefits of eating more nuts and legumes, Vashtri advised starting slowly as you incorporate more of these foods into your meals.

Some practical steps you can take include:

  • replacing the red meat in your recipes with beans
  • eating nuts as snacks
  • including more plant-based meals in your weekly menu
  • making new recipes that feature nuts and legumes

Finally, Vashtri suggests doing your homework and being prepared to use these ingredients in your food preparation.

“Learn about the health advantages provided by nuts and beans then ensure that you have enough stock for when you need to cook or snack on them,” she said.

Takeaway

A new study found that eating more processed red meat was associated with a greater risk for dementia. However, replacing processed red meat with nuts and legumes was linked to a reduced risk for the condition.

Substances found in processed red meat like saturated fat, cholesterol, and preservatives might contribute to dementia risk through various mechanisms. Nuts and legumes contain nutrients that can protect brain and cardiovascular health.

Start slowly and gradually replace the red meat in your diet with nuts and beans to help reduce your risk of cognitive decline.

NFL Brothers Travis and Jason Kelce Launching New Cereal ‘Kelce Mix’: What to Know

Travis and Jason Kelce holding boxes of cereal.
NFL players Travis and Jason Kelce have teamed up with General Mills to launch a new breakfast cereal, “Kelce Mix,” which is a unique blend of the brothers’ three favorite cereals. Image Provided by General Mills
  • Travis and Jason Kelce are collaborating with General Mills to launch a new breakfast cereal.
  • The cereal, Kelce Mix, contains a combination of their breakfast favorites.
  • However, the brothers have been called out for promoting a cereal that could be unhealthy.
  • The brothers responded to the criticism by claiming it was okay in moderation.
  • Nutritionists have weighed in, saying children would be better off eating less sugar.

Kansas City Chiefs tight end Travis Kelce and his brother Jason have announced that they are doing a collaboration with General Mills, bringing to market a cereal based on their childhood favorites.

The creation, which is called “Kelce Mix,” drew inspiration from a discussion the pair had on their “New Heights” podcast last year in which the siblings ranked their favorite cereals.

It features a combination of Travis and Jason’s three favorites: Cinnamon Toast Crunch, Lucky Charms, and Reese’s Puffs all in one box.

There will also be four limited edition Kelces’ Pick collectible boxes with these three cereals as well as Honey Nut Cheerios.

However, while Travis Kelce seemed stoked about the deal, calling it a “full-circle moment,” some, like Dr. Calley Means, have had a less rosy outlook on the celebrity endorsement.

Means, the best-selling author of “Good Energy: The Surprising Connection Between Metabolism and Limitless Health” and the founder of TrueMed, recently took to his X account with the blunt advice that “[a]thletes should stop sponsoring food that destroys kids’ metabolic health.”

Jason Kelce soon clapped back via his X account, saying, “I grew up on these products, Calley, and I was a perfectly healthy fit child because I enjoyed them in moderation and when on the go for quick meals when both my parents didn’t have time to cook.”

But is Kelce right? Can these cereals really be a part of a healthy diet?

How unhealthy is Kelce Mix cereal?

Varsha Khatri, RDN, with Prowise Healthcare, noted that, while Kelce Mix combines three familiar breakfast cereals, no specific information detailing ingredients has been made available.

“Nevertheless, such cereals are generally made up of a combination of sweet-tasting grains, man-made flavors and additives that may alter their nutrient content greatly,” she said.

Khatri went on to explain that children’s cereals tend to be made with a lot of sugar, sometimes containing as much as 12-15 grams per serving.

“This is already high since it could be about half the daily recommended sugar intake for children by the American Heart Association, which should not exceed 25g while a child is between ages 2 and 18,” she said.

Khatri additionally remarked that high levels of sugar intake have been associated with conditions like obesity, type 2 diabetes, and dental problems.

Sugary cereals are also problematic because eating large amounts of sugar can increase blood sugar and cause energy crashes, leading to cravings for even more sugary foods.

“This cycle might be extremely injurious to the metabolic health of children due to their still-growing bodies that can highly suffer from what excess consumption of sugar does,” said Vashtri.

What can children eat that’s a healthier alternative?

Lesley Kumar, RD, CNS, a nutrition consultant with Ringside24, suggests that there are several breakfast options that are healthier for your child.

One is plain oatmeal that you’ve prepared yourself.

“Oatmeal is rich in fiber and can be naturally sweetened with fruits like bananas or berries, according to your preferences,” she explained.

Another possibility, per Kumar, is Greek yogurt with fresh fruit, which gives your child a healthy dose of protein and probiotics. “It can also be topped with nuts or seeds for added nutrients,” she suggested.

Nutritious smoothies made by blending fruits, vegetables, and protein source such as Greek yogurt or protein powder are also a great choice, according to Kumar.

However, if you really can’t live without cereal, she advised going for whole grain cereals like Cheerios or bran flakes. These are low in sugar and high in fiber.

“These alternatives offer balanced nutrition without the excessive sugar found in Kelce Mix,” Kumar concluded.

Takeaway

Kansas City Chiefs football players Jason and Travis Kelce are collaborating with the cereal manufacturer General Mills to release a cereal called Kelce Mix.

The cereal includes three of their favorites, including Cinnamon Toast Crunch, Lucky Charms, and Reese’s Puffs.

The announcement hasn’t been without controvery, however, with Dr. Calley Means condemning the cereal as being unhealthy for children’s nutrition.

While the brothers were quick to defend the cereal as being healthy in moderation, nutritionists say breakfast cereals are often high in sugar, which can damage children’s growing bodies.

They say healthier options are foods like oatmeal, Greek yogurt, fruit smoothies, and whole grain cereals that are lower in sugar.

Eating a Healthy Diet with Less Sugar May Slow Signs of Biological Aging

Father and soon cooking together.
Eating a healthy diet that is low in sugar may help slow signs of biological aging, a new study suggests. Frazao Studio Latino/Getty Images
  • A new study has linked sugar consumption with signs of biological aging.
  • Researchers observed how diet and sugar affected the epigenetic age of Black and white women in midlife.
  • Epigenetic age is not the same as chronological age and is indicative of how behavioral and environmental factors affect “wear and tear” on the body at a cellular level.

In one of the first studies to do so, scientists have linked dietary sugar intake to epigenetic (also referred to as biological) aging.

Excess sugar consumption is already known to increase the risk of chronic disease; now, it also appears to speed up signs of aging at a cellular level. The findings were published in the journal JAMA Network Open.

Researchers also examined the impact of a healthy diet on epigenetic markers and found the opposite effect: a higher-quality diet slowed the signs of aging.

The inverse effects also appear to be independent of one another, so diet and sugar consumption should both be evaluated in a dietary context for their effects on health and aging.

“Our findings appear to fit well with the general nutritional epidemiology literature that finds added sugars to be related to chronic diseases such as cardiometabolic conditions and cancer and related processes such as inflammation, all of which track with aging and are one manifestation of wear and tear on our bodies over time,” Dorothy Chiu, PhD, a Postdoctoral Scholar at the UCSF Osher Center for Integrative Health, and first author of the study, told Healthline.

Heidi J. Silver, PhD, RD, a Research Professor of Medicine at Vanderbilt University Medical Center who wasn’t affiliated with the study, told Healthline, “While the current studies do not show cause and effect, improving overall quality of the diet might contribute to slowing age or environmentally-related epigenetic changes.”

Sugar speeds up the body’s epigenetic “clock”

Chiu and her team investigated a cohort of 342 women in midlife. Importantly, the cohort was evenly split among Black (171) and white (171), and composed of individuals from different socioeconomic backgrounds. Prior scientific studies on diet and epigenetics have historically been limited to white individuals, making them hard to generalize to a broader population.

The participants were selected as part of the NGHS study, originally commissioned in the 1980s to study cardiovascular health in white and Black females between the ages of 9 and 19. The same group was then recruited again between 2015 and 2019 when all of the women had entered midlife — the average age being 39.

To assess the role of diet and sugar on epigenetic age, researchers first utilized a number of indices to gauge diet quality. These included the aMED Index, which scores a diet based on how closely it adheres to the Mediterranean diet, and the Alternative Healthy Eating Index.

Researchers then compared scores from these dietary indices to a novel epigenetic clock known as GrimAge2. GrimAge2, like other epigenetic clocks, relies on interpreting DNA methylation, a natural process that affects gene expression.

“DNA methylation (DNAm) is one way genes are modified and are turned on and off. It is used as a reliable indicator of epigenetic age because these patterns in DNAm have been observed to accumulate over time and are related to biological age,” said Chiu.

Researchers anticipated that a higher quality diet would slow signs of epigenetic aging, while sugar consumption would do the opposite. They were right.

However, interestingly, the effects of a healthy diet showed a far more significant response compared to sugar intake.

“Since the authors found a stronger association with diet quality and epigenetic age, it would be wiser to focus on the overall quality of the diet. Reducing added sugars intake would be one way to improve diet quality, especially if those calories are replaced with other ways to increase diet quality,” said Silver.

What is epigenetic age?

Chiu’s research is part of a growing field known as geroscience, which seeks to understand in scientific terms how aging, disease, and biology are all related. One of the important distinctions made in the field is between chronological age and biological or epigenetic age. 

When you celebrate your birthday, that’s representative of your chronological age; one year is the same amount of time for everyone. However, epigenetic age indicates the health of your body at a cellular level, and it doesn’t move at the same rate for everyone. 

Epigenetics refers to the impact that behavioral and environmental factors have on the age of your body.

For example, the epigenetic age of someone who eats healthy and exercises every day may increase more slowly than someone who is sedentary and consumes high amounts of sugar.

Epigenetic changes are also reversible, which means that your behavior, diet, and exercise can affect the aging process for better or worse.

“Epigenetic age reflects modifications of our genetic material or DNA that can result in changes in our gene and protein expression,” said Chiu.

 “These modifications end up turning genes on or off, which can have health implications depending on how the biological functions and physiology of our cells and systems are impacted,” she said.

The bottom line

Among a cohort of Black and white women in midlife, sugar intake and diet quality were predictors of epigenetic aging.

A healthy diet appears to slow the body’s biological “clock,” while consuming sugar does the opposite.

The role of diet in epigenetic aging is part of the growing field of geroscience, which seeks to understand scientifically how biology, diet, aging, and disease are all related.

Long-Term Exposure to Wildfire Smoke May Raise Your Risk of Dementia

Firefighter fighting a wildfire.
New research suggests long-term exposure to wildfire smoke may increase the risk of dementia more than other types of air pollution. Tayfun Coskun/Anadolu via Getty Images
  • Long-term exposure to wildfire smoke may increase the risk of being diagnosed with dementia, new research suggests.
  • The risk of dementia diagnosis was higher for wildfire smoke than for other types of air pollution, such as that emitted by motor vehicles and factories.
  • Experts recommended reducing your risk by using air filters at home and wearing a high quality mask when going outside during poor air quality days.

Wildfires can be devastating for communities and natural ecosystems. But wildfire smoke also poses a direct threat to human health, damaging not only the heart and lungs but also the brain.

New research suggests that when it comes to brain health, wildfire smoke may even be more harmful than other types of air pollution.

In a preliminary study reported on July 29 at the Alzheimer’s Association International Conference in Philadelphia, scientists found that long-term exposure to wildfire smoke increases the risk of being diagnosed with dementia.

The risk was higher for exposure to wildfire smoke than for other types of air pollution, such as pollution emitted by motor vehicles and factories.

This study “adds to the evidence that not only does [wildfire smoke] pose a risk to our long-term cognitive health, but it may also be more dangerous to our health than [fine particulate matter] from other sources,” said Stephanie Cleland, PhD, MSPH, assistant professor in the Faculty of Health Sciences at Simon Fraser University, who was not involved in the new research.

Last year, Canada experienced the highest wildfire carbon emissions on record, and Greece experienced the largest wildfire to date in the European Union. The fires in Canada led to poor air quality alerts in many parts of the United States. 

As human-driven climate change causes wildfires to grow more intense and destructive, wildfire smoke will become a problem for more people, with longer-term impacts.

“A lot of research on wildfire smoke has focused on short-term exposure because we previously viewed it as an intermittent, infrequent occurrence,” said Cleland.

“But with climate change playing a role in drying out forests, reducing snow levels, and increasing temperatures, we’re seeing an increase in the frequency and intensity of wildfires, and as a result, an increase in wildfire smoke,” she said.

“This study shows that it’s not just your exposure during the one week that it’s smoky [that matters for your health],” she said. “It’s your exposure over years of wildfire smoke, especially if you live in places like California, Washington, Oregon, or British Columbia that have frequent wildfires.”

Dementia risk of wildfire smoke: what’s the connection

Wildfire smoke contains a type of air pollution known as fine particulate matter (also known as PM2.5). This mixture contains solid and liquid droplets that are 2.5 microns or smaller in diameter — 20 to 30 times smaller than the width of an average human hair.

Air pollution emitted by motor vehicles and factories also contains fine particulate matter.

Because of their small size, these small particles can travel deep into the lungs — and may enter the bloodstream — where they can affect the lungs and heart. 

A growing number of studies suggest that fine particulate matter may also affect the brain, resulting in an increased risk of dementia, stroke, and headache.

Research by Cleland and her colleagues, published in 2022 in Environmental Health Perspectives, even found that short-term exposure to wildfire smoke may lead to decreases in certain types of cognitive function, such as attention.

In the new study, researchers examined the health records of over 1.2 million members of the Kaiser Permanente health system in southern California. Between 2009 and 2019, members were 60 years or older. None had been diagnosed with dementia at the beginning of the study.

Researchers used air quality monitoring data and satellite imagery to estimate members’ exposure to fine particulate matter based on where they lived. They also looked separately at levels of wildfire and non-wildfire fine particulate matter.

People with greater exposure to wildfire smoke were more likely to be diagnosed with dementia, researchers found.

The risk of a dementia diagnosis increased by 21% for every increase of 1 microgram per cubic meter in the three-year average wildfire-related fine particulate matter exposure.

In contrast, for other types of fine particulate matter exposure, the risk of dementia diagnosis increased 3% for every increase of 3 micrograms per cubic meter in the three-year average fine particulate matter exposure.

The new study has not been published yet in a peer-reviewed journal, so the results should be viewed with caution.

Some communities are impacted more

For their analysis, researchers took into account many factors that might affect the results, such as age, sex, cigarette smoking, and the poverty level of the neighborhood. This and the large number of people included are strengths of the study.

However, while “the study accounted for as many factors as they could in the Kaiser database [that might affect the results], there are likely other environmental factors not captured in their data set,” said Keith Vossel, MD, director of the Mary S. Easton Center for Alzheimer’s Research and Care at UCLA, who was not involved in the research.

For example, “it is unclear how many individuals wore protective masks when wildfires were occurring,” he told Healthline.

The results of the new study also showed that people from racial and ethnic minorities and those living in high-poverty areas appeared to be more impacted by wildfire smoke.

Cleland said this fits with other research on air pollution, in general, which has found that people living in more disadvantaged areas or who have less access to resources tend to have a higher risk of health problems related to air pollution.

In terms of exposure to wildfire smoke, people living in communities at higher risk may live in homes that have less high quality air filtration, or they may not have access to other indoor spaces with clean air, she said.

Lilah Besser, Phd, MSPH, a research assistant professor at the Miller School of Medicine, Comprehensive Center for Brain Health, said the results of the new study are in line with earlier research showing a connection between exposure to fine particulate matter and a risk of dementia.

“This study adds to that body of research by suggesting an even greater risk of dementia among those exposed to wildfire (versus non-wildfire) PM2.5 exposure,” she told Healthline.

However, researchers estimated people’s exposure to fine particulate matter, which may not always be accurate. Future research should measure people’s actual exposure to wildfire smoke and other types of air pollution, said Besser, who was not involved in the new research.

In addition, researchers should “investigate the components of wildfire smoke other than PM2.5 that may be the root cause of the amplified dementia risk,” she said.

Study author Holly Elser, MD, PhD, a neurology resident at the Hospital of the University of Pennsylvania, Philadelphia, said in a news release that wildfire smoke may be more harmful to the brain because fine particulate matter from wildfires is produced at higher temperatures, contains higher amounts of toxic chemicals and has smaller particles.

How to reduce health risks from wildfire smoke

Cleland said there are several things people can do to protect their health from wildfire smoke. These will also protect you from other sources of air pollution.

“First, check the local air quality where you live,” she said. “That can help you make informed decisions.”

If the air quality is at a dangerous level, you might want to avoid going outside, she said, or if you do have to leave your home, wear a high-quality respirator like an N95 mask.

Another thing people can do is create a safe space indoors.

“If you’re able to, seal the doors and windows and run an air cleaner. That’s going to create a space in which, even if it’s smoky outside, you have clean, safe air to breathe,” she said.

High-efficiency particulate air (HEPA) purifiers can filter smoke, dander, and pollen from the air. Commercial devices are available, but you can also build an air filter using four high-quality filters and a box fan. One of the most popular DIY air purifiers is the Corsi-Rosenthal box.

If you don’t have access to an air purifier for your home, “public spaces, like malls, libraries or community centers often have built-in high-filtration air systems,” said Cleland. “So accessing those spaces during smoky periods can be a good way to reduce your exposure.”

Besser said community strategies are also needed to reduce the risk of exposure among the most vulnerable populations.

“This could include local agency strategies such as N95 mask distribution and educational efforts to community organizations about Air Quality Index (AQI) alerts and the risks of air pollution,” she said.

Takeaway

Researchers examined the health records of over 1.2 million members of the Kaiser Permanente health system. They also used air quality monitoring data and satellite imagery to estimate members’ exposure to fine particulate matter in wildfire smoke and other sources.

Fine particulate matter, also known as PM2.5, is small enough to travel deep into the lungs and may enter the bloodstream. Research shows that this type of pollution, which is also emitted by motor vehicles and factories, can damage the lungs, heart, and brain.

In the study, people who were exposed to greater levels of wildfire smoke had a higher risk of being diagnosed with dementia. This risk was even higher than that caused by exposure to non-wildfire sources of pollution.

Daily Supplements May Slow ‘Dry’ Form of Age-Related Macular Degeneration

A female wearing glasses taking a supplement with a glass of water.
A formulation of over-the-counter antioxidant supplements may help prevent the progression of dry age-related macular degeneration, a new study finds. filmstudio/Getty Images
  • Dry age-related macular degeneration is an eye disorder that can lead to blindness. It affects millions of Americans.
  • A formulation of over-the-counter antioxidant supplements appears to prevent the progression of the disease, even in its advanced stage, known as geographic atrophy.
  • There is no cure for AMD, and current pharmaceutical options can be cumbersome and expensive.

A widely available formulation of antioxidant supplements appears to slow the progression of dry age-related macular degeneration (AMD).

Dry AMD is a common eye disorder in individuals aged 55 and over that causes blurred vision, and is a leading cause of legal blindness.

As its name suggests, the condition occurs naturally during aging and is more common in older individuals. It may have little to no effect on vision but may also progress to a more severe, vision-threatening form known as geographic atrophy.

Based on recent estimates, dry AMD is believed to affect nearly 20 million Americans aged 40 and older. However, less than one percent of them have a vision-threatening form of it.

There is no treatment to reverse damage from dry AMD. However, new evidence suggests that a cocktail of over-the-counter antioxidant supplements can slow progression of the disease significantly, even in individuals with geographic atrophy.

55% less disease progression over placebo

In an article published this month in the journal Ophthalmology, researchers from the National Eye Institute found that antioxidant supplementation slowed progression by as much as 55% over three years compared to a placebo.

“These findings are very significant as geographic atrophy affects approximately 5 million people worldwide and, up until recently, we had no treatments to prevent geographic atrophy occurring, slow its expansion, or restore vision to affected areas. The oral supplements have the advantages of a large treatment effect, excellent safety profile, ease of use, and low cost,” Tiarnan D. L. Keenan, BM BCh, PhD, a researcher in the Division of Epidemiology and Clinical Applications at the National Institute of Health’s National Eye Institute, and first author of the study, told Healthline.

Theodore Leng, MD, an Associate Professor of Ophthalmology at Stanford Medicine who wasn’t affiliated with the study, told Healthline, “This study was great because it really confirms some of our suspicions about vitamin supplementation in this specific form of advanced macular degeneration.”

Antioxidant cocktail shows promise

Researchers conducted a post hoc analysis of two major clinical trials that previously investigated the link between antioxidant supplementation and dry AMD progression. Those trials, known as AREDS and AREDS2, honed in on six supplements believed to support eye health and slow disease progression:

The original studies found that taking this combination of supplements reduced the risk of progression from intermediate to advanced AMD by one-fourth. However, there was nothing to suggest that the supplements would slow progression in individuals whose eyes had already advanced to geographic atrophy, the most extreme form of dry AMD.

Keenan and his team wanted to find out whether the AREDS2 supplement formulation could also help slow disease progression in individuals with the advanced form of the disease.

“Our study shows that the oral supplements have an important role even when geographic atrophy is present,” he told Healthline.

Specifically, they found that over a three year period, the eyes of individuals with geographic atrophy who took antioxidants showed 39.8 microns of disease progression, compared to 73.2 microns in the eyes of those who took a placebo. That’s 55% less progression simply by taking an antioxidant supplement.

Wet vs dry AMD

The vast majority, about 90%, of all cases of AMD are the dry form. However, about 10% of AMD cases may progress to another form, known as wet AMD.

Dry AMD is associated with the presence of large yellow protein deposits under the retina, known as drusen. These deposits damage a small but important area at the back of the eye known as the macula, which allows your eyes to precisely focus on objects in front of you. Dry AMD may progress, but doesn’t always, to geographic atrophy, which can cause permanent loss of vision.

“Imagine you have a camera or a computer screen and you have dead pixels on the screen. That area of dead pixels just expands slowly over time. That’s basically the patient’s experience who has geographic atrophy. They have areas of the retina that are dead or not functional and that area grows in a kind of concentric manner,” said Leng.

Sometimes, wet AMD may develop. Wet AMD or neovascular AMD, is defined by the abnormal presence of blood vessels under the retina that can cause swelling and bleeding. This bleeding and fluid accumulation are where the condition derives its “wet” moniker.

Wet AMD tends to be more severe, progresses more quickly, and always affects the central vision of the macula. It is always considered an advanced stage of AMD.

At this time, AREDS2 antioxidant supplements only appear to be beneficial for dry AMD, not wet.

“Individuals with geographic atrophy should benefit from the AREDS2 formulation supplement. In addition, our previous research has shown that a healthy diet (particularly a Mediterranean-type dietary pattern) and avoiding smoking are strongly associated with slower atrophy growth rates. So a healthy lifestyle is important alongside the supplements,” said Keenan.

The bottom line

A combination of antioxidant supplements known as AREDS2, which includes vitamin A, vitamin C, copper, zinc, lutein, and zeaxanthin, helps slow the progression of dry AMD, even in the advanced stage known as geographic atrophy.

A novel study found that compared to placebo, those taking the antioxidant formulation showed 55% less disease progression.

The supplements are not recommended for wet AMD, a less common but more severe form of AMD.