Study Suggests Link Between Regular Aspirin Use, Increased Risk Of Age-Related Macular Degeneration

Source : JAMA and Archives Journals

CHICAGO – Regular aspirin use appears to be associated with an increased risk of neovascular age-related macular degeneration (AMD), which is a leading cause of blindness in older people, and it appears to be independent of a history of cardiovascular disease and smoking, according to a report published Online First by JAMA Internal Medicine, a JAMA Network publication.

Aspirin is one of the most widely used medications in the world and is commonly used in the prevention of cardiovascular disease, such as myocardial infarction (heart attack) and ischemic stroke. While a recent study suggested that regular aspirin use was associated with AMD, particularly the more visually devastating neovascular (wet) form, other studies have reported inconsistent findings. Smoking is also a preventable risk factor for AMD, the authors write in the study background.

Gerald Liew, Ph.D., of the University of Sydney, Australia, and colleagues examined whether regular aspirin use (defined as once or more per week in the past year) was associated with a higher risk of developing AMD by conducting a prospective analysis of data from an Australian study that included four examinations during a 15-year period. Of 2,389 participants, 257 individuals (10.8 percent) were regular aspirin users.

After the 15-year follow-up, 63 individuals (24.5 percent) developed incident neovascular AMD, according to the results.

"The cumulative incidence of neovascular AMD among nonregular aspirin users was 0.8 percent at five years, 1.6 percent at 10 years, and 3.7 percent at 15 years; among regular aspirin users, the cumulative incidence was 1.9 percent at five years, 7 percent at 10 years and 9.3 percent at 15 years, respectively," the authors note. "Regular aspirin use was significantly associated with an increased incidence of neovascular AMD."

The authors note that any decision concerning whether to stop aspirin therapy is "complex and needs to be individualized."

"Currently, there is insufficient evidence to recommend changing clinical practice, except perhaps in patients with strong risk factors for neovascular AMD (e.g., existing late AMD in the fellow eye) in whom it may be appropriate to raise the potentially small risk of incident neovascular AMD with long-term aspirin therapy," the authors conclude.

(JAMA Intern Med. Published online January 21, 2013. doi:10.1001/jamainternmed.2013.1583.)

Editor's Note: This study was supported by project grants from the National Health & Medical Research Council Australia. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Commentary: Relationship of Aspirin Use with Age-Related Macular Degeneration

In an invited commentary, Sanjay Kaul, M.D., and George A. Diamond, M.D., of Cedars-Sinai Medical Center, Los Angeles, write: "This study has important strengths and limitations. It provides evidence from the largest prospective cohort with more than five years of longitudinal evaluation reported to date using objective and standardized ascertainment of AMD."

"The key limitation is the nonrandomized design of the study with its potential for residual (unmeasured or unobserved) confounding that cannot be mitigated by multivariate logistic regression or propensity score analysis," the authors continue.

"From a purely science-of-medicine perspective, the strength of evidence is not sufficiently robust to be clinically directive. These findings are, at best, hypothesis-generating that should await validation in prospective randomized studies before guiding clinical practice or patient behavior," the authors conclude. "However, from an art-of-medicine perspective, based on the limited amount of available evidence, there are some courses of action available to the thoughtful clinician. In the absence of definitive evidence regarding whether limiting aspirin exposure mitigates AMD risk, one obvious course of action is to maintain the status quo."

(JAMA Intern Med. Published online January 21, 2013. doi:10.1001/jamainternmed.2013.2530.)

Editor's Note: Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

  • <<
  • >>

Articles List

  • More than One Way to Change a Base

    More than One Way to Change a Base

    It’s easier than ever these days to clone and sequence DNA. Thanks to CRISPR/Cas and related technologies, it’s even straightforward to rewrite genomic sequences in living cells and organisms. But as powerful as it is, CRISPR, et al., cannot induce genetic rewrites in a test tube—genome editing requires cellular machinery to repair the DNA breaks the methods produce. Instead, researchers interested in mutating cloned genes on plasmids must revert to a tried-and-true method, site-directed mutagenesis. First described in the 1970s—and earning its inventor a share of the Nobel Prize in Chemistry in 1993—site-directed mutagenesis uses short oligonucleotides to introduce single base changes, as well as insertions and deletions, to DNA plasmids. Researchers can use the method to swap amino acids in expressed proteins, test clinically relevant mutations and tweak promoters. But there’s more than one way to change a base, and molecular-tools vendors have commercialized multiple strategies. Here, we review some of the more popular approaches to site-directed mutagenesis.
  • What Doesn’t Kill You … Testing for Chemical Toxicity

    What Doesn’t Kill You … Testing for Chemical Toxicity

    Understanding the effects of small molecules, compounds and chemicals on cells is the very core of drug discovery, one in which the pharmaceutical industry continues to invest billions of dollars. Yet alongside the question of whether such entities have a desired effect looms that of whether they have a toxic effect on those cells—and ultimately the tissues and organisms the cells compose. This question has equal importance to those who protect our environment and assure that our food is safe to eat. Testing chemical toxicity can take many forms, from looking for simple surrogates of death, such as the inability to exclude trypan blue, to sophisticated measures of changes in a specific cell type’s physiology. Various assays look at pathways leading to cell death, membrane integrity, depletion of energy, ability to proliferate and changes in differentiation. They are accomplished using instruments ranging from a hemocytometer and light microscope; to a Coulter counter, microplate reader or flow cytometer; to a high-content analysis solution found principally in screening cores at biotech and larger pharmaceutical companies. Screens for loss of viability are often the first line of inquiry, and only after an entity is shown to cause a decrease in survival is it then subjected to more nuanced assays [1]. Here we look at the principal means by which entities are tested for their effects on viability.

Disqus Comments