In present-day Western Europe and North America, the dementia research field is in as much political turmoil as mainstream politics. And the struggling forces at play in both domains are often the same: individual activity or collective solidarity, technological solutions or community development/public health, for-profits versus nonprofits, unbridled capitalism or regulatory constraint.
So too are the same types of questions often asked: where should we allocate resources? How much should we focus on capitalism, free markets, and science to solve our problems and how much on community investments and infrastructure, and the initiative of nonprofit organizations? To what degree does personal responsibility guide outcomes versus socioeconomic determinants and the flawed (rigged) systems that perpetuate them? In short, both domains seem to reflect a ratcheting up of tensions between neoliberalism and social democracy in our current milieu.
These tensions were recently manifest in US President Barack Obama’s signing of the 21st Century Cures Act passed by the Congress and Senate with bipartisan support. The bill provides $4.8 billion in new funding for the National Institutes of Health, with $1.8 billion earmarked for cancer, $1.6 billion for brain diseases including Alzheimer’s, $500 million for the Food and Drug Administration, and $1 billion to help states deal with opiate abuse. While this sounds like an occasion for celebration, any time a bill mentions the word “cure” we should ask ourselves what is meant by such a promised medical breakthrough and who really benefits?
The easiest health product to sell is false hope and the hardest to implement is real change. As Elizabeth Warren recently pointed out, the Senate passed the bill with good features like strengthening the budget of NIH and finding ways to engage patients and their views in research. But by the time it was finished in the House it had become the dream of lobbyists, loaded up with multiple gifts for many in the private sector. The principle beneficiary was the pharmaceutical and medical device industries where regulatory changes were made to make it easier to get devices and drugs to market. The American Public Health Association also reported that money to find a cure could be taken away from funds for prevention. Cure, of course, is where the profit is—there is much less money to be made from preventing illness.
And yet, prevention is where data is increasingly telling us our focus ought to be. A recent JAMA article, based on national dementia prevalence data in the United States from 2000-2012, confirmed earlier findings that in some industrialized Western countries, particularly those with high standards of living (e.g. the United Kingdom, the Netherlands, Sweden, etc.), the rate of dementia has actually been falling over the last decade. In such population-based studies it is admittedly difficult to determine all the interacting social, behavioral, medical, and other factors that likely contributed to this decline. Even so, the general consensus experts have drawn, both from the recent JAMA study and other such research, is that it is likely that social policy and public health measures that have improved diet, created better educational and exercise opportunities, and contributed to the amelioration of risk factors for cognitive impairment, such as cardiovascular disease, diabetes, or smoking, may be having downstream effects on lowering the dementia rate. Indeed, because researchers found that years of schooling correlated with lower risk in the study, it has also been surmised that post-World War II social democratic policies that widened access to education in Western countries may have played a key role by increasing “cognitive reserve” (i.e. the brain’s capacity to maintain performance in the face of neuropathology) at the population level.
We don’t know for sure whether people born with more robust brains acquire more education or that education itself is protective for dementia. It may ultimately be a combination of both. However, we do know that people who are better-educated are healthier in general, and are less likely to suffer from dementia. Moreover, we know that better educated people are, on average, more economically well off. In sum, the studies demonstrating trends toward lower incidence of dementia in multiple countries would seem to be suggesting that strong social democratic policies supporting socio-economic equality, public and environmental health, and a focus on collective wellbeing best serve the health of the human brain. Overall, the findings stand in pleasant contrast to the usual rhetoric we hear about the “oncoming tidal wave”, or “silver tsunami”, of elderly people with Alzheimer’s.
Indeed, it was quite interesting that just two days after JAMA published the research establishing the drop in dementia rates the pharmaceutical company Lilly announced the failure of their investigational new drug solanezumab in phase III trials of patients with mild Alzheimer’s. This anti-amyloid drug had previously failed with persons with advanced Alzheimer’s, but had shown some small glimmers of hope for subgroups of persons with so-called “Mild Cognitive Impairment.” At the same time Lilly’s drug failed in phase III, the company Biogen was promoting new data suggesting some minimal benefit of their drug aducanumab, another amyloid antibody. As has been a common theme in the wake of consistent amyloid drug failures over the past decade, company spokespersons and others are shifting the goal posts, claiming that these drugs have not been given early enough and sometimes to the wrong population.
Despite this, Paul Aisen, the leader of the Alzheimer’s Therapeutic Research Institute at University of Southern California, was quoted as saying that a (formally) negative study of solanezumab actually supports the amyloid hypothesis, because some small beneficial effects were seen. Further, the Alzheimer’s Association’s Heather Snyder was quoted at the same meeting as saying that, despite, continued failures in Alzheimer’s drug development, they are on track for approval of a disease-modifying drug by 2025.
Even if such a prediction were to come true, based on existing drugs in late trials we can wonder what the real magnitude of the benefit to patients and families would be. If a compound was found that “prevented” Alzheimer’s, but had to be administered across many decades, how would we pay for it and who would have access? Would free market dynamics dictate which individuals were able to reduce their risk for dementia? The underlying point is that public health interventions benefit many and expensive drugs (perhaps) very few.
At a time of right-wing/neoliberal drift in many Western countries (as recently represented by Brexit, the election of Donald Trump, and the rise of other far right politicians in countries like France), we mustn’t lose sight of the fact that government investments that broaden access to quality education, increase economic opportunity and reduce poverty, and target known risk factors have complex but critically important downstream benefits for brain health at the population level.
As we face down not only Alzheimer’s disease but also major existential issues like climate change, growing income inequality, and lead in public drinking water, it appears to be imperative for both governments and the dementia field to not merely serve corporate agendas but rather take an active role in creating a fairer, healthier, and more humane society and providing security against the hazards and vicissitudes of life. In the final analysis, a more caring and egalitarian society where we look after one another, invest in shared infrastructure and socioeconomic opportunities, and help people live with dignity and purpose will result in healthier bodies and brains; consequently, social democracy should be seen as a major strategy in the so-called ‘war on Alzheimer’s’.
Featured image credit: Obama by Austen Hufford. CC BY 2.0 via Flickr.