I’ve chosen the Orlando shooting to open this blog because I think it sets the stage for the discussions I’d like to have, and I think it’s an event that deserves especially cautious interpretation…
The Lancet recently published a review paper emphasizing the importance of nutrition in the first 1,000 days after birth. New scientific research is supporting this concept to a remarkable degree, with a Brigham Young University study finding that infant feeding practices predict obesity in adults. Additional studies released in the last few weeks indicate extensive health effects in formula-fed infants and increased brain development in breastfed infants.
What you consume as an infant impacts your health for the rest of your life. According to the Lancet paper, malnutrition as a baby can have particularly lasting effects on an individual, and may be a contributing factor to the growing epidemic of metabolic diseases like diabetes.
While the consensus that breastfeeding leads to better health is not new, the mounting evidence shows that breastfeeding deserves a strong public health push. What little money is available for public health right now would be well spent on an issue that has such broad potential health benefits.
A study in Rhesus monkeys highlights the wide range of benefits of breast milk. Researchers at UC Davis found that compared to breastfed monkeys, those that were formula fed had altered gut flora as well as (possibly related) reduced function of their immune systems and metabolism.
Additionally, a Brown University brain development study employed several methods to measure brain development in infants. The researchers found that children who had been breastfed had much stronger development in some brain regions. While this is just a single study, it supports previous findings in adolescents.
The figures examining myelin water fraction, a measure of white matter growth, are particularly remarkable. By that measure, some brain regions experienced as much as 34% greater development in breastfed children as compared to formula-fed.
While many studies have shown long term health benefits of breastfeeding, there is still a need for more research.
Ethical considerations make true randomized studies difficult in this field. As a recent WHO review of long-term breastfeeding effects points out, the well-recognized short term benefits to infant disease and mortality make it unethical to assign a formula-only group at random. Because of this, long term effects can be obscured by the effects of parenting and environment (parents who choose to breastfeed may share other characteristics).
Additionally, the current recommendations would benefit from more information on the cause of long term benefits. Not all mothers have the option of breastfeeding, so there needs to be a better alternative.
The real story in this new research is that the development of baby formula has not progressed as well as one would hope. It is currently a poor alternative and while we may never (in the near future) be able to reproduce all of the benefits of breastfeeding, it is hard to believe that we can’t do better.
Genetically Modified Organisms (GMOs) are in the news again: unwelcome GMO wheat was found in an Oregon wheatfield, the Senate recently voted not to allow states to mandate GMO labeling, and extensive global protests were held against GMO monopolist Monsanto. The issue of GMO foods is full of misunderstandings and misinformation, so it is important to learn the facts behind the rhetoric.
The term GMO itself is based on a misleading premise. In fact, genetic modification of crops has been going on in some form throughout the history of farming. We have been changing the genes of plants through selective breeding and other methods for thousands of years, possibly even in ways that make our food less healthy. The FDA uses the term GE or Genetically Engineered to ” distinguish plants that have been modified using modern biotechnology from those modified through traditional breeding.”
Modern genetic engineering is certainly a new level of modification of our food supply. While many of these genetic modifications are simple and fairly predictable, there are still unknowns involved in the process. In addition, GE foods are very widespread in the US. The FDA has some figures: “In 2012, GE cotton accounted for 94 percent of all cotton planted, GE soybeans accounted for 93 percent of soybeans planted, and GE corn accounted for 88 percent of corn planted.”
While the European Union requires GMO foods to be labeled, the US has never had such a requirement and most foods’ GMO status is not easily attainable. Last Thursday, the Senate rejected a measure that would have allowed individual states to require labeling of GMO foods.
The process of genetically engineering a plant usually involves inserting a gene that codes for a specific protein. A popular modification is to add a gene that allows a plant to become resistant to the common herbicide Roundup (Glyphosate). Another example is Golden Rice, which adds a gene for Vitamin A production to rice crops in order to prevent Vitamin A deficiency (and resulting blindness) in poor regions. Other modifications improve drought resistance, pest resistance, and growth and efficiency in crops. The FDA provides an infographic on how the modification process goes, ideally:
There are potential problems with the modification process. New genes could cause unexpected allergies, have other unexpected effects on humans, aid in the creation of herbicide resistant superweeds, or spread their new genes to non-GMO crops. Other concerns include the overuse of glyphosate products in herbicide resistant crops. Though glyphosate is widely considered one of the safest herbicides for humans, some toxicity reports raise concerns about the forms used in herbicide products (no one has established that there is a significant danger to humans in the levels of glyphosate they might consume). The herbicidal mechanism of glyphosate involves disrupting the action of a protein (EPSPS) which humans do not produce.
Because genetic engineering of food has emerged so recently and has quickly become dominant in agriculture, there are concerns that the new products are not well vetted before being made available to the public. The FDA guidance document on genetically modified plants requests that data be submitted to the FDA before their use. It does say, however, that “during the consultation process, the FDA does not conduct a comprehensive scientific review of data generated by the developer.” In other words, the conclusions of the safety studies commissioned by the new plant owners are not independently verified. The WHO seems to leave safety assessments of these new plants to national regulators.
While there are always potential health risks in creating new organisms, there is a broad consensus in the scientific community that the widely used strains of genetically engineered crops do not impose significant risks over conventional crops. Numerous reports and meta-analyses have indicated that there is very little potential danger to humans. Studies which have found possible risks have been almost universally discredited (though scientists have been known to be particularly harsh on these findings for giving ammunition to GMO opposition which has otherwise had very little scientific backing).
In addition, the world’s population is increasing. It is difficult to produce enough food for the world’s population, and genetic engineering holds promise for increasing crop yields, allowing for agriculture in dry or otherwise inhospitable regions, and adding essential nutritional content to easily grown staple foods.
The scientific journal Nature’s May special on GMOs includes an article calling for nonprofit research to develop GE foods beyond the commonly used pest and herbicide resistant strains. Newer GE technology will allow for much more precise gene insertions, opening up new possible modifications while decreasing the chances of unintended effects.
The future of genetically engineered foods is full of promise, but is hindered by misunderstanding of the technology. If health and environmentally conscious groups can accept the idea of genetically engineered foods, they may be able to successfully push for higher safety standards and more consumer-focused modifications rather than the elimination of this area of research.
Pandemics are in the news again with two recent outbreaks beginning to cause alarm. How much danger do modern pandemics actually pose to the international community? Should Americans be concerned about a serious pandemic?
To answer these questions, we first need to introduce some history.
In November 2002 SARS (Severe Acute Respiratory Syndrome), a coronavirus originating near Hong Kong, began to be recognized as a serious threat and a potential pandemic. By March 2003, the international public health community was taking strong measures to contain the highly contagious disease. By the time the virus was under control in 2004, 8096 people had been infected and 774 had died, giving a mortality rate of 9.6%.
In 2009, a novel form of H1N1 began to spread in the US and Mexico. This subtype of Influenza A virus was also responsible for the devastating 1918 “Spanish Flu” pandemic. Public health officials had been preparing for an outbreak of H5N1 and so had good procedures in place for an influenza outbreak, but were not prepared with specific countermeasures for H1N1. It is much more difficult to estimate the scope and mortality rate of this disease than the SARS outbreak because of its resemblance to more common influenza infections, which are very common. According to a 2012 statistical analysis, approximately 201,200 people died of H1N1 from 2009-2010.
Currently in the news: a SARS-like virus, MERS, is proliferating in the Middle East and a new influenza subtype, H7N9, is beginning to cause concern. At an annual World Health Organization (WHO) meeting in Switzerland Monday the WHO director called MERS a “threat to the entire world” and her “greatest priority.”
So what level of effort and concern should the average person invest in these threats? What can medical professionals do to keep up with the latest outbreaks? Let’s try to sort through the responses to an epidemic.
When potential pandemics arise, the WHO tracks outbreaks of these illnesses carefully and prepares an international response. In the United States, the CDC directs these efforts along with state and local public health infrastructure.
The CDC has a Health Alert Network which will email subscribers with updates on health events. It is open to everyone, though they describe this as their “primary method of sharing cleared information about urgent public health incidents with public information officers; federal, state, territorial, and local public health practitioners; clinicians; and public health laboratories.” The WHO in turn has their Disease Outbreak News feed to keep people informed of epidemic threats.
The key to epidemiology is early containment, and communication is an essential part of that. The battle against H1N1 was bolstered by a strong vaccination program, and SARS was contained with a massive public health effort which included widespread quarantines and even pre-departure screenings. But the front line of containment is letting doctors (and patients) know what symptoms to look for so that patients can be isolated and proper precautions taken.
We talked to Dr. David Hooper of Massachusetts General Hospital (MGH) about how the hospital ensures proper information is disseminated. Aside from research that doctors in the hospital may do on their own, Hooper and his colleague Paul Biddinger in the Emergency Department continually monitor email notifications from local, state, national, and international health authorities. “If an alert came in, we would make sure some version of it was disseminated to relevant staff,” he said.
Occasionally they are contacted directly for local issues but overwhelmingly this method depends on the diligence of doctors. “All of us are tied into email pretty much constantly with our smartphones and laptops,” Hooper said.
We also talked to a Connecticut physician in a smaller practice, Dr. Hugh Blumenfeld. He relies exclusively on CDC alert emails to keep up to date on epidemic warnings.
In the recent past, these methods of communication seem to have been successful. While The Spanish Flu of 1918 claimed the lives of about 50 million people, and infected about 20% of the world’s population, we haven’t experienced a pandemic of nearly that proportion since then.
The media tends to present stories in a way that will sell, and in the case of epidemics this means exaggerating the risk. But in this case, exaggeration is not necessarily a bad thing.
In reality, your chances of dying as part of a pandemic are quite low. During the SARS outbreak, 92% of the recorded infections were in China. Given the media storm, you might be surprised to know that not a single death from SARS was recorded in the US. Even H1N1 was not in danger of being as deadly as cancer in 2009.
But unlike cancer, pandemics are an unknown risk. We never know when some newly mutated high-mortality virus could run out of control. Because of this, we are best served treating every outbreak as if it could be disastrous (at least in terms of communication). Global travel has upped the ante in terms of how fast these diseases can spread, and we need to be well-informed and prepared to act quickly.
While the new era of global travel complicates disease control, scientific research on pandemic prevention is becoming more complicated as well. A recent JAMA article on H7N9 preparedness concludes:
Another influenza pandemic is inevitable. Even with recent additional vaccine manufacturing capacity and improvements in potency testing, the global public health community remains woefully underprepared for an effective vaccine response to a pandemic. To be successful in meeting the challenge of a severe pandemic, the influenza vaccine enterprise must move forward with the development of novel antigen influenza vaccines that protect most individuals from multiple strains of influenza.
And even the development of these vaccines has become a contentious issue. The Dutch lab that identified the MERS virus recently filed a patent on some applications of the virus such as the creation of a vaccine. This may make it impossible for anyone to manufacture a vaccine for MERS without their permission. The lab insists that this will cause pharmaceutical companies to show more interest in creating a vaccine.
In order to be effective, these vaccines must be very widely distributed very quickly. If only one company can produce a vaccine for a particular virus, there must be a way to ensure that they can deliver large amounts in a timely fashion, and that there won’t be price gouging which makes it impossible to afford in the proper quantities. The WHO is currently investigating this issue.
So, should you be worried about pandemics? Not yet, but you should keep your eye on them. If you follow the scope and severity of outbreaks, you will know when it’s time to worry.
Several recent news items encourage optimism for the success of the sweeping 2010 federal health care reform law, the Affordable Care Act (“Obamacare”), but much of the law’s effects remain to be seen. Many early concerns with the law involved predictions that covering more people would drive up premiums. Thus far, this does not appear to be the case.
The Washington Post has a summary of promising early results of state insurance exchanges in California and Oregon. These exchanges are set up by the Affordable Care Act, and encourage competition between insurance providers.
In 2009 it was estimated by the Congressional Budget Office that the Silver Plan, which covers 70% of medical costs for most subscribers, would cost over $430 a month by 2016 (which the Washington Post article falsely compares to the 2014 California rates). A California-specific study, released just last month by actuarial firm Milliman (retained by the state of California), predicted an even higher $450 premium for Silver Plan subscribers.
Well, the nembers are finally in. The California exchange, Covered California, has recently announced that it will be able to offer Silver Plan rates averaging $276 a month in 2014, even before the subsidies for which many of the subscribers will be eligible. This comes in much lower than the previous estimates and indicates a victory for the insurance exchange method of driving down costs.
A Forbes article on the same story has a great explanation of how these exchanges work:
“Keep in mind that the entire idea of the exchanges is to require health insurance companies to compete openly with one another by offering identical coverage programs in the three created classes—each offering insurance coverage that actually delivers meaningful protection to customers—and then openly disclosing the price each insurance company will charge for that policy. Thus, shoppers can clearly see which company has the best price on an apples-to-apples basis.”
See Covered California’s comparison of new regional rates vs their current equivalents below:
Results in Oregon have been similarly impressive, with several insurance providers even lowering their proposed rates after seeing their competitor’s offers. With such a huge pool of previously uninsured people searching for plans, companies are becoming very competitive over the opportunity for increased market share.
So what do these results mean for the rest of the country? They indicate that a well-run exchange program can successfully use competition to prevent the immediate premium increases which many predicted. But caution is needed going forward: the sustainability of these premium rates depends on a variety of influences, including success of healthcare cost reduction measures, and convincing the young, healthy uninsured to sign up for plans. These successes will be necessary to balance out the increased burden from additional covered drugs and procedures as well as the mandated coverage of those with preexisting conditions.
A new study presented at the American Heart Association’s Quality of Care and Outcomes Research Scientific Sessions 2013 indicates that the 2006 Massachusetts health reform law, which was a model for the national reform law, has not resulted in increased hospital use or accompanying costs. The results were consistent in safety-net hospitals (hospitals with a high percentage of low-income patients) and in Medicare patients.
Because the Affordable Care Act is structured similarly to the Massachusetts law, this news is promising for the national outlook.
The new study may help alleviate concerns brought up by a 2011 Journal of American Medicine study, which found that patients were continuing to seek care through safety-net providers even after they obtained health insurance. The results validated warnings that previously uninsured people might continue to make use of expensive emergency room facilities for non-emergency care even after they obtained health insurance.
The National Center for Health Statistics today released an analysis of teen birth rates showing a dramatic decline since 1991. As of 2011, 15-19 year-olds of all races are having children at a rate of 31.3 per 1000 women, a record low and an impressive improvement from the 61.8 level in 1991.
As shown in the graph below, the decline is particularly apparent in non-white populations.
Source:CDC/NCHS, National Vital Statistics System.
You may notice that the two time intervals are very different in this chart. The 20.3 women per 1000 drop from 1991-2007 represents a decline of 1.3/year, whereas the 2007-2011 fall of 10.2 women actually represents a significant acceleration at 2.6/year. This is, however, a misleading comparison. These numbers appear to have been chosen for highest dramatic effect: both 1991 and 2007 are years which occur at the apex of an atypical upward trend. While it is true that the rates have declined significantly over the years, the effect is much exaggerated by the choice of years to highlight. This is illustrated by the following Congressional Research Service chart, which gives us the whole picture:
It is worth noting that an important ideological shift in federal sex education funding occurred in 2010. According to a report issued by the Congressional Research Service last month, the federal government underwent 3 major eras in their strategies to prevent teen pregnancy.
1981 heralded the first federal program to be tasked exclusively with adolescent issues of sexuality and pregnancy, the Adolescent Family Life program (AFL). They initiated a variety of intervention programs, encouraging contraception and abstinence strategies for avoiding pregnancies and STDs. In 1996, the Title V Abstinence Education block grant and other federal funding mechanisms began to focus on abstinence-only education strategies. The focus on abstinence-only policies continued until FY2010, when funding was provided for several evidence-based programs. This evidence-based approach included the acknowledgment that many high-schoolers (47.4% in 2011) have already experienced sexual intercourse and might not be receptive to the abstinence-only approach.
There are no doubt many factors involved in the change in teen pregnancy rate over the years, but the CRS chart does imply that the initial AFL approach was not very effective, and that we have gotten better at reaching teens with all of the messages we have chosen to use since 1991.
So which states are doing best at reducing teen pregnancy rates? First let’s look at the CDC’s heat map:
Source: CDC/NCHS, National Vital Statistics System.
The states with the most significant decline in teen birth rates are Arizona and Utah, both with a 35% decline from 2007-2011. In the lower 3, West Virginia, District of Columbia, and Arkansas experienced an insignificant, 15, and 16% decline respectively (with Alaska also coming in at 16%). While North Dakota also experienced an insignificant decline, it has remained below the national average in both years examined.
Looking at the 2011 teen pregnancy rates, the highest 5 states average 301% higher than the lowest 5. While it is hard to tell from this data what is causing the large disparities between states, the broad differences indicate that it will be worthwhile to perform analyses on a state-by-state level rather than focusing on federal policy alone.
A trio of new studies of the Mediterranean diet add to growing evidence of incredible health benefits:
A study released yesterday by the University of Ohio indicates that a molecule found in celery and parsley (among other foods), Apigenin, encourages normal cell death through apoptosis, reversing the immortality cancer cells tend to develop.
Another pair of recent publications were released as part of the PREDIMED program, which was designed to “assess the efficacy of the Mediterranean diet in the primary prevention of cardiovascular diseases.”
One PREDIMED study shows that high-risk patients following the Mediterranean diet supplemented with either olive oil or nuts had a significantly lower incidence of major cardiovascular events compared to a low fat diet.
A smaller study in the PREDIMED program indicated that the Mediterranean diets also displayed cognitive benefits in relation to a low fat diet. Significantly lower rates of mild cognitive impairment and dementia were observed in the Mediterranean diet groups supplemented with olive oil or nuts.
Caveat: It should be noted that there is no small amount of bias inherent in all this studying of the trendy Mediterranean diet. On PubMed, a simple search for “Mediterranean Diet” research since January yields 80 results. An identical search for “low fat diet” yields 49 results (this is the diet used as a comparison in the PREDIMED studies because it is commonly recommended by doctors). Another search for the admittedly more obscure “indian diet” gives a single result. Perhaps an Indian diet is the healthier diet; we may not find out until Indian food becomes the next health trend and funding is granted for this research. The key takeaway from these studies is that high vegetable-fat diets have been shown to produce better health outcomes than the standard low fat diet currently recommended for patients with major cardiovascular risk factors.
If you work in the health or science sectors, you have probably noticed a flurry of “big data” initiatives. Finding ways to standardize and make accessible the massive amounts of data we accumulate is a growing challenge. Let’s look at some of the recent progress in this area and how we might move forward.
Healthcare has undergone massive policy changes in the last few years, but the data revolution in this field has just begun. 2010’s Affordable Care Act laid the policy groundwork for expanding the use and usefulness of Health Information Technology, notably by encouraging Electronic Medical Records. Obviously, standardizing health information represents a huge boon to public health research by allowing for easy access to reliable health information. In April, McKinsey & Company released a comprehensive report calling for increased integration of big data into healthcare processes. Notably, they estimate that expansion of current integration trends could lead to a $300-450 billion annual reduction in US healthcare spending, representing a 12-17% overall decrease. On the technological side, the McKinsey report points out that more than 200 new businesses have developed “innovative health-data applications” since 2010. A recent Technorati article confirms the trend of technological innovation brought on by big data, pointing out Samsung’s innovative work with hospitals to create new technology which allows Electronic Medical Records to be better integrated into patient care.
The data revolution is also becoming a greater part of scientific research. With last year’s launch of Obama’s $200 million “Big Data Research and Development Initiative,” the government has accelerated many projects which will allow big data pooling in science. The president’s FY14 budget proposal includes an additional $40 million to expand NIH involvement in the initiative.
As an example of this work, a big step forward was achieved in cancer research last week when the University of Chicago launched “The Bionimbus Protected Data Cloud” to allow for secure access to genetic cancer information from The Cancer Genome Atlas (TCGA). Previously, using The Cancer Genome Atlas involved weeks of downloading data and additional time setting up methods to manipulate that data. Cloud access to many of the giant data initiatives in the science world will be very helpful to increasing the use of data which is otherwise difficult for researchers to access.
Readers: How can we push smaller research communities to start using more big data methods to standardize and share data? Are you involved with a project that uses big data in an innovative way?
A report by the National Academies’ “Committee on the Consequences of Sodium Reduction in Populations” has found that evidence supports a daily diet which limits daily sodium intake to 2300 mg. Surprisingly, they did not find sufficient evidence to support still lower sodium diets, even in populations at risk for heart disease. In fact, there is some (inconclusive) evidence that low sodium diets may be harmful to those with diabetes, kidney failure, or heart disease
The current US guidelines for sodium intake give an adequate Dietary Reference Intake (DRI) of 1500 mg and an Upper Limit (UL) of 2300 mg, but adult intake levels in the United States average 3400 mg. There is a growing call for increased public health measures to keep America’s runaway sodium intake levels down.
The focus of the study is on sodium’s direct effect on health outcomes rather than on intermediate biomarkers such as blood pressure, which are a more controversial measure of health impact. The committee refrained from making any direct recommendations of intake ranges due to a lack of data.
Readers, what can we do to prevent the high sodium intake which is the norm in American diets? Have you been recommending low sodium diets to patients?
Read the whole report here.