Archive | June 2010

Taking Stock of Our Genes: Ten Years Later

It has been ten years since scientists sequenced the vast majority of the human genome, and as noted in a recent New York Times editorial, this has not resulted in great strides in human health. Although this may sound disappointing, it is still important to acknowledge the significance of the feat, and to understand why it will take a long while for humanity to fully benefit from it.

At the time, it seemed likely that we would soon be able to take an individual’s genetic blueprint, put it into a computer, and receive a detailed view of that person’s health outlook. It is possible to do this, but we know now that most diseases are caused by a multitude of factors, and genes are only important for some of them. For most others, the future risk of disease is based on percentages and other factors both known and unknown.

The creation of drugs using genetic engineering is possible, although it is not always the preferred method of synthesis. The treatment of diseases via gene therapy has only worked for a small number of conditions.

Applying genetic knowledge to disease diagnosis and classification is an area of potentially great promise, especially concerning cancer. Diseases that look identical on the surface are often several different types that only appear to be the same. Genetic tumor analysis allows them to be “fingerprinted”, and treatments evaluated based on these subtypes. As the editor notes, genetic testing has helped predict which breast cancer patients benefit from certain chemotherapy regimens. Genetic testing can be used to take a fresh look at old treatment trials which were thought to be ineffective. “Ineffective” treatments may have been unknowingly applied to the wrong subtypes.

One of the reasons for the difficulty in translating knowledge of our genetic blueprint into actionable interventions is complexity. Genes do not act by themselves, but rather as part of an intricate concert within cells, some being switched on, others off, all at precise times. Consider a computer programmer, able to see all of the code behind a program, but unable to read the language in which it was written. At our present state of knowledge we have the book, can view every page, but we are only beginning to read.

Photo Credit:  Svilen Milev

Health Costs Increase 9% in 2010

With the oil spill rightly taking center stage these days, discussions of health issues have faded from the spotlight.   But costs continue to increase, even when nobody is looking.  A recent report estimates costs going up by 9% in 2010, with 42% of companies passing those costs along to employees, often in the form of higher deductibles.

An increase in deductible costs, while a burden, is an opportunity for consumers (patients) to start looking closely at what they are spending, and what they are receiving in return.  The lack of this deliberation is what economists refer to as ‘moral hazard’, which is one of the primary drivers of esclating healthcare costs.

Most caregivers are not accustomed to discussing costs and benefits of treatments with patients.  Its not formally taught, and it is avoided because of the uncomfortable issues this type of discussion can bring up.  The third-party insurance system makes it easy to avoid this dilemma by both parties;  we provide the care, and the payments come indirectly through the employer in the form of insurance premiums.

Once patients start questioning  the costs of care, doctors will have to provide answers.  At first, not many will be able to do so, especially those in institutional practice settings where they may not have easy access to this information.   If patients are going to be shouldering more of the costs of their care, it is only fair to be able to provide them with honest information to help them make the choices that are in their best interests.