James Le Fanu

‘For every problem there is a solution: neat, plausible and wrong’. H.L.Mencken

Back to Articles

The fall of medicine

Despite medicine’s huge advances over the past 50 years, doctors are disillusioned and the public are neurotic about their health. Why? Medical invention ran out of steam in the mid-1970s and the vacuum was filled by two flawed theories-the social/environmental theory of disease and the new genetics.

A solitary statistic is a dreary thing; but when combined, statistics can reveal questions that previously remained hidden. Thus it is only of passing interest to learn that more than two thirds of doctors qualifying ten years ago have “regrets” about their chosen career. We all have spasms of self-doubt; doctors are no exception. But when the same “regrets” were expressed by only 28 per cent of doctors qualifying 20 years ago, and 16 per cent 30 years ago, then a pattern is clearly emerging.

This is not the only sign that medicine is in trouble. Surveys reveal that the proportion of the population claiming to be “concerned about their health” has increased from one in ten in 1968, to one in two last year. And the most curious aspect of this new phenomenon of the “worried well” is that it is medically inspired. The well are worried because repeatedly and systematically they have been told by experts that their health is threatened by hidden hazards. The commonsense advice of the past – “don't smoke and eat sensibly” – has metamorphosed into an all-embracing condemnation of every sensuous pleasure: food, alcohol, sunbathing and sex. And every week brings yet another danger. Who would have thought until a few weeks ago that night lights which have reassured generations of children should be damaging to their eyesight?

The paradox of modern medicine that requires explanation is why its spectacular success over the past 50 years has had such perverse consequences-leaving doctors less fulfilled and the public more neurotic about their health. These are, of course, complex matters with any number of explanations. But consider a list of significant medical advances in the past 60 years-starting in 1937 with blood transfusion, and moving through penicillin, kidney dialysis, radiotherapy, cortisone, polio vaccination, the oral contraceptive pill, hip replacement operations, kidney transplants, coronary bypass, the cure of childhood cancer, CAT scanners, test-tube babies and concluding with Viagra. Several themes are easy to identify: the assault on infectious disease (penicillin and childhood immunisation); major developments in the treatment of mental illness, cancer and heart disease; the widening scope of surgery (hip replacements and transplantation); and improvements in diagnostic techniques (the CAT scanner). But what is most noticeable about the list is the concentration of the important breakthroughs in a 30-year period from the mid-1940s to the mid-1970s. Since then, there has been a marked decline in the rate of therapeutic innovation. This "rise and fall" runs counter to the common view of an upward and onward march of medical progress-but it provides the key to understanding current medical discontents.

To understand the “fall&rdqu it is important also to understand the “rise.” The second world war played a significant role in two respects. First, conflict is always a great spur to scientific innovation. To take just one example: the skills required by surgeons to remove bullets from the still beating hearts of wounded soldiers proved invaluable for the pioneers of open heart surgery. Second, the surge of optimism released by the Allied victory in 1945 encouraged a belief in the limitless possibilities of science. Nowadays such unqualified enthusiasm might strike us as naïve, but it helps to explain why doctors and scientists during this period were prepared to spend decades tackling what at the time seemed insoluble problems such as transplantation and curing cancer. The war may help to explain why the “rise&rdqu happened when it did, but its two components were the cornucopia of new drugs that tumbled out of the research laboratories of pharmaceutical companies, and technology. The doctor setting up practice in the 1930s had half a dozen or so proven remedies at his disposal with which to treat the diseases he encountered every day. By the time his retirement loomed, this handful of drugs had grown to more than 2,000. You might assume that this staggering increase must have been underpinned by some important scientific insight permitting research chemists to design chemicals that could correct the defects of functions caused by illness. But this is not how it happened. Rather, the origins of almost every class of drug discovered during this period can be traced to some fortuitous, serendipitous or accidental observation-whether it was antibiotics or cortisone, drugs for the treatment of schizophrenia or cancer, the immuno-suppressants that permitted transplantation or anti-inflammatory drugs for treating arthritis.

The role of chance in drug discovery is central to subsequent developments and so merits elaboration. Drugs, being chemicals, work by interfering in some way with the chemistry of cells. If a chemist were intentionally to design a drug, he would have to know the defect at a cellular level that his chemical was intended to correct. For this he would have to know something about the microscopic world of the cell but-and this is the astonishing fact-during the golden age of drug discovery there was almost no understanding of how cells worked. Rather, the research chemists would begin with a lead about some chemical having an effect on a particular disease, and would then synthesise thousands of similar compounds in the hope that one might hit the jackpot. The great insight of modern therapeutics was that it was quite unnecessary for scientists to understand in any detail what was wrong. Instead, synthetic chemistry-blindly and randomly-delivered the remedies

By contrast, the second component of the “rise”-technological innovation-was the antithesis of this chance method of discovery. Its solutions are, by definition, highly specific answers to well-defined problems of how to “take over” physiological functions, such as respiration, for long enough to allow a patient’s damaged organs to recover; or how to replace worn-out joints; or sustain circulation for long enough on a heart-lung bypass machine to permit surgeons to repair complex defects.

Taken together, the medical advances of the 30-year golden age add up to one of the most sustained epochs of human achievement of all time. It took a while for this pioneering research to feed through to everyday practice, but by the mid-1970s medicine had become a highly sophisticated enterprise capable of dealing with the full range of human illness. And yet, as Thomas Mann has the hero observe in his novel Buddenbrooks: "Often the outward, visible, material signs of success only show themselves when the process of decline has already set in."

In 1979, the first signs that medicine’s fortunes might be changing were presented in a prestigious public lecture, “The End of an Age of Optimism,” given by Colin Dollery, then professor of pharmacology at the Royal Postgraduate Medical School, the powerhouse of medical innovation in Britain over the previous three decades. Dollery began by wistfully recalling the momentous events of the recent past, but went on: “Problems seem larger, and solutions more elusive… Many senior medical researchers are pessimistic about claims for future advance.” The following year, the president of the Association of American Physicians, James Wyngaarden, commented on the “declining interest in medical research shown by junior doctors over several years,” and a year later, in 1980, came the first recognition that the great bastion of medical advance, the pharmaceutical industry, was also in trouble. There was “a dearth of new drugs,” observed John Maddox, the editor of Nature, commenting on the fall in the rate of genuine breakthroughs over the previous decade. What had happened?

Medicine, like everything else, is bounded by its own concerns-the diagnosis and treatment of disease-so success necessarily places a limit on the possibility of further progress. By the late 1970s, medicine’s success over the previous three decades meant that much of the do-able had been done. So by the time surgeons had started transplanting hearts there was not much further to go in the field of cardiac surgery. The main burden of illness had been squeezed towards the extremes of life: infant mortality was heading towards its irreducible minimum, while for the first time in human history most people were now living out their natural lifespans to become vulnerable to diseases largely determined by ageing. These age-determined diseases, which are the dominant preoccupation of western medicine, are of two sorts: some, such as arthritis of the hips or narrowed arteries, can be relieved with drugs or operations; others, such as cancers and circulatory disorders, people die from-they can be palliated but not postponed indefinitely.

Furthermore, the random method by which most new drugs had been discovered meant that sooner or later research chemists would be scraping the barrel of useful chemical compounds. As for technology, the ability to keep the seriously ill alive almost indefinitely imposed financial constraints on the scope of future advance, with almost half of medical expenditure now being incurred in the last 60 days of a patient’s life. There remain unanswered questions, of course, the most important of which are the causes of the illnesses which are not related to age but appear to strike out of the blue, such as diabetes, multiple sclerosis, Parkinson’s and others. They must have causes-just as peptic ulcers were found to be due to the bacterium helicobacter-but neither back in the 1970s nor now do we know what these are.

The contention that science has “reached its limits” has been made many times before-only to be repeatedly disproved. But it is a matter of empirical observation that from the late 1970s the rate of therapeutic innovation has dramatically declined. It would be absurd to suggest that this was a complete dead end and there is plenty of scope, at the very least, for refining the advances of earlier decades. But 25 years on, the most commonly prescribed drugs today are all variations of compounds discovered before 1975.

The intellectual vacuum of the late 1970s was filled by two different sets of ideas. They emerged from two specialities which up until then had played only a marginal role in postwar medicine: epidemiology (the study of the patterns of disease) and genetics. The epidemiologists insisted that most common diseases, such as cancer, strokes and heart disease, were due to social habits-an unhealthy lifestyle or exposure to environmental pollutants. As for genetics, or “the new genetics,” a few truly astonishing developments had opened up the possibility of identifying the contribution of defective genes to disease. There was a beguiling complementarity between these two approaches as they represented in a different guise the contribution of nature (the gene) and nurture (social and environmental factors) to human development. Both, however, have proved to be blind alleys, unable to deliver on their promises. Their failure explains “the fall” of modern medicine and the source of its present discontents.

The human genome packs more than 100 trillion times as much information by volume as the most sophisticated computerised information system ever devised. It remained completely inaccessible until the late 1970s, when several technical breakthroughs allowed scientists first to cut up the DNA text into manageable sections, then to isolate the genes from those sections, and finally to identify the sequence of nucleotides that make up the genes. Like the mounted cavalry, the new genetics emerged in the early 1980s to rescue medicine from “the end of the age of optimism.” This was real science-of the sort that earned Nobel prizes. Little wonder it became the talisman of medicine’s future hopes.

Twenty years on the new genetics has hugely expanded our understanding of the workings of the human genome, which now appears more complex and inscrutable than we could ever have imagined. But its three main projects-biotechnology, genetic screening and gene therapy-are in disarray and its practical benefits scarcely detectable.

The first, biotechnology or genetic engineering, promised an entirely new method of making drugs by inserting the gene for, say, human insulin into a microbe that would produce millions of copies of itself. This clever, if costly, technique made its pioneers wealthy, but has generated only a dozen or so new drugs. With a couple of exceptions, these are either, like insulin, more expensive variants of compounds already available, or of marginal efficacy.

Genetic screening has fared little better. The discovery of genes for cystic fibrosis and similar disorders raised the expectation that it would only be a matter of time before prevention would be possible by combining antenatal testing and selective abortion. But screening during pregnancy is a big undertaking and, like all antenatal tests, generates much parental anxiety. Thus it is not obvious that at an estimated £100,000 per case, preventive genetic screening will ever become a viable option. As for the third and last practical application, gene therapy, the replacement of a “dud” with a normal gene, it simply does not work.

The gap between the anticipated and actual achievements of the new genetics is shocking given that genetics is still widely perceived as holding the key to the future. The trouble is that genes are not an important cause of illness-because natural selection ensures that those unfortunate to be born with dele-terious genes are unlikely to survive long enough to procreate. As for the many illnesses that run in families, there may well be a genetic component to them, but it is invariably only one of several factors and, regrettably, there is not much that can be done about it.

However, the new genetics does have the virtue of being a genuine science, which is more than can be said for the social theory-the second radical idea to emerge in the late 1970s to fill the vacuum created by the decline in therapeutic innovation. Its appeal was simple. Prevention, as everyone knows, is better than cure; so, the argument went, doctors would be much better occupied encouraging everyone to adopt a “healthier lifestyle” than expensively, and often futilely, trying to treat people in their shiny palaces of disease. All that was necessary was to identify the damaging social habits which gave rise to illness in the first place. This has not proved difficult; over the past 20 years statistical associations have implicated almost every aspect of people’s everyday lives in some lethal disease or other. Food and drink are obvious culprits; salt pushes up the blood pressure to cause strokes; dairy products fur up the arteries to cause heart attacks; and coffee drinkers are more prone to die from cancer of the pancreas. Almost undetectable levels of pollutants in the air and water lower the sperm count and cause leukaemia. “The search for links between lifestyle or the environment and disease is an unending source of fear-but yields little certainty,” observed the journal Science in 1994.

But most of these alleged hazards, about which we read everyday in the newspapers, cannot possibly be true. The human organism is-as it has to be-robust and impervious to small changes in the external world. The notion that subtle alterations in patterns of food consumption or undetectable levels of pollutants can be harmful is contrary to the fundamental laws of human biology. Rather, the social theory has had the regrettable consequence of undermining medicine’s reputation as a source of reliable knowledge. When Frank Dobson, the health minister, warned last year, on expert advice, that anyone eating three lamb chops a day or its equivalent was at increased risk of cancer, medical authority became indistinguishable from quackery.

Thus, the pattern of rise and fall is clear enough. From the 1940s onwards, the combination of fortuitous drug discovery and innovative technology impelled medicine forwards. By the late 1970s, these dynamic forces had become exhausted, creating an intellectual vacuum which was filled by two radical but ultimately unsuccessful approaches: the new genetics and social theory.

And yet the everyday practice of medicine belies this gloomy interpretation, because medicine still delivers so much more than it did 50 years ago. But it is precisely because medicine is so successful that the paradoxes of discontented doctors and burgeoning public neuroticism merit an explanation. It would be reasonable to infer that doctors' discontents may be due to the fact that medicine is not as exciting as in the past, while the growing numbers of the “worried well” are clearly due to the anxiety mongering generated by social theory.

What of the future? There is no obvious reason why present trends should not continue. Future surveys will reveal yet more doctors with regrets and further legions of the worried well. It is, however, possible to imagine a better future were this pattern of rise and fall to be more widely acknowledged. It will not be easy to admit that much current medical advice is quackery-but it must be done. The problem raised by the pretensions of the new genetics is different. Its danger is that it threatens to push medical research down the blind alley of reductionist explanations when it should be looking upwards and outwards to find the causes of disease. The benefit of blowing away these illusions will be to liberate medicine to concentrate on its legitimate task of, in William Blake’s memorable phrase, “doing good in minute particulars."