In the 1990s, scientists declared that schizophrenia and other psychiatric illnesses were pure brain disorders that would eventually yield to drugs. Now they are recognizing that social factors are among the causes, and must be part of the cure.
By the time I met her, Susan was a success story. She was a student at the local community college. She had her own apartment, and she kept it in reasonable shape. She did not drink, at least not much, and she did not use drugs, if you did not count marijuana. She was a big, imposing black woman who defended herself aggressively on the street, but she had not been jailed for years. All this was striking because Susan clearly met criteria for a diagnosis of schizophrenia, the most severe and debilitating of psychiatric disorders. She thought that people listened to her through the heating pipes in her apartment. She heard them muttering mean remarks. Sometimes she thought she was part of a government experiment that was beaming rays on black people, a kind of technological Tuskegee. She felt those rays pressing down so hard on her head that it hurt. Yet she had not been hospitalized since she got her own apartment, even though she took no medication and saw no psychiatrists. That apartment was the most effective antipsychotic she had ever taken.
Twenty years ago, most psychiatrists would have agreed that Susan had a brain disorder for which the only reasonable treatment was medication. They had learned to reject the old psychoanalytic ideas about schizophrenia, and for good reasons. When psychoanalysis dominated American psychiatry, in the mid-20th century, clinicians believed that this terrible illness, with its characteristic combination of hallucinations (usually auditory), delusions, and deterioration in work and social life, arose from the patient's own emotional conflict. Such patients were unable to reconcile their intense longing for intimacy with their fear of closeness. The science mostly blamed the mother. She was "schizophrenogenic." She delivered conflicting messages of hope and rejection, and her ambivalence drove her child, unable to know what was real, into the paralyzed world of madness. It became standard practice in American psychiatry to regard the mother as the cause of the child's psychosis, and standard practice to treat schizophrenia with psychoanalysis to counteract her grim influence. The standard practice often failed.
The 1980s saw a revolution in psychiatric science, and it brought enormous excitement about what the new biomedical approach to serious psychiatric illness could offer to patients like Susan. To signal how much psychiatry had changed since its tweedy psychoanalytic days, the National Institute of Mental Health designated the 1990s as the "decade of the brain." Psychoanalysis and even psychotherapy were said to be on their way out. Psychiatry would focus on real disease, and psychiatric researchers would pinpoint the biochemical causes of illness and neatly design drugs to target them.
Schizophrenia became a poster child for the new approach, for it was the illness the psychoanalysis of the previous era had most spectacularly failed to cure. Psychiatrists came to see the assignment of blame to the schizophrenogenic mother as an unforgivable sin. Such mothers, they realized, had not only been forced to struggle with losing a child to madness, but with the self-denigration and doubt that came from being told that they had caused the misery in the first place. The pain of this mistake still reverberates through the profession. In psychiatry it is now considered not only incorrect but morally wrong to see the parents as responsible for their child's illness. I remember talking to a young psychiatrist in the late 1990s, back when I was doing an anthropological study of psychiatric training. I asked him what he would want non-psychiatrists to know about psychiatry. "Tell them," he said, "that schizophrenia is no one's fault."
It is now clear that the simple biomedical approach to serious psychiatric illnesses has failed in turn. At least, the bold dream that these maladies would be understood as brain disorders with clearly identifiable genetic causes and clear, targeted pharmacological interventions (what some researchers call the bio-bio-bio model, for brain lesion, genetic cause, and pharmacological cure) has faded into the mist. To be sure, it would be too strong to say that we should no longer think of schizophrenia as a brain disease. One often has a profound sense, when confronted with a person diagnosed with schizophrenia, that something has gone badly wrong with the brain.
Yet the outcome of two decades of serious psychiatric science is that schizophrenia now appears to be a complex outcome of many unrelated causes—the genes you inherit, but also whether your mother fell ill during her pregnancy, whether you got beaten up as a child or were stressed as an adolescent, even how much sun your skin has seen. It's not just about the brain. It's not just about genes. In fact, schizophrenia looks more and more like diabetes. A messy array of risk factors predisposes someone to develop diabetes: smoking, being overweight, collecting fat around the middle rather than on the hips, high blood pressure, and yes, family history. These risk factors are not intrinsically linked. Some of them have something to do with genes, but most do not. They hang together so loosely that physicians now speak of a metabolic "syndrome," something far looser and vaguer than an "illness," let alone a "disease." Psychiatric researchers increasingly think about schizophrenia in similar terms.
And so the schizophrenogenic mother is back. Not in the flesh, perhaps. Few clinicians talk anymore about cold, rejecting mothers—"refrigerator" mothers, to use the old psychoanalytic tag. But they talk about stress and trauma and culture. They talk about childhood adversity—being beaten, bullied, or sexually abused, the kind of thing that the idea of the schizophrenogenic mother was meant to capture, though in the new research the assault is physical and the abuser is likely male. Clinicians recognize that having a decent place to live is sometimes more important than medication. Increasingly, the valuable research is done not only in the laboratory but in the field, by epidemiologists and even anthropologists. What happened?
The first reason the tide turned is that the newer, targeted medications did not work very well. It is true that about a third of those who take antipsychotics improve markedly. But the side effects of antipsychotics are not very pleasant. They can make your skin crawl as if ants were scuttling underneath the surface. They can make you feel dull and bloated. While they damp down the horrifying hallucinations that can make someone's life a misery—harsh voices whispering "You're stupid" dozens of times a day, so audible that the sufferer turns to see who spoke—it is not as if the drugs restore most people to the way they were before they fell sick. Many who are on antipsychotic medication are so sluggish that they are lucky if they can work menial jobs.
Some of the new drugs' problems could be even more serious. For instance, when clozapine was first released in the United States in 1989, under the brand name Clozaril, headlines announced a new era in the treatment of psychiatric illness. Observers described dramatic remissions that unlocked the prison cage created by the schizophrenic mind, returning men and women to themselves. Clozaril also carried the risk of a strange side effect: In some cases, blood molecules would clump together and the patient would die. Consequently, those who took the drug had to be monitored constantly, their blood drawn weekly, their charts reviewed. Clozaril could cost $9,000 per year. But it was meant to set the mind free.
Yet Clozaril turned out not to be a miracle drug, at least for most of those who took it. Two decades after its release, a reanalysis published in The Archives of General Psychiatry found that on average, the older antipsychotics—such as Thorazine, mocked in the novel One Flew Over the Cuckoo's Nest for the fixed, glassy stares it produced in those who took it—worked as well as the new generation, and at a fraction of the cost. Then there was more bad news, which washed like a tidal wave across the mental health world in the late 1990s, as if the facts had somehow been hidden from view. These new antipsychotics caused patients to gain tremendous amounts of weight. On average, people put on 10 pounds in their first 10 weeks on Clozaril. They could gain a hundred pounds in a year. It made them feel awful. I remember a round young woman whose eyes suddenly filled with tears as she told me she once had been slender.
The weight not only depressed people. It killed them. People with schizophrenia die at a rate far higher than that of the general population, and most of that increase is not due to suicide. In a now famous study of patients on Clozaril, more than a third developed diabetes in the first five years of use alone.
The second reason the tide turned against the simple biomedical model is that the search for a genetic explanation fell apart. Genes are clearly involved in schizophrenia. The child of someone with schizophrenia has a tenfold increase in the risk of developing the disorder; the identical twin of someone with schizophrenia has a one-in-two chance of falling ill. By contrast, the risk that a child of someone with Huntington's chorea—a terrible convulsive disorder caused by a single inherited gene—will go on to develop the disease goes up by a factor of 10,000. If you inherit the gene, you will die of the disease.
Schizophrenia doesn't work like that. The effort to narrow the number of genes that may play a role has been daunting. A leading researcher in the field, Ridha Joober, has argued that there are so many genes involved, and the effects of any one gene are so small, that the serious scientist working in the field should devote his or her time solely to identifying genes that can be shown not to be relevant. The number of implicated genes is so great that Schizophrenia Forum, an excellent Web site devoted to organizing the scientific research on the disorder—the subject of 50,000 published articles in the last two decades—features what Joober has called a "gene of the week" section. Another scientist, Robin Murray, one of the most prominent schizophrenia researchers in Europe, has pointed out that you can now track the scientific status of a gene the way you follow the performance of a sports team. He said he likes to go online to the Schizophrenia Forum to see how his favorite genes are faring.
The third reason for the pushback against the biomedical approach is that a cadre of psychiatric epidemiologists and anthropologists has made clear that culture really matters. In the early days of the biomedical revolution, when schizophrenia epitomized the pure brain disorder, the illness was said to appear at the same rate around the globe, as if true brain disease respected no social boundaries and was found in all nations, classes, and races in equal measure. This piece of dogma was repeated with remarkable confidence from textbook to textbook, driven by the fervent anti-psychoanalytic insistence that the mother was not to blame. No one should ever have believed it. As the epidemiologist John McGrath dryly remarked, "While the notion that schizophrenia respects human rights is vaguely ennobling, it is also frankly bizarre." In recent years, epidemiologists have been able to demonstrate that while schizophrenia is rare everywhere, it is much more common in some settings than in others, and in some societies the disorder seems more severe and unyielding. Moreover, when you look at the differences, it is hard not to draw the conclusion that there is something deeply social at work behind them.
Schizophrenia has a more benign course and outcome in the developing world. The best data come from India. In the study that established the difference, researchers looking at people two years after they first showed up at a hospital for care found that they scored significantly better on most outcome measures than a comparable group in the West. They had fewer symptoms, took less medication, and were more likely to be employed and married. The results were dissected, reanalyzed, then replicated—not in a tranquil Hindu village, but in the chaotic urban tangle of modern Chennai. No one really knows why Indian patients did so well, but increasingly, psychiatric scientists are willing to attribute the better outcomes to social factors. For one thing, families are far more involved in the ill person's care in India. They come to all the appointments, manage the medications, and allow the patients to live with them indefinitely. Compared to Europeans and Americans, they yell at the patients less.
More ...