The Depression Epidemic

Can Mood Science Save Us?

Magazine Issue
November/December 2014
An illustration of a person in a series of nesting boxes, hugging their knees

In August, comedian Robin Williams’s tragic suicide rocketed depression into the headlines and presented an opportunity for people to get beyond simplistic notions about “chemical imbalances” and finally reckon with how deeply rooted depression is in our 21st-century consumer culture. But that reckoning never happened.

At first, everyone had something to say about Williams’s death, much of it heartfelt and intensely personal. The sudden loss of a beloved entertainer to a mental health struggle spurred the famous and the not so famous to reveal their own experiences of depression and their own brushes with suicide. But his death also aroused discomfort and confusion as people hungered for an answer to the question of why a gifted, universally admired cultural icon had killed himself. At times, the response took the form of a desperate search for an easy answer to a deeply unsettling loss. Talking heads on TV parroted psychiatric orthodoxies about brain illness, or blamed Williams’s death on vague psychological forces referred to as his demons. As time went on, and the coverage began to focus on Williams’ financial troubles, his career difficulties, and a recent Parkinson’s diagnosis, people quickly shifted their stance from “I can’t believe it” to “of course.” Despite the fact that much of what was said was pseudo-explanatory and clichéd—“He suffered from a disease against which he was helpless” or “If only he’d gotten the right treatment, his life might have been saved”—there was a sense of relief that, at last, we had a reason we could hold onto.

Yet on the day Williams died—as on every day in America—more than 100 people died by suicide and 2,500 attempted suicide. And amid the wall-to-wall coverage of his death, there was no serious attempt to explain why the suicide rate for adults has increased 25 percent since 1999 or what’s raising the incidence of depression in America and other affluent societies around the world. Soon after that August day, the public conversation about depression lost its focus as well as the sense of urgency it had temporarily assumed.

A Deficiency View of Depression

Depression, of course, is among the most common problems encountered in clinical practice, yet most psychotherapists still accept the cultural assumption that it’s primarily a problem of the individual. Myopically focusing on the individual has become a habit of mind. Best estimates are that about 35 million American adults will, at one time or another, struggle with depression. That’s nearly 1 in 5 people, so it’s no hyperbole to say that we’re the midst of a depression epidemic. Is this alarmist? Here are a few reasons why such a strong term is warranted.

  • According to worldwide projections from the World Health Organization (WHO), by 2030, the amount of disability and life lost from depression will surpass that from war, accidents, cancer, stroke, and heart disease. In fact, WHO reports that for youth aged 10 to 19, depression is already the number-one cause of illness and disability.
  • The National Comorbidity study reported two decades ago that 18-to-29-year-olds in America were likelier to experience depression than those 60 and older, even though they’d been alive for less than half as long.
  • According to a 2012 survey by the Association for University and College Counseling Center Directors, 95 percent of college counseling-center directors in America reported an increase in the number of students with significant psychological problems. A 2012 survey of college students by the American College Health Association found that 33 percent of women and 27 percent of men identified a period in the previous year of feeling so depressed that they had difficulty functioning.
  • According to the Centers for Disease Control, antidepressant use has increased 400 percent since 1988 in America. In fact, 11 percent of Americans over the age of 12 take an antidepressant. A recent BBC news story reported that so many people are taking Prozac in the United Kingdom that scientists are concerned that active metabolites in human urine are running off into water and affecting the behavior of wildlife.

A paradox about depression is that, while more research and treatment resources have been poured into combating it than ever before, its personal and economic toll has actually grown. How can it be that—despite all the efforts aimed at understanding, treating, and educating the public about this condition—rates of depression continue to rise? Why have our treatments plateaued in their effectiveness, and why does the stigma associated with this condition remain very much with us?

Our reaction to Robin William’s death offers a clue about why we have yet to get a handle on the depression epidemic. We’ve come to accept the clinical view of depression as an individual’s defect. This view is partly enshrined in the biomedical model’s assertion that depression is a brain- or genetic-based illness, despite the fact that there’s no biological test to diagnose depression, nor do any genes predict its occurrence. Researchers, even as they work with the most advanced technology, can’t find this defect—because their search is premised on the wrong question: where is the disease?

Clinically, the biomedical model has failed to produce the results that billions of dollars of pharmaceutical advertising has promised. While 75 percent of those treated for depression receive antidepressants, the results remain disappointing. Despite 26 different antidepressants to choose from, only a third of people with major depression experience full remission after a round of treatment. Adding to these findings is the fact that newer antidepressants are no more effective than those developed nearly 60 years ago.

Many therapists are skeptical of the disease model, for lofty intellectual reasons and obvious economic ones, but psychotherapy’s track record isn’t much better. For two generations, no major breakthroughs in the treatment of depression have occurred, and the outcomes of therapeutic approaches are essentially comparable to those of antidepressants, conferring (according to whom you listen to) only a small advantage or disadvantage.

Tens of millions of Americans have great difficulty managing low mood, but most people with clinical depression don’t seek psychotherapy, because they don’t know about it, don’t believe in, or can’t afford. And when psychotherapy is engaged, clients are typically left with residual symptoms, just as they are when talking antidepressants. The vast pool of people who have low-grade depression symptoms are a kind of walking wounded; many will be back for further treatment.

Psychotherapists may not like the reductionism of the disease model, but their basic assumptions often overlap with it, still holding to different versions of a deficiency view of depression. Instead of residing in the person’s brain, as the psychiatrist claims, the deficiency may reside in thoughts (says the cognitive therapist), in childhood (says the psychoanalyst), or in the person’s relationship with a significant other (says the marital or family therapist). But I’d argue that this very premise—that depression and its symptoms are proof that something fundamental is wrong with an individual—is itself fundamentally wrong. No perspective based on finding deficiencies within individuals can equip us to understand why a depression epidemic is at hand, nor can it give us the best set of tools to combat it.

The Rise of Mood Science

Depression has clearly been a tough nut to crack, but we haven’t focused much on what’s at the center of that nut: mood. The main approaches to depression have instead focused on other domains, such as cognition, biology, or social functioning. Diagnostically, though, the defining feature of depression is persistent low mood, and the typically depressed person reports moods that are excessively dull, empty, and sad, as well as lacking joy, excitement, or cheer.

A problem throughout most of the 20th century was that researchers doubted that something as evanescent as mood could be studied with precision or objectivity. Fortunately, this has changed. Just as CAT scans and functional magnetic imaging allowed physicians to see the innermost recesses of the body, so, too, in the last 30 years, has an increasingly sophisticated assessment methodology enabled us to measure mood and emotion. The emerging field known as affective science now benefits from a wealth of measurement tools, with techniques for measuring the moods that people report, systems for measuring behavior in the lab and in the field, and ways to monitor the physiology of mood and emotion, from functional brain scans to miniature sensors that monitor the body as people go about their everyday lives. The result has been a wealth of insights about “normal moods” that we can use to understand why we have a depression epidemic—an important first step in bringing it under control.

One fundamental question is why we have moods at all. What possible evolutionary purpose could they serve? In recent years, it’s become increasingly apparent that moods are a key adaption that we share with other animals. The architecture of the ancient mood system not only influences what we feel, think, and do, but it guides our bodily responses to the world as well. So what do moods do? Moods help us assess information about our external and internal worlds. They summarize what’s favorable or unfavorable with respect to accomplishing key evolutionary goals, such as survival and reproduction. They’re a clever adaptation because they integrate multiple aspects of how well or poorly we’re doing. They track key resources in our external environment (like food, allies, potential mates) and our internal environment (like fatigue, hormone levels, adequacy of hydration).

Our moods are integral to understanding what motivates us. They tune behavior to situational requirements, getting us to do the right thing at the right time. High moods energize behavior and activate us to pursue rewards more vigorously, to “make hay while the sun shines.” Low moods are best understood as a stop-and-think mechanism, which restrains behavior and focuses attention on threats and obstacles. When a bear spends hours fishing for salmon at a favorite bend of the river and finds no fish, low mood helps the bear pull back and move on. The value of low mood in humans is particularly clear when people face serious dilemmas in which rash action could be dangerous or lead to further losses. Think of the newlywed who discovers a spouse’s infidelity, the company man fired after decades of service, the death of a child. Stop-and-think is essential in these situations to help limit a person’s further losses or to reconstitute a life plan. A rich and expanding vein of psychological research, both in and out of the lab, shows how low mood can be a boost to realistic appraisal, improving the accuracy of perception and judgment.

While low mood can be a vital adaptation, like any adaptation, it’s not perfect. Bigger brains enabled humans’ higher cognitive ability, but also made childbirth a higher-risk proposition. Bipedal walking freed up our hands for improved hunting and craftsmanship, but a more upright posture placed new pressures on the spinal column, rendering our species prone to back injuries and pain. So, too, we see a mix of costs and benefits for even the most useful psychological adaptations. Anxiety both saves us from legitimate dangers and sets us up for disabling forms of fear, sometimes when no dangers are present. Similarly, low mood brings with it the possibility for the chronic and severe forms of depression we see occurring at increasing rates around the world today.

Depression in Context

Although low mood is a primitive adaptation built into our very makeup, the conditions of modern life can set it into overdrive, creating a perfect storm for mood. In just a few hundred generations, humans colonized the planet, built cities, and invented technologies beyond the wildest dreams of earlier times. Today, the dizzying speed of change in our physical and cultural environment has outmatched the pace of evolution of our nervous systems to keep up. What was once a perfectly good adaptation to a less revved-up and stimulating world is out of step with the demands of modern life.

For example, our capacity for mood evolved in the context of life on a rotating planet, with its predictable 24-hour cycle of light-and-dark phases. Our species is diurnal, and as hunter-gatherers, we spent hundreds of thousands of years being active during the daylight hours. Why? Because the best chance of our finding sustenance and other rewards was in the light phase. Just try to find edible berries by moonlight! As a result, we’re configured with a strong 24-hour biorhythm. Driven by cues of light and dark, this rhythm makes us alert during the day and sleepy at night. Then, about 10,000 years ago, we abandoned our nomadic hunter-gatherer ways and took to permanent dwellings. Village living, by itself, didn’t change our lightscape. But in the last 150 years, as we’ve increasingly traded the outdoors lifestyle of the farmer for the indoors lifestyle of the urbanite, we’ve begun to get less and less daylight.

Recent data show dramatic light deprivations, even in sunny places. When small devices that measure light exposure and duration were attached to adults in San Diego, it was discovered that the average person received a paltry 58 minutes of sunlight a day. What’s more, San Diegans who received less light exposure during their daily routines reported more symptoms of depression. But what about the light we receive from light bulbs? Unfortunately, it’s no substitute for the sun: artificial light is fainter and provides fewer mood benefits. Our newfound reliance on indoor light has effectively turned most people into cave dwellers.

Not only are we not getting enough light during the day, we’re getting too much at night. For this, we can blame Thomas Edison and, more recently, Steve Jobs. Consumer electronics—particularly laptops, smartphones, and iPads—are shining light into our eyes until just moments before we doze off. According to the National Sleep Foundation, more than 90 percent of Americans regularly use a computer or an electronic device of some kind in the hour before bed. The light exposure from light bulbs, TVs, iPads, and phone screens is enough to fool the brain, tricking our 24-hour biological clock and delaying sleep. Plus, the games, shows, texts, and emails on these devices often provide intense stimulation just when we should be winding down. It’s no surprise that surveys show that the average American sleeps 1.5 hours less now than in 1900.

Modern lighting conditions are out of sync with how our moods evolved. A consequence of getting light in the wrong places and times is that, rather than feeling alert during the day and sleepy at night, millions of us feel like the walking dead, insufficiently alert during the day and insufficiently tired at night.

Depression and the Pursuit of Happiness

Likewise, rapid changes in our cultural environment are breeding chronic low mood. For example, the past 15 years have witnessed an ever-growing stream of self-help and pop psych books examining happiness and how people can increase it. We’re the only species to look to culture to guide us on what feelings are desirable and how to manage undesirable feelings. No other animal has had so much advice available—spiritual, medical, psychological—about what to do when feeling down. These resources should serve as bulwarks against depression, but perversely, the opposite is true. Our predominant cultural messages about mood, though surely well-intentioned, are worsening the depression epidemic.

This is particularly true in the United States. Indeed, it’s hard to think of anything more American than the pursuit of happiness. After all, along with life and liberty, it’s written into the Declaration of Independence as a fundamental right. Wanting happiness is as American as apple pie. But how happy should we expect to be? Happier than other people around the globe?

It would appear so. Analysis of thousands of survey responses reported in the Journal of Personality and Social Psychology found that when people in different countries were asked to rate how desirable and appropriate it is to experience varying psychological states, high-arousal positive states like joy and affection were rated as more desirable and appropriate in Australia and the United States than in Taiwan and China. So what’s the problem? Everyone I know wants to be alive, free, and happy. What’s wrong with pursuing happiness to the fullest extent possible? The more you value your happiness, the happier you’ll be, right?

Not necessarily! Research led by psychologist Iris Mauss found evidence for an alternative hypothesis: people who value happiness more are, in fact, less likely to achieve their goal of feeling happy. These findings help us understand why our predominant cultural norms about mood are worsening the depression epidemic. Today, it’s common for people to assume that achieving happiness is like achieving other goals. If we simply work hard at it, we can master happiness, just as we can figure out how to use new computer software, play the piano, or speak a different language. However, research has shown repeatedly that the goal of becoming happier is different from these other goals. Setting a goal to become happier is like putting yourself on a treadmill that goes faster the harder you run. Efforts devoted to augmenting happiness backfire, disappointing—and potentially depressing—us when we can’t achieve our expected goal.

Rising happiness standards widen the gap between what we want to feel and what we actually feel. People who set unrealistic goals for mood states may be less able to accept or tolerate negative emotional experiences like anxiety or sadness. Oddly enough, being able to accept negative feelings—rather than always striving to make them disappear—seems to be associated with feeling better, not worse, over the long run. There’s evidence that when people accept negative feelings, those experiences draw less attention and less negative evaluation than they would otherwise. Some research shows that people who report an ability to accept negative feelings when they arise are less likely to experience depressive symptoms in the future.

Ultimately, the im­plicit cultural imperative to judge one’s life on the basis of personal happiness bumps us up against another wall: our mood system isn’t configured to deliver durable euphoria. Euphoria is a reward the mood system metes out along the way, on the road to pursuing other evolutionarily important goals, a reward for having sex or for when your first-choice date to the prom says yes. Life, however, metes these rewards out sparingly. Bunnies get pleasure after eating a carrot as the reward for finding a carrot, but a well-tempered bunny doesn’t stay satiated. It’s the end of the pleasure and the promise of more that sets the bunny hopping off to find more carrots and ultimately to survive long enough to make more bunnies. This cruel phenomenon, called hedonic adaptation, explains why buying a shiny, red sports car rarely wards off a mid-life crisis.

Our unattainable cultural imperative for happiness, combined with the basic principle of hedonic adaptation, is a cruel combination. When people fall outside the zone of their desired mood, as they inevitably do, they consider this a personal failure. An increasing percentage of the population reports itself as being chronically dissatisfied and chronically overacting to low mood. When many of the same people engage in mood-punishing routines that feature too little sleep, light, and physical activity, you have the seedbeds of the depression epidemic.

Psychotherapy and Mood Science

Mood science presents new ways to understand the depression epidemic by taking a fresh look at the big picture. In other words, we can’t understand why depressed mood is so prevalent until we understand the design of the mood system and why we evolved the capacity for low mood. Of course, this doesn’t mean we’ve found a single cause for the depression epidemic; there’s no single villain. And mood science is a body of theories and empirical findings, not a therapy. Nevertheless, insights from the evolutionary psychology of mood have clinical utility. Psychotherapists are in a position to harness these insights to enrich their approaches. The findings of mood science are already converging with principles behind therapy models that emphasize understanding the brain, the mind–body connection, and the psychological impact of broader cultural forces.

Low moods naturally call out for interpretation, and people struggling with low mood often turn to psychotherapists to find meaning in their experience. But unfortunately, our culture discourages deep inquiry into the sources of mood. Because of how pharmaceutical companies dominate the airwaves, many clients have been taught to interpret low mood solely as evidence of an underlying disease and to medicate it. Not surprisingly, they’re often confused about the real sources of their moods; in a sense, they’re mood illiterate.

It’s important for therapists to become more systematic in assessing the factors that shape mood. An inventory of a client’s mood environment would include lifestyle choices, sleep regimen, light exposure, physical-activity levels, stressors, and progress toward major goals. But it would also include a client’s expectations for mood and how he or she reacts to and copes with negative feeling states.

Perhaps the most striking thing about the depression epidemic is that it’s unfolded in an era of unprecedented comfort and high living standards. Many of the most depressed people have clothing, food, reliable shelter, and cars and can reasonably expect to live longer than people in previous eras. But mood science links mood to the contingent goals that humans create, and psychotherapists know that goal creation can be a self-defeating process. How many people are depressed because they don’t see themselves as rich enough, pretty enough, successful enough? The goals of status, power, and affiliation may be primitive evolutionary givens, but their exact expressions are open to culture and the idiosyncrasies of an individual’s life history. Though our moods may be a legacy of evolution, we aren’t prisoners of it.

A key hypothesis from mood science is that our cultural epidemic of low mood results from people becoming fixated in the pursuit of unattainable goals. How many people chase low-probability outcomes, such as becoming a bestselling author or a famous actress? From early on, parents, teachers, and the media have touted the idea that children can become anything they want, so long as they’re willing to work for it. I’m 5’ 7” tall, and I remember many afternoons in the gym practicing my jumpshot for my future career in the NBA. Yet an extraordinarily deep-rooted ethos in our culture shames and discourages people from ever giving up on a goal—whether it’s an unrealistic NBA dream or a failing marriage. Many people’s default response is therefore to double down, often locking themselves into depression for weeks, months, and years. Psychotherapists often succeed in their work because they understand that the right goals are a cornerstone of psychological well-being.

The right goals should be applied to therapy, too. Therapy for depression should set the bar higher than just reducing symptoms, which is the standard goal of the disease model of depression. Rather, the mood science approach takes seriously the aspiration for wellness. This is because wellness and thriving are more robust and clinically meaningful endpoints than reducing symptoms. Epidemiological research shows that people who achieve full symptomatic relief for a significant amount of time are less vulnerable to subsequent relapse. This isn’t a vision of lives without any moments of low mood. Rather, achieving a period of wellness, one that lasts at least a few months, allows a person to rebuild critical resources—friendships, coping skills, and life purpose—that buffer the person against the shocks they’ll inevitably face. Without such a wellness period, improvement collapses like a house of cards.

Understanding the forces that are seeding low mood in the depression epidemic can help us better understand how to achieve better therapeutic outcomes. We know that mood is the great integrator: we can alter it in an upward direction by changing how we think, the events around us, our goals, our relationships, what’s happening in our bodies (by exercising or sleeping better), and, yes, our brains (through medications or diet). Mood science, at its core, advises resourceful experimentation. Most therapeutic approaches pull only one lever: brain change, relationship change, cognitive change. A mood science approach urges us to look to a broad array of levers that may be used to influence mood. My own story might provide an example of why that’s so important.

Twenty years ago, when I was studying history in Baltimore, I became severely depressed over a period of four years, when I lost virtually everything I had: my first career, my pride, my equanimity. I went from a self-assured Ivy League graduate to someone who spent most days lying in bed or on the floor. For the first two years, I was slow to recognize that I had depression, slow to enter treatment, and too stubborn to change anything about how I was living. When I finally entered treatment, the depression was quite severe, and it didn’t budge in the face of psychotherapy, medication, or a month in the hospital.

It was only after years of what seemed like meaningless suffering that I began to understand and act on the insight that my mood was trying to tell me something. My depression offered a warning that I had put all my eggs in a single basket, which was about to be dashed on the sidewalk. I was determined to become a historian, even though the field of history offered virtually no university-level jobs, and I’d chosen a subfield of interest in which hiring was even rarer. With all my self-esteem locked up in this enterprise, I ignored these ominous signs and forced myself to continue working on a massive dissertation, despite an escalating low mood, which eventually left me bedridden. I believe my low mood was trying to force me to consider other life paths, and when I refused, a violent depression forced me to stop pursuing my career choice, and much of anything else.

It was a long time digging out. Becoming well and keeping depression at bay was connected to restoring purpose in my life, or more accurately, purposes. I’ve stayed well in part because, in terms of my emotional investments, I’ve diversified my portfolio. By moving from history to research psychology, I recreated a career. Also, I got married, had a daughter, and even developed full-blown hobbies, like running marathons. Most recently, I spent several years writing a book that aims to help others understand depression. Each of these enterprises has given me purpose and connected me to deep evolutionary desires, such as attachment, procreation, health, and affiliation. These purposes serve as a kind of armor, increasing my resistance to serious depression.

In this sense, even the deepest depression can lead to a kind of creative destruction—a forced questioning of basic assumptions—leading to new meanings, new beliefs, new goals, new behaviors, and even a new life narrative, in which a person emerges stronger and abler to endure hardship. This, again, is a reminder that we may be better off if we regard recovery not simply as the absence of depression symptoms, but as the cultivation of a set of active qualities or practices that prevent low mood from taking root.

Public Health Problem Number One

Our tendency to see depression as a problem of flawed, disconnected individuals surely interferes with our ability to address it as a public health issue. When people locate the problem as pertaining exclusively to a stigmatized diagnostic category of individuals, it’s easy to say that their suffering has nothing to do with me or my society. A sense of social distance and a lack of sympathy may explain why many people just shrug when they hear WHO’s dire projections about the growing effects of the depression epidemic.

Ironically, the notion of a chemical imbalance was originally intended to reduce stigma, as in “depression is a flaw in chemistry, not character.” But in practice, this idea has had the opposite effect. Research suggests that telling affected people they have a brain- or genetic-based illness makes them feel helpless; it doesn’t increase their willingness to seek treatment. Other evidence shows that adherence to the defect model is connected to holding disparaging attitudes toward people who struggle with depression as alien, threatening, and potentially dangerous.

Fortunately, several lively and novel campaigns are aspiring to break down the social isolation of depression and challenge the defect model. These go well beyond the usual preachy public service announcements to “get the facts.” These are dynamic social media campaigns, which forge bonds among people struggling with depression, people who’ve recovered from depression, and people who care about others suffering from depression. For example, the organization To Write Love on Her Arms counters suicidality with a well-orchestrated campaign called No One Else Can Play Your Part. Many musicians and athletes wear this organization’s distinctive T-shirts in photographs and performances, aiming to encourage conversion and connect people with resources in their communities. The Come Out of the Dark Campaign, a student-centered organization, mails out thousands of free glow-in-the-dark wristbands emblazed with the slogan “Come Out of the Dark” to people who’ve been touched by depression and who share their stories and wristband photos in online galleries. In England, the It’s OK campaign, led by May Gabriel, aims—via blogs and Twitter posts—to break down the stigma of depression and self-harm among youth and adolescents. Through shared stories, disparaging notions about depressed people are softened and solidarity is forged.

We also need a more enlightened public conversation in print, on TV, online, and in professional gatherings about the real sources of the depression epidemic, including a recognition of larger social and cultural forces that are shaping it. We need to broaden our understanding of depression to include politically loaded issues such as poverty, our culture’s obsession with status and wealth, women’s unequal status, and the decline of the traditional extended family. And psychotherapists have an important role to play in expanding our cultural awareness of what’s emerged as the number-one mental health challenge we face. To fulfill its obligation toward the tens of millions of Americans struggling with depression, the psychotherapy community must venture outside its comfort zone.

In the aftermath of Robin Williams’s death, mood science offers a new and hopeful perspective on the depression epidemic, revealing that low mood and depression are the natural products of an ancient and deep evolutionary process. Instead of assuming weakness or defectiveness in people who grapple with depression, it highlights the fact that, rather than a permanent debility, depression can often be followed by thriving. Writing these words 15 years after my own dark episode, I can testify to the truth of that observation. To meet the challenge of the depression epidemic, our society needs to get beyond the defect model and honor the strengths of depressed people, seeing their potential for rebirth and the ways by which, once reborn, they can help others build their own enduring recoveries.

 

Illustration © Paul Schulenburg

Jonathan Rottenberg

Jonathan Rottenberg, PhD, is an associate professor of psychology at the University of South Florida, where he’s director of the Mood and Emotion Laboratory. He’s the author of The Depths: The Evolutionary Origins of the Depression Epidemic. In 2013, he started the Come Out of the Dark campaign.