Bipolar 2 From Inside and Out

Grippy Socks and Sour Candy

My husband is a great help when I write my blogs. He keeps an eye out for news stories that deal with mental health in some fashion. So when he saw an article on new words related to the topic, he made sure I saw it. Then he asked me how I felt about it.

The story was about new language that young people were using to describe various mental health concepts.

First and foremost among them was “grippy socks vacay”—a reference to the footwear issued to people who have been committed, voluntarily or otherwise, to psych wards. But “vacay” is short for “vacation.” I can just picture a conversation using it: “Where’s Janet been?” “Oh, she’s been on a grippy socks vacay.” Or “I’m stressed. It’s time I went on a grippy socks vacay.” It seems unlikely that the people who say these things are always referring to an actual stay in a psych ward.

I was more than slightly appalled. It’s true that grippy socks evoke the image of a hospital stay. But grippy socks are a part of any stay in any department of a hospital, not just psych wards. And such a stay is hardly a vacation. It’s likely, I think, that people use this to mean something like “relaxing getaway” or “time to clear my head.” An actual stay in a psych ward, however, is not a relaxing getaway. It’s intense. It’s not supposed to be relaxing. And while it does provide time to clear one’s head, that’s still far from accurate. Medication, group therapy, and individual therapy may eventually clear one’s head or at least change one’s perspective, but it’s hardly just a time away from work and day-to-day stresses.

The article went on to discuss whether the phrase increased or decreased stigma. Some said one, some the other. I think it perpetuates stigma. It implies that someone who is in a psych ward is there to have a good time. “Grippy socks vacay” is demeaning when the hard work that mental patients must accomplish is considered.

If it’s used as a euphemism for an actual psych ward stay, it’s insensitive at the least. If it just means time off from daily cares, it’s still inaccurate and discounts the real experience. Those things can’t be good for reducing stigma.

Now, my friends and I have been known to use irreverent language to refer to our conditions. Robbin and I used to say on occasion that we needed a “check-up from the neck up.” We used it just between the two of us (well, I’ve also used it with my husband) to indicate that we needed to see our therapists. But I don’t see it as being demeaning, especially since we never used it in the context of anything but our own disorders, not a general description of someone the general populace would slangily describe as “crazy.” If we had said of any popular figure that they needed a check-up from the neck up, that would have been something else. But we didn’t.

Of course, you may disagree with this and I’d love to hear from you regarding your opinion.

The other article my husband shared with me was one that indicated that it was a trend on TikTok to use sour candy to ward off anxiety. The article even said that experts backed up the theory.

The idea is that the intense sensation of sourness distracts the brain from the cause of the anxiety. It’s a distraction technique, like snapping a rubber band on your wrist to take your mind away from unwanted thoughts. One expert interviewed for the article said, “Panic ensues when our amygdala triggers the flight or fight response. One way to dampen our amygdala’s response and mitigate panic is by turning our attention to the present moment through our senses: taste, smell, touch, sight, and hearing.” Mindfulness through candy, I guess, would be a way to describe it. The experts also advise grounding yourself with other sensations such as the scent of essential oils.

Other experts noted that sour candy is a kind of crutch and not a long-term solution. One called it “maladaptive.” Sensory distractions, they said, were most effective in conjunction with acceptance rather than avoidance.

What’s the takeaway from this? Aside from the potential boost in sales for Jolly Ranchers, I mean. I think it’s a good reminder that there are ways to short-circuit anxiety and panic. And for people who only experience occasional, momentary anxiety, it’s probably a good thing. But for someone with an actual anxiety or panic disorder, it’s likely to be only one tool they use — and a minor one, at that.

What have you been reading recently about mental health trends? I’d love for you to share that, too.

Lately, I’ve been seeing articles with titles that say codependency is a myth or a hoax. They claim that the concept is not just wrong but harmful. Despite its almost 40-year history, codependency now seems invalid to many.

Codependency is defined as a mechanism whereby enablers are enmeshed with their child, spouse, sibling, or significant other to such an extent that they lose the ability to take care of their own emotional needs. The enabling also means that the person suffering from a psychological condition (originally addiction, but later other problems) does not have the motivation to work on themselves or change their behavior. In extreme cases, it means that one partner cannot tell where they end and the other begins.

My husband introduced me to the concept of codependency. He has a background in psychology and was greatly influenced by Melody Beattie’s writing. Her book, Codependent No More (published in 1986 but still selling well), his work with Adult Children of Alcoholics (ACA), and attendance at seminars on the topic have made him a staunch believer. When I told him about the articles, he scoffed. In fact, he seemed offended. It’s a basic tenet that aligns with his experience of psychology.

So, what are the objections to the concept of codependency?

First of all, it’s not a recognized psychological condition in that it’s not an official diagnosis. There are no specific diagnostic criteria, though there is a list of symptoms including fears of rejection or abandonment, avoiding conflict, making decisions for or trying to manage the loved one, keeping others happy to the detriment of self, and generally a “focus on caretaking and caring for others to the point that you begin to define yourself in relation to their needs.” Admittedly, those are largely squishy criteria (there are others), some of which overlap with officially recognized diagnoses.

Another definition states, “The codependent person sacrifices their needs to meet the demands and expectations of the other person. These individuals may also strongly desire to ‘fix’ the other person’s problems. The individual often neglects their self-care and personal growth in the process.” This was developed in the context of addiction studies, and some people object to the concept being broadened to include other circumstances.

More significant is the idea that the concept pathologizes love and support. Interdependence is the natural function of intimate relationships, and depending on each other is the ideal. Codependency theory is said to downplay helping behaviors that are essential to good relationships. In addition, codependency is often viewed as a “women’s problem,” and that reinforces patriarchal stereotypes, such as that women are “needy.” Instead, a person labeled codependent should work on overwriting old scripts of anxious attachment and other negative feedback loops.

Codependence is said to have contributed to the “tough love” movement that involved a hands-off approach to a loved one’s addiction, allowing them to experience the natural consequences of their behaviors. Tough love is discredited these days as a form of verbal abuse and a philosophy that has no basis in psychological practice, as well as reinforcing the idea that an addict must hit “rock bottom” before they are able to accept help. Tough love also promoted a model of intervention as a process involving anger, blame, non-compassionate confrontation, and the use of psychologically damaging “boot camps” for troubled teens.

Then, too, it is said that there is no research validating the concept of codependency, no way to measure it, and no effective treatment for it.

There’s another point of view, though—that codependency is a real, serious problem.

Let’s take that last point first. Research on codependency has revealed specific behaviors associated with it and the tendency to repeat those behaviors in subsequent relationships. Research has also indicated sex differences in codependency, with women being more likely to suffer from it. (It should be noted that women also dominate in diagnoses such as depression. Both genders are affected by depression and codependency, however.) As with codependency, there are statistics to report how many people suffer from depression and other conditions, but none to say how severe their condition is. Also, codependency has its roots in attachment theory, family systems theory, and trauma studies.

Treatment for codependency is quite possible. Education, individual therapy, couples/family therapy, group therapy, CBT, and DBT have all had beneficial effects. Even 12-step programs such as Codependents Anonymous are possible ways to address codependency. And, like some other disorders, codependency responds to techniques such as boundary setting, building on strength and resilience, and self-care. It also has other characteristics common to other conditions—relapses and setbacks, for example.

As for the idea that codependency pathologizes love and support, it is true that these qualities are essential to the human experience and good things in and of themselves. But when those qualities get hijacked by excessive, misdirected, and exaggerated needs, they can become pathological. After all, moderate depression and anxiety are parts of the human experience too, but when they strike with extreme manifestations, they become pathological as well. To say that all expressions of love and support are good is to ignore the harm that they can do when they interfere with those normal experiences of human interaction.

And while the concept of codependence may have started in the field of addiction studies, there’s no indication that that’s the only place where it belongs. Plenty of psychological concepts begin in one area of study and expand into others. The idea of healing the inner child may have started with trauma studies, but it now applies to other areas as well, such as grief therapy and other abandonment issues (including codependency).

What does all this add up to? I think my husband and the proponents of codependency theory have a point. The fact that it hasn’t been sufficiently studied doesn’t mean that it doesn’t exist, just that it is a comparatively recent idea compared to other conditions and pathologies. It has demonstrable effects on relationships and makes logical sense. If two people become enmeshed, their behaviors are likely to become warped and dysfunctional. In fact, dysfunction is one of the hallmarks of codependency. It explains relationship dysfunction in a way that few other concepts do. It may not be the only relationship hazard, but it checks a lot of the boxes.

Sure, the term codependency has been overused, especially in the type of pop psychology promoted by assorted self-help articles and books. But so have other psychological concepts and societal problems. Just because gender studies has had limited usefulness in analyzing male and female communication styles doesn’t mean that it has nothing to tell us.

So, do I think that the concept of codependency is a myth? No. Do I believe that it’s a “hoax,” as some have claimed? Again, no. Is the concept itself toxic? Does it imply that love and support are invalid? No. Is it overused by people who don’t understand it? Certainly. Does codependency deserve more study and practice before we discard it? Definitely.

I’ve seen codependency working in people’s lives. Anecdotal evidence isn’t sufficient to prove its reality, of course, but it’s a starting point for further exploration by professionals. Just because something doesn’t appear in the DSM, a notoriously changeable document, doesn’t mean it’s not real.

Mental illness is not just an American problem. In fact, it’s a problem around the world, and perhaps much more acute in other nations, especially those plagued by poverty.

There’s no way to know for sure, but many – perhaps most – of the world’s mentally ill are undiagnosed, untreated, ignored. Because what do you do when you live where there’s no psychiatrist? No therapist. No medication. No help.

Your family may support you, shelter you, or shun you, depending on their financial and emotional resources and those of the community. But for many people, there is simply nothing.

Psychiatrist Vikram Patel, one of Time magazine’s 100 Most Influential People for 2015, is working to change that.

As a recent profile in Discover magazine put it, Patel and others like him have set out to prove “that mental illnesses, like bipolar disease, schizophrenia, and depression are medical issues, not character weaknesses. They take a major toll on the world’s health, and addressing them is a necessity, not a luxury.”

In 2003, Patel wrote a handbook, Where There Is No Psychiatrist: A Mental Health Care Manual, to be used by health workers and volunteers in poverty-stricken communities in Africa and Asia. A new edition, co-written with Charlotte Hanlon, is due out at the end of this month.

Patel, in his first job out of med school, in Harare, Zimbabwe, says he learned that there wasn’t even a word for “depression” in the local language, though it afflicted 25% of people at a local primary care clinic. There was little study of diagnosis and treatment in “underserved areas.”

Later epidemiologists learned to their surprise that mental illnesses were among the top ten causes of disability around the world – more than heart disease, cancer, malaria, and lung disease. Their report was not enough to spur investment in worldwide mental health.

Patel developed the model of lay counselors – local people who know the local culture – guiding people with depression, schizophrenia, and other illnesses through interventions including talk therapy and group counseling. By 2016, the World Health Organization (WHO) admitted that every dollar invested in psychological treatment in developing countries paid off fourfold in productivity because of the number of people able to return to work.

One objection voiced about Patel’s model is that the real problem is poverty, not depression or other mental illness. The argument goes that the misery of being poor, not a psychiatric illness, leads to symptoms and that Westerners are exporting their notions of mental health to the rest of the world, backed up by Big Pharma. Patel responds, “Telling people that they’re not depressed, they’re just poor, is saying you can only be depressed if you’re rich … I certainly think there’s been a transformation in the awareness of mental illnesses as genuine causes of human suffering for rich and poor alike.”

Of course the problem of underserved mentally ill people is not exclusive to impoverished nations. There are pockets in American society where the mentally ill live in the midst of privilege, but with the resources of the Third World – the homeless mentally ill, institutionalized elders, the incarcerated, the misdiagnosed, those in rural areas far from mental health resources, the underaged, the people whose families don’t understand, or don’t care, or can’t help, or won’t.

I don’t know whether Patel’s model of community self-help can work for those populations as well as they do internationally. This is not the self-help of the 1970s and 80s, when shelves in bookstores overflowed with volumes promising to cure anything from depression to toxic relationships. It would be shameful if the rich received one standard of care for mental health problems, while the poor had to make do with DIY solutions, or none.

But, really, isn’t that what we’ve got now?

Control/No Control

When I was a kid, my family used to go to visit relatives in Campton and Beattyville, Kentucky. It was always a good time. There were barns to play in and fishing, berrying, eggs to gather, and so forth. To get there, we took what was then a toll road called the Mountain Parkway. I loved dropping change in the bucket as we passed through the toll stations.

The road wound and twisted up into the mountains. There were steep dropoffs along the sides. I don’t remember railings, though I suppose there were some. We visited there about once a year during summer vacation. My Dad drove.

I have a number of things on my List of Things I’ll Never Be Able to Do Again, and going to Campton is one of them. For one thing, I have no relatives left there anymore—most were quite aged back then and their children have scattered. But the more important reason is that I could not handle the drive.

When I was in Ireland with my husband, we rented a car and drove around the country. The GPS that came with the car was sketchy at best. It took us on one-lane roads that meandered through the hills. On the larger roads, there were many rotaries, which we hadn’t driven before. Eventually, we started relying on my phone and Google Maps, which didn’t get us lost as often or run us off into ditches. We still ended up going on twisty back roads.

But I was terrified the entire time we were driving. Dan had to drive since I couldn’t adjust to driving on the left (I tried once and gave up). My nerves couldn’t handle it. The entire time we were driving, I had my hand braced against the roof of the car. When it was particularly frightening, I made a peculiar humming noise that Dan had to learn to ignore. He’d remind me that I had anti-anxiety meds I could take, too. I did, but they didn’t stop my symptoms.

Fast forward a couple of years. We were in Gatlinburg, Tennessee, driving around looking for where we stayed and where we were going. Again, we used Google Maps on my phone. Again, we were traveling on twisty back roads with sudden hills and no shoulders to speak of. Again I clung to the Oh Shit handle and made the weird humming noise as we navigated the convoluted routes. Again I took anti-anxiety meds.

Then I had a revelation: I could never go to Campton again, even if Dan was driving. The bends in the road and the steep drop-offs would prove too daunting. I don’t want to put myself through that again if I don’t have to. And I don’t want to have to.

I don’t have trouble driving on surface streets or highways, even alone. Those I can handle—even for four- or five-hour drives.

When I’m driving, I feel in control of the vehicle and don’t have the massive anxiety. That is, unless the circumstances involve something that makes me feel out of control, like left-side driving or narrow roads with switchbacks and doglegs. Even if Dan drives and I navigate, I still do the clutching and humming thing. It’s exhausting. If I were driving, I would have to go 20 mph and mightily piss off the cars behind me.

The bottom line? I can drive myself places, but only under certain conditions when I feel in control. If there’s a factor—or more than one—that makes me feel out of control, I can’t do it.

I like to think that I’m not a control freak under other circumstances. There’s just something about a machine that weighs that much going at a speed that feels unsafe in terrain that strikes me as difficult. This still leaves me a lot of places I can go, even without Dan. But not everywhere. And that makes me feel sad and incompetent, two feelings that I don’t like and that there’s no medication for.

Pill-Shaming

When I first started taking Prozac, when it was just becoming ubiquitous, my mother said, “I hear it’s a ticking time bomb!”

“Oh, dear!” I thought. “Mom’s been listening to Phil Donahue again.” (She had been, but that’s not the point.)

Back in the day, Prozac was hailed as a miracle drug and condemned as a killer drug. On the one hand, it was said to be a “magic bullet” for depression. On the other, it was supposed to result in addiction and suicide.

It’s probably true that it was prescribed too often to too many people who may not really have needed it. And it may have led to suicides—not because Prozac prompted such an action, but either because it was improperly prescribed or because it activated people who were already passively suicidal and pushed them into action.

At any rate, Prozac was not an unmixed blessing.

For me, it was closer to a miracle drug. It was the first medication that had any significant effect on my depression. I noticed no side effects.

But Prozac is no longer the psychiatric drug of choice. Since that time, hundreds—maybe thousands—of psychotropic drugs have been introduced and widely prescribed. Many have proved just as controversial as Prozac. Indeed, the whole concept of psychiatric drugs is now controversial.

I belong to a lot of Facebook groups that encourage discussion on psychological matters and have a lot of Facebook friends with opinions on them, sometimes very strong ones. Some of the people with the strongest opinions are those who condemn certain classes of psychiatric drugs or that category of drugs altogether. They share horror stories of addiction, atrocious side effects, zombie-like behavior, and even death from the use of these drugs.

Benzos are the drugs that are most often condemned. And it’s true that they can be addictive if they’re misused. Whether that’s because a doctor overprescribes them or a patient takes more than prescribed I couldn’t say. But I maintain that benzos aren’t inherently harmful when prescribed appropriately and supervised professionally.

I have personal experience with benzos. They were the first psychiatric drug I ever took, meant to relieve a rather severe nervous tic that affected my neck and head during junior high school. I do remember walking off a short stepstool while shelving books in the library, but I was not injured and the misstep could be attributable to ordinary clumsiness, which was something I was known for (and still am). The benzos were discontinued when I got better. I also took benzos in college because of pain due to temporomandibular joint problems.

Now I have benzos that my psychiatrist prescribed “as needed” for anxiety and sleep disturbances. After all the years I’ve seen him and my history of compliance with prescribed medication, plus the very low doses, he had no hesitation prescribing, and I have no objection to taking them.

But some of the people I see online object to any psychiatric drugs whatsoever. Again, the most common complaints are addiction, side effects, and zombie-like behavior. Of course, I can’t—won’t—deny that they have suffered these effects. Psychotropics are known to affect different people differently. I’ve had side effects from many of the ones I’ve taken that were too unpleasant for me to continue taking the drugs. But after all the different meds I’ve tried during my journey to a combination of drugs that work for me, it would be a surprise if I objected to them altogether.

But I don’t. I’ve had cautious, responsible psychiatrists who’ve prescribed cautiously and monitored rigorously, listening to me when I reported side effects.

So, my personal experiences have been good. I know not everyone’s experiences have been, for a variety of reasons.

What I object to is the drumbeat of “all psychotropic medications are bad and ruin lives.” And the memes that show pictures of forests and puppies that say “These are antidepressants” and pictures of pills with the caption “These are shit.”

I hope those messages don’t steer people who need them away from psychotropic medications. And I hope that people who do need them find prescribers who are conscientious, cautious, and responsible in prescribing them. On balance, I think they’re a good thing.

I’m Sorry

The other day, my husband was putting together a magnifying lamp that I had bought to help me repair some jewelry. I was trying to adjust the lamp to a height where it would be usable and comfortable. The lamp was a cheap piece of shit and it broke.

Instantly, I apologized. The clamp broke. I apologized again. It turned out that the pin holding the clamp together broke. I apologized again. My husband determined that it was not fixable as it was. Guess what I did? That’s right — said, “I’m sorry.” I said I was sorry for ordering the cheap thing. I said I was sorry for wasting money. I was sorry for wasting my husband’s time. I was sorry for everything.

The week before, I wanted to go to an art house in a nearby town to see the documentary about Joan Baez. The whole way there, I was nervous — about the route we were taking, whether we would find parking near enough to the theater, whether we should eat dinner before or after the movie. And especially whether Dan would like the film. On the way home, I kept asking him, “Was that okay? Did you like it? Is it okay that I chose the movie? Is it okay that I chose that movie?”

On the way home, he reassured me. He liked the movie. He learned things he hadn’t known about Joan Baez. We were lucky to find the parking place so near the theater. It was a nice evening for a drive.

Then he said, “Where’s all this coming from?”

“I chose the movie and the time and bought the tickets and decided which theater to see it at. If anything went wrong, it was all my fault.”

“Ah. Old tapes.”

In these recent cases, things went right. Dan figured out a way to fix the magnifying lamp by cannibalizing another lamp. We got to the movie on time and got good seats. We found a handicapped parking spot open right across from the theater. The movie was great. I felt better after we got home.

Dan was right, though. The excessive apologies started in my past — not with Dan — further back in time than that. If something was my choice, and it didn’t turn out great, it was wrecked. I realize this is all-or-nothing thinking, which is counterproductive.

Even before the old tapes, though, I had a habit of feeling sorry for everything and saying so. I apologized for everything. And I punished myself. If I said something “wrong” or even a tiny bit off-color, I tapped my cheek with an open hand, symbolically slapping myself for doing something bad. (I think it’s important to note that my parents never slapped me as a child, so I don’t know where that came from.)

And I apologized endlessly. For everything. My friends noticed. They asked why I did it. They let me know that it was annoying. I tried consciously to stop. And after a while, after having friends who stuck with me, after practice, I did stop. For a while.

Then I got in a relationship with a gaslighter and again felt guilty for everything. He blamed me for things I did and things I didn’t do. Once, he even claimed that when I did something wrong in front of company, I had offended his honor. And of course, if I selected anything — where we went, what we ate, what music we listened to, I was at fault. I was at fault for liking mayo on my sandwiches and for not offering him a bite of my sandwich. I was seriously wrong not to wait for him even though he was past the time for a meet-up with friends. Wrong to hook up with a friend while he was hooking up with one of mine in the next room. Eventually, I shut down, afraid to do anything.

Years later, I got past the apologizing, for the most part. The past two weeks, I’ve been backsliding. I think it may be because money has been extra tight, which makes me extremely nervous, and I’ve had to tell Dan he can’t make some purchases now. That feels treacherous, even though he doesn’t complain or blame or shame me. But it puts me back into the mindset of blaming myself before someone else can. It’s not comfortable for either of us. It’s all I can do not to apologize for feeling this way, for my disorder having this effect.

I’m hoping that writing about it will help me work out how I feel. And maybe make the apologies back off. At least for a while.

Am I Neurodivergent?

Last week I wrote about language that has been lost from technical meaning to become popular usage. This week I want to explore a term that may or may not apply to me—neurodivergent.

The dictionary definition I looked up said that neurodivergent means “differing in mental or neurological function from what is considered typical or normal.” It added that the term is “frequently used with reference to autistic spectrum disorders.” The alternate definition given is “not neurotypical,” which is no help at all.

I’m not on the autism spectrum, so I don’t “qualify” as neurodivergent that way. And I don’t have any of the other disorders, like ADHD, that typically are associated with neurodiversity. So where does that leave me?

Another definition: “Neurodiversity describes the idea that people experience and interact with the world around them in many different ways; there is no one ‘right’ way of thinking, learning, and behaving, and differences are not viewed as deficits.”

I like that better. It leaves room for a lot of varieties of neurodivergence.

A medical website explained it this way: “Neurodivergent is a nonmedical term that describes people whose brains develop or work differently for some reason. This means the person has different strengths and struggles from people whose brains develop or work more typically. While some people who are neurodivergent have medical conditions, it also happens to people where a medical condition or diagnosis hasn’t been identified.”

Wikipedia also notes, “Some neurodiversity advocates and researchers argue that the neurodiversity paradigm is the middle ground between strong medical model and strong social model.”

It’s true that my bipolar disorder means my brain and my behavior are not typical. I feel neurodivergent, even though I know that’s hardly a criterion. I’ve accepted that my bipolar is somewhere in the middle ground between the medical model and the social model. For a long time, I believed in the medical model absolutely. To me, my bipolar disorder was brought on by bad brain chemistry. I couldn’t see any glaring social problems such as abuse in my family. Mine was very much the traditional social model—working father, stay-at-home mother, one sister. I never suffered domestic violence or sexual abuse.

What I didn’t see was that there are other kinds of traumatic events, some of which I did experience as a child. Some of them were so painful that I remember having meltdowns because of them. Young adulthood brought more trauma. Since then, a combination of medication and therapy has helped. Perhaps the medication helped with the part of my disorder that was caused by my brain, while therapy helped the social trauma part.

Lately, I’ve been thinking about the idea of a spectrum. Although autism is often considered using the convenient concept of a spectrum, I know that “being on the autism spectrum” is not accepted by all. High-functioning or low-functioning, people are at base autistic or non-autistic, not “a little bit autistic,” as the spectrum seems to imply. Other people find the spectrum idea useful.

I’ve also thought about the introvert/extrovert spectrum. It makes sense to me that no one is truly at either end of that spectrum—all introvert or all extrovert. Nor is anyone pure ambivert, evenly poised between the two ends of the spectrum. We’re all various degrees of ambivert, leaning toward one side or the other, but sharing some of the characteristics of each.

My brain has developed differently or works differently for some reason. But according to the spectrum philosophy, no one is totally neurotypical or totally neurodivergent. We’re all jumbled somewhere in the middle. A little to one side and we’re considered one or the other. So, I see myself as on the neurodiversity spectrum—neither one nor the other and not evenly balanced between the two. Somewhere in the relative middle, to one side or the other. Part neurodivergent and part neurotypical.

Whatever I am, I’m not 100% neurotypical or 100% neurodivergent. But I’m at least partly neurodivergent. And I’m comfortable with that.

Language Lost

There are many words that are specific to psychology, including diagnoses, symptoms, and therapeutic techniques. Many of those terms, however, have worked their way into general conversation. Some think this is a good thing as it makes society more aware of the language we as psychiatric patients use. Others object to this use of language. They see it as diluting the meaning of the terms.

Two of the most common words that have made this shift are bipolar and OCD. Instead of diagnoses, they’re often used as descriptions of people or things that are thought to share the characteristics of the disorders. “The weather is bipolar this month.” “Beth’s house is really tidy. She’s so OCD.” These usages are, of course, inaccurate. Weather can’t have a psychiatric disorder, and a neat house is not enough to diagnose a person with OCD.

The thing is, people aren’t using them literally. Weather being bipolar is a metaphor. It conveys the idea that the weather is changeable, seemingly randomly. Calling weather bipolar expresses the concept more vividly, which is probably why it has become so popular. Calling someone OCD is an exaggeration used for effect. They’re saying that Beth is not just neat, but excessively neat. The people who use these expressions don’t have any real idea of what the terms mean. They’ve just heard them used and have only a vague, superficial idea of what they mean.

Spoons is another metaphor gone astray. Originally, it was used to describe the depletion of energy that someone with an “invisible illness” feels when they’re required to do more than they’re capable of on any given day. Spoons are a variable commodity. The neurodivergent or physically challenged never know how many “spoons” they will have at the beginning of a day and when they’ll run out of them. It’s a very powerful metaphor which makes it easier to understand the concept.

Nowadays, however, it’s used by people who don’t face these challenges to mean simply “I’m tired” or “I’m done for the day.” But these people don’t have a widely varying amount of energy at the start of each day. Oh, they may be more or less tired depending on the quantity and quality of their sleep. But they don’t begin with so few spoons that getting out of bed requires an enormous expenditure of spoons that depletes them for the rest of the day.

The word triggers is not a metaphor, but a word that has weakened over time. In psychological terms, a trigger is something that brings back vivid memories and sensations of a traumatic incident. The person who is triggered cannot control their reactions and will experience the event as if it were actually occurring in real-time. In its new meaning, a trigger is anything that a person doesn’t like or causes them to be uncomfortable. This discomfort is minor and fleeting, and does not cause sensory overload. People who use “triggered” this way betray a deep misunderstanding of the term and often make fun of the concept altogether.

These and other terms like neurodivergent and spectrum are also frequently misunderstood or misused. Some are still being defined and arguments about what they really mean often occur.

People who use the words in their specific, technical sense sometimes speak of “reclaiming” them. They are offended by the perceived misuse of the various terms and want to restrict them to their original, technical meanings. They want other people to stop using them in their new senses. They feel the new usage cheapens the words.

The thing is, language doesn’t work that way. Once a word or phrase has “escaped into the wild” and is being used with a different shade of meaning, there’s no getting it back. No matter how much you try to educate people about the “real” meaning of the word, most people will not even realize they are using it “wrong” and won’t stop using it in the new sense. In fact, the first dictionary definition of bipolar is “having or relating to two poles or extremities,” not the disorder. The non-psychiatric sense of OCD as an adjective hasn’t made it to the dictionary yet, but it’s only a matter of time now.

Personally, I can think of things a lot more heinous than describing me and the weather the same way. Is it ignorant? Yes. Is it insulting? Probably. I just think it’s a waste of time correcting one person at a time or trying to educate the masses about it. Millions of people are still going to do it, and there are more important things to educate them about.

Those of us with brain illnesses such as bipolar disorder, OCD, PTSD, and schizophrenia often speak of the stigma associated with our problems. It’s no wonder—stigma affects our lives in both predictable and unpredictable ways.

For instance, say you’ve become comfortable talking about your disorder. Then one day when you’re at a reunion or some other gathering, you happen to mention it and get the glazed-eyes-fixed-smile-back-away-slowly response. Sure, a lot of people don’t know what to say to you, but that reaction just makes it clear that you are different and, to that person, potentially a source of danger. Someone to be avoided. Someone not to engage with.

That’s stigma.

Prejudice is related to stigma. It’s just a short step away. Prejudice happens when people have a preconceived idea of what brain illness looks like. (That’s what prejudice means.) This could be a person who assumes that a serial killer or mass shooter is obviously “insane.” Their assumptions are reinforced when it’s revealed that the perpetrator had a history of psychological problems or had taken medication. They’re ignorant of the facts—that most killings are prompted by motivations such as rage, gender or racial hatred, jealousy, or fear. They don’t know that the mentally ill are much more likely to be the victims of violence than to be perpetrators.

People with prejudice against people with mental illness can also assume that psychiatric diagnoses are not “real.” They think people with these conditions can—and should—just “snap out of it,” “pull themselves up by their bootstraps,” or “get over” their problems. They look down on people who seek help. They make jokes about “crazies” and “lunatics.” They believe that anyone with a “real” mental illness is in a locked ward in an “insane asylum,” or should be. They don’t know that straightjackets aren’t used anymore and feel they’re funny Halloween costumes.

In other words, people who are prejudiced lack understanding and empathy.

Discrimination takes it one step further.

When people with brain illnesses suffer from discrimination, they lose opportunities because of their condition. If they are open about their diagnosis on applications, they may never receive a callback or an interview for a job. They may start receiving bad evaluations at work if they have to leave for doctor’s appointments or be let go for not getting along with other workers, many of whom may have prejudice against them. They don’t receive the accommodations required by the Americans with Disabilities Act (ADA).

Discrimination can also be involved with decisions from Social Security Disability. It’s not supposed to be that way, but people with mental illnesses are likely to have more difficulty “proving” that they have a disability severe enough to warrant supplemental income.

So what’s to be done? Education is the solution we always advocate. But it’s a hard ask. It’s difficult to get anyone to learn about the realities of brain illnesses. They don’t learn about it in school, and the messages they get from the media do little except reinforce the stigma surrounding the various conditions. In fact, they perpetuate much of the stigma.

Pushback is another strategy. We simply cannot let it pass when someone makes a prejudiced remark or demonstrates a lack of understanding. We can speak up about inappropriate Halloween costumes or assumptions about violence and the mentally ill. We can inform others that not all homeless persons are mentally ill. In fact, most homelessness is caused by a lack of affordable housing and low wages.

When it comes to discrimination, legislation and activism are often the solutions or at least the beginnings of them. Lobbying efforts regarding policy and treatment will help. Lots more needs to be done to inform legislators about the very basics, much less the possible ways to address the problems. Reporting violations of the ADA may not lead to resolutions, but it still needs to be done.

Of course, it’s difficult for many people with brain illnesses to do these things. We are frequently isolated and doubt our own abilities. Confronting legislators, educating them, and lobbying for their attention is daunting. Neurotypical people have trouble doing it, especially without an organization that gives them leverage. But it’s work that needs to be done. I admit that I’m not at the forefront, though these blog posts and my books are intended to help educate, and the groups I belong to try to do likewise.

It’s not enough. But it’s a start.

Positivity and Acceptance

Those who follow this blog have seen me rail against toxic positivity. When it’s not absurd, it’s insulting to those of us with mood disorders. No, we can’t just cheer up. If we could look at the bright side, we wouldn’t have depression or anxiety. You may be able to choose happiness, but I can’t. I’ve needed medication and therapy just to feel meh at times. If I could turn bipolar disorder off like a light switch, don’t you think I’d do it?

Toxic positivity can be seen nearly everywhere, in a lot of different situations: the self-help movement, of course, but also business, medicine, and even religion – as well as endless memes. American society is rife with toxic positivity. It appears in motivational business conventions and TED Talks. Salespeople are advised to think positively and envision success. Breast cancer survivors are advised to keep a positive attitude, to the extent that they are encouraged to tell how the disease has had a positive effect on their lives and relationships. (Expressions of fear, anger, and other natural emotions in response to the diagnosis are downplayed or discouraged.) Religions can exhort us to count our blessings or “manifest” our wants and needs by using positive thoughts to attract them.

Positivity becomes toxic when it is seen as the only method of coping with problems in life, even ones that have other solutions or none. Toxic positivity presents relentless cheer as the only acceptable reaction and a panacea for every difficulty. And toxic positivity leads people to demand that others take up the mindset and apply it to every situation, even devastating ones. As such, it denies the reality of human suffering and normal emotional responses. It’s a form of non-acceptance.

So, what is the alternative? What is a more natural – but still effective – technique for dealing with difficulties? How can those of us who have mood disorders or any other brain illness find ways to navigate through life without slapping on a smile and coercing our emotions to fit a certain mold?

Radical acceptance is one answer. Radical acceptance means that you accept your inner feelings and your outward circumstances as they are, especially if they are not under your control. You acknowledge reality without trying to impose a set of emotional mandates on it. Your acceptance and acknowledgment may involve pain or discomfort, but those are understandable, normal human conditions. They are natural conditions that evoke a natural response.

Rooted in Buddhist teachings and given a name by Marsha Linehan, the psychologist who developed dialectical behavior therapy (DBT), radical acceptance uses mindfulness to help people learn to face and regulate their emotions. Interestingly, one 2018 study found that accepting your negative emotions without judgment is a factor in psychological health.

With radical acceptance, when you encounter difficult situations and emotions, you note their presence without trying to suppress them. You accept them, as the name implies. This attitude can address – and reduce – feelings of shame and distress that you may feel, especially when you are not able to simply shut off those feelings and replace them with positivity. That doesn’t mean that you wallow in unpleasant feelings or allow unfortunate circumstances to stunt your responses.

Instead, you note the feelings – accept that they exist – and “hold space” for them within you. You appreciate that your emotions can lead you to new understandings of and reactions to your circumstances. For example, instead of adhering to the unattainable maxim that “Failure is not an option,” you can recognize when you have indeed failed and accept it as a natural part of life. You can then move on to a mindset of growth where you use that failure to inform your future actions. You develop a more accurate picture of the world and can begin implementing real solutions.

Of course, there are situations where radical acceptance is not appropriate. Abusive situations, for one, shouldn’t simply be accepted without being addressed. But neither will positive thinking resolve them. They require action, from seeking help from a trusted individual to leaving the situation to contacting law enforcement or an organization that can help.

But in other circumstances, radical acceptance may be an answer for some. For myself, I’ll just be satisfied if radical acceptance helps drive out toxic positivity. I don’t think it will, but a person can dream.