At Sunrise Mart, a small Japanese grocery with a branch in Brooklyn’s Sunset Park, you can’t miss the mountain of KitKats. The shop sells all kinds of fresh foods and imported snacks, but as soon as you step inside, you’re toe-to-toe with an enormous heap of colorful bags of the chocolate bars, rising up from the floor in the store’s most prominent real estate. The bags offer flavors such as lychee, chocolate orange, and cheesecake. At $10 each, they’re a little expensive. That doesn’t seem to matter. When I visited the store this spring in search of soup ingredients, multiple shoppers buzzed around me on an otherwise slow weekday afternoon, snapping up bag after bag. I’d never had anything but a standard American KitKat before, but I’d heard so many people rave about the Japanese versions that stumbling on the opportunity to try them myself seemed like money I couldn’t afford not to spend. I stuffed two bags of the pistachio and matcha flavors in my tote and headed for the subway, feeling like I’d just unearthed some kind of treasure. When I got home, I pulled out both, plus a few other packages of impulse-purchased Asian candy that I’d scooped up (you know, while I was there), and staged my own little taste test on my kitchen counter. Their flavors and textures differed from the candy I’d been eating for my entire life. They were all great. Matcha won. Without realizing it, I’d repeated a ritual that’s become pretty common, both online and in real life. YouTube and TikTok videos of Americans taste-testing candies from Europe, Asia, and Latin America rack up millions of views. At Economy Candy, a Manhattan confectioner that stocks a huge variety of sweets, new customers come in every day, brandishing their phones, fiending to try candies from far-flung locales that they heard about on the internet or that their roommate tried on vacation. Skye Greenfield Cohen, who runs the store with her husband, told me that as recently as five years ago, Economy Candy had only a few racks of imports. “That meant halvah from the Middle East, Turkish delight, those kinds of grandmalike candy that were more nostalgic for a homeland, rather than fun,” she said. Now imports from around the world make up about a third of the store’s inventory. On one level, it’s not difficult to understand why any type of candy, foreign or domestic, becomes popular. Candy is engineered to entice and delight, and it’s mostly pretty cheap. But American shoppers don’t exactly lack domestic candy options; any average grocery store’s checkout line is bursting with Snickers, Twizzlers, M&Ms, and Skittles. The hunger for foreign treats can’t be entirely explained by the vagaries of social-media virality, either. According to one estimate, since 2009, the annual value of America’s non-chocolate candy imports has grown by hundreds of millions of dollars; in 2019, it crossed the $2 billion threshold for the first time. Some logistical and cultural factors help explain the United States’ imported-candy boom. But first and foremost, Americans seem to love foreign sweets because they’re having the same revelation I had in my kitchen with my green KitKats: The international stuff puts most domestic candy to shame. In the early 2010s, executives at the American division of the Japanese confectioner Morinaga & Company noticed something strange happening in Utah. The company’s Hi-Chew brand of fruit-flavored candies, which was then difficult to find in much of the United States, was selling extraordinarily well in Salt Lake City. The success was welcome—Morinaga wanted to expand its market in the U.S.—but it didn’t immediately make sense. At the time, the majority of the company’s American sales came from West Coast cities with large Asian populations, where the candies were stocked by grocers who catered to people who already knew and liked them. Salt Lake City, which is almost three-quarters white, was anomalously enamored of the intensely chewy little fruit nuggets. The company eventually figured out what was going on: According to Teruhiro Kawabe, Morinaga America’s president, missionaries from the Church of Latter-Day Saints were coming home from stints in Japan, where Hi-Chew has been omnipresent for decades, and buying up as much of the candy as they could find. “They got to know the candy in the Japanese grocery stores, and they got addicted,” Kawabe told me. Soon their friends and families were, too. The Salt Lake City scenario wasn’t exactly replicable, but Kawabe said that it served as proof of concept: Americans would love the candy, if the company could get it in front of them. Getting a particular product in front of shoppers, though, is much easier said than done, especially when it comes to things that are largely unknown or thought to have a niche audience. Candy purchases tend to be spur-of-the-moment decisions made at checkout counters, and that real estate is limited and has long been spoken for by conglomerates such as Hershey and Mars, which make much of the candy that Americans have been eating for their entire life. To take a shot at mainstream American success, Hi-Chew’s makers did the usual stuff that consumer-products businesses do: They hired retail consultants, switched distributors, that kind of thing. But they also set their sights on a very important group: Major League Baseball players, the only people who routinely spend time chewing snacks in extreme close-up on TV. Morinaga supplied Japanese players in the league with Hi-Chew, Kawabe told me, focusing first on teams in markets where major retailers were headquartered. The gambit worked; ESPN reported on just how obsessed the 2015 Yankees squad was with the little fruit candies. Walgreens and CVS picked up the brand after it became popular with the Chicago Cubs and Boston Red Sox. Regular people tried the newly plentiful and suddenly trendy candy, and then insisted that their brother or spouse or co-workers try it. Hi-Chew’s U.S. sales grew from $8 million in 2012 to more than $100 million in 2021, according to Kawabe. This success story might feel a little bit too convenient, but baseball players’ mid-2010s Hi-Chew mania was well documented—and, apparently, ongoing. Moreover, explosive American growth in the past decade has been common among foreign candy brands. Sales of gummy candies from the German confectioner Haribo more than doubled from 2011 to 2017. Ferrero, the Italian parent company of Kinder chocolates, says that the line’s U.S. sales are growing by double digits annually. The European chocolate brands Milka and Cadbury are now owned by the American Oreo-maker Mondelez—an advantage over other confectioners when navigating import and retail red tape. None of these companies pulled off the same tactic with baseball players, but their rise seems to have followed similar patterns. Greenfield Cohen, from Economy Candy, said sales growth largely happens by word of mouth. This is helped along by the increasing popularity of international travel and the internet’s ability to serve niche products to a potentially large pool of previously untapped buyers. European candy in particular benefits from these dynamics—millions of American tourists visit the continent every year, and destination-specific candies are a common gift for returning travelers to bring home to loved ones. (That’s how I first tried Hi-Chew way back in 2002, although my high-school best friend had gone on a family trip to the exotic land of Tampa, not Japan.) Now the barrier between trying one piece of interesting candy—or even just hearing someone rave about it—and keeping a stockpile in your pantry or desk drawer is as low as it’s ever been. Of course, candy also needs to taste good for people to like it. All the word of mouth in the world won’t permanently increase sales of a bad product. Once people try candy from other parts of the world, they return to it because it is, in some very real ways, better than its domestic competitors. Have you ever had a matcha KitKat? Its physical form is identical to that of a regular KitKat, except instead of chocolate, it’s blanketed in bright green. Where many Americans would expect the familiar, slightly bland flavor of milk chocolate, there’s an earthy, creamy sweetness—perfect for people who, like me, get a little queasy after a few pieces of sickly sweet Halloween candy. With Hi-Chews, each wrapped in tiny squares of plain-white waxed paper, the flavors are important—and far more varied than in popular American fruit candies—but the primary feature is the texture. Chewing one feels like you’ve encountered a Starburst that fights back. It’s delicious. The reasons for foreign candy’s superiority are varied—and more surprising than you might expect. In some cases, yes, a candy is better because it is fundamentally different, on a chemical level, than what’s available in America. Europe’s strict regulations on chocolate quality mean that it offers something that’s not really comparable to a Hershey bar (and that Europeans are generally enthusiastic to tell you how much American chocolate sucks). The European Union also bans certain food additives that the FDA allows, which can yield slightly different results in all kinds of finished products, including candy. These cases seem to be the exception, not the rule, however. Ali Bouzari, a culinary scientist and co-founder of the product-development firm Pilot R&D, doesn’t buy the idea that inherently superior quality is the reason that so many people are charmed by imported sweets. “The basic tools of commercial candy manufacture are pretty universal, and the ingredients that people work with are fully globalized,” Bouzari told me. German brands, Japanese brands, and American brands likely all source their grape flavorings, for example, from the same vendors. What’s different—and what makes foreign candies so enticing—instead mostly seems to be in the implementation. Imported candies tend to embrace flavors and textures that American candies don’t. “I will always first go for the melon stuff” when shopping in an East Asian grocery store, Bouzari said. “This is candy inspired by a culture that thinks about melons more than we do.” Every part of the world has some kind of confection that it does particularly well: Scandinavians produce more flavors and textures of licorice than most Americans could dream of. Mexican candy frequently includes savory or spicy flavors. Candies from a number of East and South Asian countries tend to feature a far wider array of fruit flavors than are available in the West. The flavorings and ingredients that go into these candies are likely available to American manufacturers from the vendors they’re already using, according to Bouzari. Foreign producers develop products primarily for their domestic markets, so they make different choices and end up with results that can feel idiosyncratic—sometimes thrillingly so—to the American palate. As food culture has globalized, those palates have become more adventurous, especially in larger metropolitan areas, where more types of food have become more widely available in restaurants and grocery stores than ever before. Meanwhile, Bouzari told me, major U.S. manufacturers haven’t really kept up. They depend on appealing to as broad a swath of the country’s atypically diverse population as possible—not just across racial and ethnic lines, but across the country’s many local and regional food cultures. The results are candies that tend to be highly sweet and pretty bland, forgoing flavors and textures that brands believe might alienate white Americans in particular. All that being said, American tastes have a way of bending the world to their will. Once a foreign confectioner achieves a certain level of American success, it usually ends up adjusting its products for the American market, even if only a little. Kawabe, Morinaga America’s president, told me that some of the Hi-Chew flavors sold in mainstream U.S. retailers vary slightly from what’s available in Japan. When Americans buy grape candy, for example, their flavor expectations are just different from when the Japanese buy the same thing. Candy companies that want huge U.S. sales growth, for better or worse, need to meet people where they are. The most salient difference between foreign and domestic candy might not be chemical or methodological, but rather philosophical. New American products could theoretically embrace the lessons of imported candy and snatch up some of its growing domestic market share. But in Bouzari’s experience, much of the candy being developed domestically, such as low-carb candy from brands like Smart Sweets and Highkey, isn’t trying to delight consumers, but to placate their health fears by engineering it into diet food. “Candy is meant to be edible, ephemeral entertainment,” he said. “If you try to turn it into food, you get caught in a weird no-man’s-land where it’s neither the complete entertainment that it should be, and it’s not as nourishing as it should be.” For Americans who want something fun and novel and sweet, overseas might just be the most logical place to look right now. “In most other places I’ve been in the world, there is a more well-adjusted relationship to hedonism in food than we have here,” Bouzari said. “Other people spend less time trying to figure out how to eat gummy bears with no sugar.” from https://ift.tt/zCUOsNf Check out http://natthash.tumblr.com
0 Comments
Pick a memory. It could be as recent as breakfast or as distant as your first day of kindergarten. What matters is that you can really visualize it. Hold the image in your mind. Now consider: Do you see the scene through your own eyes, as you did at the time? Or do you see yourself in it, as if you’re watching a character in a movie? Do you see it, in other words, from a first-person or a third-person perspective? Usually, we associate this kind of distinction with storytelling and fiction-writing. But like a story, every visual memory has its own implicit vantage point. All seeing is seeing from somewhere. And sometimes, in memories, that somewhere is not where you actually were at the time. This fact is strange, even unsettling. It cuts against our most basic understanding of memory as a simple record of experience. For a long time, psychologists and neuroscientists did not pay this fact much attention. That has changed in recent years, and as the amount of research on the role of perspective has multiplied, so too have its potential implications. Memory perspective, it turns out, is tied up in criminal justice, implicit bias, and post-traumatic stress disorder. At the deepest level, it helps us make sense of who we are. The distinction between first- and third-person memories dates back at least as far as Sigmund Freud, who first commented on it near the end of the 19th century. Not for another 80 years, though, did the first empirical studies begin fleshing out the specifics of memory perspective. And it was only in the 2000s that the field really started picking up steam. What those early studies found was that third-person memories were far less unusual than once thought. The phenomenon is associated with a number of mental disorders, such as depression, anxiety, and schizophrenia, but it is not merely a symptom of pathology; even among healthy people, it is quite common. Just how common is tricky to quantify. Peggy St. Jacques, a psychology professor at the University of Alberta who studies perspective in memory, told me that roughly 90 percent of people report having at least one third-person memory. For the average person, St. Jacques estimates, on the basis of her research, that about a quarter of memories from the past five years are third-person. (At least a couple of papers have found that women tend to have more third-person memories than men do, but a third study turned up no statistically significant difference; on the whole, research on possible demographic disparities is scant.) In certain rare cases, people may have only third-person memories. As you try to recall your own, be warned that things can get confusing fast. Perhaps you can call to mind early-childhood scenes that you picture from a third-person perspective. But it’s hard to know whether these are genuine memories translated from the first person to the third person, or third-person scenes constructed from stories or photographs. To some people, third-person memories are second nature; to others, they sound like science fiction. Why any given memory gets recalled from one perspective rather than the other is the result of a whole bunch of intersecting factors. People are more likely to remember experiences in which they felt anxious or self-conscious—say, when they gave a presentation in front of a crowd—in the third person, St. Jacques told me. This makes sense: When you’re imagining how you look through an audience’s eyes in the moment, you’re more likely to see yourself through their eyes at the time of recall. Researchers have also repeatedly found that the older a memory is, the more likely you are to recall it from the third person, and this, too, is fairly intuitive. If first-person recollection is the ability to adopt the position—and inhabit the experience—of your former self, then naturally you’ll have more trouble seeing the world the way you did as a 6 year old than the way you did last week. The tendency for older memories to be translated into the third person may also have to do with the fact that the more distant the memory is, the less detail you’ll likely have, and the less detail you have, the less likely you are to be able to reassume the vantage point from which you originally witnessed the scene, David Rubin, a Duke University psychology professor who has published dozens of papers on autobiographical memory, told me. Less intuitive, perhaps, is the reverse: People are able to recall a scene in greater detail when they’re asked to take a first-person perspective than when they’re asked to take a third-person perspective. “Sometimes in a courtroom, an eyewitness to a holdup might be asked to recall what happened from the perspective of the clerk,” St. Jacques told me. But if her research is any indication, such tactics may blur rather than sharpen the witness’s memory. “Our research suggests that might actually be more likely to make the memory less vivid, make the eyewitness less likely to remember the specifics.” Even without an examiner’s instructions, such an eyewitness might be predisposed to recall the robbery in the third person: Researchers have found that people often translate traumatic or emotionally charged memories out of the first person. This may be because first-person memories tend to elicit stronger emotional reactions at the time of recall, and by taking a third-person perspective, we can distance ourselves from the painful experience, Angelina Sutin, a psychologist at Florida State University, told me. It may also be a function of the information at our disposal. In charged situations, Rubin said, people tend to zero in on the object of their anger or fear. Take the bank-robbery scenario: The police “want the teller to describe the person who’s robbing them, and instead he describes in great detail the barrel of the gun pointed at his head.” He can’t remember much beyond that. And so, lacking the information necessary to situate himself in his original perspective, he floats. This distancing effect has some fairly mind-bending potential applications, none more so, perhaps, than to the problem of near-death experiences. For many years, philosophers and psychologists have documented instances of people reporting that, in moments of trauma, they felt as though they were floating outside—usually above—their body. Such reports, though, Rubin points out, are not in-the-moment descriptions but after-the-fact accounts. So he has a controversial idea: What in retrospect seems like an out-of-body experience may in fact be only the trauma-induced translation of a first-person memory into a third-person memory, one so compelling that it deceives you into thinking the experience itself occurred in the third person. The recaller, in this theory, is like a person peering through a convex window, mistaking a distortion of the glass for a distortion of the world. Traumatic dissociations are dramatic but by no means isolated cases of what Rubin calls the “constructive nature of the world.” In a 2019 review article on memory perspective, St. Jacques noted that shifting your vantage and fabricating an entirely new scene rely on the same mental processes occuring in the same regions of the brain. So similar are recollecting the past and projecting into the future that some psychologists lump them into a single category: “mental time travel.” Both are acts of construction. The distinction between memory and imagination blurs. At some level, people generally understand this, but rarely do we get so incontrovertible an example as with third-person memories. If you and a friend try to recall the decor at the restaurant where you got dinner last month, you might find that you disagree on certain points. You think the wallpaper was green, your friend thinks blue, one of you is wrong, and you’re both sure you’re right. With third-person memories, though, you know the memory is distorted, because you could not possibly have been looking at yourself at the time. If, without even realizing it, you can change something so central as the perspective from which you view a memory, how confident can you really be in any of the memory’s details? In this way, third-person memories are sort of terrifying. But shifts in perspective are more than mere deficiencies of memory. In her lab at Ohio State University, the psychologist Lisa Libby is investigating the relationship between memory perspective and identity—that is, the way shifts in our memory play a role in how we make sense of who we are. In one experiment, Libby asked a group of female undergraduates whether they were interested in STEM. The students then participated in a science activity, some in a version designed to be engaging, and others in a version designed to be boring. Afterward, when she surveyed the undergrads about how they’d found the exercise, she instructed some to recall it from a first-person perspective, and others from a third-person perspective. The first-person group’s answers corresponded to how interesting the task really was; the third-person group’s corresponded to whether they’d said they liked STEM in the initial survey. Libby’s takeaway: Each type of memory seems to have its own function. “One way to think about the two perspectives is that they help you represent … two different components of who you are as a person,” Libby told me. Remembering an event from a first-person perspective puts you in an experiential frame of mind. It helps you recall how you felt in the moment. Remembering an event from a third-person perspective puts you in a more narrative frame of mind. It helps you contextualize your experience by bringing it in line with your prior beliefs and fitting it into a coherent story. Memory is the—or at least a—raw material of identity; perspective is a tool we use to mold it. Maybe the most interesting thing about all of this is what it suggests about the human proclivity for narrative. When we shift our memories from one perspective to another, we are, often without even realizing it, shaping and reshaping our experience into a story, rendering chaos into coherence. The narrative impulse, it seems, runs even deeper than we generally acknowledge. It is not merely a quirk of culture or a chance outgrowth of modern life. It’s a fact of psychology, hardwired into the human mind. from https://ift.tt/iq60O7K Check out http://natthash.tumblr.com The most haunting memory of the pandemic for Laura, a doctor who practices internal medicine in New York, is a patient who never got COVID at all. A middle-aged man diagnosed with Stage 3 colon cancer in 2019, he underwent surgery and a round of successful chemotherapy and was due for regular checkups to make sure the tumor wasn’t growing. Then the pandemic hit, and he decided that going to the hospital wasn’t worth the risk of getting COVID. So he put it off … and put it off. “The next time I saw him, in early 2022, he required hospice care,” Laura told me. He died shortly after. With proper care, Laura said, “he could have stayed alive indefinitely.” (The Atlantic agreed to withhold Laura’s last name, because she isn’t authorized to speak publicly about her patients.) Early in the pandemic, when much of the country was in lockdown, forgoing nonemergency health care as Laura’s patient did seemed like the right thing to do. But the health-care delays didn’t just end when America began to reopen in the summer of 2020. Patients were putting off health care through the end of the first pandemic year, when vaccines weren’t yet widely available. And they were still doing so well into 2021, at which point much of the country seemed to be moving on from COVID. By this point, the coronavirus has killed more than 1 million Americans and debilitated many more. One estimate shows that life expectancy in the U.S. fell 2.41 years from 2019 to 2021. But the delays in health care over the past two and a half years have allowed ailments to unduly worsen, wearing down people with non-COVID medical problems too. “It just seems like my patients are sicker,” Laura said. Compared with before the pandemic, she is seeing more people further along with AIDS, more people with irreversible heart failure, and more people with end-stage kidney failure. Mental-health issues are more severe, and her patients struggling with addiction have been more likely to relapse. Even as Americans are treating the pandemic like an afterthought, a disturbing possibility remains: COVID aside, is the country simply going to be in worse health than before the pandemic? According to health-care workers, administrations, and researchers I talked with from across the country, patients are still dealing with a suite of problems from delaying care during the pandemic, problems that in some cases they will be facing for the rest of their lives. The scope of this damage isn’t yet clear—and likely won’t come into focus for several years—but there are troubling signs of a looming chronic health crisis the country has yet to reckon with. At some point, the emergency phase of COVID will end, but the physical toll of the pandemic may linger in the bodies of Americans for decades to come. During those bleak pre-vaccine dark ages, going to the doctor could feel like a disaster in waiting. Many of the country’s hospitals were overwhelmed with COVID patients, and outpatient clinics had closed. As a result, in every week through July 2020, roughly 45 percent of American adults said that over the preceding month, they either put off medical care or didn’t get it at all because of the pandemic. Once they did come in, they were sicker—a trend observed for all sorts of ailments, including childhood diabetes, appendicitis, and cancer. A recent study analyzed the 8.4 million non-COVID Medicare hospitalizations from April 2020 to September 2021 and found not only that hospital admissions plummeted, but also that those admitted to hospitals were up to 20 percent more likely to die—an astonishing effect that lasted through the length of the study. Partly, that result came about because only those who were sicker made it to the hospital, James Goodwin, one of the study’s authors and a professor at the University of Texas Medical Branch, in Galveston, told me. It was also partly because overwhelmed hospitals were giving worse care. But Goodwin estimates that “more than half the cause was people delaying medical care early in their illness and therefore being more likely to die. Instead of coming in with a urinary tract infection, they’re already getting septic. I mean, people were having heart attacks and not showing up at the hospital.” [Read: America is sliding into the long pandemic defeat] For some conditions, skipping a checkup or two may not matter all that much in the long run. But for other conditions, every doctor’s visit can count. Take the tens of millions of Americans with vascular issues in their feet and legs due to diabetes or peripheral artery disease. Their problems might lead to, say, ulcers on the foot that can be treated with regular medical care, but delays of even a few months can increase the risk of amputation. When patients came in later in 2020, it was sometimes too late to save the limb. An Ohio trauma center found that the odds of undergoing a diabetes-related amputation in 2020 was almost 11 times higher once the pandemic hit versus earlier in the year. Although only a small percentage of Americans lost a limb, the lack of care early in the pandemic helped fuel a dangerous spike in substance-abuse disorders. In a matter of weeks or months, people’s support systems collapsed, and for some, years of work overcoming an addiction unraveled. “My patients took a huge step back, probably more than many of us realize,” Aarti Patel, a physician assistant at a Lower Manhattan community hospital, told me. One of her patients, a man in his late 50s who was five years sober, started drinking again during the pandemic and eventually landed in the hospital for withdrawal. Patients like this man, she said, “would have really difficult, long hospital stays, because they were at really high risk of DTs, alcohol seizures. Some of them even had to go to the ICU because [the withdrawal] was so severe.” Later in the year, when doctors’ offices were up and running, “a lot of patients expressed that they didn’t want to go back for care right away,” says Kim Muellers, a graduate student at Pace University who is studying the effects of COVID on medical care in New York City, North Carolina, and Florida. Indeed, through the spring of 2021, the top reason Medicare recipients failed to seek care was they didn’t want to be at a medical facility. Other people were avoiding the doctor because they’d lost their job and health insurance and couldn’t afford the bills. The problem, doctors told me, is that all of those missed appointments start to add up. Patients with high blood pressure or blood sugar, for example, may now be less likely to have their conditions under control—which after enough time can lead to all sorts of other ailments. Losing a limb can pose challenges for patients that will last for the rest of their lives. Relapses can put people at a higher risk for lifelong medical complications. Cancer screenings plummeted, and even a few weeks without treatment can increase the chance of dying from the disease. In other words, even short-term delays can cause long-term havoc. [Read: How long can the coronavirus keep reinfecting us?] To make matters worse, the health-care delays fueling a sicker America may not be totally over yet, either. After so many backups, some health-care systems, hobbled by workforce shortages, are scrambling to address the pent-up demand for care that patients can simply no longer put off, according to administrators and doctors from several major health systems, including Cleveland Clinic, the Veterans Health Administration, and Mayo Clinic. Disruptions in the global supply chain are forcing doctors to ration basic supplies, adding to backlogs. Amy Oxentenko, a gastroenterologist at Mayo Clinic in Arizona who helps oversee clinical practice across the entire Mayo system, says that “all of these things are just adding up to a continued delay, and I think we’ll see impacts for years to come.” It’s still early, and not everything that providers told me is necessarily showing up in the data. Oddly enough, the CDC’s National Health Interview Survey found that most Americans were able to see a doctor at least once during the first year of the pandemic. And the same survey has not revealed any uptick in most health conditions, including asthma episodes, high blood pressure, and chronic pain—which might be expected if America were getting sicker. It’s even conceivable that the disturbing observations of clinicians are a statistical illusion. If for whatever reason only sicker people are now being seen by—or able to access—a doctor, then it can be true both that providers are seeing more seriously ill patients in medical facilities and that the total number of seriously ill people in the community is staying the same. The scope of the damage just isn’t yet clear: Maybe a smaller number of people will be worse off because of delayed cancer care or substance-abuse relapses, or maybe far more people—more than tens of million of Americans—will be dealing with exacerbated issues for the rest of their lives. None of this accounts for what COVID itself is doing to Americans, of course. The health-care system is only beginning to grapple with the ways in which a past bout with COVID is a long-term risk for overall health, or the extent to which long COVID can complicate other conditions. The pandemic may feel “over” for lots of Americans, but many who made it through the gantlet of the past two-plus years may end up living sicker, and dying sooner. This disturbing prospect is not only poised to further devastate communities; it’s also bad news for health-care workers already exhausted by COVID. Laura, the Manhattan internist who treated the colon-cancer patient, told me it’s disheartening to see so many people showing up at irreversible points in their disease. “As doctors,” she said, “our overall batting average is going down.” Aarti Patel, the physician assistant, put it in blunter terms: “Burnout is probably too simple a term. We’re in severe moral distress.” Nothing about this grim fate was inevitable. Laura told me that “going to the doctor mid-pandemic may have posed a small risk in terms of COVID, but not going was risky in terms of letting disease go unchecked. And in retrospect it seems that many people didn’t quite get that.” But there didn’t have to be such a stark trade-off between fighting a pandemic and maintaining health care for other medical conditions. Some hospitals—at least the better-resourced ones—figured out how to avoid the worst kind of delays. Mayo Clinic, for example, is one of a number of systems with a sophisticated triage algorithm that prioritizes patients needing acute care. In the spring of 2021, Cleveland Clinic launched a massive outreach blitz to schedule some 86,000 appointments, according to Lisa Yerian, the chief improvement officer. And the Veterans Health Administration provided iPads to thousands of veterans who lacked other means of accessing the internet in the spring of 2020, ensuring a more seamless transition to virtual care, Joe Francis, who directs health-care analytics, told me. Thanks in part to these efforts, Francis said, high-risk patients at the VHA were being seen at pre-pandemic levels a mere six months into the pandemic. These health-care systems also suggest a path forward. America may still be able to stave off the worst of the collateral damage by reaching the patients who have fallen through the cracks—and already the data suggest that these patients tend to be disproportionately Black, Hispanic, and low-income. Tragically, it’s too late for some Americans: People who died of cancer can’t come back to life; amputated limbs can’t regrow. Others still have plenty of time. Hypertension that’s currently uncontrolled can be tamped down before causing an early heart attack; drinking that’s gotten out of hand can be corralled before it leads to liver failure in a decade; undetected tumors can be spotted in time for treatment. An uptick in premature death and disability, summed over millions of Americans, could strain the health-care system for years. But it’s still possible to prevent an acute public-health crisis from seeding an even bigger chronic one. from https://ift.tt/FxdKgBZ Check out http://natthash.tumblr.com In less than two weeks, you could walk out of a pharmacy with a next-generation COVID booster in your arm. Just a few days ago, the Biden administration indicated that the first updated COVID-19 vaccines would be available shortly after Labor Day to Americans 12 and older who have already had their primary series. Unlike the shots the U.S. has now, the new doses from Pfizer and Moderna will be bivalent, which means they’ll contain genetic material based both on the ancestral strain of the coronavirus and on two newer Omicron subvariants that are circulating in the U.S. These shots’ new formulation promises some level of protection that simply hasn’t been possible with the original vaccines. “A bivalent vaccine will have some benefit for almost everybody who gets it,” Rishi Goel, an immunologist at the University of Pennsylvania, told me. “How much benefit that is, we’re still not exactly sure.” People who aren’t at high risk could end up only marginally more protected against severe outcomes, and no one thinks the shots will banish COVID infections for good. There is, however, a simple rule of thumb that nearly everyone can follow to maximize the uncertain gains from a shot: Wait three to six months from your last COVID infection or vaccination. Put that rule into action, and it plays out a little differently, depending on your circumstances. If you haven’t had an Omicron infection:If you haven’t had COVID since about November 2021, the advantage of a bivalent booster over the original formula is obvious, and as long as you haven’t gotten boosted recently, there’s every reason to get the new one right away. (If you have been boosted in the past few months, your antibody levels are probably still too high for a new shot to do much for you.) Marion Pepper, an immunologist at the University of Washington, told me that Americans who have already gotten three or more doses “have probably maxed out the protective capacity” of the original shots. By contrast, the bivalent vaccines offer something new to those who have so far escaped Omicron: a lesson on the spike proteins of the BA.4 and BA.5 subvariants, which will help the immune system fight the real thing should it get into your body. “I’m just super excited to get the bivalent vaccine,” says Jenna Guthmiller, an immunologist at the University of Colorado who has not yet had COVID. “I think it’ll be really nice and ease my mind a little bit.” [Read: The pandemic’s soft closing] If you have had an Omicron infection:Veterans of Omicron infections might still have something to gain from seeing the BA.4 and BA.5 spike proteins—especially if your goal is to avoid getting sick with COVID at all. Past a certain number of shots, boosters’ impact on your long-term protection against severe disease is unclear, Goel told me. Paul Offit, the director of the Vaccine Education Center at Children’s Hospital of Philadelphia, told me he doesn’t plan on getting a booster at all this fall because, after three vaccine doses and an infection, “I think I’m protected against serious illness.” But if you want to stave off infection, Goel said, “the bivalent vaccines, or really any variant-containing vaccines, have real value.” That’s because formulas based on a given variant have been shown to temporarily increase your stock of antibodies that target that variant. How long that extra-protective state lasts, or whether it’s sufficient to prevent any infection whatsoever, is still a scientific puzzle. The original boosters were shown to increase antibody levels to a peak about two weeks after the shot, then decay steadily over the following three months. We don’t know yet whether a bivalent formula will change that timeline, Goel said. But you can still use it to estimate approximately when your protection will be at its highest. You might, for example, choose to err on the early side of that three-to-six-month timeline if you have a particularly high-risk event coming up in the next few weeks. “If all we had was the original booster and I was going to an indoor wedding or something, I think it would be reasonable to get that booster,” Pepper said. [Read: The BA.5 wave is what COVID normal looks like] If you had an Omicron infection this summer:“You’re still riding the wave of antibodies that you generated as a result of that infection,” Guthmiller told me, so a shot won’t do much for you yet. That’s true regardless of which Omicron subvariant you might have been infected with, she said, because BA.2 infections have been shown to protect fairly well against today’s dominant strains, BA.4 and BA.5. (BA.2 became dominant in the United States back in March.) The severity of your illness doesn’t really matter either, Goel said. A higher fever and more intense cough might indicate that your immune system got extra revved up, he said, but they could just as easily mean that your body needs more help responding to the coronavirus. In either case, once a little more time has passed, getting the bivalent vaccine could help extend your body’s memory of its last COVID encounter, and keep infection at bay. If you’re at high risk:Certain groups of people should get any booster as soon as it’s available to them, the experts I spoke with emphasized to me: immunocompromised people, people over the age of 50 or so, and people with medical conditions that put them at high risk of severe disease. If you fall in one of these categories and haven’t received all the boosters you’re eligible for, “I wouldn’t wait for the bivalent,” Offit said. For people in these high-risk categories who have already gotten the recommended number of boosters, you should get the new one as soon as it’s available to you. (The FDA and CDC have not yet indicated whether they will recommend a waiting period between your most recent shot and the bivalent booster.) Goel recommended waiting at least a month after your most recent infection or shot, but if you’re very worried about your risk, you don’t need to stretch the delay to three months. Your body might still have extra antibodies floating around, but with no practical way to check at scale, “I’m honestly in favor of recommending boosting as a way to maximize individual benefit,” he said. [Read: America’s fall booster plan has a fatal paradox] If you want to wait and see:Waiting is always an option if you want to know more about how the bivalent vaccines perform. The FDA and CDC are set to green-light the shots based on human data from the existing boosters and other experimental bivalent boosters that didn’t make it to market in the U.S.—plus trials on the new formula in mice. Pfizer and Moderna simply haven’t progressed very far in their human trials. While there’s no reason to suspect that the new shots won’t be safe, Offit recommended opting for the original boosters until more safety and efficacy data are available, which could be as soon as a couple of months after the rollout—as long as the vaccine makers or the government collects that information and makes it public. But Guthmiller and Goel said they weren’t concerned about the lack of human data, and the bivalent shot is almost certainly the better bet. There is one significant reason to avoid waiting too long for the bivalent shot: It offers the greatest protection against infection from the subvariants it’s actually designed around. BA.4 and BA.5 might be with us through the fall and winter—or they might give way to a different branch of Omicron, or even a variant that’s entirely unlike Omicron. You’d certainly be better off against this new variant with a bivalent booster than no booster at all. But if you want to maximize your anti-infection shield while you have it, consider putting it up against the enemy you know. from https://ift.tt/pt3MNyP Check out http://natthash.tumblr.com America’s first-ever reformulated COVID-19 vaccines are coming, very ahead of schedule, and in some ways, the timing couldn’t be better. Pfizer’s version of the shot, which combines the original recipe with ingredients targeting the Omicron subvariants BA.4 and BA.5, may be available to people 12 and older as early as the week after Labor Day; Moderna’s adult-only brew seems to be on a similar track. The schedule slates the shots to debut at a time when BA.5 is still the country’s dominant coronavirus morph—and it means that, after more than a year of scrambling to catch up to SARS-CoV-2’s evolutionary capers, we might finally be getting inoculations that are well matched to the season’s circulating strains. Which is “absolutely great,” says Deepta Bhattacharya, an immunologist at the University of Arizona. In other ways, the timing couldn’t be worse. Emergency pandemic funds have been drying up, imperiling already dwindling supplies of vaccines; with each passing week, more Americans are greeting the coronavirus with little more than a shrug. The most recent revamp of the country’s pandemic playbook has softened or stripped away the greater part of the remaining mitigation measures that stood between SARS-CoV-2 and us. Calls for staying up-to-date on COVID vaccines are one of the last nationwide measures left—which puts a lot of pressure on shot-induced immunity to combat the virus, all on its own. [Read: The pandemic’s soft closing] The nation has latched on before to the idea that shots alone can see us through. When vaccines first rolled out, Americans were assured that they’d essentially stamp out transmission, and that the immunized could take off their masks. “I thought we learned our lesson,” says Saskia Popescu, an infectious-disease epidemiologist at George Mason University. Apparently we did not. America is still stuck on the notion of what Popescu calls “vaccine absolutism.” And it rests on two very shaky assumptions, perhaps both doomed to fail: that the shots can and should sustainably block infection, and that “people will actually go and get the vaccine,” says Deshira Wallace, a public-health researcher at the University of North Carolina at Chapel Hill. As fall looms, the U.S. is now poised to expose the fatal paradox in its vaccine-only plan. At a time when the country is more reliant than ever on the power of inoculation, we’re also doing less than ever to set the shots up for success. In terms of both content and timing, the fall shot will be one of the most important COVID vaccines offered to Americans since the initial doses. Since SARS-CoV-2 first collided with the human population nearly three years ago, it’s shape-shifted. The coronavirus is now better at infecting us and is a pretty meh match for the original shots that Pfizer, Moderna, and Johnson & Johnson produced. An updated vaccine should rejuvenate our defenses, prodding our antibody levels to soar and our B cells and T cells to relearn the virus’s visage. That doesn’t mean the shots will offer a protective panacea. COVID vaccines, like most others, are best at staving off severe disease and death; against BA.5 and its kin, especially, that protection is likely to be durable and strong. But those same shields will be far more flimsy and ephemeral against milder cases or transmission, and can only modestly cut down the risk of long COVID. And when partnered with a compromised or elderly immune system, the shots have that much less immunological oomph. Then say a new immunity-dodging variant appears: The shots could lose even more of their strength. Vaccine performance also depends on how and how often the shots are used. The more people take the doses, the better they will work. But no matter how hard we try, this reformulated shot “is not going to cover everyone, either because they choose not to get it or won’t be able to access it,” says Katia Bruxvoort, an epidemiologist at the University of Alabama at Birmingham. People who haven’t yet finished their primary series of COVID shots aren’t expected to be able to sign up for the BA.5 boosts—a policy that Bhattacharya thinks is a big mistake, not least because it will disadvantage anyone who seeks a first brush with vaccine protection this fall. “The better the degree of breadth right at the beginning,” he told me, the better future encounters with the virus should go. Most kids under 12 remain in that totally unvaccinated category; even those who have completed their initial round of shots won’t be eligible for the revamped recipe, at least not in this first autumn push. Among people who can immediately get the new booster, uptake will probably be meager and unbalanced. “Realistically, the boosters are going to be concentrated in the places that have been the least impacted by the pandemic” and in people who have already had at least one boost, says Anne Sosin, a public-health researcher at Dartmouth. Such widening gaps in protection will continue to offer the virus vulnerable pockets to invade. Crummy uptake isn’t a new issue, and some of the same deterrents that have plagued rollouts from the start haven’t gone away. Vaccines are a hassle and can come with annoying side effects. And in recent months, even more obstacles have been raised. The wind-down of COVID funding is making it much harder for people without insurance or other reliable health-care access to get boosted. And after nearly three years of constant crisis slog, far fewer people fear the virus, especially now that so many Americans have caught it and survived. A year into the Biden administration’s concerted push for boosters, fewer than a third of U.S. residents have nabbed even their first additional shot. With each additional injection Americans are asked to get, participation drops off—a trend experts anticipate will continue into the fall. “There’s a psychological hurdle,” says Gregory Poland, a vaccinologist at the Mayo Clinic, “that this is over and done.” [Read: New COVID vaccines will be ready this fall. America won’t be.] The reality that most Americans are living in simply doesn’t square with an urgent call for boosts—which speaks to the “increasing incoherence in our response,” Sosin told me. The nation’s leaders have vanished mask mandates and quarantine recommendations, and shortened isolation stints; they’ve given up on telling schools, universities, and offices to test regularly. People have been repeatedly told not to fear the virus or its potentially lethal threat. And yet the biggest sell for vaccines has somehow become an individualistic, hyper-medicalized call to action—another opportunity to slash one’s chances at severe disease and death. The U.S. needs people to take this vaccine because it has nothing else. But its residents are unlikely to take it, because they’re not doing anything else. If all goes as planned, COVID tests, treatments, and vaccines will be commercialized by 2023—making these fall shots perhaps the last free boosters we’ll get. And yet, officials have neither a new strategy for buoying vaccine uptake nor the ammunition for clear messaging on how well the shots will work. In service of speeding up the availability of the BA.5-focused shots, federal regulators are planning to green-light the new formulation based on antibody data from mice. (Both Pfizer and Moderna have human studies planned or under way, but results aren’t expected to be ready until after the rollout begins.) The reliance on animal experiments isn’t necessarily concerning, Bhattacharya told me; the approval protocol for annual flu shots doesn’t require massive human clinical trials either. But the shortcut does introduce a snag: “We know nothing yet about the efficacy or effectiveness of these Omicron-focused vaccines,” Poland said. Researchers can’t be sure of the degree to which the shots will improve upon the original recipe. And public-health officials won’t be able to leverage the concrete, comforting numbers that have been attached to nearly every other shot that’s been doled out. Instead, communications will hinge on “how much trust you have in the information you’re getting from the government,” UNC’s Wallace told me. “And that is very tricky right now.” Shots, to be abundantly clear, are essential to building up a properly defensive anti-COVID wall. But they are not by themselves sufficient to keep invaders out. Like bricks stacked without a foundation or mortar, they will slip and slide and crumble. Nor is a wall with too few bricks likely to succeed: If the goal is to preemptively quell a winter case surge, “a booster that will have maybe 30 to 40 percent uptake is not something we can expect to have a huge population-level impact,” Bhattacharya told me. [Read: Vaccines are still mostly blocking severe disease] All of that bodes poorly for the coming fall and winter, a time when respiratory viruses thrive and people throng indoors. The nation could see yet another round of “incredibly high surges,” says Jessica Malaty Rivera, a senior adviser at the Pandemic Prevention Institute, further sapping supplies of underutilized or tough-to-access tools such as tests and treatments, and straining a health-care system that’s already on the brink. Cases of long COVID will continue to appear; sick people will continue to miss work and school. And “God forbid we get another variant” that’s even more severe, George Mason’s Popescu told me, further overwhelming the few defenses we have. Pinning all of America’s hopes on vaccines this fall, experts told me, may have ripple effects on our future COVID autumns too. Asked to counter the virus alone, the injections will falter; they will look less appealing, driving uptake further down. If this fall is meant to set a precedent for subsequent vaccination campaigns, it may unspool one of the worst scenarios of all: asking shots to do so much for us that they hardly accomplish anything at all. from https://ift.tt/wBp2Hvc Check out http://natthash.tumblr.com The transition from Monkeypox Inoculation Plan A to Monkeypox Inoculation Plan B has been a smashing success—at least, if you ask federal officials. Just a few weeks ago, the U.S. had nowhere near enough of the Jynneos vaccine to doubly dose even a quarter of the Americans at highest risk of monkeypox, roughly 1.6 million men who have sex with men. Now that the administration has asked that every dose of Jynneos be split into five and delivered a different way, between the layers of the skin, the party line has changed. “Everyone that wants to get vaccinated within that group is going to have an opportunity to get vaccinated” by September’s end, Robert Fenton, the White House’s monkeypox czar, said on a podcast last week. But this new strategy of intradermal dosing “is a gamble,” says Caitlin Rivers, an epidemiologist at Johns Hopkins, and its weaknesses are already beginning to show. It may be high time to start acting on a fallback plan for our fallback plan, should Plan B’s high-stakes wager not pay off. [Read: America’s new monkeypox strategy rests on a single study] The Plan Cs on the table aren’t very palatable—which is probably why they’re Plan Cs. One option, largely dismissed early on, could entail turning to ACAM2000, a hypereffective smallpox shot, with sometimes dangerous side effects, that the U.S. has stockpiled in spades. Already, three jurisdictions, including the state of California, have ordered more than 800 doses of ACAM from the government, according to Timothy Granholm, a spokesperson for HHS. Simply anticipating the possibility of Plan B’s failure might count as atypical for modern American public health—getting ahead of the virus du jour, rather than taking a reactive stance, says Stella Safo, an HIV physician in New York. Too often in the past few years, the institutions of public health have observed rather than acted, allowing SARS-CoV-2, and now monkeypox, to run roughshod over the American populace. “It would be really nice to not be saying, ‘Let’s wait and see,’” Safo told me. ACAM2000 may not be the country’s best or safest option for curtailing monkeypox, but the risk of not considering it may soon outweigh the risks of the shot itself. There’s a world in which the U.S. didn’t even need a Monkeypox Inoculation Plan B. Had U.S. leaders been willing to invest resources in heading off the pathogen, by offering aid to countries where the virus has been endemic for decades or by focusing earlier this year on tests, treatments, vaccines, and public communications, maybe America’s original immunization plan—using the full, subcutaneous Jynneos dose—would have been all the nation needed on the injection front. That didn’t happen, and instead the country adopted intradermal delivery, without real clarity on how well such doses might guard against infection, transmission, or disease. The notion that intradermal shots will work as hoped rests on a “chain of assumptions,” says John Beigel, an immunologist at the National Institute of Allergy and Infectious Diseases, several of which may not hold during a large, fast-spreading outbreak that’s tightly linked to sex—a poorly studied form of monkeypox transmission. Jynneos’s original approval was based on an antibody analog of protection, rather than efficacy against bona fide illness. And the FDA’s authorization of intradermal shots rests on a single study, which didn’t directly check the vaccine’s ability to stave off disease, either. The study also enrolled only healthy adults, most of them white—a poor reflection of the population now being hit. It’s a “big leap” to build a nationwide vaccine campaign on just those results, says Sri Edupuganti, a vaccinologist at Emory University and one of the study’s authors. (Beigel is now designing a clinical trial that will reevaluate the intradermal route among participants more relevant to the current outbreak. He and his team will also test one-tenth intradermal doses, which could further stretch supply.) [Read: America should have been able to handle monkeypox] The intradermal plan has logistical challenges, too. Administering in-skin shots requires extra training and special needles, burdening already stressed staff, especially in low-resource regions. Several jurisdictions are struggling to extract more than three or four doses from some vials, rather than the government’s promised five—a shortchanging of those hoping to increase their stocks by a clean 400 percent. Plus, some bottle caps are breaking before all the doses are withdrawn. Intradermal vaccination can also come with grating side effects, including redness and swelling that can stick around for days, potentially deterring people from returning for the essential second shot. Fenton, from the White House, noted in a press briefing last week that the switch to intradermal “increased our supplies significantly without compromising safety or effectiveness.” But that assertion seems “disingenuous at best,” says Gregg Gonsalves, an epidemiologist and AIDS activist at Yale’s School of Public Health. Even the CEO of Bavarian Nordic, the vaccine’s manufacturer, criticized the FDA’s pivot as too hasty. (The FDA attempted to counter the company’s criticisms.) Meanwhile, demand may continue to grow, especially if the epidemic starts to concentrate less among men who have sex with men. “The longer the outbreak lasts, the longer you have for jumping to other populations,” Gonsalves told me. College campuses, reopening now, “seem like the most obvious next stop.” And “if this gets into other networks,” says Ina Park, a sexual-health expert at UC San Francisco, Plan B “just won’t be enough.” Equity, too, is becoming an issue. “If we lived in a world where we had plenty of vaccine, you would go with subcutaneous,” Beigel told me. But in North Carolina, for instance, where 70 percent of monkeypox cases have been among Black men, some two-thirds of the subcutaneous shots administered before August 8 went to people who are white; similar skews have been noted in New York City. Now “Black and brown gay men are really angry,” says Kenyon Farrow, a writer and public-health activist based in Ohio. “They watched white gay men get full doses … and now they feel like they are getting less of a dose.” Farrow has pushed for everyone to get at least one subcutaneous shot—a strategy that advocates in New York City also back—but the Biden administration seems set on moving all jurisdictions onto the intradermal route. Mapping out yet another vaccination strategy won’t address all of these problems. (And no matter what, the administration should keep ordering more Jynneos, stat.) But the forecast for fall is murky. And should the present situation worsen, a fresh tactic could give the U.S. a head start—something the country hasn’t had on the public-health playing field in a while. Already, some experts are mulling the nuclear option: ACAM2000, the smallpox shot that the government has been hoarding to counter a potential bioterrorism attack. Doses of the vaccine are available by the many millions, and thought to be both effective and durable. It’s also, Edupuganti told me, “one of the vaccines with the highest amount of adverse reactions,” occasionally triggering side effects as serious as heart inflammation. The shot contains a replicating virus, and shouldn’t be taken by immunocompromised people, including many of those who are living with HIV. And just about everyone who gets the shot sprouts an oozy lesion at the injection site that can pass the vaccine virus to others. Against something like smallpox—a far more contagious virus that killed up to 30 percent of its victims—ACAM2000 would be “a no-brainer,” says Rafi Ahmed, a vaccinologist at Emory University. With monkeypox, though, Johns Hopkins’s Rivers told me, the risk-benefit calculation “is really hazy.” [Read: What should worry most Americans about our monkeypox response] It’s not time to trot out ACAM yet, Safo, the New York physician, told me. But maybe autumn will bring many more cases. Maybe monkeypox’s symptoms could grow more severe. Maybe the virus will start to surge in new populations. Maybe intradermal Jynneos will fall short in effectiveness or safety. In any case, containment with the current tools isn’t a guarantee. “If things do get out of control,” Ahmed told me, “you want to have some ACAM stocks ready to go.” No clear, perfect threshold can yet denote “out of control.” Still, a trend toward a worse outbreak would inch the country closer to tapping into its ACAM2000 supply, Park told me: “I don’t think we have another choice.” Which means that the FDA and CDC should probably start poring over the ACAM data now, Rivers said. Resorting to ACAM2000 will also put the onus on officials to explain to the public what they’re getting into. If some are balking at intradermal shots, people further back in line could reasonably wonder why they’ve been stuck with a less-safe vaccine, Farrow pointed out. There could be a middle ground worth testing in a clinical trial: one shot of Jynneos, via either administration route, followed by a dose of ACAM2000, says Stephen Goldstein, a virologist at the University of Utah. One 2019 study hints that this shot, chaser approach could shrink infectious lesions, as well as cut down on ACAM2000’s side effects, while still offering an immunological boost—though that trial used two subcutaneous Jynneos doses first. In any case, the government would do well to pursue more options, even enroll people in trials comparing the different vaccines, Gonsalves told me. And transparency is tantamount. “Back in the days of AIDS,” he said, “many of us were saying, as new drugs were coming online, we wanted access and answers” about the options at hand. Right now, the nation’s short on both. That “we’re even having to ask these questions about ACAM,” Farrow told me, is a sobering reminder that “we didn’t get our shit together” early on. Instead, the U.S. has backed itself into having to reckon with its appetite for risk. Being too cautious with vaccines could allow the outbreak to further balloon; being too reckless with shots could compromise public trust. The administration firmly contends that Jynneos remains “the best available option,” according to Granholm, the HHS spokesman. (That said, ACAM2000 “is available upon request,” he told me.)
from https://ift.tt/YJsg6Ub Check out http://natthash.tumblr.com On the list of perfect pet parents, Mikel Delgado, a professional feline-behavior consultant, probably ranks high. The Ph.D. expert in animal cognition spends half an hour each evening playing with her three torbie cats, Ruby, Coriander, and Professor Scribbles. She’s trained them to take pills in gelatin capsules, just in case they eventually need meds. She even commissioned a screened-in backyard catio so that the girls can safely venture outside. Delgado would do anything for her cats—well, almost anything. “Guilty as charged,” Delgado told me. “I do not brush my cats’ teeth.” To be fair, most cat owners don’t—probably because they’re well aware that it’s weird, if not downright terrifying, to stick one’s fingers inside an ornery cat’s mouth. Reliable stats are scarce, but informal surveys suggest that less than 5 percent of owners give their cats the dental scrub-a-dub-dub—an estimate that the vets I spoke with endorse. “I’m always very shocked if someone says they brush their cat’s teeth,” says Anson Tsugawa, a veterinary dentist in California. When Steve Valeika, a vet in North Carolina, suggests the practice to his clients, many of them “look at me like I’ve totally lost it,” he told me. (This is where I out myself as one of the loons: My cats, Calvin and Hobbes, get their teeth brushed thrice weekly.) There certainly is an element of absurdity to all of this. Lions, after all, aren’t skulking the savannas for Oral-Bs. But our pets don’t share the diets and lifestyles of their wild counterparts, and their teeth are quite susceptible to the buildup of bacteria that can eventually invade the gums to trigger prolonged, painful disease. Studies suggest that most domestic cats older than four end up developing some sort of gum affliction; several experts told me that the rates of periodontal disease in household felines can exceed 80 percent. Left untreated, these ailments can cost a cat one or more teeth, or even spread its effects throughout the body, potentially compromising organs such as the kidneys, liver, and heart. To stave off kitty gum disease, veterinary guidelines and professionals generally recommend that owners clean their cats’ chompers daily, ideally for at least a minute, hitting every tooth. “That’s the gold standard,” says Santiago Peralta, a veterinary dentist at Cornell University. Even a gap of two or three days can leave enough time for tartar to cement, Jeanne Perrone, a veterinary-dentistry trainer in Florida, told me. But brushing feline teeth is also really, really, really hard. Most cats aren’t keen on having things shoved into their mouth, especially not bristly, sludge-covered sticks. (Dogs don’t always love cleanings either, but they’re at least used to engaging their owners with their mouths.) My old cat, Luna, was once so desperate to escape a brushing that she shrieked in my face, then peed all over the floor. [Read: Why we think cats are psychopaths] A niche industry has sprouted to ease the ordeal for hygiene-conscious humans: poultry-flavored toothpastes, cat-size toothbrushes, silicone scrubbers that fit on fingers. Sometimes the gear helps; when Chin-Sun Lee, a New Orleans–based writer, purchased malt-flavored toothpaste for her cat, Tuesday, he went bonkers for the stuff. Every morning, he comes trotting over just so he can lick the brush. Krissy Lyon, a neuroscientist at the Salk Institute, told me that one of her cats, Cocchi, is so crazy for his toothpaste that she and her partner have to “restrain him or lock him in a different room” while they’re brushing the teeth of their other cat, Noma. But tasty toothpaste isn’t a sufficient lure for all. Valeika, who extols the virtues of feline oral health, admitted that even his own cat, Boocat, doesn’t reap the benefits of his brushing expertise. He “tried hard-core for a couple weeks” when he adopted her seven years ago. But Boocat was too feisty to stand for such a thing. “She can be a real terror,” Valeika told me. “We once saw her chase a bear out of our yard.” Maybe Boocat is picking up on how odd the whole toothbrushing ritual can be. Even most American people weren’t regularly scrubbing their dentition until around the time of World War II. Vet dentistry, which borrowed principles from its human analogue, “is a relatively new discipline,” Peralta told me. “Thirty years ago, nobody was even thinking about dog or cat teeth.” Nor was it all that long ago that people across the country routinely let their pets sleep outside, eat only table scraps, and run hog wild through the streets. Now pets have become overly pampered, their accessories Gooped. Experts told me that they’ve seen all kinds of snake-oil hacks that purport to functionally replace feline toothbrushing—sprays, gels, toys, water additives, even calls to rub cat teeth with coconut oil. A lot of these products end up just cosmetically whitening teeth, temporarily freshening breath, or accomplishing nothing at all. If a super-simple, once-a-month magic bullet for dental hygiene existed, Tsugawa told me, “we’d be doing it for our own teeth.” There are probably a lot of un-toothbrushed cats out there who could be s-l-o-w-l-y taught to accept the process and maybe even enjoy it. Mary Berg, the president of Beyond the Crown Veterinary Education, told me that one of her colleagues trained her pet to relish the process so much that “she could just say ‘Brusha brusha brusha’ and the cat would come running.” But getting to that point can require weeks or months of conditioning. Berg recommends taking it day by day, introducing cats first to the toothpaste, then to getting one or two teeth touched, and on and on until they’re comfy with the whole set—always “with lots of praise and reward afterward,” she said. And that’s all before “you introduce that scary plastic thing.” [Read: An offbeat approach to bonding with cats] That’s a big ask for many owners, especially those who went the cat route because of the creatures’ rep for being low-maintenance. The consequences of skipping toothbrushing are also subtle because they don’t directly affect humans, Delgado told me. Miss a nail trimming, and the couch might pay the price. But cat teeth aren’t often glimpsed. The potential downsides of brushing, meanwhile, can be screamingly clear. On cat forums and Twitter, the cat-toothbrushing-phobic joke about losing their fingers. But what a lot of people are really afraid of sacrificing is their cat’s love. Broken trust can mar the relationship between owner and pet, Perrone said; people simply can’t communicate to skittish animals that this act of apparent torture is for their own good. Some cats never learn to deal. Even among veterinary experts, toothbrushing rituals are rare. Peralta and his wife just try to clear the bar of “at least once a week” with their own cat, Kit Kat; Berg and Perrone don’t brush their felines’ teeth at all. (Tsugawa does not currently own a cat, but he wasn’t a brusher when he did.) I’m no pro, but I feel a bit torn too. I never took the time to teach Calvin and Hobbes to see toothbrushing as a treat, and they can get pretty grumpy during the ritual itself. Valeika, the North Carolina vet, told me that seeing Boocat’s horrified reactions was the main thing that prompted him to quit the brush. “She would hate it if we were always doing that to her,” he said. “She really would just not be our pet anymore.”
from https://ift.tt/RT27OJk Check out http://natthash.tumblr.com The last time I tried to wait out the pandemic, I drove south. My dog and I traveled nine hours from San Francisco to the Anza-Borrego Desert, which sprawls over more than half a million acres near the Mexican border. Most of that territory is untouched wilderness, rocky washes home to deer, pumas, and golden eagles. The place felt solitary. That’s why I chose it. I work as a doctor in an emergency room, a hospital, and an HIV clinic. I also take powerful immunosuppressants for autoimmune disease, one of which rendered the coronavirus vaccines far less effective in my body. My co-workers had tried to see all of the COVID patients to protect me, but as Omicron exploded in January, that became impossible. The woman who’d broken her ankle tested positive. The grandfather who’d lacerated his scalp did too, just like the middle-aged man who wanted to detox. Treatments for COVID were in short supply, and I wanted to get through the surge alive. So for several weeks, I canceled work, a privilege most can’t afford. Forced into isolation, I decided to spend a week where solitude felt deliberate. Back then I would have described my trip to the desert, and pandemic life broadly, as an intermission. The moment caseloads tumbled and hospitals stocked treatments, I would go hiking in Japan. I would brave the dating scene after a two-year hiatus. I would deploy with Doctors Without Borders. Meanwhile, I reassured myself that I just had to hold out a few months longer, even though the deadline kept retreating. Mine was an outlook equally comforting and wrong. [Read: The millions of people stuck in pandemic limbo] Kurt Vonnegut famously taught about six archetypes that underpin stories. In a video of one of his lectures, he draws on a chalkboard an x-axis for time and a y-axis for degree of good fortune, then traces a sine wave that plummets before rising again. “We call this story ‘Man in Hole,’ but it needn’t be about a man, and it needn’t be about somebody getting into a hole,” Vonnegut says. It’s a tale—of fall and salvation, of mettle forged through trials, of ultimate catharsis and victory--that humans tell naturally. And it needn’t be about a man and a hole. It could be about a world and a virus. People in the U.S. have heard this story repeatedly over the past two and a half years, the media and government casting the downturn of each surge or advent of each therapeutic as the ladder that would soon carry us from the hole of the pandemic. Until that deliverance, we could cultivate rooftop gardens and sourdough starters to stave off our impatience. It’s less scary to rewrite reality into a reassuring plot arc—one with a familiar contour and clean resolution—than to envision a story that doesn’t end, or one whose ending permanently reconfigures our world. But nearly eight months after my return from Anza-Borrego, the bridge of my nose is raw from my N95 mask. Yet another Omicron subvariant is spreading, as one strain supersedes another. Despite stunning progress in vaccines and drugs, COVID still threatens to hospitalize or disable me, and I don’t foresee that reality changing imminently. While the mirage of normalcy recedes, glittering and unattainable, I remain marooned in another desert, staring down the truth that a sense of closure won’t arrive anytime soon. [Read: The BA.5 wave is what COVID normal looks like] SARS-CoV-2 is only the latest pathogen to upend people’s lives. Working as a doctor who specializes in HIV—a virus that profoundly affects my patients yet is ignored by most Americans—has taught me some truths about pandemics. The first time someone asked me whether HIV was “still a problem,” at a Christmas party years ago, I almost choked on my drink. But the question made twisted sense in a country where the notion that a pandemic is over depends little on science and more on which communities are affected. The people I treat who gasp from pneumonia or seize from meningitis because they can’t access or adhere to HIV medications are invariably poor, and many are Black or Latino. My acquaintance at the party was a straight, white, wealthy man in his 60s. He could exist in a story where the man had climbed out of the hole. Tale concluded, the credits rolled. That conversation is the reason why, whenever someone says the coronavirus pandemic is over, my first question is always, “Over for whom?” Though I‘ve endured a sliver of the adversity my patients have, I’m learning what it’s like to embody a less comfortable story than the one others are telling. I walk by packed bars. I scroll through photos of maskless crowds at concerts. I hear people use the phrase “during the pandemic,” as if it’s ended. After multiple false starts, the man in the dominant version of the story escaped the hole after the Omicron surge once and for all. That narrative has real consequences, including lax precautions, risky workplace policies, and woefully inadequate funds for global COVID efforts. It sidelines millions of Americans: not only people like me dealing with high-risk medical conditions, but also survivors confronting long COVID, frontline workers depleted by burnout, and loved ones grieving those who have died, disproportionately people of color. I don’t want my fellow San Franciscans to stop eating out or traveling; their lives will be freer than mine, a situation I accept as unavoidable even if it saddens me. I do wish, though, that the government would value my life by investing in preventing COVID transmission rather than issuing ever more anemic guidelines. And amid such policy failures, I wish people with less to fear from the virus would shift the burden off the shoulders of the more vulnerable, by wearing masks on public transit, staying home when they’re sick until a rapid test turns negative, and keeping up to date on boosters. [Read: The pandemic’s soft closing] After far too long, I have stopped clutching the myth of Man in Hole, in which I must either pretend the pandemic is over—a self-deception that could land me in the hospital—or else wait indefinitely for a ladder, watching clouds scud over desert lowlands as I forfeit plans and dreams. I need a story to replace it, and for that, I’ve turned to my patients. A few years ago, I treated a young man who had contracted HIV just out of college. A pandemic that had never touched him suddenly shaded his life, and for months, that paralyzed him. He didn’t look for work; he played video games all day and nearly lost his housing. Then, six months after his diagnosis, he started bringing a notebook to our visits. In it, he fashioned a plan. Nothing sweeping: Stop by two restaurants to ask about jobs. Get glasses. Post a dating profile. A year into our time together, he was working in a café, had an adoring boyfriend who knew his status, had undergone a long-overdue surgery, and had started graduate school. I started carrying a notebook recently. The plans I scribble down differ from those I might have conceived before the pandemic but share one feature: They are possible despite my constraints. I rode my bike from Seattle to Vancouver for an outdoor vacation. I attended a wedding in an N95 mask. I made enchiladas with friends after we all took rapid tests. I spoke on the radio about the injustices of pandemic policy, because adapting to my new reality doesn’t mean abdicating the battle for a better one. That, too, I learned from people with HIV, who formed committees to pressure the FDA and the NIH, demanded inclusion in policy decisions, and were jailed for protesting for effective antiretrovirals, including one used in COVID treatment. [Read: COVID long-haulers are fighting for their future] I still seethe whenever I show up to an event that’s too overcrowded and underventilated for me to stay, or board a plane where the overturned mask rule reminds me of the nation’s disregard for my health. But action is nonetheless a relief after spending so long stymied. If I were to chart my life on Vonnegut’s chalkboard now, I’d draw a steep plunge followed by a slow and bumpy incline that hasn’t yet neared the original precipice. It’s a tale less tantalizing than Man in Hole, and galling in its incrementalism, but it does have one advantage: It’s true. Some people visit Anza-Borrego only after the rains, in perfect conditions, when a riot of wildflowers suffuses the land with color. I never have. People tend to assume that this is when the desert is most alive, but in truth, even in the most arid conditions, bobcats prowl, coyotes slink, and foxes rear their kits. When the wild sheep can’t find water, they ram barrel cacti and devour the wet pulp. These animals know well that the rains don’t always come. During the dry spells, life carries on. from https://ift.tt/RJ0xpLa Check out http://natthash.tumblr.com Nothing gets a female mosquito going quite like the stench of human BO. The chase can begin from more than 100 feet away, with a plume of breath that wafts carbon dioxide onto the nubby sensory organ atop the insect’s mouth. Her senses snared, she flies person-ward, until her antennae start to buzz with the pungent perfume of skin. Lured closer still, she homes in on her host’s body heat, then touches down on a landing pad of flesh that she can taste with her legs. She punctures her victim with her spear-like stylet and slurps the iron-rich blood within. The entire ritual is intricate and obsessive—and nearly impossible to disrupt. Of more than 3,500 mosquito species that skulk about the planet, fewer than 10 percent (and only the females, at that) enjoy nibbling on humans. But once they’re on the prowl for people, neither rain nor zappers nor citronella candles will deter them. From the tips of their antennae to the bottoms of their little insect feet, these human-loving mosquitoes bristle with human-sensing accoutrement, says Leslie Vosshall, a neurobiologist at Rockefeller University. “They really are in the business of finding us.” Even aggressive genetic interventions aren’t enough to deflect a mosquito’s bite. The genome of a species called Aedes aegypti—a striped skeeter that prefers to feed on humans and can ferry viruses such as dengue, Zika, yellow fever, and chikungunya into our blood—encodes more than 300 distinct types of chemical sensors that help the insects navigate their world. Researchers have managed to introduce tweaks that futz with more than 100 of those genes at once, and yet those mutant mosquitoes “still find and bite humans, which just blows my mind,” says Meg Younger, a neurobiologist at Boston University. The most progress scientists have made through these techniques is cutting the insects’ attraction to us roughly in half, says Joshua Raji, a sensory biologist at Johns Hopkins University. The reason is, frankly, depressing, as Vosshall, Younger, and their colleagues have found. Their recent work shows that mosquitoes’ odor-detecting systems are, unlike many other animals’, patchwork, chaotic, and riddled with fail-safes that make the insects’ sense of smell extraordinarily difficult to stump. It’s an essential adaptation for a creature that is hyper-focused on us: “They are finding a way to survive,” Raji told me. The insects are literally coded with backup plan after backup plan for stalking us. [Read: A pivotal mosquito experiment could not have gone better] For years, scientists were sure that mosquitoes’ odor detection didn’t work in such complicated ways. In the 1990s, researchers performed a set of experiments that suggested that animals across the tree of life, including us humans, subscribed to a pretty standard smelling MO: To deduce distinct scents, creatures manufacture many, many types of olfactory nerve cells, each of them sensitive to exactly one specific type of odor. When complex fragrances filter in, their individual components nestle into receptors atop distinct neurons, like plugs fitting into sockets. The revved-up neurons then shuttle signals to the brain on parallel, independent tracks—keeping their intel separate until a central hub in the animal’s noggin collapses it all together, says Margo Herre, a neurobiologist who trained with Vosshall. It’s an additive system of switches that, coded correctly, yields precision in spades: Tripping Neuron A might mean there’s something hazelnutty nearby. But add Neuron B and Neuron C to the mix, and that could suggest it’s actually Nutella. Scientists called this the “one receptor, one neuron” rule, and for decades, Raji told me, it’s what everyone figured they would find in just about any creature that possessed a sense of smell. But mosquitoes, scourges that they are, were delighted to take this nice, neat dogma and totally screw it up. Their olfactory neurons, Vosshall’s team discovered, don’t respond to just a single odor; many of them instead recognize several scents. Their surfaces are studded with multiple types of receptors, all configured slightly differently, like a universal outlet adaptor. No longer do neuron subtypes A + B + C all need to activate in order to tell the brain, Thar be a snack; each could potentially pass that info on alone. That comes in handy when human blood is on the menu: Thanks to the vagaries of genetics, diet, lifestyle, environment, and more, “we all smell very different,” says Andrea Gloria-Soria, an entomologist at the Connecticut Agricultural Experiment Station. An olfactory system that’s loosey-goosey with its wiring can substantially raise the chances that the average mosquito smell cell will react when something delectable saunters by. [Read: The parasite that lures mosquitoes to humans] Mosquitoes probably do lose some acuity by stacking their cells like multitools, Herre told me. Although a neuron that’s provoked by a ton of different things is more likely to detect prey, it’ll also have a lot of trouble distinguishing which of its many triggers is turning its gears. But for hungry mosquitoes, maybe that’s not such a terrible tax: As long as the insects can locate a viable host, they hardly care which of us it is. (Is it human, or is it dancer? Doesn’t matter—as long as there’s blood.) The system is “really redundant,” Younger told me, so much so that it’s quite challenging to break. Humans, who do smell according to the Traditional Rules of Sniff, are easy to dupe: A mutation that affects just one type of receptor can take out of commission every neuron that bears. With mosquitoes, though, such sabotage would require an impractical number of genetic tweaks, Vosshall told me—which means there’s little hope for, say, engineering mosquitoes that can’t or won’t sniff our bodies out. “They’re really the ultimate predator,” says Omar Akbari, a biologist at UC San Diego. “You can’t find a single person on Earth that hasn’t been bitten at least once.” People-piercing mosquitoes might have good reason to be this clingy. Humans are super social and super hairless, a clean and convenient smorgasbord. Our blood helps nourish developing eggs, and our objects and architecture collect standing water, giving the insects a perfect spot in which to breed their young. Each of us is a mosquito “Walmart,” as Vosshall put it—a one-stop shop for all the creatures’ baby-rearing needs. The insects’ infatuation with us is costly: By way of the many, many deadly pathogens they carry, mosquitoes kill more people than any other animal on Earth does (except, well, us). Stopping certain species from biting us, by messing with their smell systems or by any other means, remains a key goal of global health. One path forward involves population control. Akbari’s team, for instance, is one of many that are engineering sterile male mosquitoes that, once released, will compete with unaltered males for mates but sire only unviable eggs. Other researchers are breeding strains that will introduce modified genes into disease-carrying species, rendering their offspring less able to chauffeur pathogens from person to person, or making them far less likely to survive. [Read: The worst animal in the world] Even if turning off mosquitoes’ smell cells is a dead end, cluing into how their olfaction works can still help with the design of new repellents that could target tons of their chemical sensors at once, Gloria-Soria told me. DEET, for instance, is thought to work at least partly in this way—although, after decades of research, scientists are still sussing out exactly how, and some species are now acquiring resistance to the stuff. Investigating skeeter smell could lead us to better-understood alternatives that aren’t quite so greasy and gross. Or perhaps the best solution lies not in repelling mosquitoes, but in baiting them better. Instead of slathering ourselves with gunk that turns our tasty skin toxic, maybe we could cook up traps that distract mosquitoes with something that smells even more alluring than a hot, sweaty, mouth-breathing human. Raji told me that some scientists are tinkering with recipes of lactic acid, ammonia, and carbon dioxide to entice female skeeters into parfum de people snares. If that’s the way of the future, it’ll be quite the olfactory flex: a way of leveraging how much mosquitoes love us to ensure that they never get too close. from https://ift.tt/a3zKJ5W Check out http://natthash.tumblr.com Joseph Osmundson, a microbiologist at NYU, was walking home recently in New York City when a stranger abruptly shouted “Monkeypox!” at him. He wasn’t infected with the virus, which has been spreading largely through intimate contact between men, nor did he have the characteristic skin lesions. So he must have been targeted for this catcall, he told me, on account of his being “visibly gay.” From his perspective, the name of the disease has made a painful outbreak worse. “Not only is this virus horrible, and people are suffering,” he said, “but it’s also fucking called monkeypox. Are you kidding?” Since the global crisis started in the spring, efforts to contain the spread of monkeypox have developed in parallel with efforts to change its formal identity. In June, more than two dozen virologists and public-health experts put out a call for a “neutral, non-discriminatory and non-stigmitizing” nomenclature for the virus and its subtypes; World Health Organization Director-General Tedros Adhanom Ghebreyesus responded by announcing a formal process to create one. A month later, with monkeypox still mired in linguistic purgatory, the health commissioner of New York City issued an open letter to Ghebreyesus warning that a “public health failure of words with potentially catastrophic consequences” was imminent. “Words can save lives or put them at further risk,” the letter said. “The WHO must act in this moment before it is too late.” [Read: Asking gay men to be careful isn’t homophobia] As a practicing physician—and a gay one at that—I've felt devastated by the clumsy public-health response to monkeypox. The delays in rolling out tests, treatments, vaccines, and contact tracing have been a months-long source of frustration. But the name of the disease has never bothered me, let alone engendered premonitions of catastrophe. Sure, monkeypox sounded odd when I first started hearing it in conversation. But that feeling quickly went away as doctors had to deal with the scourge itself, and with a public-health failure of actions. After seeing lives literally put at risk by our government, I have a hard time believing that the word monkeypox can really do the same. I’ve been told I’m wrong about this point, many times and by many different people. Some say the term is silly, and that it makes a dreadful ailment seem unimportant. Others claim that it’s too scary, and causes panic we don’t need. I’ve also heard that monkeypox is racist, that it’s homophobic, and that, actually, it’s causing harm to monkeys. A single name for a disease is said to be, somehow, the source of all this evil. But medicine is full of terms that sound funny or disgusting or obscene. One can find “hairy cell leukemia” and “fish scale disease” and “cat cry syndrome” on the books. A common viral illness related to monkeypox is termed “molluscum contagiosum,” which seems like a Harry Potter curse; and then there’s “maple syrup urine disease”—much too sweet of a label for a debilitating condition. All these names are weird, but they hardly seem offensive. Why should monkeypox be different? The name for the current outbreak is, at the very least, inapt. It “genuinely bothers me every time I use it,” Neil Stone, an infectious-disease physician in the United Kingdom, told me. In addition to finding the name unserious and possibly racist, he’s hung up on the fact that monkeypox doesn’t actually have much to do with monkeys. Although the disease was first identified in primates, in 1958, small mammals like squirrels and rats are now thought to be more important viral reservoirs. The subtypes of the monkeypox virus, called clades, could be even more misleading. These were originally named after the regions in Africa where they’d first been identified, but the present crisis did not emerge from any of those places, Christian Happi, the director of the African Center of Excellence for Genomics of Infectious Diseases in Nigeria, told me. If we were being less hypocritical, he suggested, the 2022 epidemic would be attributed not to the West African clade of monkeypox but to the “European” clade—in reference to the continent where cases were first identified this year. Happi, who was the lead author on the demand for a less stigmatizing nomenclature, also takes issue with some media outlets’ use of archival photos of Africans to illustrate a disease that now is occuring in white men. Since I spoke with Happi, a group of virologists and public-health experts convened by the WHO reached an agreement to rename the clades. A statement issued Friday said the monkeypox subvariant behind this year’s global outbreak would henceforth fall within “Clade IIb.” That shift will be most significant within the scientific community, but the more pressing question, of what to do about the term on all of our lips, is unresolved. What will monkeypox become? Surely any change would have to be in line with the “Best Practices for the Naming of New Human Infectious Diseases,” put out by the WHO in 2015. Those guidelines are designed to minimize word-based harm to “trade, travel, tourism or animal welfare,” as well as to “cultural, social, national, regional, professional or ethnic groups.” To that end, they say, names should exclude all stigmatizing references to specific people (e.g., “Creutzfeld-Jakob disease”), occupations (“Legionnaires’ disease”), or places (“Lyme disease”). Animal-based names, such as “swine flu” and “paralytic shellfish poisoning,” are also verboten. When I talked with Stone, he tossed out “human orthopoxvirus syndrome,” or “HOPS” for short, as a possible alternative for monkeypox. Happi said that “mundopox,” from the Spanish for world, was another. But if the WHO is to follow its own rules to the letter, it should stay away from any implication that the virus is a product of the Hispanophonic world (or, I guess, that hopping rabbits are to blame). Surely global-health officials will be more inclined to fumigate the discourse with another odorless, colorless gas of pseudowords and digits—something in the lifeless spirit of COVID-19. Along these lines, the emergency-medicine physician Jeremy Faust has suggested “OPOXID-22,” short for “orthopoxvirus disease 2022.” Even a bland name, however, might not immunize the WHO against blowback. Boghuma Kabisen Titanji, an infectious-disease doctor, has already criticized Faust’s proposal as incorrectly implying that monkeypox is new to 2022. Call it “IgnoredPox (IPOX)” instead, she suggested, in light of the fact that outbreaks have been neglected for decades. [Read: We’re testing for monkeypox the wrong way] Granted, monkeypox is not a great name for a disease that spreads between humans, and nothing good can come of potentially racist associations or implications of bestiality. But the WHO’s “Best Practices,” if deployed across the board, would exclude many—maybe most—of the medical terms in use today. Taken in broader perspective, monkeypox isn’t even unusually off-base. Chickenpox has little to do with chickens, for instance, and, unlike monkeypox, it’s not a poxvirus but a herpesvirus. Maybe in a more perfect world, we’d refer to chickenpox as “chicken herpes”; but then again, the herpesviruses—named for the creeping spread of lesions they may produce—are already stigmatizing given their association with sexually transmitted infections. Nearly all of us contract a herpesvirus during our lives, via nonsexual spread. Just the same, I remember telling one patient that he had a disseminated herpesvirus infection only to watch him jump to the erroneous conclusion that his wife must have committed adultery. Even though monkeypox is being used to harass people right now, bad actors who truly wish to deepen victims’ shame will always find a way to do so. Earlier this month, two gay men in Washington, D.C., are alleged to have been berated, then beaten, by teenagers who included monkeypox among a string of homophobic slurs. If that particular word had been unavailable, I’ll bet the others would have sufficed. Tone of voice and body language can, by themselves, turn a good word bad; and there’s little reason to think that any term for a disease, no matter how generic it might seem, cannot be wielded for ill purposes. “The name per se is not a major issue,” Mike Ryan, the executive director of the WHO Health Emergencies Programme, said last month. “It’s the weaponization of these names. It’s the use of these names in the pejorative.” Indeed, HIV is no longer called “gay-related immune deficiency,” but gay men are still frequently ostracized over the condition. Connotation outlives denotation. Even COVID-19—a disease name that was designed from the very start to be as inoffensive as possible—can easily be turned into a slur. “Covidians” and “Covidiots” abound. Perhaps episodes of hate would occur less often if the WHO naming guidelines were universally adopted. Maybe the name monkeypox, which already sounds something like an insult, has a way of loosening the bigot’s tongue. Social scientists have struggled to assess the size of this effect. A number of preliminary studies suggested that the initial, China-centric framing of the new coronavirus in 2020 worsened bias against Asians and Asian Americans. But other research found no effect on anti-Asian sentiment; and one study concluded that the Trump administration’s effort to “scapegoat outgroups” actually backfired. Meanwhile, an increased level of anti–Asian American discrimination seems to have persisted for years. Any incremental consequences of the name monkeypox for anti-gay and anti-Black sentiment seem equally hard to predict. In any case, cruelty is nothing if not creative. Last month, the Fox News host Tucker Carlson ran a segment on the monkeypox-naming controversy in which he proposed a slew of other offensive names, including “Schlong COVID”—a term that manages to insult the victims of two diseases at once. The problem, as always, is people. The illness is new and mysterious to most of us, visibly apparent, and comes on the heels of the divisive coronavirus pandemic. It’s not the name; it’s the vibes. And the vibes are bad. Strangers are publicly accusing one another of having monkeypox. Medical influencers are playing up the possibility that monkeypox easily spreads through the air or will become common in children. Old political arguments over COVID have been rehashed. Bad vibes don’t wash off easily in medicine. In 2011, a rare form of blood-vessel inflammation called “Wegener’s granulomatosis” was renamed because it turned out that the condition’s namesake was a Nazi. Unfortunately, the disorder’s new name (“granulomatosis with polyangiitis”) is a mouthful. Doctors still prefer the shorter Wegener’s more than a decade later. Medical textbooks must awkwardly refer—Prince style—to the disease “formerly known as Wegener’s.” Will monkeypox also hang around? Consider the illness with the worst vibes of all: cancer. The name for these cellular growths brings to mind suffering and inevitable death. Yet many cancers diagnosed today are so small as to be practically harmless. Some doctors have been campaigning to remove the “cancer” label from such tumors, hoping to reduce fear and unnecessary treatment. But studies find that calling some mild breast and prostate tumors “lesions” or “abnormal cells” instead of “cancer” seems to have only a small impact on patient anxiety and overtreatment. A monkeypox rebrand may not do much more. Of course proponents of the name-change argue that getting rid of monkeypox wouldn’t have to save the world to be worth doing. “Nobody thinks changing the name is going to instantly end all stigma of people with the disease,” Gavin Yamey, a global-health professor at Duke, told me. It might still lower the social temperature, he said, and represent a proactive and important step to protect marginalized communities. For Osmundson, to assume that nothing whatsoever can be done to combat prejudice is giving in to nihilism. But a campaign to change the language of disease, based on the urge to do something, could be counterproductive. At worst, it could make semantics seem like the most important tool for addressing social wrongs. The American Medical Association, for example, recently declared that “a consideration of our language” is central to the work of improving health equity. “Pursuing equity requires disavowing words that are rooted in systems of power that reinforce discrimination and exclusion.” I don’t think that I’ve ever avowed allegiance to a word. Regardless, disavowing a particular word does nothing by itself to uproot injustice. Whatever we decide to call this Clade IIb virus, society has made plain which lives it values less: In the U.S., monkeypox is already spreading along the same racial, sexual, and economic fault lines as other sexually transmitted infections. An August 8 presentation from the Georgia Department of Public Health noted that most monkeypox patients in the state were young gay men; 82 percent were Black; and 67 percent were also HIV positive. Our actions, not our nouns, determine who will get sick. In 1993, Harvard scientists discovered a crucial gene for the growth of embryos. They decided that it would be fun to name it after the video-game character Sonic the Hedgehog. Other researchers at the time derided this choice as unserious. But today, the scientific literature is full of dry sentences like “Sonic Hedgehog plays a role in cell growth, cell specialization, and the normal shaping (patterning) of the body.” Words, like viruses, evolve as they move from host to host; and words, like viruses, may become more or less noxious over time. If the name monkeypox strikes listeners as funny or offensive right now, that could change in the future—irrespective of any committee. from https://ift.tt/NkQHpEY Check out http://natthash.tumblr.com |
Authorhttp://natthash.tumblr.com Archives
April 2023
Categories |