The world has just seen the largest vaccination campaign in history. At least 13 billion COVID shots have been administered—more injections, by a sweeping margin, than there are human beings on the Earth. In the U.S. alone, millions of lives have been saved by a rollout of extraordinary scope. More than three-fifths of the population elected to receive the medicine even before it got its full approval from the FDA. Yet the legacy of this achievement appears to be in doubt. Just look at where the country is right now. In Florida, the governor—a likely Republican presidential candidate—openly pursues the politics of vaccine resistance and denial. In Ohio, kids are getting measles. In New York, polio is back. A football player nearly died on national TV, and fears about vaccines fanned across the internet. Vaccinologists, pediatricians, and public-health experts routinely warn that confidence is wavering for every kind of immunization, and worry that it may collapse in years to come. In other words, America is mired in a paradoxical and pessimistic moment. “We’ve just had a national vaccination campaign that has exceeded almost all previous efforts in a dramatic fashion,” says Noel Brewer, a psychologist at the University of North Carolina who has been studying decision making about vaccines for more than 20 years, “and people are talking about vaccination as if there’s something fundamentally wrong.” It’s more than talk. Americans are arguing, Americans are worrying, Americans are obsessing over vaccines; and that fixation has produced its own, pathological anxiety. To fret about the state of public trust is rational: When vaccine adherence wobbles, lives are put in peril; in the midst of a pandemic, the mortal risk is even greater. More than 60 million Americans haven’t gotten a single COVID shot, and a few thousand deaths are attributed to the disease every week. But the scale of this concern—the measure of our instability—may be distorted by the heights to which we’ve climbed. Evidence that the nation has arrived at the brink of collapse does not hold up to scrutiny. No one knows where vaccination rates are really heading, and the coming crash is more an idea—a projection, even—than a certainty. The future of vaccination in America may be no worse than its recent past. In the end, it might be better. The first alarms about a widespread vaccination crisis—the first suggestions that a leeriness of COVID shots had “spread its tentacles into other diseases”—were raised by clinicians. Megha Shah, a pediatrician with the Los Angeles public-health department, told me that she began to worry in the spring of 2021, while volunteering at a medical center. Two years earlier, she recalled, working there had been uneventful. She’d meet with parents—mostly from low-income Latino families—to discuss the standard vaccination schedule: Okay, here’s what we’re recommending for your child. This protects against this; that protects against that. The parents would ask a couple of questions, and she’d answer them. The child would be immunized, almost every time. But in the middle of the COVID-vaccine rollout, she found that those conversations were playing out differently. “Oh, I’m just not sure,” she said some parents told her. Or, “I need to talk this over with my partner.” She saw families refuse, flat-out, to give their infants routine shots. “It just was very, very surprising,” Shah said. “I mean, questions are good. We want parents to be engaged and informed decision makers.” But it seemed to her—and her colleagues too—that healthy “engagement” had gone sour. Last year, she and her colleagues took a closer look. For a study published in Pediatrics, they drew on national survey data collected from April 2020 through early 2022, of parents’ attitudes toward standard childhood vaccines. In some respects, the results looked good: Parents endorsed the importance and effectiveness of these vaccines at a high and stable rate throughout the pandemic—in the vicinity of 91 percent. But over the same period, concerns about potential harms marched upward. In April 2020, about 25 percent of those surveyed agreed that vaccines “have many known harmful side effects” and “may lead to illness or death”; by the end of the year, that number had increased to 30 percent, and then to nearly 35 percent the following June. “Parents still seemed very confident overall in the benefits of vaccinations,” Shah told me, “but there was a huge jump over the course of the pandemic about the safety.” [Read: What’s really behind global vaccine hesitancy?] Those results jibed with a theory that has now been invoked so many times, it reads as common knowledge: “Perhaps this was a spillover effect,” Shah said, “from all of the vaccine misinformation that was circling during the pandemic.” That effect—the spreading tentacles of doubt—can be seen around the world, says Heidi Larson, a professor at the London School of Hygiene & Tropical Medicine who has studied attitudes toward vaccination across Europe since the start of the coronavirus pandemic. “The public-health community was assuming that COVID would be a great boon to public confidence in vaccines, but it hasn’t worked out that way. The trend has been actually a negative knock-on effect,” Larson told me. In a troubling alignment, even anti-vaccine activists now endorse the notion of hesitancy spillover, calling it a “wonderful silver lining” to the pandemic. But hold on a minute. Here in the U.S., it’s certainly true that vaccine worries have been broadcast and rebroadcast, at ever greater volumes, through a clamorous network of influencers and politicians. This campaign of hesitancy is growing more open and insistent by the day, and the consequences can be atrocious: Americans with false beliefs about vaccines are falling sick and dying stubborn and alone. But even as these anecdotes accrue, misinformation’s greater sway—the extent to which it shapes Americans’ behavior toward vaccines for COVID, measles, or the flu—remains murky, if not altogether undetectable. The best numbers to go on in this country, drawn from polls of people’s attitudes about vaccines and official vaccination surveys from the CDC, don’t hint at any comprehensive change. When concerning blips and mini-trends arise—shifts in parents’ attitudes, as seen in Shah’s research, or drops in local rates of children getting immunized—they’re set against a landscape with a flat horizon. It’s not a pretty view, for that: The U.S. lags five points behind the average wealthy country in its rate of people fully vaccinated against COVID, and two points behind in its vaccination rate for measles. And even blips can translate into many thousands of at-risk kids, Shah pointed out. Yet one might still be grateful for the sameness overall. A seedbed of resistance to the COVID shots, disproportionately Republican, was already present near the start of the pandemic, and hasn’t seemed to thrive despite two years’ worth of fertilizer runoff from Fox News and other outlets spewing doubt. In August 2020, the Harris Poll’s weekly COVID-19 tracker found that 15 percent of American adults said they were “not at all likely” to get the vaccine when it finally became available. In August 2022, Harris reported that 17 percent weren’t planning to be immunized. Other long-running surveys have found similar results. In September 2020, Kaiser Family Foundation’s vaccine monitor pegged the rate of refusal at 20 percent. In December 2022, it was … still 20 percent. The most recent uptake numbers from the CDC suggest that children born in 2018 and 2019 (who would have been babies or toddlers when COVID first appeared) had higher vaccination rates by age 2 than children born in 2016 and 2017. Some of these kids did miss out on shots amid the pandemic’s early lapses in routine medical care, but they quickly caught up. Another, more alarming batch of data from the CDC shows that measles-mumps-rubella coverage among the nation’s kindergartners has dropped for two years in a row, down from 95.2 to 93.5 percent, and is now lower than it’s been since at least 2013. Still, the proportion of kids who get exempted from school vaccine requirements for medical or philosophical reasons has hardly changed at all, and the headline-grabbing “slide” in rates appears instead to be at least in part a product of “provisional enrollments”—i.e., children who missed some vaccinations (perhaps in early 2020) and were allowed to enter school while they caught up. If there really is a wave of newly red-pilled, anti-vaxxer parents, then going by these data, they’re nowhere to be seen. Some public-health disasters hit like hurricanes; others spread like rust. “We may not have a full picture yet,” Shah told me, referring to the latest evidence from the CDC on where vaccination rates are heading. “My gut and my clinical experience tell me that it’s too soon to say.” Other experts share that view. Robert Bednarczyk, an epidemiologist at Emory University, has been estimating the susceptibility of U.S. children to measles outbreaks since 2016. National immunization surveys have not shown substantial drops in coverage for 2020 and 2021, he told me, “but there is a large caveat to this. These surveys have a lag time.” Any children from the CDC’s data set who were born in 2018, he noted, would have gotten most of their vaccines before the pandemic started, during their first year of life. The same problem applies to teens. The government’s latest stats for adolescents—which looked as good as ever in 2021—capture many who would have gotten all their shots pre-COVID. Until more data are released, researchers still won’t know whether or how far kids' vaccination rates have really dipped during the 2020s. The time delay is just one potential problem. Parents who are suspicious of vaccines, and angry at the government for encouraging their use, may be less willing to participate in CDC surveys, Daniel Salmon, the director of the Institute for Vaccine Safety at Johns Hopkins Bloomberg School of Public Health, told me. “Having studied this for 25 years, I would be surprised if we don’t see a substantial COVID effect on childhood vaccines,” he said. “These data are a little bit reassuring, that it’s not, like, an oh-my-god huge effect. But we need more time and more data to really know the answer.” [Read: How many Republicans died because the GOP turned against vaccines?] Uncertainty doesn’t have to be a source of terror, though. Early uptake data already provide some signs of a “vaccine-hesitancy spillover effect” happening in reverse, UNC’s Brewer told me, driving more enthusiasm, not less, for getting different kinds of shots. Just look at how the push to dose the nation with half a billion COVID shots goosed the rates of grown-ups getting flu shots: For decades now, our public-health establishment has pushed for better influenza coverage, even as the rate for older Americans was stuck at roughly 65 percent. Then COVID came along and, voilà, senior citizens’ flu-shot coverage jumped to 75 percent—higher than it ever was before. This all fits with a familiar idea in the field, Brewer said, that going in for any one vaccine makes you much more likely to get another in the future. “There does seem to be a sort of positive spillover,” he said, “probably because the forces that led to previous vaccinations are still mostly in place.” Even some of the scariest signals we’ve seen so far—reports that anti-vaccine sentiment is clearly on the rise—can seem ambiguous, depending on one’s breadth of view. Consider the finding from Heidi Larson’s group, that vaccine confidence has declined across the whole of the European Union throughout the pandemic, according to surveys taken in 2020 and 2022. The same report says that attitudes have now returned to where they were in 2018 and that confidence in the MMR vaccine, in particular, remains higher than it was four years ago. Given that the 2020 surveys were conducted mostly in March, at the very onset of the first pandemic lockdowns, they might have captured a temporary spike of interest in vaccines. After all, vaccines can seem more useful when you’re terrified of death. In other words, America may truly have experienced a recent drop in vaccine confidence—but from an inflated and unsustainable high. That could help explain other recent findings too, including Shah’s. “You need to take the long view,” says Douglas Opel, a pediatric bioethicist at Seattle Children’s Hospital who has been studying the ups and downs of vaccine hesitancy for more than a decade. For a paper published last July, he and colleagues looked at vaccine attitudes among 4,562 parents from late 2019 to the end of 2020. They found that the parents grew more enthusiastic about childhood immunizations when the pandemic started, but their feelings later returned to baseline. Larson told me that a “transient COVID effect” may well explain some of what her team has found, but said it was very unlikely to account in full for the worrying trend. In any case, she told me, “we shouldn’t assume this and should instead make an extra effort to continue to build confidence.” No crunching of the numbers can excuse the spread of vaccine misinformation, or suggest that those who peddle it are anything but a hateful scourge on individuals and a threat to public health. But you can’t simply ignore the fact that, as far as we can see, all the gnashing about vaccines’ supposed risks simply hasn’t changed a lot of people’s minds. It certainly hasn’t caused a steep and sudden rise in vaccine refusal. The idea that we’re in the midst of some new vaccine-hesitancy contagion is based as much on vibes as proven fact. The problem is, bad vibes can leave us prone to misinterpretation. Take the recent measles outbreak in Ohio: It’s alarming, but not so relevant to recent trends in vaccination, despite many claims to that effect. More than one-quarter of the affected children were too young to have been eligible for the MMR vaccine, while others were old enough to have missed their first shot by 2020, before any hesitancy “spillover” could have taken place. And at least a meaningful proportion of the affected families, from the state’s Democratic-leaning Somali American community, wouldn’t seem to represent the GOP’s white, unvaccinated constituency. The stark politicization of the COVID shots can be misread too. Despite the 30-point gap between Democrats and Republicans in COVID vaccination rates, those rates are much, much higher—for members of both parties—than they’ve ever been for flu shots. And interparty differences in flu-shot uptake seem to be long-standing. A preprint study from Minttu Rönn, a researcher at the Harvard T. H. Chan School of Public Health, and colleagues found a broadening divide in coverage between Democratic- and Republican-voting states, based on data going back to 2010. But this may not be a bad thing. Rönn doesn’t think the change arises from a loss of trust among Republicans; rather, she told me, it looks to be related to rising flu-shot coverage overall, with proportionally greater gains in Democratic-leaning areas. (That difference could be the result of local attitudes, ease of access, or insurance coverage, she said.) In other words, red states aren’t necessarily falling behind on vaccination. Blue states are surging forward. [Read: Vaccination in America might have only one tragic path forward] Optimism here may seem perverse. COVID booster uptake is absurdly low right now, even for the elderly. The politicization of vaccines (whenever it began) certainly isn’t letting up. Given what would happen if trust in vaccination really did collapse, perhaps it makes more sense to err on the side of freaking out. As Larson said, every effort should be taken to build confidence, no matter what. But the truth of what we know right now ought to be important too. Maybe it’s okay to feel okay. Maybe there’s value in maintaining calm and taking stock of what we’ve accomplished or what we’ve maintained in the face of all these efforts to confuse us. At the risk of trying way too hard to find some solace in disturbing facts, here’s another case in point. Remember Shah’s results, that parents’ concerns about the health effects of childhood vaccines have steadily gone up throughout the pandemic, even as their belief in vaccines’ benefits stayed high? That increase wasn’t clearly more pronounced in any specific group. Belief that vaccination can result in illness or death went up across the board for men and women in the survey, for young and old, for Black and white alike. It rose among Republicans and also Democrats—in just about the same proportions. If America’s parents have been getting more attuned to potential risks from vaccination, we’re doing it together. I’m in that number too. As a scientist by training and a science journalist by trade, I’ve been reporting and editing stories about vaccination for years. Still, I’ve never thought so hard about the topic, and in such critical detail, as I have since 2021. At no point in my life has vaccination been this pervasive, perplexing, and important. When it came time to get my children COVID shots, I learned everything I could about potential risks and benefits. I looked at data on the incidence of myocarditis, I considered very rare but deadly outcomes, and I weighed the efficacy of different shots against their measured side effects. These investigations did not arise from distrust of authority, podcast propaganda, or a belief in microchips so small they fit inside of a syringe. I wasn’t fearful; I was curious. I had questions, and I got answers—and now every member of my family has gotten their shots. We’ve all been forced by circumstance to think in different ways about our health. Before the pandemic, Larson told me, most people simply didn’t have to pay attention to vaccines. Parents with young children, sure, but everybody else? “I think they probably said, Yeah, vaccines are important. Yeah, they’re safe enough,” she said. But now the stakes are raised across the population. “I mean, there are these groups around the world where you’re like, ‘why do they care about vaccines?’ And it’s because of COVID.” The emergence of so many groups with newfound interest in vaccines could end up being dangerous, of course—in the same way that newly minted drivers are a menace on the road. “A lot of people went online asking questions about vaccines,” Larson told me, in a tone that made it sound as though online were a synonym for “straight to hell.” But sometimes asking questions gets you useful information, and sometimes useful information leads to wise decisions. Debates about vaccines may be louder than they’ve ever been before, but that doesn’t mean that vaccination rates are bound to fall. Even if the situation isn’t getting that much worse, the country might still be left to wallow in its status quo. Yes, more than 200 million Americans have been fully immunized against COVID—and more than 100 million haven’t. “This has been a problem for a long time,” Daniel Salmon told me. “It was already ‘a crisis in confidence’ a dozen years ago. We don’t see a free fall—that’s somewhat reassuring—but that’s very different from saying that we’re good to go.” The fact of this crisis, however long it’s been around, will never matter more than its effects. After all, “confidence” itself is not the only factor, or even the most important one, that determines who gets shots. “Generally speaking, access to vaccination is a much bigger driver than what people think and feel,” Noel Brewer told me. Early in the pandemic, lots of parents wanted to vaccinate their kids and simply couldn’t. Now many of them can. But obstacles persist, and their effects aren’t evenly distributed. According to the CDC, toddlers’ vaccination rates are somewhat lower among those who live in poverty, or reside in rural areas, or don’t identify as white or Asian. Since the pandemic started, these gaps in opportunity appear to have increased. A grand and tragic spillover of people’s vaccination doubts—the anti-vaxxers’ hoped-for “silver lining” to the pandemic—may or may not come. In the meantime, though, there are other problems to address. from https://ift.tt/KYUGukx Check out http://natthash.tumblr.com
0 Comments
For decades now, gay men have been barred from giving blood. In 2015, what had been a lifetime ban was loosened, such that gay men could be donors if they’d abstained from sex for at least a year. This was later shortened to three months. Last week, the FDA put out a new and more inclusive plan: Sexually active gay and bisexual people would be permitted to donate so long as they have not recently engaged in anal sex with new or multiple partners. Assistant Secretary for Health Rachel Levine, the first Senate-confirmed transgender official in the U.S., issued a statement commending the proposal for “advancing equity.” It “treats everyone the same,” she said, “regardless of gender and sexual orientation.” As a member of the small but honorable league of gay pathologists, I’m affected by these proposed policy changes more than most Americans. I’m subject to restrictions on giving blood, and I’ve also been responsible for monitoring the complications that can arise from transfusions of infected blood. I am quite concerned about HIV, given that men who have sex with men are at much greater risk of contracting the virus than members of other groups. But it’s not the blood-borne illness that I, as a doctor, fear most. Common bacteria lead to far more transfusion-transmitted infections in the U.S. than any virus does, and most of those produce severe or fatal illness. The risk from viruses is extraordinarily low—there hasn’t been a single reported case of transfusion-associated HIV in the U.S. since 2008—because laboratories now use highly accurate tests to screen all donors and ensure the safety of our blood supply. This testing is so accurate that preventing anyone from donating based on their sexual behavior is no longer logical. Meanwhile, new dictates about anal sex, like older ones explicitly targeting men who have sex with men, still discriminate against the queer community—the FDA is simply struggling to find the most socially acceptable way to pursue a policy that it should have abandoned long ago. Strict precautions made more sense 30 years ago, when screening didn’t work nearly as well as it does today. Patients with hemophilia, many of whom rely on blood products to live, were prominent, early victims of our inability to keep HIV out of the blood supply. One patient who’d acquired the virus through a transfusion lamented to The New York Times in 1993 that he had already watched an uncle and a cousin die of AIDS. Those days of “shock and denial,” as the Times described it, are thankfully behind us. But for older patients, memories of the crisis in the ’80s and early ’90s linger, and cause significant anxiety. Even people unaware of this historical context may consider the receipt of someone else’s blood disturbing, threatening, or sinful. As a doctor, I’ve found that patients tend to be more hesitant about getting a blood transfusion than they are about taking a pill. I’ve had them ask for a detailed medical history of the donor, or say they’re willing to take blood only from a close relative. (Typically, neither of these requests can be fulfilled for reasons of privacy and practicality.) Yet the same patients may accept—without question—drugs that carry a risk of serious complication that is thousands of times higher than the risk of receiving infected blood. Even when it comes to blood-borne infections, patients seem to worry less about the greatest danger—bacterial contamination—than they do about the transfer of viruses such as HIV and hepatitis C. I can’t fault anyone for being sick and scared, but the risk of contracting HIV from a blood transfusion is not just low—it’s essentially nonexistent. [Read: Blood plasma, sweat and tears] Donors' feelings matter, too, and the FDA’s policies toward gay and bisexual men who wish to give blood have been unfair for many years. While officials speak in the supposedly objective language of risk and safety, their selective deployment of concern suggests a deeper homophobia. As one scholar put it in The American Journal of Bioethics more than a decade ago, “Discrimination resides not in the risk itself but in the FDA response to the risk.” Many demographic groups are at elevated risk of contracting HIV, yet the agency isn’t continually refining its exclusion criteria for young people or urban dwellers or Black and Hispanic people. Federal policy did prohibit Haitians from donating blood from 1983 to 1991, but activists successfully lobbied for the reversal of this ban with the powerful slogan “The H in HIV stands for human, not Haitian.” Nearly everyone today would find the idea of rejecting blood from one racial group to be morally repugnant. Under its new proposal, which purports to target anal sex instead of homosexuality itself, the FDA effectively persists in rejecting blood from sexual minorities. The planned update would certainly be an improvement. It comes out of years of advocacy by LGBTQ-rights organizations, and its details are apparently supported by newly conducted government research. Peter Marks, the director of the Center for Biologics Evaluation and Research at the FDA, cited an unpublished study showing that “a significant fraction” of men who have sex with men would now be able to donate. But the plan is still likely to exclude a large portion of them—even those who wear condoms or regularly test for sexually transmitted infections. An FDA spokesperson told me via email that “additional data are needed to determine what proportion of [men who have sex with men] would be able to donate under the proposed change.” Research done in France, Canada, and the U.K., where similar policies have since been adopted over the past two years, demonstrates the risk. A French blood-donation study, for instance, estimated that 70 percent of men who have sex with men had more than one recent partner; and when Canadian researchers surveyed queer communities in Montreal, Toronto, and Vancouver, they found that up to 63 percent would not be eligible to donate because they’d recently had anal sex with new or multiple partners. Just 1 percent of previously eligible donors would have been rejected by similar criteria. The U.K. assumed in its calculations that 35 to 50 percent of men who have sex with men would be ineligible under a policy much like the FDA’s, while only 1.4 percent of previous donors would be newly deferred. If the new rule’s net effect is that gay and bisexual men are turned away from blood centers at many times the rate of heterosexual individuals, what else can you call it but discrimination? The U.S. guidance is supposed to ban a lifestyle choice rather than an identity, but the implication is that too many queer men have chosen wrong. The FDA spokesperson told me, “Anal sex with more than one sexual partner has a significantly greater risk of HIV infection when compared to other sexual exposures, including oral sex or penile-vaginal sex.” If the FDA wants to pry into my sex life, it should have a good reason for doing so. The increasing granularity and intimacy of these policies—specifying numbers of partners, kinds of sex—gives the impression that the stakes are very high: If we don’t keep out the most dangerous donors, the blood supply could be ruined. But donor-screening questions are a crude tool for picking needles from a haystack. The only HIV infections that are likely to get missed by modern testing are those contracted within the previous week or two. This suggests that, at most, a couple thousand individuals—gay and straight—across the entire country are at risk of slipping past our testing defenses at any given time. Of course, very few of them will happen to donate blood right then. No voluntary questionnaire can ever totally exclude this possibility, but patients and doctors already accept other life-threatening transfusion risks that occur at much greater rates than HIV transmission ever could. When I would be on call for monitoring transfusion reactions at a single hospital, the phone would ring a few times every night. Yet blood has been given out tens of millions of times across the country since the last known instance of a transfusion resulting in a case of HIV. [Read: How blood-plasma companies target the poorest Americans] Early data suggest that the overall risk-benefit calculus of receiving blood isn’t likely to change. When eligibility criteria were first relaxed in the U.S. a few years ago, the already tiny rate of HIV-positive donations remained minuscule. Real-world results from other countries that have recently adopted sexual-orientation-neutral policies will become available in the coming years. But modeling studies already support removing any screening question that explicitly or implicitly targets queer men. A 2022 Canadian analysis suggested that removing all questions about men who have sex with men would not result in a significantly higher risk to patients. “Extra behavioral risk questions may not be necessary,” the researchers concluded. If there must be a restriction in place, then one narrowly tailored to the slim risk window of seven to 10 days before donation should be good enough. (The FDA says that its proposed policy “would be expected to reduce the likelihood of donations by individuals with new or recent HIV infection who may be in the window period.”) As a gay man, I realize that, brief periods of crisis during the coronavirus pandemic aside, no one needs my blood. Only 6.8 percent of men in the U.S. identify as gay or bisexual, so our potential benefit to the overall supply is inherently modest. If we went back to being banned completely, patients would not be harmed. But reversing that ban, both in letter and in spirit, would send a vital message: Our government and health-care system view sexual minorities as more than a disease vector. A policy that uses anal sex as a stand-in for men who have sex with men only further stigmatizes this population by impugning one of its main sources of sexual pleasure. There is no question that nonmonogamous queer men have a greater chance of contracting HIV. But a policy that truly treats everyone the same would accept a tiny amount of risk as the price of working with human beings. from https://ift.tt/DsVJ9EW Check out http://natthash.tumblr.com Stephen B. Thomas, the director of the Center for Health Equity at the University of Maryland, considers himself an eternal optimist. When he reflects on the devastating pandemic that has been raging for the past three years, he chooses to focus less on what the world has lost and more on what it has gained: potent antiviral drugs, powerful vaccines, and, most important, unprecedented collaborations among clinicians, academics, and community leaders that helped get those lifesaving resources to many of the people who needed them most. But when Thomas, whose efforts during the pandemic helped transform more than 1,000 Black barbershops and salons into COVID-vaccine clinics, looks ahead to the next few months, he worries that momentum will start to fizzle out—or, even worse, that it will go into reverse. This week, the Biden administration announced that it would allow the public-health-emergency declaration over COVID-19 to expire in May—a transition that’s expected to put shots, treatments, tests, and other types of care more out of reach of millions of Americans, especially those who are uninsured. The move has been a long time coming, but for community leaders such as Thomas, whose vaccine-outreach project, Shots at the Shop, has depended on emergency funds and White House support, the transition could mean the imperilment of a local infrastructure that he and his colleagues have been building for years. It shouldn’t have been inevitable, he told me, that community vaccination efforts would end up on the chopping block. “A silver lining of the pandemic was the realization that hyperlocal strategies work,” he said. “Now we’re seeing the erosion of that.” I called Thomas this week to discuss how the emergency declaration allowed his team to mobilize resources for outreach efforts—and what may happen in the coming months as the nation attempts to pivot back to normalcy. Our conversation has been edited for clarity and length. Katherine J. Wu: Tell me about the genesis of Shots at the Shop. Stephen B. Thomas: We started our work with barbershops and beauty salons in 2014. It’s called HAIR: Health Advocates In-Reach and Research. Our focus was on colorectal-cancer screening. We brought medical professionals—gastroenterologists and others—into the shop, recognizing that Black people in particular were dying from colon cancer at rates that were just unacceptable but were potentially preventable with early diagnosis and appropriate screening. Now, if I can talk to you about colonoscopy, I could probably talk to you about anything. In 2019, we held a national health conference for barbers and stylists. They all came from around the country to talk about different areas of health and chronic disease: prostate cancer, breast cancer, others. We brought them all together to talk about how we can address health disparities and get more agency and visibility to this new frontline workforce. When the pandemic hit, all the plans that came out of the national conference were on hold. But we continued our efforts in the barbershops. We started a Zoom town hall. And we started seeing misinformation and disinformation about the pandemic being disseminated in our shops, and there were no countermeasures. We got picked up on the national media, and then we got the endorsement of the White House. And that’s when we launched Shots at the Shop. We had 1,000 shops signed up in, I’d say, less than 90 days. Wu: Why do you think Shots at the Shop was so successful? What was the network doing differently from other vaccine-outreach efforts that spoke directly to Black and brown communities? Thomas: If you came to any of our clinics, it didn’t feel like you were coming into a clinic or a hospital. It felt like you were coming to a family reunion. We had a DJ spinning music. We had catered food. We had a festive environment. Some people showed up hesitant, and some of them left hesitant but fascinated. We didn’t have to change their worldview. But we treated them with dignity and respect. We weren’t telling them they’re stupid and don’t understand science. And the model worked. It worked so well that even the health professionals were extremely pleased, because now all they had to do was show up with the vaccine, and the arms were ready for needles. [Read: The flu-ification of COVID policy is almost complete] The barbers and stylists saw themselves as doing health-related things anyway. They had always seen themselves as doing more than just cutting hair. No self-respecting Black barber is going to say, “We’ll get you in and out in 10 minutes.” It doesn’t matter how much hair you have: You’re gonna be in there for half a day. Wu: How big of a difference do you think your network’s outreach efforts made in narrowing the racial gaps in COVID vaccination? Thomas: Attribution is always difficult, and success has many mothers. So I will say this to you: I have no doubt that we made a huge difference. With a disease like COVID, you can’t afford to have any pocket unprotected, and we were vaccinating people who would otherwise have never been vaccinated. We were dealing with people at the “hell no” wall. We were also vaccinating people who were homeless. They were treated with dignity and respect. At some of our shops, we did a coat drive and a shoe drive. And we had dentists providing us with oral-health supplies: toothbrush, floss, paste, and other things. It made a huge difference. When you meet people where they are, you’ve got to meet all their needs. Wu: How big of a difference did the emergency declaration, and the freeing-up of resources, tools, and funds, make for your team’s outreach efforts? Thomas: Even with all the work I’ve been doing in the barber shop since 2014, the pandemic got us our first grant from the state. Money flowed. We had resources to go beyond the typical mechanisms. I was able to secure thousands of KN95 masks and distribute them to shops. Same thing with rapid tests. We even sent them Corsi-Rosenthal boxes, a DIY filtration system to clean up indoor air. Without the emergency declaration, we would still be in the desert screaming for help. The emergency declaration made it possible to get resources through nontraditional channels, and we were doing things that the other systems—the hospital system, the local health department—couldn’t do. We extended their reach to populations that have historically been underserved and distrustful. Wu: The public-health-emergency declaration hasn’t yet expired. What signs of trouble are you seeing right now? Thomas: The bridge between the barbershops and the clinical side has been shut down in almost all places, including here in Maryland. I go to the shop and they say to me, “Dr. T, when are we going to have the boosters here?” Then I call my clinical partners, who deliver the shots. Some won’t even answer my phone calls. And when they do, they say, “Oh, we don’t do pop-ups anymore. We don’t do community-outreach clinics anymore, because the grant money’s gone. The staff we hired during the pandemic, they use the pandemic funding—they’re gone.” But people are here; they want the booster. And my clinical partners say, “Send them down to a pharmacy.” Nobody wants to go to a pharmacy. [Read: The COVID strategy America hasn’t really tried] You can’t see me, so you can’t see the smoke still coming out of my ears. But it hurts. We got them to trust. If you abandon the community now, it will simply reinforce the idea that they don’t matter. Wu: What is the response to this from the communities you’re talking to? Thomas: It’s “I told you so, they didn’t care about us. I told you, they would leave us with all these other underlying conditions.” You know, it shouldn’t take a pandemic to build trust. But if we lose it now, it will be very, very difficult to build back. We built a bridge. It worked. Why would you dismantle it? Because that’s exactly what's happening right now. The very infrastructure we created to close the racial gaps in vaccine acceptance is being dismantled. It’s totally unacceptable. Wu: The emergency declaration was always going to end at some point. Did it have to play out like this? Thomas: I don’t think so. If you talk to the hospital administrators, they’ll tell you the emergency declaration and the money allowed them to add outreach. And when the money went away, they went back to business as usual. Even though the outreach proved you could actually do a better job. And the misinformation and the disinformation campaign hasn’t stopped. Why would you go back to what doesn’t work? Wu: What is your team planning for the short and long term, with limited resources? Thomas: As long as Shots at the Shop can connect clinical partners to access vaccines, we will definitely keep that going. Nobody wants to go back to normal. So many of our barbers and stylists feel like they’re on their own. I’m doing my best to supply them with KN95 masks and rapid tests. We have kept the conversation going on our every-other-week Zoom town hall. We just launched a podcast. We put out some of our stories in the form of a graphic novel, The Barbershop Storybook. And we’re trying to launch a national association for barbers and stylists, called Barbers and Stylists United for Health. The pandemic resulted in a mobilization of innovation, a recognition of the intelligence at the community level, the recognition that you need to culturally tailor your strategy. We need to keep those relationships intact. Because this is not the last time we’re going to see a pandemic even in our lifetime. I’m doing my best to knock on doors to continue to put our proposals out there. Hopefully, people will realize that reaching Black and Hispanic communities is worth sustaining. from https://ift.tt/2DrAKlj Check out http://natthash.tumblr.com When it comes to treating disease with food, the quackery stretches back far. Through the centuries, raw garlic has been touted as a home treatment for everything from chlamydia to the common cold; Renaissance remedies for the plague included figs soaked in hyssop oil. During the 1918 flu pandemic, Americans wolfed down onions or chugged “fluid beef” gravy to keep the deadly virus at bay. Even in modern times, the internet abounds with dubious culinary cure-alls: apple-cider vinegar for gonorrhea; orange juice for malaria; mint, milk, and pineapple for tuberculosis. It all has a way of making real science sound like garbage. Research on nutrition and immunity “has been ruined a bit by all the writing out there on Eat this to cure cancer,” Lydia Lynch, an immunologist and a cancer biologist at Harvard, told me. In recent years, though, plenty of legit studies have confirmed that our diets really can affect our ability to fight off invaders—down to the fine-scale functioning of individual immune cells. Those studies belong to a new subfield of immunology sometimes referred to as immunometabolism. Researchers are still a long way off from being able to confidently recommend specific foods or dietary supplements for colds, flus, STIs, and other infectious illnesses. But someday, knowledge of how nutrients fuel the fight against disease could influence the way that infections are treated in hospitals, in clinics, and maybe at home—not just with antimicrobials and steroids but with dietary supplements, metabolic drugs, or whole foods. Although major breakthroughs in immunometabolism are just now arriving, the concepts that underlie them have been around for at least as long as the quackery. People have known for millennia that in the hours after we fall ill, our appetite dwindles; our body feels heavy and sluggish; we lose our thirst drive. In the 1980s, the veterinarian Benjamin Hart argued that those changes were a package deal—just some of many sickness behaviors, as he called them, that are evolutionarily hardwired into all sorts of creatures. The goal, Hart told me recently, is to “help the animal stay in one place and conserve energy”—especially as the body devotes a large proportion of its limited resources to igniting microbe-fighting fevers. The notion of illness-induced anorexia (not to be confused with the eating disorder anorexia nervosa) might seem, at first, like “a bit of a paradox,” says Zuri Sullivan, an immunologist at Harvard. Fighting pathogenic microbes is energetically costly—which makes eating less a very counterintuitive choice. But researchers have long posited that cutting down on calories could serve a strategic purpose: to deprive certain pathogens of essential nutrients. (Because viruses do not eat to acquire energy, this notion is limited to cell-based organisms such as bacteria, fungi, and parasites.) A team led by Miguel Soares, an immunologist at the Instituto Gulbenkian de Ciência, in Portugal, recently showed that this exact scenario might be playing out with malaria. As the parasites burst out of the red blood cells where they replicate, the resulting spray of heme (an oxygen-transporting molecule) prompts the liver to stop making glucose. The halt seems to deprive the parasites of nutrition, weakening them and tempering the infection’s worst effects. [Read: Why science can be so indecisive about nutrition] Cutting down on sugar can be a dangerous race to the bottom: Animals that forgo food while they’re sick are trying to starve out an invader before they themselves run out of energy. Let the glucose boycott stretch on too long, and the dieter might develop dangerously low blood sugar —a common complication of severe malaria—which can turn deadly if untreated. At the same time, though, a paucity of glucose might have beneficial effects on individual tissues and cells during certain immune fights. For example, low-carbohydrate, high-fat ketogenic diets seem to enhance the protective powers of certain types of immune cells in mice, making it tougher for particular pathogens to infiltrate airway tissue. Those findings are still far from potential human applications. But Andrew Wang, an immunologist and a rheumatologist at Yale, hopes that this sort of research could someday yield better clinical treatments for sepsis, an often fatal condition in which an infection spreads throughout the body, infiltrating the blood. “It’s still not understood exactly what you’re supposed to feed folks with sepsis,” Wang told me. He and his former mentor at Yale, Ruslan Medzhitov, are now running a clinical trial to see whether shifting the balance of carbohydrates and lipids in their diet speeds recovery for people ill with sepsis. If the team is able to suss out clear patterns, doctors might eventually be able to flip the body’s metabolic switches with carefully timed doses of drugs, giving immune cells a bigger edge against their enemies. But the rules of these food-illness interactions, to the extent that anyone understands them, are devilishly complex. Sepsis can be caused by a whole slew of different pathogens. And context really, really matters. In 2016, Wang, Medzhitov, and their colleagues discovered that feeding mice glucose during infections created starkly different effects depending on the nature of the pathogen driving disease. When the mice were pumped full of glucose while infected with the bacterium Listeria, all of them died—whereas about half of the rodents that were allowed to give in to their infection-induced anorexia lived. Meanwhile, the same sugary menu increased survival rates for mice with the flu. In this case, the difference doesn’t seem to boil down to what the microbe was eating. Instead, the mice’s diet changed the nature of the immune response they were able to marshal—and how much collateral damage that response was able to inflict on the body, as James Hamblin wrote for The Atlantic at the time. The type of inflammation that mice ignited against Listeria, the team found, could imperil fragile brain cells when the rodents were well fed. But when the mice went off sugar, their starved livers started producing an alternate fuel source called ketone bodies—the same compounds people make when on a ketogenic diet—that helped steel their neurons. Even as the mice fought off their bacterial infections, their brain stayed resilient to the inflammatory burn. The opposite played out when the researchers subbed in influenza, a virus that sparks a different type of inflammation: Glucose pushed brain cells into better shielding themselves against the immune system’s fiery response. [Read: Feed a cold, don’t starve it] There’s not yet one unifying principle to explain these differences. But they are a reminder of an underappreciated aspect of immunity. Surviving disease, after all, isn’t just about purging a pathogen from the body; our tissues also have to guard themselves from shrapnel as immune cells and microbes wage all-out war. It’s now becoming clear, Soares told me, that “metabolic reprogramming is a big component of that protection.” The tactics that thwart a bacterium like Listeria might not also shield us from a virus, a parasite, or a fungus; they may not be ideal during peacetime. Which means our bodies must constantly toggle between metabolic states. In the same way that the types of infections likely matter, so do the specific types of nutrients: animal fats, plant fats, starches, simple sugars, proteins. Like glucose, fats can be boons in some contexts but detrimental in others, as Lynch has found. In people with obesity or other metabolic conditions, immune cells appear to reconfigure themselves to rely more heavily on fats as they perform their day-to-day functions. They can also be more sluggish when they attack. That’s the case for a class of cells called natural killers: “They still recognize cancer or a virally infected cell and go to it as something that needs to be killed,” Lynch told me. “But they lack the energy to actually kill it.” Timing, too, almost certainly has an effect. The immune defenses that help someone expunge a virus in the first few days of an infection might not be the ones that are ideal later on in the course of disease. Even starving out bacterial enemies isn’t a surefire strategy. A few years ago, Janelle Ayres, an immunologist at the Salk Institute for Biological Studies, and her colleagues found that when they infected mice with Salmonella and didn’t allow the rodents to eat, the hungry microbes in their guts began to spread outside of the intestines, likely in search of food. The migration ended up killing tons of their tiny mammal hosts. Mice that ate normally, meanwhile, fared far better—though the Salmonella inside of them also had an easier time transmitting to new hosts. The microbes, too, were responding to the metabolic milieu, and trying to adapt. “It would be great if it was as simple as ‘If you have a bacterial infection, reduce glucose,’” Ayres said. “But I think we just don’t know.” All of this leaves immunometabolism in a somewhat chaotic state. “We don’t have simple recommendations” on how to eat your way to better immunity, Medzhitov told me. And any that eventually emerge will likely have to be tempered by caveats: Factors such as age, sex, infection and vaccination history, underlying medical conditions, and more can all alter people’s immunometabolic needs. After Medzhitov’s 2016 study on glucose and viral infections was published, he recalls being dismayed by a piece from a foreign outlet circulating online claiming that “a scientist from the USA says that during flu, you should eat candy,” he told me with a sigh. “That was bad.” [Read: You can’t “starve” cancer, but you might help treat it with food] But considering how chaotic, individualistic, and messy nutrition is for humans, it shouldn’t be a surprise that the dietary principles governing our individual cells can get pretty complicated too. For now, Medzhitov said, we may be able to follow our instincts. Our bodies, after all, have been navigating this mess for millennia, and have probably picked up some sense of what they need along the way. It may not be a coincidence that during viral infections, “something sweet like honey and tea can really feel good,” Medzhitov said. There may even be some immunological value in downing the sick-day classic, chicken soup: It’s chock-full of fluid and salts, helpful things to ingest when the body’s electrolyte balance has been thrown out of whack by disease. The science around sickness cravings is far from settled. Still, Sullivan, who trained with Medzhitov, jokes that she now feels better about indulging in Talenti mango sorbet when she’s feeling under the weather with something viral, thanks to her colleagues’ 2016 finds. Maybe the sugar helps her body battle the virus without harming itself; then again, maybe not. For now, she figures it can’t hurt to dig in. from https://ift.tt/lyaSGqw Check out http://natthash.tumblr.com If you’ve ever been to London, you know that navigating its wobbly grid, riddled with curves and dead-end streets, requires impressive spatial memory. Driving around London is so demanding, in fact, that in 2006 researchers found that it was linked with changes in the brains of the city’s cab drivers: Compared with Londoners who drove fixed routes, cabbies had a larger volume of gray matter in the hippocampus, a brain region crucial to forming spatial memory. The longer the cab driver’s tenure, the greater the effect. The study is a particularly evocative demonstration of neuroplasticity: the human brain’s innate ability to change in response to environmental input (in this case, the spatially demanding task of driving a cab all over London). That hard-won neuroplasticity required years of mental and physical practice. Wouldn’t it be nice to get the same effects without so much work? To hear some people tell it, you can: Psychedelic drugs such as psilocybin, LSD, ayahuasca, and Ecstasy, along with anesthetics such as ketamine, can enhance a user’s neuroplasticity within hours of administration. In fact, some users take psychedelics for the express purpose of making their brain a little more malleable. Just drop some acid, the thinking goes, and your brain will rewire itself—you’ll be smarter, fitter, more creative, and self-aware. You might even get a transcendent experience. Popular media abound with anecdotes suggesting that microdosing LSD or psilocybin can expand divergent thinking, a more free and associative type of thinking that some psychologists link with creativity. [Read: Here’s what happens when a few dozen people take small doses of psychedelics] Research suggests that psychedelic-induced neuroplasticity can indeed enhance specific types of learning, particularly in terms of overcoming fear and anxiety associated with past trauma. But claims about the transformative, brain-enhancing effects of psychedelics are, for the most part, overstated. We don’t really know yet how much microdosing, or a full-blown trip, will change the average person’s mental circuitry. And there’s reason to suspect that, for some people, such changes may be actively harmful. There is nothing new about the notion that the human and animal brain are pliant in response to everyday experience and injury. The philosopher and psychologist William James is said to have first used the term plasticity back in 1890 to describe changes in neural pathways that are linked to the formation of habits. Now we understand that these changes take place not only between neurons but also within them: Individual cells are capable of sprouting new connections and reorganizing in response to all kinds of experiences. Essentially, this is a neural response to learning, which psychedelics can rev up. We also understand how potent psychedelic drugs can be in inducing changes to the brain. Injecting psilocybin into a mouse can stimulate neurons in the frontal cortex to grow by about 10 percent and sprout new spines, projections that foster connections to other neurons. It also alleviated their stress-related behaviors—effects that persisted for more than a month, indicating enduring structural change linked with learning. Presumably, a similar effect takes place in humans. (Comparable studies on humans would be impossible to conduct, because investigating changes in a single neuron would require, well, sacrificing the subject.) The thing is, all those changes aren’t necessarily all good. Neuroplasticity just means that your brain—and your mind—is put into a state where it is more easily influenced. The effect is a bit like putting a glass vase back into the kiln, which makes it pliable and easy to reshape. Of course you can make the vase more functional and beautiful, but you might also turn it into a mess. Above all else, psychedelics make us exquisitely impressionable, thanks to their speed of action and magnitude of effect, though their ultimate effect is still heavily dependent on context and influence. [Read: A new chapter in the science of psychedelic microdosing] We have all experienced heightened neuroplasticity during the so-called sensitive periods of brain development, which typically unfold between the ages of 1 and 4 when the brain is uniquely responsive to environmental input. This helps explain why kids effortlessly learn all kinds of things, like how to ski or speak a new language. But even in childhood, you don’t acquire your knowledge and skills by magic; you have to do something in a stimulating enough environment to leverage this neuroplastic state. If you have the misfortune of being neglected or abused during your brain’s sensitive periods, the effects are likely to be adverse and enduring—probably more so than if the same events happened later in life. Being in a neuroplastic state enhances our ability to learn, but it might also burn in negative or traumatic experiences—or memories—if you happen to have them while taking a psychedelic. Last year, a patient of mine, a woman in her early 50s, decided to try psilocybin with a friend. The experience was quite pleasurable until she started to recall memories of her emotionally abusive father, who had an alcohol addiction. In the weeks following her psilocybin exposure, she had vivid and painful recollections of her childhood, which precipitated an acute depression. Her experience might have been very different—perhaps even positive—if she’d had a guide or therapist with her while she was tripping to help her reappraise these memories and make them less toxic. But without a mediating positive influence, she was left to the mercy of her imagination. This must have been just the sort of situation legislators in Oregon had in mind last month when they legalized recreational psilocybin use, but only in conjunction with a licensed guide. It’s the right idea. [Read: What it’s like to trip on the most potent magic mushroom] In truth, researchers and clinicians haven’t a clue whether people who microdose frequently with psychedelics—and are thus walking around in a state of enhanced neuroplasticity—are more vulnerable to the encoding of traumatic events. In order to find out, you would have to compare a group of people who microdose against a group of people who don’t over a period of time and see, for example, if they differ in rates of PTSD. Crucially, you’d have to randomly assign people to either microdose or abstain—not simply let them pick whether they want to try tripping. In the absence of such a study, we are all currently involved in a large, uncontrolled social experiment. The results will inevitably be messy and inconclusive. Even if opening your brain to change were all to the good, the promise of neuroplasticity without limit—that you can rejuvenate and remodel the brain at any age—far exceeds scientific evidence. Despite claims to the contrary, each of us has an upper limit to how malleable we can make our brain. The sensitive periods, when we hit our maximum plasticity, is a finite window of opportunity that slams shut as the brain matures. We progressively lose neuroplasticity as we age. Of course we can continue to learn—it just takes more effort than when we were young. Part of this change is structural: At 75, your hippocampus contains neurons that are a lot less connected to one another than they were at 25. That’s one of the major reasons older people find that their memory is not as sharp as it used to be. You may enhance those connections slightly with a dose of psilocybin, but you simply can’t make your brain behave as if it’s five decades younger. [Read: What it’s like to get worse at something] This reality has never stopped a highly profitable industry from catering to people’s anxieties and hopes—especially seniors’. You don’t have to search long online before you find all kinds of supplements claiming to keep your brain young and sharp. Brain-training programs go even further, purporting to rewire your brain and boost your cognition (sound familiar?), when in reality the benefits are very modest, and limited to whatever cognitive task you’ve practiced. Memorizing a string of numbers will make you better at memorizing numbers; it won’t transfer to another skill and make you better at, say, chess. We lose neuroplasticity as we age for good reason. To retain our experience, we don’t want our brain to rewire itself too much. Yes, we lose cognitive fluidity along the way, but we gain knowledge too. That’s not a bad trade-off. After all, it’s probably more valuable to an adult to be able to use all of their accumulated knowledge than to be able to solve a novel mathematical problem or learn a new skill. More important, our very identity is encoded in our neural architecture—something we wouldn’t want to tinker with lightly. At their best, psychedelics and other neuroplasticity-enhancing drugs can do some wonderful things, such as speed up the treatment of depression, quell anxiety in terminally ill patients, and alleviate the worst symptoms of PTSD. That’s enough reason to research their uses and let patients know psychedelics are an option for psychiatric treatment when the evidence supports it. But limitless drug-induced self-enhancement is simply an illusion. from https://ift.tt/HqQADaO Check out http://natthash.tumblr.com |
Authorhttp://natthash.tumblr.com Archives
April 2023
Categories |