As Republicans across the U.S. intensify their efforts to legislate against transgender rights, they are finding aid and comfort in an unlikely place: Western Europe, where governments and medical authorities in at least five countries that once led the way on gender-affirming treatments for children and adolescents are now reversing course, arguing that the science undergirding these treatments is unproven, and their benefits unclear. The about-face by these countries concerns the so-called Dutch protocol, which has for at least a decade been viewed by many clinicians as the gold-standard approach to care for children and teenagers with gender dysphoria. Kids on the protocol are given medical and mental-health assessments; some go on to take medicines that block their natural puberty and, when they’re older, receive cross-sex hormones and eventually surgery. But in Finland, Sweden, France, Norway, and the U.K., scientists and public-health officials are warning that, for some young people, these interventions may do more harm than good. European health authorities are not reversing themselves on broader issues of trans rights, particularly for adults. But this turn against the Dutch protocol has inflamed activists and politicians in the United States. Republicans who have worked to ban its recommended treatments claim that the shifts in Europe prove they’re right. Their opponents argue that any doubts at all about the protocol, raised in any country whatsoever, are simply out of step with settled science: They point to broad endorsements by the American Medical Association, the American Psychiatric Association, and the American Academy of Pediatrics, among other groups; and they assert that when it comes to the lifesaving nature of gender-affirming care, “doctors agree.” But doctors do not agree, particularly in Europe, where no treatments have been banned but a genuine debate is unfurling in this field. In Finland, for example, new treatment guidelines put out in 2020 advised against the use of puberty-blocking drugs and other medical interventions as a first line of care for teens with adolescent-onset dysphoria. Sweden’s National Board of Health and Welfare followed suit in 2022, announcing that such treatments should be given only under exceptional circumstances or in a research context. Shortly after that, the National Academy of Medicine in France recommended la plus grande réserve in the use of puberty blockers. Just last month, a national investigatory board in Norway expressed concerns about the treatment. And the U.K.’s only national gender clinic for children, the Tavistock, has been ordered to close its doors after a government-commissioned report found, among other problems, that its Dutch-protocol-based approach to treatment lacked sufficient evidence. These changes in Europe have so far been fairly localized: Health authorities in many countries on the continent—among them Austria, Denmark, Germany, Italy, and Spain—have neither subjected the Dutch approach to formal scrutiny nor advised against its use. Yet questions about the protocol seem to be spreading. At the end of March, for example, a Belgian TV report described a 42-fold increase in patients at a leading gender clinic in Ghent and raised questions about the right approach to care. Doubts about the protocol have even come to the country that invented it, at the Center of Expertise on Gender Dysphoria in Amsterdam. “Until I began noticing the developments in other EU countries and started reading the scientific literature myself, I too thought that the Dutch gender care was very careful and evidence-based,” Jilles Smids, a postdoctoral researcher in medical ethics at Erasmus University in the Netherlands, told me via email. “But now I don’t think that any more.” Kirsten Visser, a Netherlands-based advocate and consultant for parents of trans teens, says her own son, Sietse, started receiving “definitely lifesaving” care at the Amsterdam center in 2012, at the age of 11. Around the time that Sietse showed up at the clinic, the Dutch protocol was becoming established internationally, largely through the work of a child and adolescent psychiatrist there named Annelou de Vries. After completing a Ph.D. on gender dysphoria in Dutch adolescents, de Vries published two seminal papers with the clinical psychologist Peggy Cohen-Kettenis and other colleagues, in 2011 and 2014. The former looked at the psychological effects of puberty suppression on 70 young people over a period of two years, on average; the latter tracked outcomes for 55 of those people who had gone on to receive gender-reassignment surgery, over an average of six years. Taken together, the studies found that the teens showed fewer symptoms of depression after having their puberty suppressed, as well as a decrease in behavioral and emotional problems; and that the ones who went on to take gender-affirming hormones and have surgery grew into “well-functioning young adults.” De Vries’s expertise has since been widely recognized within the field: She served as a co-lead on the revision of the adolescent section of care guidelines recently published by the World Professional Association for Transgender Health, and is now president-elect of the European equivalent, EPATH. But in the years after her two studies were released, research done in other European countries led to concerns about their relevance. In 2015, for example, Finnish researchers described a phenomenon that “called for clinical attention,” as they put it: More children were reporting gender dysphoria, and a greater proportion of them had been assigned female at birth. The fact that three-quarters of those Finnish teens had been diagnosed with separate and severe psychiatric conditions appeared to be at odds with the data from the Netherlands, the paper argued. The Dutch studies had found that just one-third of adolescents with gender dysphoria experienced other psychiatric issues, suggesting they were in far better mental health. In Sweden, too, clinicians grew alarmed by the sudden increase in the number of teenagers seeking gender care. Mikael Landén, a professor of psychiatry at the University of Gothenburg, told me that this population has increased 17-fold since 2010. One explanation for that change—that more open-minded attitudes around gender have emboldened kids to seek the help they need—just doesn’t ring true to him. He’d studied those views in his early work, he said, and found that, on the whole, Swedish attitudes toward transgender people have been very positive for a long time. When the government asked Landén and a group of other scientists to write an evidence-based review of hormone-based treatments for young people, their verdict, after two years of study, was expressed definitively: The original research findings from de Vries were outdated, and do not necessarily apply to the group of teens who have been coming forward in more recent years. The Dutch protocol had been “a valuable contribution,” he told me, and “it was reasonable to start using it” in Sweden. But times had changed, and so had the research literature. In 2021, for instance, a team based at the U.K.’s Tavistock clinic published research showing no detectable improvements in the mental health of youngsters who had been put on puberty blockers and followed for up to three years. [Read: The war on trans kids is totally unconstitutional] De Vries acknowledged some concerns about the research when we spoke in February. “Our early outcomes studies were really from another time and comprised small samples,” she told me, and they looked only at trans youth who had experienced gender dysphoria from childhood. She granted that there is some research to suggest that kids who don’t arrive at the clinic until they’re older are worse off, psychologically, than their younger peers; but she also said her team has run studies including 16-year-olds, and that their findings were “not worrisome.” She agrees that other researchers have not replicated the long-term follow-up research on kids who went through the Dutch protocol, but she pointed out that the short-term benefits of such treatment have indeed been seen in other studies. Research conducted in the U.S., and published earlier this year, found that a group of 315 trans and nonbinary youth were on average less depressed and anxious, and better-functioning, after two years of hormonal treatment. In the meantime, de Vries and her colleagues haveurged clinicians in other countries to do more of their own investigation, in part because the youngsters who receive care at gender clinics in the Netherlands seem to be in comparatively good mental health from the get-go. It’s not yet clear, she told me, that studies of this group will be applicable to youth in other countries. “Every doctor or psychologist who is involved in transgender care should feel the obligation to do a good pre- and post-test,” one of de Vries’s co-authors on the 2011 and 2014 studies said to a Dutch newspaper in 2021. “The rest of the world is blindly adopting our research.” De Vries is now working on a research project, funded by an $864,000 grant, that will try to answer newly forming doubts about the Dutch protocol. Her proposal for the grant, filed in 2021, described its subject as a “once so welcomed but now sharp[ly] criticized approach.” That such criticisms are becoming mainstream even in her own country is itself a startling development. After all, the Netherlands has long been at the vanguard of progressive health-care practices. When the Dutch approach to transgender care for adults first started taking shape during the 1970s (many years before the protocol for kids would be established), the country’s politics were dominated by a steadfast opposition to taboos. James Kennedy, an American-born professor of modern Dutch history at Utrecht University, has described this as the country’s “compassionate culture”: In a radical departure from its traditional Christian conservatism, long-standing policies were being spurned; and even touchy subjects such as death and sex were made the subject of broad public-policy debates. Sex work, for example, was widely tolerated, then legalized in 2000. Similarly, the Royal Dutch Medical Association offered formal guidelines for the practice of euthanasia in the 1980s, and a corresponding national law—one of the world’s first—codified the rules in 2002. Against this backdrop of openness, in which doctors were seen as authoritative figures who were well equipped to decide what was best for their patients, one of the first dedicated clinics for transgender people was established in Amsterdam in 1972. It offered an array of services—blood tests, hormone therapy, and surgeries—to trans adults. According to a recent book by the historian Alex Bakker, Dutch surgeons, some of them inspired by their Christian beliefs, developed techniques that would reduce patients’ psychological suffering. “Helping those in need trumped ‘taboos’ about the sanctity of life or fixed gender roles,” Kennedy told me. The Dutch protocol for treating gender dysphoria in children, as established in the 1990s, reflected a further extension of this philosophy, aiming to smooth adult transitions by intervening early. [Read: Take detransitioners seriously] Nevertheless, in December, a journalist named Jan Kuitenbrouwer and a sociologist named Peter Vasterman published an opinion piece in a leading daily newspaper, NRC, that took aim at the Dutch protocol and its “shaky” scientific foundations, and alluded to the international scrutiny of the past few years. “It is remarkable that the media in our neighboring countries report extensively on this reconsideration,” the article said, “but the Dutch hardly ever do.” Like critics elsewhere, Kuitenbrouwer and Vasterman pointed to the rising numbers of children seeking care, from 60 to 1,600 in the Netherlands across a dozen years, and the unaccounted rise in those assigned female at birth; and they suggested that this new generation of people seeking treatment is not analogous to those included in the studies conducted by de Vries a decade ago. De Vries and some colleagues countered that their more recent research addresses this concern. “Scientific evaluation has always been an integral part of this challenging model of care, where young people make early decisions about medical interventions with lifelong implications,” they wrote in the same newspaper. Also in December, a clinical psychologist at Radboud University’s gender clinic in Nijmegen named Chris Verhaak told a different Dutch outlet that puberty blockers affect children’s bones, and maybe also their brain development. “It is not nothing,” she said. Verhaak is currently running a government-funded study to understand the source and nature of the increase in the number of patients. (Results are due to be presented to the Dutch House of Representatives this year.) In another interview that month, she said that for up to half of cases, the gains in suppressing puberty are not clear. “I worry about that,” she told the newsweekly De Groene Amsterdammer. “Especially because we also experience enormous pressure to provide these puberty inhibitors as quickly as possible.” Verhaak’s comments in particular sparked dismay among trans groups, which saw them as promoting destructive narratives about social contagion. Verhaak and her direct collaborators say that they are no longer speaking to the media until their study is released, but Hedi Claahsen, a professor and principal clinician on the Radboud center’s gender team, told me that practitioners are cautious and follow national guidelines. When I asked if her center’s approach differed from the one used in Amsterdam, she told me, “No clinic is exactly the same.” Individual providers, who are working at different institutions, may end up providing care that reflects “a different vision.” Another, more significant round of criticism arrived at the end of February, when another widely read Dutch newspaper, de Volkskrant, published a 5,000-word article under a headline reading: “The treatment of transgender youth in the Netherlands was praised. Now the criticism of ‘the Dutch approach’ is growing.” The authors spoke with Iris, a 22-year-old woman who spent five years on testosterone and had a double mastectomy that she now regrets; they pointed to a new population of kids assigned female at birth seeking care only in their teens; and they noted reservations about the protocol in Finland and Sweden. “Is the ‘Dutch approach’ still the way to go?” the story asked. The article prompted debate on Twitter, where Michiel Verkoulen, a health economist working with the government of the Netherlands to address the long-standing problem of ever-expanding waiting lists and their impact on young people’s mental health, accused the Dutch protocol’s critics of ignoring what he described as the elephant in the room. “What to do with the people for whom transgender care is critical?” he asked. “You can put every research aside, keep asking for more, and argue that diagnostics and treatments should be stricter … But the question remains: What then?” “In the Netherlands there are more and more people saying that gender diversity is woke and it’s nonsense and it’s bullshit,” Visser, the consultant for parents of trans teens, told me. Sam van den Berg, a spokesperson for an Utrecht-based trans-rights organization called Transvisie, argued that this debate does not need to happen. The quality of care for children with gender dysphoria is better in the Netherlands than almost anywhere else, she said. “We don’t feel it’s necessary to change anything.” Indeed, doctors in the Netherlands are still free to provide gender-affirming care as they see fit. The same is true of their colleagues in Finland, Sweden, France, Norway, and the U.K., where new official guidelines and recommendations are not binding. No legal prohibitions have been put in place in Europe, as they have been in more than a dozen U.S. states, where physicians risk losing their medical license or facing criminal sanctions for prescribing certain forms of gender-affirming care. But the trend toward more conservative application of the Dutch protocol is likely to have real effects in European countries, in terms of which kids get treatment, and of what kind. Louise Frisén, an associate professor at Karolinska Institute and a pediatric psychiatrist at the child and adolescent mental-health clinic in Stockholm, Sweden, told me she worries that under her country’s new guidelines, many of her teenage patients will find it harder to access medical care. The benefits of treatment are clear, she said, and she further claimed that the policy change has caused anguish for some patients who are panicking at the looming prospect of puberty. As for de Vries, when I spoke with her a few weeks before the article in de Volkskrant was published, she agreed that clinicians should be cautious, but not to the point where treatment becomes inaccessible. Outcomes for those with later-onset dysphoria do need to be investigated further, she acknowledged, but “if we are going to wait ’til the highest-standard medical evidence provides us the answers, we will have to stop altogether.” In that sense, Europe’s brewing disagreement over treatment could turn into paralysis. “That’s what worries me,” she said. “You will always have to work with uncertainties in this field.” from https://ift.tt/fFg7obq Check out http://natthash.tumblr.com
0 Comments
Imagine you need to send a letter. The mailbox is only two blocks away, but the task feels insurmountable. Air hunger seizes you whenever you walk, you’re plagued by dizziness and headaches, and anyway, you keep blanking on your zip code for the return address. So you sit in the kitchen, disheartened by the letter you can’t send, the deadlines you’ve missed, the commitments you’ve canceled. Months have passed since you got COVID. Weren’t you supposed to feel better by now? Long COVID is a diverse and confusing condition, a new disease with an unclear prognosis, often-fluctuating symptoms, and a definition people still can’t agree on. And in many cases, it is disabling. In a recent survey, 1.6 percent of American adults said post-COVID symptoms limit their daily activities “a lot.” That degree of upheaval aligns with the Americans With Disabilities Act’s definition of disability: “a physical or mental impairment that substantially limits one or more major life activities.” But for many people experiencing long COVID who were able-bodied before, describing themselves as “disabled” is proving to be a complicated decision. This country is not kind to disabled people: American culture and institutions tend to operate on the belief that a person’s worth derives from their productivity and physical or cognitive abilities. That ableism was particularly stark in the early months of the pandemic, when some states explicitly de-prioritized certain groups of disabled people for ventilators. Despite the passage of the ADA in 1990, disabled people still confront barriers accessing things such as jobs and health care, and even a meal with friends at a restaurant. Most of our cultural narratives cast disability as either a tribulation to overcome or a tragedy. Consequently, incorporating disability into your identity can require a lot of reflection. Lizzie Jones, who finished her doctoral research in disability studies last year and now works for an educational consultancy, suffered a 30-foot fall that shattered half of her body a week before her college graduation. She told me that her accident prompted “radical identity shifts” as she transitioned from trying to get the life she’d imagined back on track to envisioning a new one. These are the sorts of mindset changes that Ibrahim Rashid struggled with after contracting COVID in November 2020, when he was a graduate student. He dealt with debilitating symptoms for months, but even after applying for disability accommodations to finish his degree, he “was so scared of that word,” he told me. Rashid was afraid of people treating him differently and of losing his internship offer. Most terrifying, calling himself disabled felt like an admission that his long COVID wasn’t going to suddenly resolve. [Jennifer Senior: What not to ask me about my long COVID] Aaron Teasdale, an outdoors and travel writer and a mountaineer, has also been wrestling with identity questions since he got COVID in January 2022. For months, he spent most of his time in a remote-controlled bed, gazing out the window at the Montana forests he once skied. Although his fatigue is now slowly improving, he had to take Ritalin to speak with me. He was still figuring out what being disabled meant to him, whether it simply described his current condition or reflected some new, deeper part of himself—a reckoning made more difficult by the unknowability of his prognosis. "Maybe I just need more time before I say I’m a disabled person,“ he said. “When you have your greatest passions completely taken away from you, it does leave you questioning, Well, who am I?” Long COVID can wax and wane, leaving people scrambling to adapt. It doesn’t mesh with the stereotype of disability as static, visible, and binary—the wheelchair user cast in opposition to the pedestrian. Nor does the fact that long COVID is often imperceptible in casual interactions, which forces long-haulers to contend with disclosure and the possibility of passing as able-bodied. One such long-hauler is Julia Moore Vogel, a program director at Scripps Research, who initially hesitated at the idea of getting a disabled-parking permit. “My first thought was, I’m not disabled, because I can walk,” she told me. But if she did walk, she’d be drained for days. Taking her daughter to the zoo or the beach was out of the question. Once she got over her apprehension, identifying as disabled ended up feeling empowering. Getting that permit was “one of the best things I’ve done for myself,” Vogel told me. She could drive her kid to the playground, park nearby, and then sit and watch her play. After plenty of therapy and conversations with other disabled people, Rashid, too, came to embrace disability as part of his identity, so much so that he now speaks and writes about chronic illness. [Read: The future of long COVID] Usually, the community around a disease—including advocacy among those it disables—arises after scientists name it. Long COVID upended that order, because the term first spread through hashtags and support groups in 2020. Instead of doctors informing patients of whether their symptoms fit a certain illness, patients were telling doctors what symptoms their illness entailed. And there were a lot of symptoms: everything from life-altering neurocognitive problems and dizziness to a mild, persistent cough. As long-COVID networks blossomed online, members began seeking support from wider disability-rights communities, and contributing fresh energy and resources to those groups. People who’d fought similar battles for decades sometimes bristled at the greater political capital afforded to long-haulers, whose advocacy didn’t universally extend to other disabled people; for the most part, though, long-haulers were welcomed. Tapping into conversations among disabled people “has shown me that I’m simply not alone,” Eris Eady, a writer and an artist who works for Planned Parenthood, told me. Eady, who is queer and Black, found that long COVID interplayed with struggles they already faced on account of their identity. So they sought advice from disabled Black women about interdependence, mutual aid, and accessibility, as well as about being dismissed by doctors, an experience more prevalent among women and people of color. Disabled communities have years of experience supporting people through identity changes. The writer and disability-justice organizer Leah Lakshmi Piepzna-Samarasinha told me that when she was newly disabled, she was dogged with heavy questions: Am I going to be able to make a living? Am I datable? Her isolation and fear dissipated only when she met other young disabled people, who taught her how to be creative in “hacking the world.” For long-haulers navigating these transitions for the first time, the process can be rocky. Rachel Robles, a contributor to The Long COVID Survival Guide, told me she spent her early months with long COVID “waking up every day and thinking, Okay, is this the day it’s left my body?” Conceiving of herself as disabled didn’t take away her long COVID. She didn’t stop seeing doctors and trying treatments. But thinking about accessibility did inspire her to return to gymnastics, which she’d quit decades earlier because of a heart condition. If she couldn’t lift her hands over her head sometimes, and if a dive roll would never be in her future, then so be it: Gymnastics could be about enjoying what her body could do, not yearning for what it couldn’t. Before she identified as disabled, returning to gymnastics “was something I would have never, ever imagined,” Robles said. And she never would have done it had she remained focused only on when she might recover. [Read: Long COVID is being erased—again] Hoping for improvement is a natural response to illness, especially one with a trajectory as uncertain as long COVID’s. But focusing exclusively on relinquished past identities or unrealized future ones can dampen our curiosity about the present. A better way to think about it is “What are the things you can do with the body that you have, and what are the things you might not know you can do yet?” Piepzna-Samarasinha said. “Who am I right now?” from https://ift.tt/3QLZOj0 Check out http://natthash.tumblr.com The Ozempic craze shows no signs of slowing. Demand for the drug, popularly used for weight loss, is so monumental that it is already changing the diet industry and spurring a “marketing bonanza” among the dozens of telehealth start-ups that now prescribe it. A highly public ad campaign from one start-up, Ro, banks on the drug’s simple premise: “A weekly shot to lose weight.” Never before has a weight-loss treatment been hyped this way and been able to deliver on its promise. Ozempic itself is technically a diabetes drug, but its active ingredient, semaglutide, has been approved by the FDA for weight loss under the brand name Wegovy, and can reduce a person’s body weight by up to 20 percent through a weekly injection. An even more powerful drug, known as tirzepatide, or Mounjaro, may soon be approved for weight loss, and a host of new medications are coming down the pipeline. All signs suggest that America is on the verge of a weight-loss revolution. But for people with obesity, semaglutide isn’t even the most effective weight-loss treatment around—not even close. Bariatric surgery, which has existed for many decades, is still significantly more potent. This class of procedures, which, broadly speaking, reconfigure the digestive system so people feel less hungry and more full, is considered to be the “gold standard” for treating obesity, Holly Lofton, an obesity-medicine physician at NYU, told me. Most people experience weight loss of 50 percent and, with one procedure, up to 80 percent, according to the Cleveland Clinic. Despite the impressive abilities of the new crop of weight-loss drugs—and bold assertions that such drugs could someday replace surgery outright—several doctors told me that surgery will likely continue to be the top-line treatment for obesity, even as the medications improve. People may seek out treatment with the new drugs because they’re so popular, but “long term, there will be an increase in surgery,” Shauna Levy, a professor specializing in bariatric surgery at Tulane University School of Medicine, told me. The new drugs, however potent, may be less a revolutionary fix for obesity and more a powerful tool for treating it—one of many that already exist. Unlike semaglutide, bariatric surgery, first introduced in the 1950s, took several decades to become accepted by the medical community. Initial attempts made people so sick that, at times, the surgery had to be reversed. The term bariatric surgery refers to several different procedures that reshape the gastrointestinal tract so that it absorbs fewer nutrients, holds less food, or both. These days, the most commonly performed surgery is called a Roux-en-Y, which shrinks the stomach to the size of a walnut—so people need less food to feel satisfied—and then reconnects it to the small intestine in a Y shape, rather than linearly. This gastric bypass lets food circumvent most of the stomach, leaving fewer opportunities for the body to harvest nutrients. In another common procedure, surgeons sculpt the stomach into a banana-size“sleeve” and toss the rest; another common type involves rerouting the intestines in a way that minimizes the area where calories can be absorbed. But bariatric surgery does more than shrink gastrointestinal real estate. It exerts a less visible but equally powerful effect on the many different hormones that control hunger. Some procedures remove the part of the gut that produces the “hunger hormone,” ghrelin, while the rerouting of food through a Roux-en-Y ramps up the release of “incretin” hormones that create the feeling of fullness after eating. In a sense, the new weight-loss drugs are essentially trying to re-create the effects of bariatric surgery: The success of these drugs is due to their ability to mimic the incretin hormones and get people to feel satisfied with less food. Semaglutide masquerades as the hormone GLP-1, whereas Mounjaro poses as both GLP-1 and GIP. But these are just two hormones; bariatric surgery “touches on multiple different hormones and different pathways” and, as such, is “more comprehensive,” Levy said. In one study, Mounjaro, considered the most powerful of the current crop of medications, led to 20 percentor more weight loss in 57 percent of people who took the highest dose—an impressive feat, but still a far cry from what is possible with surgery. Similarly, Ozempic and Mounjaro, both technically diabetes drugs, have powerful effects on blood-sugar levels over time, but many surgery patients “leave the hospital already in remission from their diabetes,” Levy said. In addition to sheer potency, surgery is also much more affordable than these weight-loss drugs. Unlike the drugs, bariatric surgery is covered by Medicare if the patient meets certain criteria, including having a BMI equal to or greater than 35 and at least one comorbidity related to obesity. Many private insurers cover it too, albeit to varying degrees. Out of pocket, surgery costs $15,000 to $25,000—not cheap, but still cheaper than shelling out more than $1,000 a month indefinitely. “The patient must understand that they have to continue taking medication forever,” Lofton said. People who stop taking semaglutide generally regain the weight they lost. Lofton told me about one patient who had to forgo rent just to pay for the drugs: Factoring in insurance, “you can pay for three months of medicine and then have surgery at the same price.” Neither treatment, of course, is without its potential downsides. Semaglutide can cause temporary but nasty side effects such as nausea, vomiting, and diarrhea—and though it is considered safe for treating obesity, long-term data on this usage span just two years. Because many surgeries are done laparoscopically—using only tiny incisions--mortality is vanishingly low, and many patients go home after two or three days; full recovery usually takes four to six weeks. In the long term, complications such as hernias, gallstones, and low blood sugar can develop. But there’s a reason bariatric surgery has not led to a weight-loss revolution of the kind that now gets associated with semaglutide. Despite its dramatic effects, and obesity’s prevalence across America, only 1 percent of people eligible for surgery actually get it. People hesitate for many reasons, medical and otherwise, but the most pervasive issue is a lack of awareness that surgery is even a safe or realistic option for weight loss. Bariatric surgery is plagued by stigma even within the medical community: In the 1990s, it was dismissed as a “barbaric” way to address an issue that, many believed, could be treated with diet and exercise. “There are a lot of primary-care doctors who are not talking enough about surgery” because they were trained with that old mindset, Levy said. It doesn’t help that bariatric surgery hasn’t exactly been a media sensation, with few high-profile patient advocates beyond Al Roker and Mariah Carey. In contrast, stories of celebrities on weight-loss drugs abound. Unlike surgery, semaglutide has the potential to be taken recreationally. The advantages that surgery has over weight-loss drugs may change as the drugs become more potent and eventually cheaper. But for now, semaglutide won’t dramatically shift the way obesity is treated, doctors told me—in fact, these new drugs may act as a conduit to surgery itself. Levy predicts that their sheer popularity will trigger a brief dip in the bariatric-surgery rate, but as price remains an issue, and people with obesity are unable to reach their weight-loss goals on the drugs alone, “they may start opening their mind to surgery.” Certainly, in some patients, weight-loss drugs alone could lead to lasting weight loss. And they can benefit those who are overweight but don’t qualify for surgery. But more widely, these drugs will likely be used in tandem with bariatric surgery to produce more dramatic, longer-lasting results, experts told me. “I don’t see this as an either/or,” Fatima Cody Stanford, an obesity-medicine physician at Massachusetts General Hospital and Harvard Medical School, told me. “I see it as surgery plus medicine.” Drugs can help fill in any gaps that surgery leaves behind. Weight can rebound after a procedure, because the body has a way of rebalancing itself; hormones that were tamped down due to bariatric surgery, Stanford said, can “start to reemerge with a vengeance.” About a fifth of people, and perhaps even more, regain a significant amount of weight—15 percent or more—two to five years after surgery. All of the doctors I spoke with said that medication could be a powerful tool to prevent post-surgery weight rebounds—though to keep that weight off, the medication would still have to be taken in perpetuity. Stanford estimated that more than 90 percent of her patients are on weight-loss drugs after surgery—and not necessarily semaglutide; older medications often suffice. Drugs could also be used to help people prepare for surgery, Lofton said. Some doctors encourage patients to lose weight beforehand to decrease the risk of complications such as blood clots, heart attack, and infection. Despite the hype, weight-loss drugs were never a perfect treatment for obesity. Neither is bariatric surgery, for that matter. “It is not a cure,” Lofton told me. A cure, she explained, would ensure that hunger doesn’t return and that fat cells don’t get bigger, a hallmark of obesity: “We have nothing that does that”—not even more potent next-gen drugs will provide a permanent fix. But the effect of combining surgery and medication could come close, she said. That no cure for obesity exists is evidence of its complexity. All of the experts I spoke with pointed out that obesity has long been misunderstood as a failure of personal will, as laziness or gluttony. That misunderstanding has led to inadequate care: Many people who regain weight after a bariatric procedure are made to feel by their doctors like they “wasted the surgery,” even if human biology is to blame, Stanford said. Ozempic and other weight-loss medications frame obesity as a condition that can be treated with drugs—in other words, a disease. Patients on those medications may realize, “Hey, maybe it’s not just me being lazy this whole time—maybe there is science to it and an actual disease here,” said Levy. Collectively understanding of obesity as an illness that exists alongside heart disease and cancer—diseases routinely treated with medication and surgery—instead of as a matter of personal inadequacy will have far more profound impacts on people with obesity than any drug alone. from https://ift.tt/ZxcMIhd Check out http://natthash.tumblr.com Charlie McCone has been struggling with the symptoms of long COVID since he was first infected, in March 2020. Most of the time, he is stuck on his couch or in his bed, unable to stand for more than 10 minutes without fatigue, shortness of breath, and other symptoms flaring up. But when I spoke with him on the phone, he seemed cogent and lively. “I can appear completely fine for two hours a day,” he said. No one sees him in the other 22. He can leave the house to go to medical appointments, but normally struggles to walk around the block. He can work at his computer for an hour a day. “It’s hell, but I have no choice,” he said. Like many long-haulers, McCone is duct-taping himself together to live a life—and few see the tape. McCone knows 12 people in his pre-pandemic circles who now also have long COVID, most of whom confided in him only because “I’ve posted about this for three years, multiple times a week, on Instagram, and they’ve seen me as a resource,” he said. Some are unwilling to go public, because they fear the stigma and disbelief that have dogged long COVID. “People see very little benefit in talking about this condition publicly,” he told me. “They’ll try to hide it for as long as possible.” I’ve heard similar sentiments from many of the dozens of long-haulers I’ve talked with, and the hundreds more I’ve heard from, since first reporting on long COVID in June 2020. Almost every aspect of long COVID serves to mask its reality from public view. Its bewilderingly diverse symptoms are hard to see and measure. At its worst, it can leave people bed- or housebound, disconnected from the world. And although milder cases allow patients to appear normal on some days, they extract their price later, in private. For these reasons, many people don’t realize just how sick millions of Americans are—and the invisibility created by long COVID’s symptoms is being quickly compounded by our attitude toward them. Most Americans simply aren’t thinking about COVID with the same acuity they once did; the White House long ago zeroed in on hospitalizations and deaths as the measures to worry most about. And what was once outright denial of long COVID’s existence has morphed into something subtler: a creeping conviction, seeded by academics and journalists and now common on social media, that long COVID is less common and severe than it has been portrayed—a tragedy for a small group of very sick people, but not a cause for societal concern. This line of thinking points to the absence of disability claims, the inconsistency of biochemical signatures, and the relatively small proportion of severe cases as evidence that long COVID has been overblown. “There’s a shift from ‘Is it real?’ to ‘It is real, but …,’” Lekshmi Santhosh, the medical director of a long-COVID clinic at UC San Francisco, told me. As it stands, 11 percent of adults who’ve had COVID are currently experiencing symptoms that have lasted for at least three months, according to data collected by the Census Bureau and the CDC through the national Household Pulse Survey. That equates to more than 15 million long-haulers, or 6 percent of the U.S. adult population. And yet, “I run into people daily who say, ‘I don’t know anyone with long COVID,’” says Priya Duggal, an epidemiologist and a co-lead of the Johns Hopkins COVID Long Study. The implication is that the large survey numbers cannot be correct; given how many people have had COVID, we’d surely know if one in 10 of our contacts was persistently unwell. But many factors make that unlikely. Information about COVID’s acute symptoms was plastered across our public spaces, but there was never an equivalent emphasis that even mild infections can lead to lasting and mercurial symptoms; as such, some people who have long COVID don’t even know what they have. This may be especially true for the low-income, rural, and minority groups that have borne the greatest risks of infection. Lisa McCorkell, a long-hauler who is part of the Patient-Led Research Collaborative, recently attended a virtual meeting of Bay Area community leaders, and “when I described what it is, some people in the chat said, ‘I just realized I might have it.’” Admitting that you could have a life-altering and long-lasting condition, even to yourself, involves a seismic shift in identity, which some people are understandably loath to make. “Everyone I know got Omicron and got over it, so I really didn’t want to concede that I didn’t survive this successfully,” Jennifer Senior, a friend and fellow staff writer at The Atlantic, who has written about her experience with long COVID, told me. Duggal mentioned an acquaintance who, after a COVID reinfection, can no longer walk the quarter mile to pick her kids up from school, or cook them dinner. But she has turned down Duggal’s offer of an appointment; instead, she is moving across the country for a fresh start. “That is common: I won’t call it ‘long COVID’; I’ll just change everything in my life,” Duggal told me. People who accept the condition privately may still be silent about it publicly. “Disability is often a secret we keep,” Laura Mauldin, a sociologist who studies disability, told me. One in four Americans has a disability; one in 10 has diabetes; two in five have at least two chronic diseases. In a society where health issues are treated with intense privacy, these prevalence statistics, like the one-in-10 figure for long COVID, might also intuitively feel like overestimates. Some long-haulers are scared to disclose their condition. They might feel ashamed for still being sick, or wary about hearing from yet another loved one or medical professional that there’s nothing wrong with them. Many long-haulers worry that they’ll be perceived as weak or needy, that their friends will stop seeing them, or that employers will treat them unfairly. Such fears are well founded: A British survey of almost 1,000 long-haulers found that 63 percent experienced overt discrimination because of their illness at least “sometimes,” and 34 percent sometimes regretted telling people that they have long COVID. “So many people in my life have reached out and said, ‘I’m experiencing this,’ but they’re not telling the rest of our friends,” McCorkell said. Imagine that you interact with 50 people on a regular basis, all of whom got COVID. If 10 percent are long-haulers, that’s five people who are persistently sick. Some might not know what long COVID is or might be unwilling to confront it. The others might have every reason to hide their story. “Numbers like 10 percent are not going to naturally present themselves in front of you,” McCone told me. Instead, “you’ll hear from 45 people that they are completely fine.” The same factors that stop people from being public about their condition—ignorance, denial, or concerns about stigma—also make them less likely to file for disability benefits. And that process is, to put it mildly, not easy. Applicants need thorough medical documentation; many long-haulers struggle to find doctors who believe their symptoms are real. Even with the right documents, applicants must hack their way through bureaucratic overgrowth, likely while fighting fatigue or brain fog. For these reasons, attempting to measure long COVID through disability claims is a profoundly flawed exercise. Even if people manage to apply, they face an average wait time of seven months and a two-in-three denial rate. McCone took six weeks to put an application together, and, despite having a lawyer and extensive medical documentation, was denied after one day. McCorkell knows many first-wavers—people who’ve had long COVID since March 2020—“who are just getting their approvals now.” An alternative source of data comes from the Census Bureau’s Current Population Survey, which simply asks working-age Americans if they have any of six forms of disability. Using that data, Richard Deitz, an economics-research adviser at the Federal Research Bank of New York, calculated that about 1.7 million more people now say they do than in mid-2020, reversing a years-long decline. These numbers are lower than expected if one in 10 people who gets COVID really does become a long-hauler, but the survey doesn’t directly capture many of the condition’s most common symptoms, such as fatigue, neurological problems beyond brain fog, and post-exertional malaise, where a patient’s symptoms get dramatically worse after physical or mental exertion. About 900,000 of the newly disabled people are also still working. David Putrino, who leads a long-COVID rehabilitation clinic at Mount Sinai, told me that many of his patients are refused the accommodations required under the Americans With Disabilities Act. Their employers won’t allow them to work remotely or reduce their hours, because, he said, “you look at them and don’t see an obvious disability.” Long COVID can also seem bafflingly invisible when people look at it with the wrong tools. For example, a 2022 study by National Institutes of Health researchers compared 104 long-haulers with 85 short-term COVID patients and 120 healthy people and found no differences in measures of heart or lung capacities, cognitive tests, or levels of common biomarkers—bloodstream chemicals that might indicate health problems. This study has been repeatedly used as evidence that long COVID might be fictitious or psychosomatic, but in an accompanying editorial, Akulo Hope, the medical director of Oregon Health and Science University’s long-COVID program, noted that the study exactly mirrors what long-haulers commonly experience: They undergo extensive testing that turns up little and are told, “Everything is normal and nothing is wrong.” The better explanation, Putrino told me, is that “cookie-cutter testing” doesn’t work—a problem that long COVID shares with other neglected complex illnesses, such as myalgic encephalomyelitis/chronic-fatigue syndrome and dysautonomia. For example, the NIH study didn’t consider post-exertional malaise, a cardinal symptom of both ME/CFS and long COVID; measuring it requires performing cardiopulmonary tests on two successive days. Most long-haulers also show spiking heart rates when asked to simply stand against a wall for 10 minutes—a sign of problems with their autonomic nervous system. “These things are there if you know where to look,” Putrino told me. “You need to listen to your patients, hear where the virus is affecting them, and test accordingly.” Contrary to popular belief, researchers have learned a huge amount about the biochemical basis of long COVID, and have identified several potential biomarkers for the disease. But because long COVID is likely a cluster of overlapping conditions, there might never be a singular blood test that “will tell you if you have long COVID 100 percent of the time,” Putrino said. The best way to grasp the scale of the condition, then, is still to ask people about their symptoms. Large attempts to do this have been relatively consistent in their findings: The U.S. Household Pulse Survey estimates that one in 10 people who’ve had COVID currently have long COVID; a large Dutch study put that figure at one in eight. The former study also estimated that 6 percent of American adults are long-haulers; a similar British survey by the Office for National Statistics estimated that 3 percent of the general population is. These cases vary widely in severity, and about one in five long-haulers is barely affected by their symptoms—but the remaining majority very much is. Another one in four long-haulers (or 4 million Americans) has symptoms that severely limit their daily activities. The others might, at best, wake every day feeling as if they haven’t had any rest, or feel trapped in an endless hangover. They might work or socialize when their tidal symptoms ebb, but only by making big compromises: “If I work a full day, I can’t also then make dinner or parent without significant suffering,” JD Davids, who has both long COVID and ME/CFS, told me. Some people do recover. A widely cited Israeli study of 1.9 million people used electronic medical records to show that most lingering COVID symptoms “are resolved within a year from diagnosis,” but such data fail to capture the many long-haulers who give up on the medical system precisely because they aren’t getting better or are done with being disbelieved. Other studies that track groups of long-haulers over time have found less rosy results. A French one found that 85 percent of people who had symptoms two months after their infection were still symptomatic after a year. A Scottish team found that 42 percent of its patients had only partially recovered at 18 months, and 6 percent had not recovered at all. The United Kingdom’s national survey shows that 69 percent of people with long COVID have been dealing with symptoms for at least a year, and 41 percent for at least two. The most recent data from the U.S. and the U.K. show that the total number of long-haulers has decreased over the past six months, which certainly suggests that people recover in appreciable numbers. But there’s a catch: In the U.K., the number of people who have been sick for more than a year, or who are severely limited by their illness, has gone up. A persistent pool of people is still being pummeled by symptoms—and new long-haulers are still joining the pool. This influx should be slower than ever, because Omicron variants seem to carry a lower risk of triggering long COVID, while vaccines and the drug Paxlovid can lower that risk even further. But though the odds against getting long COVID are now better, more people are taking a gamble, because preventive precautions have been all but abandoned. Even if prevalence estimates were a tenth as big, that would still mean more than 1 million Americans are dealing with a chronic illness that they didn’t have three years ago. “When long COVID first came on the scene, everyone told us that once we have the prevalence numbers, we can do something about it,” McCorkell told me. “We got those numbers. Now people say, ‘Well, we don’t believe them. Try again.’” To a degree, I sympathize with some of the skepticism regarding long COVID, because the condition challenges our typical sense of what counts as solid evidence. Blood tests, electronic medical records, and disability claims all feel like rigorous lines of objective data. Their limitations become obvious only when you consider what the average long-hauler goes through—and those details are often cast aside because they are “anecdotal” and, by implication, unreliable. This attitude is backwards: The patients’ stories are the ground truth against which all other data must be understood. Gaps between the data and the stories don’t immediately invalidate the latter; they just as likely show the holes in the former. Laura Mauldin, the disability sociologist, argues that the U.S. is primed to discount those experiences because the country’s values—exceptionalism, strength, self-reliance—have created what she calls the myth of the able-bodied public. “We cannot accept that our bodies are fallible, or that disability is utterly ordinary and expected,” she told me. “We go to great pains to pretend as though that is not the case.” If we believe that a disabling illness like long COVID is rare or mild, “we protect ourselves from having to look at it.” And looking away is that much easier because chronic illnesses like long COVID are more likely to affect women—“who are more likely to have their symptoms attributed to psychological problems,” Mauldin said—and because the American emphasis on work ethic devalues people who can’t work as much or as hard as their peers. Other aspects of long COVID make it hard to grasp. Like other similar, neglected chronic illnesses, it defies a simplistic model of infectious disease in which a pathogen causes a predictable set of easily defined symptoms that alleviate when the bug is destroyed. It challenges our belief in our institutions, because truly contending with what long-haulers go through means acknowledging how poorly the health-care system treats chronically ill patients, how inaccessible social support is to them, and how many callous indignities they suffer at the hands of even those closest to them. Long COVID is a mirror on our society, and the image it reflects is deeply unflattering. Most of all, long COVID is a huge impediment to the normalization of COVID. It’s an insistent indicator that the pandemic is not actually over; that policies allowing the coronavirus to spread freely still carry a cost; that improvements such as better indoor ventilation are still wanting; that the public emergency may have been lifted but an emergency still exists; and that millions cannot return to pre-pandemic life. “Everyone wants to say goodbye to COVID,” Duggal told me, “and if long COVID keeps existing and people keep talking about it, COVID doesn’t go away.” The people who still live with COVID are being ignored so that everyone else can live with ignoring it. from https://ift.tt/WheBv21 Check out http://natthash.tumblr.com When his 18-year-old daughter Francine first started losing weight, in the fall of 2018, Kenneth initially thought it was a good thing. Francine had always been artistic but never particularly athletic, which puzzled her father. Kenneth, now 47, is a runner with dozens of half-marathons and even one ultramarathon under his belt. When Francine started to express an interest in exercising and joining Kenneth’s wife, Tracy, for workouts, Kenneth and Tracy thought it was a positive sign. When Francine announced that she was vegan, they rolled with it. Then Francine’s hair started to fall out. It took more than a year of trying different therapists, while Francine got progressively worse, for Kenneth and Tracy to grasp just how sick their daughter was. (I’ve changed the family members’ names to protect their privacy.) Kenneth started to add up exactly what his daughter was eating in a day and realized it wasn’t nearly enough. He also suspected that Francine had learned some of her new eating habits—such as replacing breakfast with bulletproof coffee—from watching him. Around the same time that Francine began struggling, Kenneth was following his own intense diet while on a quest to improve his running time. When Francine asked about his eating, he explained what he was doing and why. “I think I was probably malnourished myself, and in that place where you can’t help but obsess about food and talk about it constantly,” he says. Kenneth thought that he was modeling healthy eating and exercise habits to his daughter. “I just had no idea that the stuff she was asking me was really her disease asking,” he says. For decades, researchers trying to understand the role of a child’s family in eating-disorder development looked almost exclusively to mothers. “The literature on fathers’ child feeding practices is scant,” observed the authors of a scientific-review paper on the topic published in 2014. They could find only 20 studies that included fathers in a meaningful way. “The research that has included fathers has focused on fathers who are part of a family in which the mother has an [eating disorder], rather than examining fathers’ unique contributions,” wrote two Yale researchers in their analysis for a 2016 study. More studies on parents and eating habits have been published since that 2014 review, but the gap between research on mothers and research on fathers remains wide. Scientists and the public alike have long ignored the idea that a father might also struggle with dieting or disordered eating, despite the fact that, according to a 2008 estimate from the National Institute of Mental Health (NIMH), roughly 1 million American men live with eating disorders. Men aren’t supposed to obsess over their weight. Men—especially straight, cisgender, white, thin men—aren’t defined by their appearance to the same degree that women and other marginalized people tend to be. And that might explain why Americans talk much less about how fathers’ eating habits and beliefs around health and weight can influence their children than, say, the motivations of almond moms. [Read: The harder, better, faster, stronger language of dieting] One 2018 study of 658 parents by Yale researchers found that although nearly everyone (93 percent) demonstrated some sort of weight bias, fathers, as well as parents of any gender with the perceived privilege of “healthy weight,” were more likely than mothers to agree with negative statements such as “Severely obese children are unusually untidy” (findings on the differences between mothers’ and fathers’ food parenting vary). Other research concluded that fathers with more education and a higher family income were more likely than other fathers to endorse fat stereotypes. And kids absorb this stigma: Adolescents were more likely to diet and binge eat if their parents talked about weight, according to a 2013 survey published in JAMA Pediatrics of 2,793 kids. Many experts say that the NIMH’s figure on the number of men living with eating disorders is likely an underestimate: Men don’t tend to disclose their disordered-eating behaviors, and health-care providers don’t think to screen men for symptoms. “Men tell me they don’t have a script for how to talk about diet culture,” Jaclyn Siegel, a social psychologist at San Diego State University, told me in 2020 when I was reporting a story on the coronavirus pandemic’s effects on men’s dieting habits. “But there’s also no script for men to express their own concerns or to seek help, because it isn’t seen as normative for men to develop eating disorders or body-image dissatisfaction.” What happens instead is a normalization and even a glamorization of men’s relationship with food and exercise. This rests on a common cultural misconception that men not only don’t get eating disorders; they don’t get emotional about food or bodies, period. Many dads go on diets, but far fewer actually call it that. Instead, like Kenneth, dieting dads might get super into long-distance running, or CrossFit, or bodybuilding, or Ironman training. They may become passionate about vegetable gardening; Kenneth and his family used to run an organic farm, and he says his passion for farming led him to preach about “good” and “bad” foods. All of these pursuits can be motivated by an interest in health and wellness—even science, the environment, social justice. But they can also be motivated by a fear of becoming or being perceived as fat, reflecting a broader bias against heavy people. “I used to do a lot of banter about ‘Look at that person; she’s fat,’” Kenneth says. “Or I’d say to the kids, ‘Hey, don’t eat that pizza,’ or ‘Don’t eat too many desserts; that will make you fat.’” Some research suggests that dads can affect their kids’ relationship with food as much as or perhaps even more than moms do. Findings vary, underscoring the need for more research, but in a 2014 study of more than 2,700 kids, girls whose fathers reported binge eating were 3.38 times more likely to report binge eating themselves, although there was no correlation with mothers’ eating. (The researchers found no relationship between boys’ binging and that of parents.) Even if they don’t actively model disordered-eating habits, fathers may withdraw from family meals altogether—something many men can do more easily than women because of societal gender norms around who prepares food. The little research we have on how dads influence their kids’ relationship with food and their body suggests that dads might be slightly more prone than moms to engage in what researchers call “pressure-to-eat behaviors”—pushing children to eat (or not eat) certain foods in certain amounts. Kyle Ganson, a clinical social worker at the University of Toronto who studies eating disorders in boys and men, speculates that fathers’ pressures on sons may relate to a desire for them to perform athletically in specific ways. “If the dad is pushing the kid in a certain direction with sports, or if the dad is their coach and heavily influencing their exercise plans, that can lead to disordered eating,” he says. [Read: Putting kids on diets won’t solve anything] On the flip side, fathers of kids with eating disorders may resort to exerting pressure because they are confused by a child’s inability to comply. “Anecdotally, the phrase I often hear from male caregivers is ‘Why can’t they just eat?’ They may also be more likely to think their child needs to ‘grow up’ or ‘deal with it,’” Ganson says. “Female caregivers tend to be doing a lot of the emotional processing around the eating disorder, while fathers are much more driven by logistics: ‘How do we move to the next phase of treatment? When do we see results?’” And when progress isn’t evident—as it often isn’t in the circular recovery process of eating disorders—dads are more likely to disconnect. “This isn’t really my territory” is another comment Ganson and his colleagues often hear from dads. This is not to say that men can’t engage emotionally with a sick child, or that managing treatment logistics isn’t valuable. But helping and connecting with a child in eating-disorder recovery requires dads to be vulnerable and humble, skills they aren’t always asked to employ or that haven’t been modeled for them. And the conditioning to push away feelings and move toward action mirrors the “No pain, no gain” messaging of much of male-diet culture. Kenneth is now striving for acceptance as he and Tracy support Francine through her eating-disorder recovery. Soon after Kenneth’s realization of just how sick Francine had become, she was admitted to an inpatient recovery program for nine days. When she came home, Kenneth and Tracy began following a common eating-disorder-treatment protocol known as family-based treatment, where parents take full responsibility for feeding a child who can no longer hear hunger cues or make decisions around food, planning and preparing every single meal and snack and monitoring every bite. There were many nights when Francine cried at the table. Tracy bore the brunt of making the food and talking Francine through the process of eating. “A lot of nights, I could see, she just could not eat unless Mom was there to support her,” Kenneth says. Those were the days he felt most helpless, just as the eating-disorder literature has so often painted fathers. But he began to look for ways to contribute, getting out board games for the family to play after dinner, when Francine had finished eating but still needed some help and distraction from the eating-disorder voice in her head. Just being there—without judgment, without trying to fix it—made him reconsider what it meant to be Francine’s dad. “I still believe it’s my job to be the protector of my family,” Kenneth says, “but I’ve had to sort of rethink what that looks like.” This article has been adapted from Virginia Sole-Smith’s forthcoming book, Fat Talk: Parenting in the Age of Diet Culture. from https://ift.tt/0FX1s6D Check out http://natthash.tumblr.com In October, when the FDA first announced a shortage of Adderall in America, the agency expected it to resolve quickly. But five months in, the effects of the shortage are still making life tough for people with attention-deficit hyperactivity disorder who rely on the drug. Stories abound of frustrated people going to dozens of pharmacies in search of medication each month, only to come up short every time. Without treatment, students have had a hard time in school, and adults have struggled to keep up at work and maintain relationships. The Adderall shortage has ended, but the widely used generic versions of the drug, known as amphetamine mixed salts, are still scarce. A “perfect storm” of factors—manufacturing delays, labor shortages, tight regulations—is to blame for the shortage, David Goodman, an ADHD expert and a psychiatry professor at the Johns Hopkins University School of Medicine, told me. And they have all been compounded by the fact that the pandemic produced a surge in Americans who want Adderall. The most dramatic changes occurred among adults, according to a recent CDC report on stimulant prescriptions, with increases in some age groups of more than 10 percent in just a single year, from 2020 to 2021. It’s the nature of the spike in demand for Adderall—among adults—that has some ADHD experts worried about “whether the demand is legitimate,” Goodman said. It’s possible that at least some of these new Adderall patients, he said, are getting prescriptions they do not need. The problem is that America has no standard clinical guidelines for how doctors should diagnose and treat adults with ADHD—a gap the CDC has called a “public health concern.” When people come in wanting help for ADHD, providers have “a lot of choices about what to use and when to use it, and those parameters have implications for good care or bad care,” Craig Surman, a psychiatry professor and an ADHD expert at Harvard and the scientific coordinator of adult-ADHD research at Massachusetts General Hospital, told me. The stimulant shortage will end, but even then, adults with ADHD may not get the care they need. For more than 200 years, symptoms related to ADHD—such as difficulty focusing, inability to sit still, and fidgeting—have largely been associated with children and teenagers. Doctors widely assumed that kids would grow out of it eventually. Although symptoms become “evident at a very early period of life,” one Scottish physician wrote in 1798, “what is very fortunate [is that] it is generally diminished with age.” For some people, ADHD symptoms really do get better as they enter adulthood, but for most, symptoms continue. The focus on children persists today in part because of parental pressure. Pediatricians have had to build a child-focused ADHD model, Surman said, because parents come in and say, “What are we going to do with our kid?” As a result, treating children ages 4 to 18 for ADHD is relatively straightforward: Clear-cut clinical guidelines from the American Academy of Pediatrics specify the need for rigorous psychiatric testing that rules out other causes and includes reports about the patient from parents and teachers. Treatment usually involves behavior management and, if necessary, medication. But there is no equivalent playbook for adults with ADHD in the U.S.—unlike in other developed nations, including the U.K. and Canada. In fact, the disorder was only recently acknowledged within the field of adult psychiatry. One reason it went overlooked for so long is because ADHD can sometimes look different in kids compared with adults: Physical hyperactivity tends to decrease with age as opposed to, say, emotional or organizational problems. “The recognition that ADHD is a life-span disorder that persists into adulthood in most people has really only happened in the last 20 years,” Margaret Sibley, a psychiatry professor at the University of Washington School of Medicine, told me. And the field of adult psychiatry has been slow to catch up. Adult ADHD was directly addressed for the first time in DSM-5—the American Psychiatric Association’s diagnostic bible—in 2013, but the criteria described there still haven’t been translated into practical instructions for clinicians. Addressing adult ADHD isn’t as simple as adapting children’s standards for grown-ups. A key distinction is that the disorder impairs different aspects of an adult’s life: Whereas a pediatrician would investigate ADHD’s impact at school or at home, a provider evaluating an adult might delve into its effects at work or in romantic relationships. Sources of information differ too: Parents and teachers can shed light on a child’s situation, but “you wouldn’t call the parent of a 40-year-old to get their take on whether the person has ADHD,” Sibley said. Providers usually rely instead on self-reporting—which isn’t always accurate. Complicating matters, the symptoms of ADHD tend to be masked by other cognitive issues that arise in adulthood, such as those caused by depression, drug use, thyroid problems, or hormonal shifts, Sibley said: “It’s a tough disorder to diagnose, because there’s no objective test.” The best option is to perform a lengthy psychiatric evaluation, which usually involves reviewing symptoms, performing a medical exam, taking the patient’s history, and assessing the patient using rating scales or checklists, according to the APA. Without clinical guidelines or an organizational body to enforce them, there is no pressure to uphold that standard. Virtual forms of ADHD care that proliferated during the pandemic, for example, were rarely conducive to lengthy evaluations. A major telehealth platform that dispensed ADHD prescriptions, Cerebral, has been investigated for sacrificing medical rigor for speedy treatment and customer satisfaction, potentially letting people without ADHD get Adderall for recreational use. In one survey, 97 percent of Cerebral users said they’d received a prescription of some kind. Initial consultations with providers lasted just half an hour, reported The Wall Street Journal; former employees feared that the company’s rampant stimulant-prescribing was fueling an addiction crisis. “It’s impossible to do a comprehensive psychiatric evaluation in 30 minutes,” Goodman said. (Cerebral previously denied wrongdoing and no longer prescribes Adderall or other stimulants.) The bigger problem is that too few providers are equipped to do those evaluations in the first place. Because adult ADHD was only recently recognized, most psychiatrists working today received no formal training in treating the disorder. “There’s a shortage of expertise,” Surman said. “It’s a confusing space where, at this point, consumers often are educating providers.” The dearth of trained professionals means that many adults seeking help for ADHD are seen by providers, including primary-care doctors, social workers, and nurse practitioners, who lack the experience to offer it. “It’s a systemic issue,” Sibley said, “not that they’re being negligent.” The lack of trained providers opens up the potential for inadequate or even dangerous care. Adderall is just one of many stimulants used to treat ADHD, and choosing the right one for a patient can be challenging—and not all people with ADHD need or want to take them. But even the most well-intentioned health-care professionals may be unprepared to evaluate patients properly. The federal government considers Adderall a highly addictive Schedule II drug, like oxycodone and fentanyl, and the risks of prescribing it unnecessarily are high: Apart from dependency, it can also cause issues such as heart problems, mood changes, anxiety, and depression. Some people with ADHD might be better off with behavioral therapy or drugs that aren’t stimulants. Unfortunately, it can be all too easy for inexperienced providers to start a patient on these drugs and continue treatment. “If I give stimulants to the average person, they’ll say their mood, their thinking, and their energy are better,” Goodman said. “It’s very important not to make a diagnosis based on the response to stimulant medication.” But the uptick in adults receiving prescriptions for those drugs since at least 2016 is a sign that this might be happening. The fact that adult ADHD is surging may soon lead to change. Last year, the American Professional Society of ADHD and Related Disorders began drafting the long-needed guidelines. The organization’s goal is to standardize care and treatment for adult ADHD across the country, said Goodman, who is APSARD’s treasurer. Establishing standards could have “broad, sweeping implications” beyond patient care, he added: Their existence could compel more medical schools to teach about adult ADHD, persuade insurance companies to cover treatment, and pressure lawmakers to include it in workplace policies. A way out of this mess, however long overdue, is only going to become even more necessary. Nearly 5 percent of adults are thought to have the disorder, but less than 20 percent of them have been diagnosed or have received treatment (compared with about 77 percent of children). “You have a much larger market of recognized and untreated adults, and that will continue to increase,” Goodman said. Women—who, like girls, are historically underdiagnosed—will likely make up a substantial share. Adults with ADHD may have suffered in silence in the past, but a growing awareness of the disorder, made possible by ongoing destigmatization, will continue to boost the ranks of people who want help. On social media, ADHD influencers abound, as do dedicated podcasts on Spotify. Until guidelines are published—and embedded into medical practice—the adult-ADHD landscape will remain chaotic. Some people will continue to get Adderall prescriptions they don’t need, and others may be unable to get an Adderall prescription they do need. Rules alone couldn’t have prevented the shortage, and they won’t stop it now. But in more ways than one, their absence means that many people who need help for ADHD are unable to receive it. from https://ift.tt/5ZB8Txm Check out http://natthash.tumblr.com I do not like carbonated beverages, plain and simple. I won’t drink soda, and you’ll never catch me with a beer. Gin and tonics are a no. Sparkling water? A beast in disguise. Oh, the cocktail is not that fizzy, you say? I’ve heard that one before. And get your slushie out of my face. As I said, I do not like carbonated beverages. I do not like them at all. I don’t just mean that they taste bad to me, the way soap or penicillin does. I mean that they hurt me. They inflict actual, physical pain on my mouth. The sensation is prickly, like having my tongue poked with hundreds of needles. On the handful of foolhardy occasions when I’ve dared take a sip of Coke, it’s felt like what I imagine sipping static electricity would feel like, at least until the pain subsides and I’m left with nothing but the hyper-saturated sweetness of a melted freezer pop. Even after I swallow, my mouth feels raw. When I try to explain this aversion, people sometimes struggle to wrap their mind around it. “Even sparkling cider?” they ask incredulously. “Even cream soda?” Yes, even sparkling cider. Yes, even cream soda. Occasionally, people try to relate: “Oh, I hate carbonation too … except in champagne.” Whatever these people mean by “hate” is clearly not the same thing I mean. The specifics of the drink make no difference to me. The carbonation itself is the problem. Part of me wonders whether this all traces back to an incident from my childhood. When I was 6 or 7 years old, I accidentally ate a piece of sushi covered in more wasabi than I’d bargained for and, in a panic, took a big gulp of water—except the water wasn’t water; it was seltzer, and I spit it all over the table. A couple of years later, I tried root beer at day camp and spat that out too. By that point, I’d pretty much learned my lesson. [Read: The medical origins of seltzer] So why am I like this? It’s not as though my mouth is hypersensitive to all tastes and sensations. I pop Sour Skittles at the movies and have a pretty high spice tolerance. My issue is more specific and, given that Americans consume more than 40 gallons of soda a person each year, very rare. But apparently I’m not the only one: On Reddit’s r/unpopularopinion forum and others like it, never-fizzers find common cause. Drinking carbonated beverages is “kinda masochist.” It’s “pure agony.” It’s like “swallowing battery acid.” “I feel like I’m drinking flesh eating bacteria,” one Redditor writes. “I swear I thought I was the only one who thinks they hurt,” another replies. You can find dozens of posts like these online—so many, in fact, that you may begin to wonder: How many times can an unpopular opinion be posted before it ceases to qualify as an unpopular opinion? Scientists, for their part, have documented at least one instance of an anaphylactic reaction to sparkling water. That reaction was not caused by the bubbles themselves, but neither is carbonation’s distinctive mouthfeel. For a long time, people assumed that the fizzy sensation was just the tactile experience of having bubbles pop inside your mouth. Early suspicions to the contrary came from mountaineers, who reported that when they raised a toast at the summit, their bubbly champagne tasted flat. In 2013, researchers confirmed that the “bite” of carbonation is not dependent on bubbles: Even after drinking sparkling water in a pressure chamber, where bubbles cannot form, test subjects still reported feeling the slight “sting, burn, or pungency” associated with fizzy drinks, both on the tip of their tongue and at the back of their throat. [Read: The sad truth about seltzer] The source of that bite, scientists determined, is the carbonic acid formed when enzymes in the mouth break down carbon dioxide. (That process happens to be inhibited by a medication commonly taken by mountaineers to stave off altitude sickness.) The acid activates pain receptors, Earl Carstens, a neurobiologist at UC Davis, told me, so the experience of drinking a carbonated beverage should be sharp and irritating for everyone. In that sense, the weird thing is not that some people hate carbonation; it’s that anyone likes it at all. Social conditioning may play a role: We accept the pain of drinking soda because we’re taught that it’s okay. Or perhaps the mild pain is associated with a pleasurable release of endorphins, as can occur when people eat a spicy food. Both of those factors are likely in play, Carstens said. But as my experience shows, not everyone experiences carbonic-acid pain the same way. Some people feel a refreshing tickle, others a chemical assault. No one knows why. Scientists have traced other aversions—to cilantro, for example, or tannic wines—to natural variations in human taste and smell receptors. “We are not at the same place in our knowledge of carbonation,” Emily Liman, a neurobiologist at the University of Southern California, told me. The problem faced by sodaphobes may yet turn out to have a genetic explanation, but for the moment, scientists don’t even understand exactly which cells are involved in the sensation. Pain receptors (such as the ones that detect spiciness) and taste cells (such as the ones that detect sourness) seem to play a part in feeling carbonation, Liman said, but it’s unclear exactly which cells contribute. In short, there’s no way to know whether I’m the victim of busted mouth biology, or of some long-repressed experience that bubbles up as oral pain, or of something else entirely. In any case, hating carbonation only means that I have to do a lot of polite declining. It’s not a huge deal, yet I sometimes find myself perturbed to to be cut off from a whole sector of human experience, to dislike something that almost everyone else seems to like, and to dislike it not because of some contrarian impulse or principled objection but because of my physiology or my psychology. Best not to indulge such musings, though—they can easily give way to temptation. Last summer, after years of strict avoidance, I ordered a cider at a bar, thinking that maybe, after all these years, something had changed. Nope! from https://ift.tt/ZkcJSil Check out http://natthash.tumblr.com Thirty years ago, antidepressant research seemed on the verge of a major breakthrough. Years of experiments with laboratory rats and mice—animals long considered “classic” models for the condition—had repeatedly shown that a new drug called rolipram could boost a molecule in the rodent brain that people with depression seemed to have lower levels of. Even guinea pigs and chipmunks seemed susceptible to the chemical’s effects. Experts hailed rolipram as a potential game changer—a treatment that might work at doses 10 to 100 times lower than conventional antidepressants, and act faster to boot. But not long after rolipram entered clinical trials in humans, researchers received a nasty surprise. The volunteers taking rolipram just kept throwing up. Terrible bouts of nausea were leading some participants to quit taking the meds. No one could take rolipram at doses high enough to be effective without experiencing serious gastrointestinal distress. Years of hard work was literally getting flushed down the tubes. Rolipram wasn’t alone: Over the years, millions of dollars have been lost on treatments that failed after vomiting cropped up as a side effect, says Nissar Darmani, the associate dean for basic sciences and research at Western University of Health Sciences. The problem in many of these cases was the rodents, or, maybe more accurately, that researchers had pinned their hopes on them. Mice and rats, the world’s most commonly used laboratory animals—creatures whose many biological similarities to us have enabled massive leaps in the treatment of HIV, cardiovascular disease, cancer, and more—are rather useless in one very specific context: They simply can’t throw up. Vomiting, for all its grossness, is an evolutionary perk: It’s one of the two primary ways to purge the gastrointestinal tract of the toxins and poisons that lurk in various foodstuffs, says Lindsey Schier, a behavioral neuroscientist at the University of Southern California. But rodent bodies aren’t built for the act of throwing up. Their diaphragm is a bit wimpy; their stomach is too bulbous, their esophagus too long and spindly. And the animals seem to lack the neural circuits they’d need to trigger the vomiting reflex. And yet, rodents make up nearly 40 percent of mammal species and have colonized habitats on every continent on Earth except Antarctica—including homes laced with delicious, bait-laden rodenticides. Part of their secret might be pure prevention. Rodents have exquisite senses of smell and taste, which work as “gatekeepers of the gastrointestinal tract,” says Linda Parker, a behavioral neuroscientist at the University of Guelph. They’re also extremely wary of new foods, and their memory for a sickening substance is strong. “They’ll avoid it for months, years, maybe even their whole life,” Parker told me. “It’s probably the strongest form of animal learning we know.” Any noxious stuff that does enter rodent bellies can also be waylaid. The animals may get diarrhea, or delay their absorption of the harmful substances by slowing digestion, or swallowing materials such as clay. These tactics aren’t perfect—but neither, to be fair, is vomiting, which is “very violent,” says Bart De Jonghe, a nutritional-science researcher at the University of Pennsylvania. The act requires the diaphragm and abdominal muscles to clench around the gut, and can leave animals physically drained and dehydrated. Maybe rodents are spared quite a few costs, says Gareth Sanger, a pharmacologist at Queen Mary University of London. [Read: Will smoking pot make me vomit forever?] It’s still a bit unclear just how much of an anomaly rodents are. Only so many mammals--among them, cats, dogs, ferrets, primates, and pigs—have thrown up in human sight. Researchers can’t always tell if the creatures that haven’t are unable, shy, or just wise about what they consume, making it difficult for biologists to trace vomiting’s evolutionary roots. Yates is one of several experts who suspect that throwing up is a relatively recent development, manifesting mainly among carnivores and primates, creatures that perhaps could not afford to snack slowly and warily as rodents do. But others disagree, hypothesizing instead that ancestral mammals had an emergency brake in their gut. Maybe rodents (and, apparently, rabbits) lost the gift, while the rest of us kept it around, Sanger told me. The act’s origins could be more ancient still: Some evidence suggests that even creatures from the Jurassic era may have occasionally lost their lunch. Labs interested in studying vomiting directly have long relied on creatures outside of the rodent family, among them dogs, cats, and ferrets—though high costs of upkeep and intermittent pushback on companion-animal testing from the public have made that work tough, Darmani told me. Nowadays, some of the most promising research takes place in shrews: small mammals that resemble rodents in size and ease of care, but can throw up. The animals have helped researchers such as Darmani and Parker make big advances in figuring out, for instance, how cannabinoids might help curb the urge to vomit—findings that could provide major relief for people undergoing chemotherapy, radiation treatment, and more. [Read: Can ginger ale really soothe nausea?] Still, rodents haven’t been written out of digestive research just yet. Parker and others have found that rats and their relatives are great models for nausea, which has historically been far harder to define and treat. Give a shrew a drug to induce vomiting, and it will work—making the brief moments when their equivalent of nausea might manifest quite tough to study. A rodent, meanwhile, must stew in its digestive distress, potentially giving researchers crucial insight with every gape of the mouth, or wrinkle of the nose. The work isn’t without its challenges. Nausea is, by definition, subjective. “You can ask a room full of 30 people what nausea is, and I guarantee you’ll get 30 different responses,” De Jonghe told me. Among nonhuman creatures, the problem is worse: “You cannot ask an animal if they feel this way or that,” Schier said. Many researchers are adamant that no animal models for nausea exist at all. But nausea-esque behaviors, even if not totally equivalent to ours, can offer important clues. Rodents, like us, get majorly turned off by gross foods; they, like us, get woozy, trembly, and sluggish after they’ve been swirled around. And when researchers spot such reactions in their lab animals, they can check what hormones spike in their blood, and what microscopic switches get flipped in the circuits of their brain—observations that could help us map nausea’s precise pathways, and perhaps block them with drugs. Understanding that topography is urgent. “Twenty years ago,” Sanger said, “vomiting was the most feared side effect” in many of the patients he saw. But with the advent of several generations of vomit-curbing drugs, “now it’s nausea.” Our current approaches for addressing motion sickness aren’t up to snuff either: Many of them are hit or miss; others are so broad-acting that they drug people into sleepy stupors—muting not only their digestive discomfort but a bunch of other basic functions as well. The medications are “sledgehammers,” Yates told me, when a “tiny little hammer” will likely do. [Read: The mysterious science of motion sickness] All of that means that rodents’ big gastrointestinal shortcoming could end up being far more valuable than once thought. The weirdness of their guts and respiratory tracts might end up being key to making future train rides and boat trips less sickening, and migraines and morning sickness more bearable—even cancer treatments less brutal. With enough understanding, maybe we’ll be able to mimic rodents’ best responses to bad foods, and none of their worst. from https://ift.tt/8CHXItA Check out http://natthash.tumblr.com Last summer, I got a tip about a curious scientific finding. “I’m sorry, it cracks me up every time I think about this,” my tipster said. Back in 2018, a Harvard doctoral student named Andres Ardisson Korat was presenting his research on the relationship between dairy foods and chronic disease to his thesis committee. One of his studies had led him to an unusual conclusion: Among diabetics, eating half a cup of ice cream a day was associated with a lower risk of heart problems. Needless to say, the idea that a dessert loaded with saturated fat and sugar might actually be good for you raised some eyebrows at the nation’s most influential department of nutrition. Earlier, the department chair, Frank Hu, had instructed Ardisson Korat to do some further digging: Could his research have been led astray by an artifact of chance, or a hidden source of bias, or a computational error? As Ardisson Korat spelled out on the day of his defense, his debunking efforts had been largely futile. The ice-cream signal was robust. It was robust, and kind of hilarious. “I do sort of remember the vibe being like, Hahaha, this ice-cream thing won’t go away; that’s pretty funny,” recalled my tipster, who’d attended the presentation. This was obviously not what a budding nutrition expert or his super-credentialed committee members were hoping to discover. “He and his committee had done, like, every type of analysis—they had thrown every possible test at this finding to try to make it go away. And there was nothing they could do to make it go away.” Spurious effects pop up all the time in science, especially in fields like nutritional epidemiology, where the health concerns and dietary habits of hundreds of thousands of people are tracked over years and years. Still, the abject silliness of “healthy ice cream” intrigued me. As a public-health historian, I’ve studied how teams of researchers process data, mingle them with theory, and then package the results as “what the science says.” I wanted to know what happens when consensus makers are confronted with a finding that seems to contradict everything they’ve ever said before. (Harvard’s Nutrition Source website calls ice cream an “indulgent” dairy food that is considered an “every-so-often” treat.) “There are few plausible biological explanations for these results,” Ardisson Korat wrote in the brief discussion of his “unexpected” finding in his thesis. Something else grabbed my attention, though: The dissertation explained that he’d hardly been the first to observe the shimmer of a health halo around ice cream. Several prior studies, he suggested, had come across a similar effect. Eager to learn more, I reached out to Ardisson Korat for an interview—I emailed him four times—but never heard back. When I contacted Tufts University, where he now works as a scientist, a press aide told me he was “not available for this.” Inevitably, my curiosity took on a different shade: Why wouldn’t a young scientist want to talk with me about his research? Just how much deeper could this bizarre ice-cream thing go? “I still to this day don’t have an answer for it,” Mark A. Pereira, an epidemiologist at the University of Minnesota, told me, speaking of the association he’d stumbled upon more than 20 years earlier. “We analyzed the hell out of the data.” Just that morning, I’d been reading one of Pereira’s early papers, on the health effects of eating dairy, because it seemed to have inspired other research that was cited in Ardisson Korat’s dissertation. But when I scrolled to the bottom of Pereira’s article, down past the headline-making conclusions, I saw in Table 5 a set of numbers that made me gasp. Back then, Pereira was a young assistant professor at Harvard Medical School. Hoping to address the newly labeled epidemics of obesity and diabetes, he initially focused his research on physical activity, but soon turned to the unsettled science of healthy eating. The status of dairy, in particular, was bogged down in simplistic and competing assumptions. “We just thought, Oh, you know, calcium and bones: It’s good for kids. But, oh, the saturated fat! Don’t eat too much dairy! ” [From the July/August 2013 issue: How junk food can end obesity] Pereira and his co-authors tested these old ideas using data from a study, begun in 1985, that tracked the emergence of heart-disease risk factors in more than 5,000 young adults. After seeing the results, “we knew it was going to be very high-profile and controversial,” Pereira recalled. Pretty much across the board—low-fat, high-fat, milk, cheese—dairy foods appeared to help prevent overweight people from developing insulin-resistance syndrome, a precursor to diabetes. “I’ll tell you, this study surprised the heck out of me,” said one CNN correspondent, as Pereira’s study spiraled through the press. But the international media coverage didn’t mention what I’d seen in Table 5. According to the numbers, tucking into a “dairy-based dessert”—a category that included foods such as pudding but consisted, according to Pereira, mainly of ice cream—was associated for overweight people with dramatically reduced odds of developing insulin-resistance syndrome. It was by far the biggest effect seen in the study, 2.5 times the size of what they’d found for milk. “It was pretty astounding,” Pereira told me. “We thought a lot about it, because we thought, Could this actually be the case? ” There were reasons to be wary: The data set wasn’t huge, in epidemiological terms, and participants hadn’t reported eating that many dairy-based desserts, so the margin of error was wide. And given that the study’s overall message was sure to attract criticism—Pereira recalled getting “skewered” by antidairy activists—he had little desire to make a fuss about ice cream. Pretty soon, Pereira’s peers found themselves in the same predicament. Building on the 2002 study and the growing interest in dairy, researchers at the Harvard School of Public Health decided to break out some of their most powerful tools. Since the 1980s, Harvard’s scientists have been collecting “food-frequency questionnaires” and medical data from many thousands of nurses, dentists, and other health-care workers. These world-famous studies have fueled a stream of influential findings, including some of the data that sparked the removal of trans fats from the food supply. The results of Harvard’s first observational study of dairy and type 2 diabetes came out in 2005. Based on data collected from just one of their three cohorts, following men between 1986 and 1998, the authors reported that higher dairy intake, and higher low-fat-dairy intake in particular, was associated with a lower risk of diabetes. “The risk reduction was almost exclusively associated with low-fat or non-fat dairy foods,” a Harvard news bulletin explained. An article on Fox News’s website underscored the low-fat message: “There was no decrease in men who drank whole milk,” the story said. Perhaps not whole milk, but what about butter pecan? Near the end of the Harvard paper, where the authors had arrayed the diabetes risks associated with various dairy foods, was a finding that was barely mentioned in the “almost exclusively” low-fat narrative given to reporters. Yes, according to that table, men who consumed two or more servings of skim or low-fat milk a day had a 22 percent lower risk of diabetes. But so did men who ate two or more servings of ice cream every week. Once again, the data suggested that ice cream might be the strongest diabetes prophylactic in the dairy aisle. Yet no one seemed to want to talk about it. In the years that followed, research summaries generally agreed that high dairy intake overall was associated with a slightly reduced risk of diabetes, but called for more investigation of which specific dairy foods might have the greatest benefits. In 2014, Harvard’s nutrition team brought another dozen years of diet-tracking data to bear on this question. In this new study, total dairy consumption now seemed to have no effect, but the ice-cream signal was impossible to miss. Visible across hundreds of thousands of subjects, it all but screamed for more attention. Following a pattern of incredulousness that was by then more than a decade old, Frank Hu, the study’s senior author and the future chair of Harvard’s nutrition department, asked the graduate student who’d led the project, Mu Chen, to double-check the data. “We were very skeptical,” Hu told me. Chen, who is no longer in academia, did not respond to interview requests, but Hu recalled that no errors in the data could be found. The Harvard researchers didn’t like the ice-cream finding: It seemed wrong. But the same paper had given them another result that they liked much better. The team was going all in on yogurt. With a growing reputation as a boon for microbiomes, yogurt was the anti-ice-cream—the healthy person’s dairy treat. “Higher intake of yogurt is associated with a reduced risk” of type 2 diabetes, “whereas other dairy foods and consumption of total dairy are not,” the 2014 paper said. “The conclusions weren’t exactly accurately written,” acknowledged Dariush Mozaffarian, the dean of policy at Tufts’s nutrition school and a co-author of the paper, when he revisited the data with me in an interview. “Saying no foods were associated—ice cream was associated.” But yogurt made so much more sense. In a way, it was confirmation of something that everyone already knew. From the start of yogurt’s entrée into the American diet, it had been perceived as an exotic food from a faraway land, quivering with vague health-giving properties. Even after being spiked with sugar in the ’70s and ’80s to better suit the U.S. market, yogurt still retained its image as an elixir. Furthermore, a growing body of literature suggested that yogurt’s health benefits might be real. Harvard had found, a few years earlier, that eating yogurt was associated with reduced weight gain; researchers at the university were interested in its possible effects on gut bacteria as well. Other studies—including those that first revealed the ice-cream signal—had also sketched the slender outlines of a yogurt effect. When Chen and Hu pooled together findings from this research, added in their latest data, and performed a meta-analysis, they concluded that yogurt was indeed associated with a reduced risk of diabetes—a potential benefit, they wrote, that warranted further study. Regarding ice cream’s potential benefits, they had much less to say. I asked other experts to compare the 2014 yogurt and ice-cream findings. Kevin Klatt, a nutrition scientist at UC Berkeley, said the ice-cream effect was “more consistent” than yogurt’s across the studied cohorts. Deirdre Tobias, an epidemiologist at Harvard, the academic editor of The American Journal of Clinical Nutrition, and a member of the advisory committee for the 2025 update to the U.S. dietary guidelines, agreed with that assessment. Even Dagfinn Aune, an epidemiologist at Imperial College London and a peer reviewer of the Chen and Hu paper, said that the ice-cream effect was “similar” in magnitude to, or “slightly stronger” than, the one for yogurt. So how did the Harvard team explain away the ice-cream finding? The theory went like this: Maybe some of the people in the study had developed health problems, such as high blood pressure or elevated cholesterol, and began avoiding ice cream on doctors’ orders (or of their own volition). Meanwhile, people who didn’t have those health problems would have had less reason to give up their cookies and cream. In that scenario, it wouldn’t be that ice cream prevented diabetes, but that being at risk of developing diabetes caused people to not eat ice cream. Epidemiologists call that “reverse causation.” To test this idea, Hu and his co-authors set aside dietary data collected after people received these sorts of diagnoses, and then redid their calculations. The ice-cream effect shrank by half, though it was still statistically significant, and still bigger than the low-fat-dairy effect that Harvard had publicized in 2005. In any event, if people who received adverse diagnoses cut back on their ice cream, you might expect that they’d also cut back on, say, cake and doughnuts. So shouldn’t there be mysterious protective “effects” for cake and doughnuts too? “There should be,” Mozaffarian said. “That’s why the finding for ice cream is intriguing.” [Read: How ice cream helped America at war] The new analysis was hardly a slam dunk. On paper, the yogurt and ice-cream effects still looked pretty similar. “Within the realm of statistical uncertainty, they’re identical,” Mozaffarian told me. But in the 2014 paper, he and the other authors had argued that “reverse causation may explain the findings” for ice cream. And as academia’s public-relations machinery came to life, nuance went out the window. “Does a yogurt a day keep diabetes away?” asked the press release that went out on publication day. “Other dairy foods and consumption of total dairy did not show this association,” said Hu, the senior author, in an ice-cream-free appraisal included in the release and echoed in Harvard’s own press bulletin. “Yogurt has approached wonder-food status in recent years,” a Forbes article on the paper noted. “In the new study, other forms of dairy like milk and cheese, did not offer the same kind of protection as yogurt.” Hu says today that the Harvard researchers felt confident in their conclusions about yogurt largely on account of their meta-analysis, and the fact that prior clinical studies and basic science research supported the idea that probiotics improve metabolic outcomes. “For ice cream, of course, there is no prior literature,” he said. Given that the ice-cream effect was diminished when they tested their reverse-causation theory, he called it “much more plausible” that yogurt would help prevent diabetes than ice cream. After his paper was published, it didn’t take long for the Harvard group’s good news about yogurt to take hold as a dominant scientific narrative. Two years later, when a team of researchers based in the Netherlands and at Harvard analyzed all the evidence it could find on dairy and diabetes, the yogurt effect popped out. A featured graph from the team’s 2016 paper in The American Journal of Clinical Nutrition summarizes data from about a dozen studies: As someone’s yogurt intake mounts to roughly one-third of a cup a day, their risk of getting diabetes shrinks by 14 percent. The authors also found the ice-cream effect: Consuming as little as a half a cup per week was associated with a 19 percent reduced diabetes risk. But that finding’s epitaph was already written. The researchers concluded that consuming “dairy foods, particularly yogurt,” might help curb the diabetes epidemic, and noted that the benefits of ice cream had elsewhere been written off as a product of reverse causation. The evidence in yogurt’s favor was much better established, Sabita Soedamah-Muthu, an epidemiologist at Tilburg University and the paper’s senior author, told me. The ice-cream effect had fewer studies in its corner. “We didn’t believe in it,” she said. There’s a thing that happens when you start writing a story about how maybe, possibly, believe it or not, ice cream might be sort of good for you, and how some of the world’s top nutritionists gathered evidence supporting that hypothesis but found reasons to look past it. You begin to ask yourself: Am I high on my own ice-cream supply? I asked the experts for a gut check. Pereira, the first to hit upon the ice-cream effect, told me that it just wasn’t the kind of result that goes down well in the “closed-minded” world of elite nutrition. “They don’t want to see it. They might ponder it for a second and kind of chuckle and not believe it,” he said. “I think that’s related to how much the field of nutritional epidemiology in the modern era is steeped in dogma.” Tobias, the journal editor and member of the 2025 U.S. dietary-guidelines advisory committee, called it “totally fair criticism” to ask why yogurt was played up while ice cream was played down. She expressed support for the Harvard team’s handling of the data, while acknowledging the tensions involved: “You don’t want to overstate stuff that you know probably has a high likelihood of bias, but you also don’t want to do the opposite and seem to be burying it, either.” Hu, the Harvard nutritionist, said that deciding what a study means requires looking beyond the numbers to what is already known about dietary science: “You need to interpret the data in the context of the rest of the literature.” Mozaffarian, Hu’s co-author, echoed this view. Still, he noted, “you’re raising a really, really important point, which is that when, as scientists, we find things that don’t fit our hypotheses, we shouldn’t just dismiss them. We should step back and say, ‘You know, could this actually be true?’ ” Could the idea that ice cream is metabolically protective be true? It would be pretty bonkers. Still, there are at least a few points in its favor. For one, ice cream’s glycemic index, a measure of how rapidly a food boosts blood sugar, is lower than that of brown rice. “There’s this perception that ice cream is unhealthy, but it’s got fat, it’s got protein, it’s got vitamins. It’s better for you than bread,” Mozaffarian said. “Given how horrible the American diet is, it’s very possible that if somebody eats ice cream and eats less starch … it could actually protect against diabetes.” The “Got Milk?” crowd also loves to talk about the “milk-fat-globule membrane,” a triple-layered biological envelope that encases the fat in mammalian milk. Some evidence suggests that dairy products in which the membrane is intact, such as ice cream, are more metabolically neutral than foods like butter, where it’s lost during the churn. (That said, regular cream has an intact membrane, and it hasn’t been consistently associated with a reduced diabetes risk.) Then there is what might charitably be termed the “real-world evidence.” In 2017, the YouTuber Anthony Howard-Crow launched what Men’s Health called “a diet that would make the American Dietetic Association shit bricks”: 2,000 calories a day of ice cream plus 500 calories of protein supplements plus booze. After 100 days on the ice-cream diet, he’d lost 32 pounds and had better blood work than before he’d started pounding Irish-whiskey milkshakes. Still, the method is unlikely to take the slimming world by storm: Howard-Crow called his ice-cream bender “the most miserable dieting adventure I have ever embarked upon.” But overall, I found more receptiveness to the ice-cream signal than I was expecting. “It’s been more or less replicated,” Pereira noted. “Whether it’s causal or not still remains an open question.” Mozaffarian agreed: “I think probably the ice cream is still reverse causation,” he said. “But I’m not sure, and I’m kind of annoyed by that.” If this had been a patented drug, he continued, “you can bet that the company would have done a $30 million randomized controlled trial to see if ice cream prevents diabetes.” To be clear, none of the experts interviewed for this article is inclined to believe that the ice-cream effect is real, although sometimes for reasons that differ from Hu’s. Pereira, for example, pointed out that people aren’t always truthful when they’re quizzed on what they eat. His 2002 study found that overweight and obese people reported eating fewer dairy-based desserts than other people. “I don’t believe that the heavier people consume less desserts,” he said. “I believe they underreport more.” If that’s true, then admitting to eating ice cream might correlate with metabolic health—and the ice-cream effect would be, in its way, a marker of fat stigma in America. [From the June 2000 issue: Ice-cream making for beginners] The problem with this line of thinking is that once you start contemplating all the ways that cultural biases can seep into the science, it doesn’t stop at dairy-based desserts. If the ice-cream effect can be set aside, how should we think about other signals produced by the same research tools? “I don’t know what I believe about yogurt,” Tobias told me. It’s widely known that yogurt eaters on average are healthier, leaner, wealthier, better educated, more physically active, more likely to read labels, more likely to be female, and less likely to smoke or drink or eat Big Macs than never-yogurters. “You can’t confidently adjust away all of that kind of stuff,” said Klatt, the UC Berkeley nutritionist. In 2004, the English epidemiologist Michael Marmot wrote, “Scientific findings do not fall on blank minds that get made up as a result. Science engages with busy minds that have strong views about how things are and ought to be.” Marmot was writing about how politicians deal with scientific evidence—always concluding that the latest data supported their existing views—but he acknowledged that scientists weren’t so different. The ice-cream saga shows how this plays out in practice. Many stories can be told about any given scientific inquiry, and choosing one is a messy, value-laden process. A scientist may worry over how their story fits with common sense, and whether they have sufficient evidence to back it up. They may also worry that it poses a threat to public health, or to their credibility. If there’s a lesson to be drawn from the parable of the diet world’s most inconvenient truth, it’s that scientific knowledge is itself a packaged good. The data, whatever they show, are just ingredients. This article appears in the May 2023 print edition with the headline “The Ice-Cream Conspiracy.” from https://ift.tt/fXb20tq Check out http://natthash.tumblr.com While her wife was pregnant with their son, Aimee MacDonald took an unusual step of preparing her own body for the baby’s arrival. First she began taking hormones, and then for six weeks straight, she pumped her breasts day and night every two to three hours. This process tricked her body into a pregnant and then postpartum state so she could make breast milk. By the time the couple’s son arrived, she was pumping 27 ounces a day—enough to feed a baby—all without actually getting pregnant or giving birth. And so, after a 38-hour labor and emergency C-section, MacDonald’s wife could do what many mothers who just gave birth might desperately want to but cannot: rest, sleep, and recover from surgery. Meanwhile, MacDonald tried nursing their baby. She held him to her breast, and he latched right away. Over the next 15 months, the two mothers co-nursed their son, switching back and forth, trading feedings in the middle of the night. MacDonald had breastfed her older daughter the usual way—as in, by herself—a decade earlier, and she remembered the bone-deep exhaustion. She did not want that for her wife. Inducing lactation meant they could share in the ups and the downs of breastfeeding together. MacDonald, who lives in a small town in Nova Scotia, had never met anyone who had tried this before. People she told were routinely shocked to learn that induced lactation—making milk without pregnancy—is biologically possible. They had so many questions: Was it safe? Did she have side effects? How did it even work? But when she described how she and her wife shared nursing duties, many women told her, “I wish I had had that.” Induced lactation wasn’t initially developed for co-nursing. Mothers who wanted to breastfeed their adoptive babies were the first to experiment with hormones and pumping. But over time, the few experts who specialize in induced lactation told me, that has given way to more queer couples who want to share or swap nursing duties. Early in her career, Alyssa Schnell, a lactation consultant in St. Louis who herself breastfed her adopted daughter 17 years ago, found that when she suggested to same-sex couples that the non-birthing partner might try nursing, “they would be horrified.” The idea that a woman would nurse a baby she did not give birth to—common in the era of wet nurses—had become strange in our era of off-the-shelf formula. Now parents are coming to her asking to induce lactation, and more of them are interested in co-nursing. About a quarter of all babies in the U.S. are breastfed exclusively for six months; more than half are breastfed at least some of the time. The statistics don’t say by whom, but that’s because they don’t need to. We can assume it’s virtually always their birthing mother. Even with the help of formula, the pressure around or preference for breastfeeding means that, in many families, the work of feeding falls disproportionately on one parent. But induced lactation decouples breastfeeding from birth. By manipulating biology, parents who co-nurse are testing the limits of just how equal a relationship can truly be. Breastfeeding is hard work, even when it’s “natural.” Adding induced lactation is harder work still. MacDonald was putting herself on a newborn schedule weeks before her baby was even born. She pumped at home. She pumped at work. She even pumped while her wife was in labor, because skipping sessions can cause milk supply to drop. As Diane Spatz, a lactation expert at the University of Pennsylvania and Children’s Hospital of Philadelphia, puts it, “You have to start pumping like a wild person.” MacDonald followed a version of the Newman-Goldfarb protocol, named after a pediatrician and an adoptive mother who documented and shared the process in 2000. In addition to pumping, the protocol includes birth control, which causes a surge of progesterone and estrogen akin to pregnancy hormones, and a drug called domperidone, which boosts the milk hormone prolactin. Together they biochemically prime the body for milk production. It’s unusual, Schnell told me, for a woman inducing lactation to make enough milk to feed a baby all on her own—unless she’s breastfed before, like MacDonald had—but it’s also unusual to make no milk at all. [Read: The problems with breastfeeding go way beyond pumps] In the U.S., getting domperidone can be a challenge. Though the drug is widely available in Canada, Australia, and Europe, the FDA has banned it in the United States, citing the risk of abnormal heart rhythms and even death. But these heart problems have shown up only in the elderly, foreign experts have noted, and Australian scientists concluded in a 2019 review that domperidone is safe for lactation, as long as women are screened for heart conditions. But in the U.S., parents usually aren’t taking it under the supervision of a doctor. They might buy pills with a prescription at a Canadian pharmacy or surreptitiously order the drug online through overseas pharmacies. “There was a brief moment when you could only buy it in Bitcoin,” says Lauren Vallone, whose partner, Robin Berryman, induced lactation so that they could co-nurse their daughter, who was born in 2020. Inducing lactation felt like a DIY project to Vallone and Berryman. As a queer couple trying to start a family, though, they were also used to doing things a different way. They eventually reached out to Schnell for guidance, but they also swapped tips in a Facebook support group that had a wealth of anecdotal advice. Not that most doctors would have been helpful. Even the idea that one can breastfeed without having been pregnant isn’t widely known, Spatz told me. “Nurses are surprised about that,” she said. “Physicians don’t know that.” Vallone and Berryman planned to divide nursing duties 50/50, but they didn’t know exactly what that would look like. Would they trade off every other feeding? Would one nurse while the other pumped? What about when one parent went back to work? “There's stories of people who have induced lactation, but then there’s no, like, ‘Well, what does your day look like?’” Vallone told me. They had no script to follow, so they could write their own. They envisioned giving themselves equal roles from the start, much like how many same-sex couples share a more equal division of labor, because they do not come in with the gender baggage of a heterosexual relationship. [Read: The gay guide to wedded bliss] What Vallone and Berryman did not want was to lapse into the roles that they watched their friends fall into, where the birthing parent becomes the breastfeeding parent becomes the default parent. The arrival of a new baby is a delicate time in any relationship—for many reasons, but in no small part because it disrupts whatever division of labor was previously agreed upon. Here is a tiny helpless human, along with a mountain of new tasks necessary to keep them alive. If the baby is breastfed, now a large share of that labor can be done by only one parent. In her case against breastfeeding in The Atlantic in 2009, Hanna Rosin described how that initial inequality persists and festers over the years: “She alone fed the child, so she naturally knows better how to comfort the child, so she is the better judge to pick a school for the child and the better nurse when the child is sick, and so on.” But what if—under very specific circumstances at least—breastfeeding did not fall solely on one parent? What if instead of parenthood starting off on unequal footing, it could be perfectly equal from the very beginning? For a while, Vallone and Berryman did trade off feedings, and both continued to pump, because they worried that their milk supplies would drop. They tracked every ounce in a shared spreadsheet. (This careful data logging actually allowed Schnell to write a case study about the couple.) The pumping eventually became too much—they couldn’t sleep if they were pumping!—but they have kept co-nursing for two years now. From the early days, they saw that nursing not only nourished their baby but also soothed her when she cried, made her sleepy when she was tired but fussy. So the work of not just feeding but all-round caregiving fell on them more equally. In the morning, they could alternate one person waking up early with the baby, the other sleeping in. At night, one parent could go out with friends without racing home for bedtime or pumping a bottle of breast milk for the other to feed. Because they could each provide everything their baby wanted, they were also each freer. Breastfeeding simultaneously deepened their relationships with their baby and allowed them a life outside of that. “You really get a sense of how radical it is to have caretaking split so evenly,” Vallone said. The couple is now trying for their second child, which Berryman plans to carry. They plan to co-nurse again. Vallone and Berryman did, however, run into an unexpected obstacle to their co-nursing: their baby. She at one point refused to nurse on Vallone, the birthing parent, and wanted to nurse only on Berryman. Any parent is probably familiar with how babies can develop seemingly arbitrary preferences: breast over bottle, left breast over right breast, even. As they get older, toddlers, too, go through periods of wanting only one parent or another to feed, clothe, bathe, or comfort them. In this case—as in many cases—Vallone and Berryman had to be deliberate about returning to a more even state. At its most intense, Berryman would sleep away from the baby in another room; it got better over time, but it also sometimes got worse. Equality did not come easily even with two nursing parents, which perhaps isn’t surprising. The advent of formula did not magically render all marriages equal. Vallone and Berryman still had to work toward keeping their co-nursing relationship as balanced as possible. Dividing work is also, well, work. [Read: Lessons from 40 men in egalitarian relationships] Not all couples who induce lactation end up splitting breastfeeding evenly. Some are not able to, and some don’t even want to. For example, one parent might choose to carry the baby while the other takes on breastfeeding. Some of the women I spoke with were primarily motivated to induce lactation to pass along their antibodies in breast milk, or to physically bond with a baby they did not carry. Even for those who never made more than a few of the roughly 25 ounces a baby typically needs every day, being able to comfort nurse—when a baby sucks more for soothing than for nourishment—was meaningful. They could nurse their baby to sleep or calm them when upset. It brought the parents closer together too: Although inducing lactation is not equivalent to pregnancy, both parents felt like their bodies were preparing for a baby together. And later, they could troubleshoot a bad latch or clogged duct together. Breastfeeding can be an isolating experience when one parent is attached to a baby eight times a day and the other looks on a bit helplessly; co-nursing made it less so. Because induced lactation has flown under the radar of mainstream science for so long, a lot remains unknown. A couple of small studies suggest that the protein and sugar content of induced breast milk is in the normal range, but detailed experiments into, for example, the mix of antibodies have never been done. And why are some women inducing lactation able to produce more than others? Schnell has noticed that those who have struggled with infertility or hormonal balances usually make less milk. She has worked with trans women, too, who are able to make milk, though usually not in large amounts. Men, theoretically, could lactate as well; early studies into domperidone actually noted this as a side effect. There are anecdotal reports of men breastfeeding infants, but there’s virtually no research into the phenomenon. One mother I interviewed, Morgan Lage, told me that her experience inducing lactation to breastfeed her daughter inspired her to train as a lactation consultant, and she hopes now to fill in some of the many unknowns. The Newman-Goldfarb protocol is widely used as the template for anyone attempting induced lactation, but no one has rigorously studied the optimal time to initiate pumping or birth control. Lage started pumping earlier than the protocol suggested, and she wonders if that’s why she was able to have a full milk supply despite never having breastfed before. She loved nursing her daughter. She loved feeling “just as important and needed” in the fleeting, precious period of infancy. [Read: Breastfeeding at any cost] I know what Lage means about feeling needed, though perhaps because I breastfed solo—as most mothers do—I did not always love it. Still, I remember staring at my baby’s eyelashes and toes, marveling at how nearly every molecule in her body came from mine. We did supplement with formula, too, in part because we wanted my husband to be involved in her feeding. Although the bottle satisfied her hunger, it did not always satisfy some primal need for comfort. During her most inconsolable nights, my husband would spend hours trying to soothe her with every trick in the book, only for her to fall quiet and asleep the minute I nursed her. This frustrated us both. To be needed this way was a burden and a joy. I was sorry, for both of us, that we could not share it. from https://ift.tt/WwNhnEr Check out http://natthash.tumblr.com |
Authorhttp://natthash.tumblr.com Archives
April 2023
Categories |