Essential workers who tugged the United States through the pandemic have not gotten much compensation for what they’ve had to endure, but hey, they did get some perks. Fifteen percent off mattresses for teachers! Allbirds at $35 off with the discount code HEALTHCAREHERO. A free Snickers bar (redeemable only at Walmart with an e-gift card)! Yes, some major retail chains issued hazard bonuses and goosed employees’ wages with “hero pay” increases--at least for a few months; L.A., Seattle, and other cities compelled grocery stores to do the same. But lots of people who put their lives at risk for their employment went without any hazard pay at all, no matter how many Americans stuck a thank you frontline workers! sign in their front yard or apartment window. But for restaurant servers, the amount of extra money they could earn for taking on a fair amount of extra risk wasn’t subject only to their bosses’ whims or to local regulations. It was also up to their customers. With every order of takeout sushi or oversauced pasta Alfredo, Americans could add some hero pay when they got their bill. And they did: Waiters from all across the country told me about the Great Pandemic Tipping Boom of 2020: “I think I got a $20 tip from someone at least once a week,” said Lori Pearson, a waitress at Bob Evans in Ann Arbor, Michigan. “That basically never happened before the pandemic.” At Zoetropolis, a restaurant and distillery in Lancaster, Pennsylvania, the average tip shot up to about 25 percent last spring, allowing waiters to bring home close to what they were making before the pandemic even with a fraction of the tables filled. [Read: An extinction event for America’s restaurants] That was then. Now squint at America’s dive-iest dive bars and stodgiest steakhouses and you might just forget that the pandemic ever happened. Even in the bluest of states, spaced-out seating and useless-anyways plexiglass dividers are going away. Forget about customers; even some workers aren’t wearing masks. At the start of June, the number of people eating out was already back to 2019 levels, according to data from OpenTable. The hazard Americans felt is fading. Will their hazard tipping fade as well? What Americans actually tip, both now and before the pandemic, is an enduring mystery. The Department of Labor fastidiously tracks the smallest movements in wages, but no government agency even tries to monitor all the extra bills that get strewn on restaurant tables. To understand how the size and frequency of tips might have changed since the start of the pandemic, and whether they are changing back, I reached out to Square, the payment company that processes credit-card transactions for millions of small businesses. Square isn’t just how you use your credit card to pay for those heirloom tomatoes at the farmers’ market; a large number of American restaurants are also on the platform. That means all the tips that get tacked on to credit-card transactions using Square are also counted up and stored. The company provided me with data going back months before the pandemic, on how often restaurant customers were giving tips, and how big those tips were. The numbers show that the Great Pandemic Tipping Boom was real, if perhaps a bit less great than I’d expected. In the innocent times before March 2020, the average tip when cards were swiped at sit-down restaurants never strayed outside a very narrow range of 19.9 to 20.1 percent—corresponding to the tipping norm that, coincidentally or not, is also incredibly easy to calculate. Then, on March 24, as stay-at-home orders began to pile up and Americans clapped and howled and clanged pots in appreciation of essential workers, the average tip did something weird: It started drifting upward. Within a few weeks, the average hit a peak of 21.0 percent. When the first pandemic wave receded, tips fell off a bit, to roughly 20.4 percent over the summer; they came up again, to 20.8 percent, during January’s massive spike in cases. Even now, as fully vaccinated Americans return to their normal lives, tips remain higher than where they were in 2019. In the past few months, the average appears to have settled at 20.6 or 20.7 percent, well above the pre-pandemic norm. A focus on average tips may hide the full extent of the tipping boom. When shutdowns first went into effect, would-be diners turned to the only options available to them: takeout and delivery. Most people, before the pandemic, tipped 20 percent only for the traditional dine-in experience. They gave less—or nothing at all—when their entrées came plastic-bagged and not plated. Given that pattern, you would expect the huge rise in takeout meals to pull down the average tip. But that doesn’t appear to have happened. “The simple explanation is that there is a greater willingness among some people to tip now as opposed to before the pandemic,” says Michael Lynn, a marketing professor at Cornell University and an expert on tipping. [Read: Don’t blame Econ 101 for the plight of essential workers] The restaurant workers I talked with all had an unbelievably hard year. Nearly every one of them received unemployment benefits before having to contend with pandemic restrictions and empty tables and the constant threat of infection from maskless munchers. For a base pay of $3.60 an hour, Pearson, the Bob Evans waitress, has had to sanitize every single ketchup bottle and salt and pepper shaker after each diner walks out. All the extra trips between the kitchen and her tables have taken a toll on her body. A boost in tips hasn’t made up for any of that, but the displays of sheer altruism have been nice. Alex Boenelli, a waitress at Texas Roadhouse in Bensalem, Pennsylvania, told me that her tips averaged 15 percent before the pandemic, but, early in the pandemic, they were often more than 25 percent. “People were tipping more because they were excited to be out,” she said. “They were so happy to be somewhere, and they felt terrible for us.” Okay, but now what? For America’s overworked, underpaid, and perpetually harassed servers, booked-up restaurants could augur the return of impatient customers who aren’t prone to spurts of generosity. Although the Square data suggest that tips are still above the norm, on average, some waiters told me that stinginess and other pre-pandemic behaviors have returned. Another server told me the exact opposite, that tips started increasing when Americans got their tax refunds and were relishing how good it felt to be back in restaurants. I asked Sara Hanson, a University of Richmond marketing professor, whether she thought anything about the past 15 months might have led to a metamorphosis in how the country treats waitstaff. “I don’t think the pandemic is going to lead to any long-term changes in tipping,” she told me. So much for that! At some point, maybe this summer, maybe a little later—eating out won’t feel so novel or even downright fun, and we’ll all just revert to our old behaviors without really thinking about it. The still-darker scenario is that Americans may soon become even less likely to tip than before the pandemic. If you’ve walked past a restaurant lately, you may have seen a help wanted sign in the window, or even one with promises of signing bonuses. These businesses cannot find enough workers, as some combination of low pay, increased government benefits, and dangerous conditions spur waiters to leave the industry. Restaurants are, at long last, doling out raises to entice people back, but if customers notice what’s going on, they could conceivably start tipping less, Hansen told me. Even if they don’t, the shortage is making everything about eating in a restaurant a little slower and less fun. A server in Lancaster, who didn’t want to be named for fear of retaliation by her employer, told me that she’s been so busy lately, she has to bus even bigger stacks of plates than normal, and the endeavor sometimes saps all feeling out of her hands. It’s not her fault that she doesn’t have time to stop by to see if her customers want another cosmo, but they might not be so understanding when the time comes to tip. [Read: Workers should have the power to say ‘no’] The other possibility is that someway, somehow, Americans actually have relearned how to treat restaurant workers. The biggest change in tipping doesn’t seem to be happening inside restaurants, but rather at people’s front doors. The Square data include restaurant transactions in which a customer orders food but doesn’t swipe their card in person, which Lynn, the Cornell professor, told me generally corresponds to online delivery orders. Going by these numbers, people bothered to give a tip for only about half of all such orders before the pandemic started. By May 2020, though, that proportion had risen to more than 75 percent—and it hasn’t stopped going up. Last month, at least 84 percent of these transactions included a tip. The fact that so many more people seem to be tipping on delivery and takeout orders, so late into the pandemic, suggests that something really has changed. It’s not as crazy as it sounds. Tipping breaks the rules of Newtonian physics, Lynn told me: What goes up stays up (usually). “Some people tip to show off and get good service,” he said. “If enough people are doing that, then everyone else has to at least tip average to avoid losing the server’s esteem. It’s this continuous upward pressure.” If a critical mass of Americans really did develop higher tipping habits, the rest of us stragglers may not have much choice but to follow along. Even if they go away completely, hazard tips will have been a success story. Larger gratuities “weren’t an economic calculation,” Les Boden, a public-health economist at Boston University, told me. “People understood that a restaurant worker was taking more risk than they had before, and were thinking, Your job is important to me, and I want to show that to you.” But the need for hazard tips also reflects a fundamental failure of the restaurant industry, and of its oversight: How much a server is compensated for the risks of working through a pandemic should not be up to split-second decisions from the rowdy diners at Table 3. from https://ift.tt/3drp4MX Check out http://natthash.tumblr.com
0 Comments
The coronavirus is on a serious self-improvement kick. Since infiltrating the human population, SARS-CoV-2 has splintered into hundreds of lineages, with some seeding new, fast-spreading variants. A more infectious version first overtook the OG coronavirus last spring, before giving way to the ultra-transmissible Alpha (B.1.1.7) variant. Now Delta (B.1.617.2), potentially the most contagious contender to date, is poised to usurp the global throne. Alphabetically, chronologically, the virus is getting better and better at its primary objective: infecting us. And experts suspect that it may be a while yet before the pathogen’s contagious potential truly maxes out. “A virus is always going to try and increase its transmissibility if it can,” Jemma Geoghegan, an evolutionary virologist at the University of Otago, told me. Other aspects of the virus’s unfolding bildungsroman, however, are much harder to forecast, or even get an initial read on. Researchers still don’t have a good handle on which variants might cause more cases of severe disease or death, a metric called virulence. And while a virus’s potential to transmit can sometimes heighten its propensity to kill, the two are by no means inextricably linked: Future coronavirus strains could trend more lethal, or less, or neither. We keep trying to pigeonhole specific variants as “more dangerous,” “more deadly,” or “more problematic,” but viral evolution is a humbling, haphazard mess—a plot-twisting story we have to watch play out in real time. “We cannot be complacent about ‘Oh, this is the end of the mutations,’” Akiko Iwasaki, a virologist and immunologist at Yale, told me. As long as the virus has hosts to infect, it will keep shape-shifting in ways we can’t fully predict. That biological caprice makes it harder to anticipate the next pandemic hurdles we’ll need to clear, and assess the dangers still ahead. But our role in this relationship matters too: What the virus can accomplish also depends a great deal on us, which means its evolution does as well. As desperately as we want to purge it, the coronavirus’s main objective is to get closer to us. Its biological imperative is to enmesh itself into a suitable host, reproduce, and disperse, then begin the process anew. In the past year and a half, SARS-CoV-2 has found its way into at least 180 million human hosts, and still the virus wants more. “The evolutionary pressure for a virus is transmissibility,” Iwasaki told me. Any changes that make more of it sooner will help it flourish, like a fast-growing weed settling into a new garden. Most mutations that occur in the SARS-CoV-2 genome are inconsequential, even detrimental, to the virus’s propagation campaign. Occasionally, though, one virus will hit upon a smidgeon of advantage. All else held equal, this variant will have a leg up on its kin, and may outcompete them. Epidemiologists sampling the sick will see a sharp upswing in the proportion of people infected by a specific version of the virus—one too large and too sudden to be explained by chance. Such a spike tipped off public-health officials to the presence of Alpha shortly before it erupted across the globe. “It went from nothing to everything really quick,” Joseph Fauver, a genomic epidemiologist at Yale University, told me. Delta now appears to be following in its predecessor’s footsteps; it swept first through India and the U.K., overtaking more sluggish variants, then spilled over international borders. Exactly how Alpha and Delta executed their meteoric rise is less clear: SARS-CoV-2 has likely hit upon multiple ways to spread more efficiently between hosts. Certain mutations might have helped Alpha more easily glom on to the outsides of cells; others might increase Delta’s ability to accumulate in the airway, the virus’s natural point of egress. Still other genetic changes could make specific variants hardier, perhaps allowing them to linger in the nose, so hosts stay contagious for longer. These different possibilities can be teased apart in experiments in laboratory cells and animals, but they all converge on a single principle, Angela Rasmussen, a virologist at the Vaccine and Infectious Disease Organization in Saskatchewan, Canada, told me: “What we’re seeing is a virus that’s becoming more efficient at making more viruses.” Given sufficient time with a new host, most viruses can be expected to trend more transmissible; the coronavirus is probably no exception. A more contagious virus might, at first pass, seem like a deadlier virus: Its enhanced invasion capabilities might allow it to grip more tightly onto its host, building up to levels high enough to overwhelm the body. “In that case, you could have transmissibility and virulence increasing in lockstep,” Paul Turner, an evolutionary biologist and virologist at Yale, told me—a neat, simple story. Some researchers have hypothesized that this could be the narrative behind the Alpha and Delta variants, both of which have been linked to bumps in hospitalization. But those patterns haven’t yet been conclusively nailed down, Turner said, and no evidence so far suggests that the coronavirus is systematically evolving to become more malicious. Viruses are microscopic entities hungry for spread, not carnage; the suffering of their host is not an imperative for them to persist. If a surge in virulence happens, it’s often incidental—collateral damage from an increase in contagiousness. The march toward transmissibility doesn’t always drag virulence along. Many people have been found to silently carry tons of SARS-CoV-2 in their airways to no ill effect. On occasion, the two traits can even butt heads, forcing viruses to become tamer over time in service of speedier spread. The hypervirulent myxoma virus, a pathogen deliberately introduced into Australian rabbits in the 1950s as a form of biocontrol, for instance, appears to have become less lethal over time. Instead of killing rabbits instantly, it began to prolong its hosts’ sickness—and by extension, its own infectious window. But myxoma is more exception than rule. Super-deadly or debilitating viruses such as Ebola and dengue, Fauver pointed out, don’t seem to be getting gentler; they already spread just fine. SARS-CoV-2 may have especially little reason to domesticate itself, since so much of its transmission happens before serious symptoms appear: “It’s not killing people before they can pass it on to someone else,” Rasmussen said. If the fates of SARS-CoV-2’s virulence and transmission aren’t tightly coupled, “there’s no responsible way to make any predictions about how virulence is going to change right now,” says Brandon Ogbunu, an evolutionary and computational biologist at Yale. Alpha and Delta may still be, particle for particle, more formidable foes than other variants; if they’re consistently driving more disease, hospitalization, and death, those trends are certainly worth paying attention to. But definitively tying them to specific viral traits or mutations is difficult, in part because virulence itself is a murky concept. “It’s kind of a disastrous word,” Ogbunu told me. It’s meant to convey the damage caused to a host by a pathogen. But damage is subjective, and depends at least as much on the host as it does on the virus. While measuring transmissibility can mean simply asking whether a variant is present and to what extent, sussing out virulence is a more qualitative interrogation, of how virus and body interact, across a bevy of different environments. If variants are weeds, virulence asks how pernicious they are, and the answer can be heavily influenced by the delicacy of the garden plants they’re throttling. Hospitalizations and deaths, some of the best real-world readouts for virulence, by themselves can be fraught metrics to use, says Müge Çevik, a virologist and infectious-disease expert at the University of St. Andrews, in the U.K. Not all places have the same standards of care, or the same access to treatments. Sick people might be admitted to a hospital because of a nastier form of the virus—or because of risk factors that made them more vulnerable to begin with. Immunity to SARS-CoV-2 has also been building over time, muddling susceptibility further. And much of the hardship caused by the coronavirus remains outside hospital walls. The difficulty of comparing populations may be part of the reason why different studies looking into variant severity have sometimes turned up discordant results. Ballooning case rates also have a way of reinforcing themselves: When many people suddenly get sick—perhaps because a more transmissible variant has emerged—medical infrastructure gets overwhelmed, and more people might die, even if the virus itself is no more harmful. “The epidemiology is so noisy, it’s so hard to say,” Vineet Menachery, a coronavirologist at the University of Texas Medical Branch, told me. (Researchers now generally agree that Alpha is deadlier than other variants; the news on Delta is less certain.) That puts the onus on researchers to meticulously catalog not only the variants infecting us, but the characteristics of the people they most strongly affect, says Rebekah Honce, a virologist at St. Jude Children’s Research Hospital. “It’s a trifecta of host, agent, and environment—you can’t ignore any branch.” COVID-19 will, inevitably, look different in the future. But our relationship with the virus won’t hinge solely on its genetic hijinks: We can expect the immune defenses we raise against SARS-CoV-2 to shape its evolutionary path. With vaccines on the rise in many parts of the world, and fewer hosts to infect, the virus is starting to hit roadblocks and slowly sputter out. “By vaccinating, we’re making it less likely that new variants will emerge,” Çevik told me. Eventually, as our collective defenses build, SARS-CoV-2 might become no more a nuisance than a common-cold coronavirus, causing only fleeting and inconsequential symptoms in most people, whose bodies have seen some version of the pathogen before, Jennie Lavine, an epidemiologist and virologist at Emory University, told me. That, of course, makes equitable access to vaccines all the more important, so mutational hot spots don’t arise in unprotected places. Left to its own devices, the virus could hypothetically bridle itself. But it may have no incentive to. “Counting on the virus to become less virulent on its own is a bad bet,” like waiting for an enemy to slacken its offense, Yale’s Iwasaki told me. The better move is to double down on our defense, the tools we already know best. There is a curious caveat to the deployment of vaccines. While inoculations aren’t themselves the cause of SARS-CoV-2 mutations, the immunity they provide can nudge the virus onto new trajectories that we’ll need to keep monitoring. A less-than-stellar vaccine developed to block Marek’s disease in chickens goaded one virus into higher transmissibility and virulence, making the pathogen more dangerous to unvaccinated birds. (There’s no evidence that’s happening with SARS-CoV-2 and our current lineup of excellent vaccines, but the virus will continue to pose an especially big threat to those who aren’t immune.) Pressure from the vaccines could also drive the spread of variants that are better at eluding our defenses and, perhaps, stumping some of our shots. A handful of variants, including Delta, have already demonstrated the ability to dodge certain antibodies—another trait, Çevik said, that enables the virus to enter its host more easily. In years to come, we’ll probably have to tinker with our vaccine recipes to keep pace with the fast-changing virus. But every vaccine we debut has the potential to block a route the virus might have otherwise taken. Viral genomes aren’t infinitely mutable—they can edit only the starting material they’ve been given, and they can’t make certain changes without hamstringing their precious capacity to spread. With time, we might be able to use shots strategically, to force SARS-CoV-2 onto more predictable evolutionary paths, Turner told me: “That’s the way we gain control.” If we’re going to live with this virus long-term—as we absolutely must—then vaccines are our key to building a sustainable relationship, one in which we turn the tables. We can make the virus’s evolution react to us, and not the other way around. from https://ift.tt/2UayG7P Check out http://natthash.tumblr.com Ominous pathogens seem to arrive every few years: SARS in 2003, swine flu in 2009, Ebola in 2014, Zika in 2016, COVID-19 in 2019. The World Health Organization calls these viral threats “Disease X,” both to encourage policy makers to think broadly about what the next pandemic might be, and because it could be anything. At this rate, 2025 is not looking good. After an inept coronavirus response, will the United States do better when the next pandemic strikes? Experts generally agree that America learned from the past year, and that the next public-health crisis won’t be quite as bewildering. But America’s pandemic preparedness still has major gaps, some of which are too big for any one administration to fix. In recent weeks, I’ve called back many of the experts I interviewed over the past 18 months about masks, testing, contact tracing, quarantine, and more. I asked them, “Are we ready for another one?” The short answer is “Not quite.” The long answer is that being truly “ready” will be harder than anyone realizes. Public-health CapacityThe U.S. is notorious for spending oodles on health care, but health care has little to do with stopping the spread of infectious diseases. When a person has strep throat, they go to the doctor; when a nation faces an epidemic, it turns to public-health workers. But one major reason the U.S. struggled to contact trace was that budget cuts following the 2008 recession had eviscerated the nation’s public-health departments. Spending on state and local public-health departments has declined by 16 and 18 percent, respectively, since 2010, according to an analysis by Kaiser Health News and the Associated Press. Public-health departments’ data systems are especially outdated, which means that public-health workers have trouble tracking people’s vaccine status, counting COVID-19 deaths, or sharing data across state lines. The American Rescue Plan, which was signed by President Joe Biden in March, dedicates $7.7 billion to hiring and training more public-health workers to perform tasks such as contact tracing and vaccination. Several experts commended this cash infusion, but they said what’s really needed is a larger annual public-health budget. Public-health departments can’t hire people based on a onetime surge of money. Just like businesses, they need annual revenue in order to make payroll. “A lot of states are not going to hire people unless they know that there’s a secure, ongoing level of funding,” Marcus Plescia, the chief medical officer at the Association of State and Territorial Health Officials, told me. That would need to come from Congress, a body that is not known for acting swiftly and boldly. TestingIn March of last year, I explained that the U.S. was behind on coronavirus testing because the FDA’s authorization process for new types of lab tests—called an emergency use authorization, or EUA—was too slow. “The speed of this virus versus the speed of the FDA and the EUA process is mismatched,” Alex Greninger, the assistant director of the virology division at the University of Washington Medical Center, told me at the time. [Read: The only way we’ll know when we need COVID-19 boosters]. After these early testing bungles, the FDA changed its authorization process so that labs could spin up tests more quickly. But testing for Disease X is not guaranteed to go more smoothly. The FDA is answerable to whichever administration is in charge at the moment, and the next pandemic might happen under the watch of President Donald Trump Jr., not President Biden. A president might be incentivized to slow testing so that the overall rates of infection look better—and indeed, President Donald Trump reportedly did this. Another challenge that labs faced this time was getting a sample of the coronavirus out of China, where it originated and where controls on viral-sample shipping are strict. When I called Greninger back recently, he said he hopes that whoever is at the helm of the FDA during the next crisis will allow labs to use the virus’s genetic sequence, which is easier to obtain than a live sample, as the initial way of proving that their test works. (In response to a request for comment, an FDA spokesperson said that in the future, “if there are no available clinical specimens, FDA will consider the best approach to allow for validation with the most appropriate means available, for a limited time until clinical specimens become available.”) Other testing reforms would be helpful too. The Health and Human Services Department needs to do a better job of coordinating testing among public-health labs, academic labs, and commercial labs, all of which were working on different kinds of tests at the beginning of this pandemic, Scott Becker, the CEO of the Association of Public Health Laboratories, told me. The federal government should also be proactively monitoring wastewater for signs of an emerging virus, not relying on people to volunteer for testing, says Ralph Catalano, a public-health professor at UC Berkeley. These steps would be wise, but they hinge on the wisdom of the people in power when Disease X hits. The Mask ShortageAs Americans were learning about the coronavirus pandemic, they also learned of something called “the national stockpile,” which held a strategic reserve of N95 masks. Or at least it was supposed to. It turned out that the federal government had distributed 85 million N95s during the 2009 swine-flu pandemic, and that supply was never replenished. That led to a shortage of masks in 2020 just as health-care workers needed them most. For now, that shortage has been alleviated. Last year, the federal government bought 325 million more N95s, said Dan Glucksman, the senior director for policy at the International Safety Equipment Association, which develops standards for personal protective equipment. In general, the Biden administration has shown “a commitment to a very data-driven, scientific approach to planning,” Charles Johnson, the president of the ISEA, said. But Glucksman and Johnson told me the administration could improve the stockpile further by having mask manufacturers rotate out the mask supply regularly so that it never expires. (N95 masks expire after about five years.) And to combat the hoards of N95 counterfeiters, Biden would do well to establish a White House–level office to fight fakes, they said. QuarantineAmericans were supposed to stay home for two weeks if they tested positive for COVID-19 or were exposed to it, but months into the pandemic, it became clear that they weren’t actually quarantining. The reason many people didn’t quarantine was sad and banal: They didn’t have paid time off from work. “We hear people say, ‘I have to work; I have to have my income,’” Ray Przybelski, the director of the Portage County Health and Human Services Department, in Wisconsin, told me in December. Throughout the pandemic, the federal government did pass several laws that allowed Americans to stay home from work if they were sick with COVID-19 or had to take care of children who were home from school. The concept of paid time off was so new to Americans that many didn’t realize they could take it. But those provisions have now expired, and that leaves America as, once again, the only industrialized country without mandatory, national paid leave. If paid leave isn’t established through legislation before the next pandemic, Americans will find themselves in the same situation, dragging themselves into work and spreading pathogens behind them. The StatesAmericans’ experience of the pandemic was largely determined by the state they lived in. Texans were allowed to stop wearing masks on March 10, 2021, when less than 10 percent of the U.S. population had been fully vaccinated. Hawaiians, meanwhile, were required to keep wearing masks indoors as of May 26, when 40 percent of Americans had been fully vaccinated. Last April, a New Yorker might have huddled alone in her tiny apartment while her relatives in South Dakota, which never issued a stay-at-home order, sat in a casino as though it were a normal spring day. The entire pandemic was a bizzaro choose-your-own-adventure story in which governors did most of the choosing. [Read: America’s entire understanding of the pandemic was shaped by messy data] The Trump administration’s unwillingness to have the federal government take the lead made local public-health officials’ job harder. Contact tracing became a brand-new, massive undertaking thrust on each state overnight. “States were left to figure out contact tracing themselves,” said Steve Waters, the head of Contrace, which helps connect contact tracers with health departments. The Biden administration believes that the federal government is a necessary leader in pandemic response, and will therefore be better positioned to coordinate state actions if Disease X arrives on its watch. But the ability of the government—any government—to handle a pandemic will be limited in a country where federalism and individualism are prized. Other countries have a minister of health; the U.S. has a weak CDC that makes suggestions states can follow or not. “The public-health response has to be unified across the country, must be guided by national leadership and national direction,” Wafaa El-Sadr, a professor of epidemiology and medicine at Columbia University, told me. “This is almost impossible in the face of limited authority of the CDC over states and the autonomy of states in making their own decisions, often due to political imperatives.” El-Sadr suggested that, in emergencies such as pandemics, perhaps the CDC could temporarily take on a more “directive” role, telling state leaders exactly what to do. But given the politicization of even cloth face masks and free vaccines, that’s highly unlikely to happen. She also brought up something that will take more than one presidential administration to fix: A lot of Americans died because “we don’t have a healthy population overall,” El-Sadr said. America has a high rate of obesity, a high rate of poverty, a high rate of uninsurance, and now, a high rate of anti-vaccine conspiracism. Pandemics exploit the vulnerabilities that we’ve never bothered to shore up. We may not know what Disease X will be, but it knows exactly where to hit us. from https://ift.tt/3jiQOqZ Check out http://natthash.tumblr.com If your wanderlust is coming on extra strong this summer, you may be wondering what to do with it. Being vaccinated may feel like a superpower, but what exactly is safe—or not? The CDC suggests, for example, that this may be the summer for road-tripping by RV. “If traveling in a RV, you may have to stop less often for food or bathroom breaks, but you could still be in close contact with others while staying at RV parks overnight and while getting gas and supplies,” the agency advises in its travel tips for families with unvaccinated children. For long distances, RVs are more to the CDC’s liking than trains, buses, cruise ships, or river boats. But if your wanderlust is coming on extra strong and you don’t have $40,000 to drop on a Winnebago, you might have some questions about those trains and buses (if not the river boats). Fewer state and federal rules govern travel within the United States this summer than last, but that freedom—along with the differing recommendations for people with differing vaccination statuses, and scary new coronavirus variants that spread more quickly—can still make assessing the risk involved in driving to a wedding or flying to see Grandpa quite tricky. To help you decide on the best travel plan for you and your family, here are answers to five summer-vacation questions that go beyond the CDC’s tips. They are not exhaustive, but I hope that they will provide you with a framework for nuanced conversations about how dangerous a particular itinerary really is, and how much of that danger you’re willing to tolerate. 1. Are states with low vaccination rates off-limits?About 45 percent of all Americans are fully vaccinated, but that sweeping number belies a lot of local differences. The percentage of adult Vermonters who are fully vaccinated (75 percent) is almost exactly double that of adult Mississippians (38 percent). Things are even more disparate on the county level: In McKinley County, New Mexico, which includes part of the Pueblo of Zuni reservation, more than 99 percent of eligible residents are vaccinated. In Union County, on the other side of the state, only 17 percent are. If you’re at least two weeks past your final dose (and you’re not immunosuppressed or immunocompromised), visiting areas of the U.S. with low vaccine rates isn’t necessarily dangerous for you. (You will probably eventually need a booster shot, but we still don’t know when.) Saskia Popescu, an infectious-disease epidemiologist at George Mason University, told me that the bottom line is“we have very efficacious vaccines” in the U.S. Still, she said, getting infected with the coronavirus post-vaccination is “quite rare, but it is possible,” and it’s mathematically more likely in an area where the virus is circulating at high levels and more people are vulnerable to it. [Read: The rural pandemic isn’t ending] Those who are worried about breakthrough infections—especially those who have vulnerable people in their household—might want to take more precautions in a low-vaccination area than they would in a high-vaccination one. If you’ve returned to indoor dining or CrossFit at home in a highly vaccinated region, consider skipping those activities while traveling to a place that has given out fewer shots. Visiting less-vaccinated areas can also be an opportunity to model good behavior. “If I’m visiting friends or family there, and maybe they haven’t been vaccinated, I would take that opportunity to talk to them about getting vaccinated,” Popescu said. If you’re traveling because you’ve been invited to an indoor, mask-free, no-shots-required event, attending with a mask “encourages those people who aren’t vaccinated to wear a mask.” Better yet, you can have a conversation with your hosts about how to make the event safer, perhaps by hosting it outside. 2. Plane, train, or automobile?“If you’re vaccinated, you really don’t need to worry about your exposure on an airplane, on a bus, in the subway, or at the office, or anywhere else you go,” Joseph Allen, an associate professor at the Harvard T.H. Chan School of Public Health, told me. Planes especially tend to get a bad rap when it comes to infectious disease (see: the classic Airborne packaging), but flying is actually rather COVID-safe once you’re off the ground. “When the airplane is running, the ventilation and filtration are better than you find in a hospital,” said Allen, who also directs Harvard’s Healthy Buildings Program. That doesn’t mean air travel is 100 percent risk-free. Security lines, baggage claims, and gates usually don’t have the same ventilation standards as a plane does at 30,000 feet. They can also put you in proximity with lots of people for a long time. Air travel isn’t quite back to pre-pandemic levels, but it’s close: 2.1 million people flew this past Sunday, and about 2.7 million people flew around this same time in 2019. As Popescu put it, “We went from a place where not a lot of people are traveling, and it just snowballed overnight.” [James Fallows: Air travel is going to be very bad, for a very long time] If you or someone you’re traveling with is still vulnerable to the virus—whether it’s because of an immune condition or because they’re still too young to get the shot—Allen recommended paying special attention to the boarding period, when planes might not run their ventilation systems to save power. Similarly, if you’re considering traveling by bus, make sure that operators are replenishing the cabin with fresh air from the outdoors, and not just recirculating unfiltered air from inside. No matter what form of public transit you choose, you will be required to wear a mask. If you’re not already accustomed to doing so for hours and hours at a time, you could be in for a rude awakening. Before you depart, make sure that your mask is comfortable and fits well, mentally prepare for how long you’ll be wearing it, and make a plan for mask breaks—preferably outside, away from other people, or while the plane’s ventilation system is running at full blast. 3. Are kids invited?While the long-term effects of COVID-19 on anyone won’t be clear for a while, it’s pretty clear that young people are at much lower risk of severe illness and death if they do get infected. For a lot of the past year, one of the scariest possible outcomes of a second grader being exposed to the virus was her passing it to her parents; now everyone in the family might be protected but her. Sean O’Leary, a pediatrician and a professor at the University of Colorado, told me that families that include both kids under 12 and people who can’t be vaccinated or are at high risk for severe COVID-19 might want to be extra mindful of their kids’ exposure, because they could pass it to someone who’s not protected. He also cautioned that “we don’t really have good data yet” on how severe the Delta variant of the coronavirus, which is on track to quickly become dominant in the U.S., is in children, though it does seem to be more transmissible among people of all ages. [Read: We are turning COVID-19 into a young person’s disease] Still, the kids don’t necessarily need to stay home all summer. “Flying appears to be a relatively low-risk activity, including with kids,” O’Leary said. If you’re bringing your unvaccinated children to an area with low vaccination and high case rates, make sure you’re taking appropriate precautions. “If you’re unvaccinated or you have a young child who’s not yet vaccinated or if you’re just feeling extra cautious,” Allen said, “the best thing to do is wear a high-quality mask,” such as an N95, a KN95, or a KF94. That’s especially true if you want to bring the kids to any kind of large event where adults might be unvaccinated too. 4. What if I get stuck next to an anti-masker?If you’re fully vaccinated and not immunosuppressed or immunocompromised, you shouldn’t be in any significant danger from a seatmate with their nose out. On any long trip, your fellow passengers are going to need to eat and drink, and you should be prepared for that eventuality before buying a ticket. If the guy in seat 27B, say, takes a phone call mask-free and you do want to intervene, Popescu recommends making eye contact with the offender, then making the motion of pulling up your own mask. If you’re particularly worried about mask compliance, you might want to opt for travel by train or plane, where conductors and flight attendants are more likely to be patrolling the rows. But that doesn’t eliminate the possibility of a free-for-all at the station or airport before you board. 5. When’s the right time to go?Things are, for the most part, looking up in the U.S. right now. But the country’s—and the world’s—recovery from the pandemic likely won’t be linear. A fall or winter surge is a distinct possibility, and though vaccinated people will likely still be protected, travel is always riskier when more virus is circulating. The longer vaccination rates lag worldwide, the more opportunity the virus will have to mutate into forms that can outwit existing vaccines. [Read: Expect the unexpected from the Delta variant] As long as case rates stay low this summer, it might be a good idea to take the mental-health break you need while you can, so you’ll be better prepared to hunker down in the cold weather, if needed. The warmer months also offer the advantage of allowing for more outdoor visits and sightseeing. One caveat: Pfizer announced earlier this month that it will likely seek an emergency-use authorization for its vaccine in children under 12 in September. If you’re particularly worried about your young children’s exposure, keep an eye on that target for your vacation planning. After so long without regular travel, don’t be surprised if you forget some of the basics. Yes, Amtrak will scan your ticket off your phone. (My friends keep forgetting this one.) No, you don’t need to take off all your jewelry in the TSA line. (I forgot that one.) One of the more important things you might have forgotten is how often you come home from a trip with a cold or worse. Travel forces lots of people together into small spaces, regardless of whether they have the sniffles. It’s hard to keep away from others or wash your hands on a bus. Before the pandemic, Popescu said, travel was a common way people got infected with viruses like the flu, parainfluenza (which can lead to croup or pneumonia), and a cold-like sickness called RSV that can be dangerous to the very young and the very old. [Read: The only way we’ll know when we need COVID-19 boosters] While you’re busy thinking about how to keep yourself safe from the coronavirus, don’t forget that your shiny new vaccine still leaves you vulnerable to the same old pathogens that plagued us before. Cloroxing your seat-back tray table might not protect you from COVID-19. But maybe we should’ve been doing it all along. from https://ift.tt/3xOVKrp Check out http://natthash.tumblr.com At the end of January, reports that yet another COVID-19 vaccine had succeeded in its clinical trials—this one offering about 70 percent protection—were front-page news in the United States, and occasioned push alerts on millions of phones. But when the Maryland-based biotech firm Novavax announced its latest stunning trial results last week, and an efficacy rate of more than 90 percent even against coronavirus variants, the response from the same media outlets was muted in comparison. The difference, of course, was the timing: With three vaccines already authorized for emergency use by the U.S. Food and Drug Administration, the nation is “awash in other shots” already, as the The New York Times put it. Practically speaking, this is true. If the FDA sees no urgency, the Novavax vaccine might not be available in the U.S. for months, and in the meantime the national supply of other doses exceeds demand. But the asymmetry in coverage also hints at how the hype around the early-bird vaccines from Pfizer and Moderna has distorted perception. Their rapid arrival has been described in this magazine as “the triumph of mRNA”—a brand-new vaccine technology whose “potential stretches far beyond this pandemic.” Other outlets gushed about “a turning point in the long history of vaccines,” one that “changed biotech forever.” It was easy to assume, based on all this reporting, that mRNA vaccines had already proved to be the most effective ones you could get—that they were better, sleeker, even cooler than any other vaccines could ever be. But the fascination with the newest, shiniest options obscured some basic facts. These two particular mRNA vaccines may have been the first to get results from Phase 3 clinical trials, but that’s because of superior trial management, not secret vaccine sauce. For now, they are harder and more expensive to manufacture and distribute than traditional types of vaccines, and their side effects are more common and more severe. The latest Novavax data confirm that it’s possible to achieve the same efficacy against COVID-19 with a more familiar technology that more people may be inclined to trust. (The mRNA vaccines delivered efficacy rates of 95 and 94 percent against the original coronavirus strain in Phase 3 trials, as compared with 96 percent for Novavax in its first trial, and now 90 percent against a mixture of variants. [Read: The differences between the vaccines matter] Pandemic-vaccine success, as I wrote last year, was never just about the technology. You needed a good vaccine, sure—but to get it out the door quickly, you also had to have a massive clinical-trial operation going, and it had to be situated in places where the virus would be spreading widely at just the right time. Even if your candidate worked amazingly well, if you weren’t testing it in the middle of a huge outbreak, you’d have to wait a very long time for the evidence to build. The precise timing of these studies mattered a great deal in practice. The Phase 3 clinical trials for Pfizer and Moderna, for example, were up and running in the U.S. by late summer 2020, and so they caught the nation’s giant wave of infections in the fall. By the time Novavax had finished recruiting in the U.S. and Mexico, in February, case rates had been dropping precipitously. This fact alone, independent of any aspect of vaccine technology, did a lot to shape the outcome. Corporate strategy was another crucial factor. To “win” the vaccine race, a company would need to be able to produce high-quality vaccine doses reliably and quickly, and in vast numbers. It would also need to field the challenges of working with multiple regulatory agencies around the world. And it would need to do all of this at the same time. BioNTech, the German company that developed the Pfizer mRNA vaccine, could not have accomplished so much, so quickly by itself. Last October, the company’s CEO, Uğur Şahin, told German interviewers that BioNTech had sought out Pfizer for help because of the scale of the clinical-trial program necessary for drug approvals. That strategic partnership, and not simply the “triumph of mRNA,” was what propelled them past the post. (Moderna had the advantage of its partnership with the National Institutes of Health.) Consider this: The BioNTech-Pfizer first-in-human vaccine study appeared on the U.S. government’s registry of clinical trials on April 30, 2020—the same day as the first-in-human vaccine study for Novavax, which would be going it alone. In a parallel universe where Novavax had paired up with, say, Merck, this story could have come out very differently. In the meantime, the early success of two mRNA vaccines pulled attention away from the slower progress of other candidates based on the same technology. Just two days after last week’s Novavax announcement came the news that an mRNA vaccine developed by the German company CureVac had delivered a weak early efficacy rate in a Phase 3 trial, landing below even the 50 percent minimum level set by the World Health Organization and the FDA. “The results caught scientists by surprise,” The New York Times reported. CureVac is the company that President Donald Trump reportedly tried to lure to the U.S. early in the pandemic, and the one that Elon Musk said he would supply with automated “RNA microfactories” for vaccine production. In the end, none of this mattered. CureVac’s mRNA vaccine just doesn’t seem to be good enough. The “sobering” struggles of CureVac perfectly illustrate what epidemiologists call “survivor bias”—a tendency to look only at positive examples and draw sweeping conclusions on their basis. When the Pfizer and Moderna vaccines triumphed, The Washington Post suggested that a bet on “speedy but risky” mRNA technology had paid off with a paradigm-shifting breakthrough. Anthony Fauci called the gamble “a spectacular success.” Such analyses usually had less to say about the non-mRNA vaccines that had gotten into clinical trials just as quickly—and about the other mRNA vaccines that were hitting snags along the way. Now we’ve seen what happened to CureVac, and that some mRNA formulations clearly work much better than others. By one count, nine groups were testing mRNA COVID-19 vaccines in animal studies as of May 2020, and six were expected to be in clinical trials a few months later. By the end of the year, only BioNTech-Pfizer, Moderna, and CureVac had reached Phase 3 testing, compared with 13 non-mRNA vaccines. Of the nine mRNA-vaccine candidates that were already testing in animals in mid-2020, just two have proved efficacy at this point, while no fewer than nine vaccines based on more traditional technologies have reached the same mark. These other, non-mRNA vaccines have been widely used throughout the world—and some could still make an important difference in the U.S. Although the U.S. has plenty of doses of the Pfizer and Moderna vaccines available right now, demand for them has cratered. The Washington Post reports that in 10 states, fewer than 35 percent of American adults have been vaccinated. An international study of COVID-19 vaccine misinformation, published in May, found that among the most common online rumors were those alleging particular dangers of mRNA technology—that it leads, for example, to the creation of “genetically modified human beings.” The CDC has also made a point of debunking the circulating falsehood that COVID-19 vaccines can change your DNA. For a time, it looked as though the Johnson & Johnson vaccine would help address this worry. It’s based on a fairly new technology, but not as new as mRNA. However, concerns about tainted doses made at a Baltimore factory and the emergence of a very rare but serious side effect have pretty much dashed that hope. The Johnson & Johnson single-dose vaccine has reportedly accounted for fewer than 4 percent of doses administered in the country. [Read: Microchipped vaccines, a 15-minute investigation] In this context, the success of the Novavax vaccine should be A1 news. The recent results confirm that it has roughly the same efficacy as the two authorized mRNA vaccines, with the added benefit of being based on an older, more familiar science. The protein-subunit approach used by Novavax was first implemented for the hepatitis B vaccine, which has been used in the U.S. since 1986. The pertussis vaccine, which is required for almost all children in U.S. public schools, is also made this way. Some of those people who have been wary of getting the mRNA vaccines may find Novavax more appealing. The Novavax vaccine also has a substantially lower rate of side effects than the authorized mRNA vaccines. Last week’s data showed that about 40 percent of people who receive Novavax report fatigue after the second dose, as compared with 65 percent for Moderna and more than 55 percent for Pfizer. Based on the results of Novavax’s first efficacy trial in the U.K., side effects (including but not limited to fatigue) aren’t just less frequent; they’re milder too. That’s a very big deal for people on hourly wages, who already bear a disproportionate risk of getting COVID-19, and who have been less likely to get vaccinated in part because of the risk of losing days of work to post-vaccine fever, pain, or malaise. Side effects are a big barrier for COVID-vaccine acceptance. The CDC reported on Monday that, according to a survey conducted in the spring, only about half of adults under the age of 40 have gotten the vaccine or definitely intend to do so, and that, among the rest, 56 percent say they are concerned about side effects. Lower rates of adverse events are likely to be a bigger issue still for parents, when considering vaccination for their children. Don’t get me wrong—the Pfizer and Moderna vaccines have been extraordinary lifesavers in this pandemic, and we may well be heading into a new golden age of vaccine development. (This week, BioNTech started injections in an early trial for an mRNA vaccine for melanoma.) But even the best experts at predicting which drugs are going to be important get things wrong quite a bit, overestimating some treatments and underestimating others. Pharmaceuticals are generally a gamble. But here’s what we know today, based on information that we have right now: Among several wonderful options, the more old-school vaccine from Novavax combines ease of manufacture with high efficacy and lower side effects. For the moment, it’s the best COVID-19 vaccine we have. from https://ift.tt/3qk3vmM Check out http://natthash.tumblr.com Midway through America’s first mass-immunization campaign against the coronavirus, experts are already girding themselves for the next. The speedy rollout of wildly effective shots in countries such as the United States, where more than half the population has received at least one dose of a COVID-19 vaccine, has shown remarkable progress—finally, slowly, steadily beating the coronavirus back. But as people inch toward something tantalizingly resembling pre-pandemic life, a cloud hangs over our transcendent summer of change: the specter of vaccine failure. We spent months building up shields against the virus, and we still don’t know how long we can expect that protection to last. To keep our bodies from slipping back toward our immunological square one, where the virus could pummel the population again, researchers are looking to vaccine boosters—another round of shots that will buoy our defenses. Around the world, scientists have already begun to dole out these jabs on an experimental basis, tinkering with their ingredients, packaging, and dosing in the hope that they’ll be ready long before they’re needed. When exactly that will be, however, is … well, complicated. Nearly all the experts I spoke with for this story said that the need for boosters is looking more and more likely, but no one knows for sure when they’ll arrive, what the best ones will look like, or how often they’ll be needed, assuming they’re part of our future at all. What underlies this uncertainty isn’t scientific ignorance: We know the signs that will portend an ebb in vaccine protection, and we’re actively looking for them. But their timing could still surprise us. The immunization process is much less akin to erecting an impenetrable fortress than it is to prepping forgetful students for an exam full of unpredictable questions. We can cram with flash cards for weeks, but to some degree we just have to cross our fingers and hope we’re still well studied when the pop quiz arrives. [Read: Expect the unexpected from the Delta variant] That same brand of bet-hedging is unfolding on a global scale. Around the world, researchers and vaccine manufacturers have been, for months, preparing for what seems to be an inevitable end to our immunological détente with the virus. But these experts are also playing a very hard and very necessary waiting game. The only way we’ll really know the best approach to boosters is to allow the vaccines to show their weak points, then patch them as soon as they arise. There are at least two major ways that COVID-19 vaccines could falter. The first might best be described as a memory lapse, and it’s a bit of a flub on the human side: Left to its own devices, the immune system slowly loses its intellectual grasp on the pathogen, and is much less prepared the next time it sees it. The second is a mismatch between what immune cells studied and what ended up on the final exam: a mutation in the coronavirus that alters its appearance so significantly that it becomes unrecognizable, even if immune memory of the vaccine remains intact. Designing and deploying boosters requires keeping tabs on these two fast-changing variables at once. Memory lapses can, in theory, be easier to detect and repair: Researchers take blood samples from vaccinated people and track the levels of different immune actors, such as antibodies and T cells. If those levels start to dip below a crucial protective threshold, it’s time to offer a booster. This approach works well in certain boosting regimens, such as the Hepatitis B vaccine for health-care workers, But sussing out this so-called correlate of protection typically takes gobs and gobs of data. For many vaccines, even ones that have been in use for decades, such as the mumps vaccine, those numbers still aren’t clear-cut. SARS-CoV-2’s correlate remains elusive. [Read: Show your immune system some love] We do have, at least, hints about the longevity of vaccine protection. Antibodies that recognize SARS-CoV-2 are known to stick around in high numbers for at least six months after the first round of shots is administered. John Wherry, an immunologist at the University of Pennsylvania, told me that, based on the data he’s seen, he suspects that antibody levels will hold their own for at least a couple of years after vaccination, though antibodies represent just a sliver of the complex immune response to the coronavirus. There have also been encouragingly few breakthroughs, or infections in people who have been fully vaccinated. An unexpected uptick in these cases would serve as a “canary in the coal mine” for public-health experts, an indication that protection was ebbing, Sallie Permar, the chair of pediatrics at Weill Cornell Medicine and NewYork-Presbyterian Komansky Children’s Hospital, told me. (The chickenpox vaccine, originally conceived of as a one-and-done shot, became a two-doser in the U.S. in the 2000s to stamp out breakthroughs, including some potentially linked to waning antibody levels, in the years after kids got their first jab.) Virus mutations can be even tougher to pin down and predict than immunological memory lapses. No known variants have yet managed to fully flummox our current repertoire of vaccines, and none yet seems to be disproportionately causing breakthroughs. But certain versions of the virus do seem more resistant to vaccine-driven antibodies in the lab—a hint that the pathogen is becoming more and more unfamiliar to the immune cells that studied it. Some experts are worried that, if enough alterations occur, we may need another round of mass inoculations as early as this fall, possibly with an updated vaccine recipe that accommodates the virus’s shape-shifting form—a more labor-intensive approach than simply juicing people up with more of the OG inoculation. In a way, our vaccines’ stellar track record is an ironic hindrance to the process of improving them. Without more long-term data on their shortcomings, epidemiologists and vaccinologists are effectively trying to predict the weather in a climate they’ve only just discovered. No universal litmus test exists for making decisions about boosters—no single definition for what would constitute a “concerning” rise in cases, no flare that goes off when our immune cells are hit with microbial amnesia, no spoilers that warn of the coronavirus’s next metamorphosis. Instead, the experts are left to determine their own benchmarks for boosters, by evaluating the available information on antibody levels, breakthroughs, variant surveillance, and how different versions of the virus fare in labs and animal models, all while being mindful of the pandemic’s progress on scales both local and global. All of this intel then gets fed into a risk-benefit analysis, to determine whether the need for boosters outweighs any possible costs, which can span the medical to the economic, says Grace Lee, a pediatrician at Stanford University and a member of the CDC’s Advisory Committee on Immunization Practices. That’s all before public-health officials have to coordinate the logistics of getting another round of vaccines into people—a campaign that will inevitably reawaken the issues about trust, equity, and access still stymieing our current rollout. And even after boosters debut, agencies like the CDC might tinker with the playbooks for years or decades to get the scheduling just right. (The CDC did not answer questions about the nature of future boosting efforts, noting only that “the need for and timing of COVID-19 booster doses have not been established.”) Even amid all this uncertainty, the road to boosting won’t be a fumble in the dark. In the past year and a half, millions of SARS-CoV-2 genomes have been sequenced, helping researchers monitor the virus’s every genetic change; other scientists are monitoring the vaccinated, in the hope of catching or even predicting the inflection point, when our immune protection against the virus might start to drop. By the time our first round of shots starts to lose its oomph, contingency plans will have long ago been set in motion. Some companies and researchers have already started experimentally doling out additional jabs. Johnson & Johnson representatives told me that their single-dose vaccine is being tested as a two-doser, while Moderna and Pfizer have confirmed that they’re checking whether third shots, some of which have been specially reformulated to fight worrisome variants, can better equip immune systems to tussle with new versions of the virus. The National Institutes of Health recently announced a clinical trial that will offer a Moderna booster to participants who were vaccinated three to five months prior. And researchers at Johns Hopkins are exploring whether certain immunocompromised people—a group at higher risk of not responding to standard-issue vaccines—might benefit from a third injection. These individuals and others with less exuberant immune systems, such as older people, might need boosters sooner than the rest of us, says Ali Ellebedy, an immunologist at Washington University in St. Louis. [Read: COVID-19 vaccines are entering uncharted immune territory] Several boosting trials will take a mix-and-match approach, offering vaccines that differ in formulation from the first COVID-19 shot people took—a Moderna boost for people who initially got Pfizer, for instance. If so-called heterologous boosting is safe and effective, future rounds of shots will be much easier to give: People won’t have to scour their neighborhood for a company-specific vaccination clinic—or waste time struggling to remember which shot they got months or years ago. Hybrid inoculations could even improve on the original plan, potentially by marshaling different branches of the immune system, as they have with vaccines against HIV, Ebola, and tuberculosis. Delivered in succession, different types of COVID-19 shots could, in theory, build a punchier and more cohesive response because of their diverse packaging—and perhaps provide more comprehensive protection when it comes to variants, Srilatha Edupuganti, an infectious-disease physician and vaccinologist at the Emory Vaccine Center, one of the sites for the NIH trial, told me. New vaccine recipes, which haven’t yet been cleared, could also play a role in future vaccination efforts. Some researchers are looking outside the spike protein, to see whether they can build shots that contain more instructive bits of SARS-CoV-2 anatomy. A few are experimenting with delivering vaccines as oral drops or nasal sprays that might coax out an airway-specific immune response, to head off the coronavirus at its natural point of entry. This whole rigmarole will get easier if we eventually find SARS-CoV-2’s elusive correlate of protection, which will probably involve a specific kind of antibody: Instead of running long, expensive clinical trials to determine a vaccine’s efficacy, scientists can just check whether it marshals an immune response strong enough to match or exceed the threshold. “It’s what we dream about,” Permar told me. “Vaccines would be so much easier to develop and test.” There’s even talk of developing universal vaccines that could accommodate a wide range of potential variants, perhaps cutting down on the amount of mutant-specific tinkering we’ll need to do in the future, and the number of shots we’ll need to give. Boosting in perpetuity isn’t an ideal option, if we can avoid it. For some shots, the severity of side effects can ratchet up with each additional dose. (Some evidence exists that the mix-and-match approach might come with nastier side effects as well.) Vaccinating too often is also possible: At a certain point, cells will stop learning efficiently from the material vaccines provide, and essentially “burn out” from information overload, Wherry told me. Perhaps the heaviest immunization schedule we’ll end up with is one that’s already familiar: annual shots, like those we develop for the flu, each reformulated to tackle a slightly different set of strains. But many experts think that’s not terribly likely. Flu viruses mutate faster than coronaviruses do, and hop between animals and humans much more frequently, giving them more opportunity to mutate. The world is better served when we’re judicious with vaccines, after all, and inoculate as needed, no more, no less. A lot would feel wrong about lining people up for a second or third helping of a COVID-19 vaccine while billions around the world have yet to receive their first dose, Krutika Kuppalli, an infectious-disease physician at the Medical University of South Carolina, told me. Every unprotected person represents another potential depot for the virus to establish itself and mutate, and jump ahead of our vaccines once again. Getting more first shots into arms means slowing the virus’s spread, and limiting its costume changes. It means, perhaps, delaying our need for boosters a little while longer. from https://ift.tt/2UrPpnn Check out http://natthash.tumblr.com In the spring of 2020, as a brand-new disease spread rapidly across the United States, millions of Americans arrived at the same conclusion: They wanted a beer. This was, to be fair, the same conclusion that many of us were coming to before the pandemic began, but the ways we could satisfy that thirst had changed dramatically. As beer spoiled in kegs inside idle bars and restaurants, Americans set out in search of six-packs. Liquor stores and grocery stores, which were both categorized as “essential businesses” and allowed to operate during even the tightest local lockdowns, saw their alcohol sales spike. Booze-delivery services such as Drizly more than tripled their sales. As with things like paper towels and flour, beer producers and distributors scrambled to divert their product into the right packaging and onto the right shelves. This swing has caused people to speculate that Americans might be drinking more overall, a theory that sounds plausible enough—life has been bad and also boring—but hasn’t really panned out, in the aggregate. Total alcohol consumption in the United States has been quite steady for years, including last year, says Lester Jones, the chief economist for the National Beer Wholesalers Association. What has changed, though, is virtually everything else about drinking. Swirling beneath the placid consumption rate was all the cultural and logistical chaos that has defined American life in the past 15 months: Supply chains broke down at the same moment that our lives changed in ways that had us scrounging around for sources of comfort. Now beer sales offer a glimpse of the lives we want for ourselves—and how disaster-borne limitations are still getting in the way. [Read: America has a drinking problem] For drinking to remain steady throughout the pandemic, Americans had to change both where they were looking for booze and which kinds of booze they sought. In 2019, a little more than half of America’s beer budget was spent “on premises,” in restaurants, bars, stadiums, and other places where you buy and drink in the same place. The rest of beer sales happen “off premises,” in grocery stores, liquor stores, gas stations—places that will call the cops if you start cracking Bud Lights while still inside. During the first wave of the pandemic, the bottom fell out of on-premises sales. Jones told me that for about four weeks last spring, keg sales in the United States actually went negative: More keg beer was spoiling at retailers than was being sold to them fresh. Cut off from bars and restaurants, people began buying up the beer in grocery and liquor stores, and brewers were forced to pivot as quickly as they could. Beer production “is not a speedboat; it's like an aircraft carrier or cruise ship,” Jones said. “You start turning that rudder and three miles down the waterway, the boat starts to turn.” For many producers, planning the types and amounts of beer they’ll brew starts anywhere from six months to a year in advance, and that includes contracts for cans and glass bottles. Getting more of that packaging while virtually every beverage company in the country wants to send more product to grocery stores inundated with customers stuck at home has been all but impossible, contributing to an aluminum-can shortage that won’t abate any time soon. For much of the past 15 months, your first-choice beer might not have been consistently available at your local grocery, even if the brewer had plenty on hand. When buyers have to settle for their second or third choices, their tastes start to change. “People have tended to go toward the products they understand and know,” Joe Gold, a lead distributor at Chesapeake Beverage, in Baltimore, told me. “The experimental beers that were there, or that brewers were trying to come out with, they just got kind of pushed to the side.” Craft brewers finished 2020 with sales down 8 percent, while macrobrewers such as Anheuser-Busch and Molson Coors had a strong year. For the first time in his career, Gold said, he found expired Budweiser in the stockroom of a liquor store—not because people weren’t buying it, but because the cases had been misplaced behind a supply of craft brews that hadn’t been touched. [Read: Craft beer is the strangest, happiest economic story in America] In theory, a shift back toward pre-pandemic socializing now that more Americans are getting vaccinated could help reverse these trends, as people get to sample beers before buying a pint and ask questions of their bartenders. So far, though, Gold said that bars in his region aren’t keen to try new things while they deal with the ongoing financial fallout of pandemic restrictions. Bars that had 10 rotating taps to showcase new beers might be down to 5—or to none at all—because they can’t guarantee that they’ll sell through the unfamiliar beers before they go bad. Gold is still juggling demand with spotty supply: He might be able to get a bar only half its order of Bud Light one week, and the next week it’ll order more because its Coors distributor is tapped out too. At the same time, the demand for hard seltzer is expanding—it’s inexpensive, predictable, and a normal part of the drinking routines of even more young adults after a year hanging out in backyards and parks instead of craft breweries or cocktail bars. Drinking patterns are now changing in other ways. NielsenIQ’s most recent sales data, for the last week of May, saw the country’s average bar and restaurant sales increase almost 30 percent—not over 2020 levels, but over the same week in 2019. Part of that bump is because the pandemic required businesses to build out new delivery and takeout options that have remained popular, but anyone who’s tried to make a dinner reservation lately can tell you that it’s also because people have started to return to some of their old social habits. And once inside, they’re buying more. Matt Crompton, a director of client services for NielsenIQ’s alcohol-industry business, says that the average bar or restaurant is filling about 5 percent more orders than it did in 2019, but those orders are, on average, 24 percent more expensive. When appraising these trends, it’s important to account for the fact that many things about American life are intensely regional. Chris Larue, the president of Sunshine State Distributing, in Orlando, which distributes craft beer and spirits, says that although keg sales are still not totally back to normal, the disruption to his business, which also serves Tampa and Miami, hasn’t been as drastic as the swings that others described. “There are ways to much more easily socially distance here, and we have a lot of outdoor seating at bars and restaurants,” he told me. Florida also reopened much more quickly than many other states, giving its most eager residents a head start on returning to their pre-pandemic habits. With many office workers still staying home for all or part of the work week, sales—of beer and beyond—at businesses that serve city centers remain spotty. In neighborhoods where people live, things are a little brighter, and beer sales are more buoyant, especially for independently owned businesses that people are particularly excited to see bounce back, Larue said. That could portend a strong comeback for microbrewers in the months ahead, but it also poses a problem to beer and spirits purveyors. No one’s really sure if the work-from-home crowd wants to hit happy hour out in the suburbs the way people did when they were in the office, so everyone is forced to guess what (and how much) people might want to drink as their lives and choices once again change rapidly. [Read: Millennials are sick of drinking] Drink orders aren’t the only things changing inside bars. The hospitality sector has had a difficult time coaxing experienced servers and bartenders to return to the industry, where pay is often low, stability is rare even in the best of times, and working conditions during the pandemic have been extremely dangerous. That, too, will affect what these businesses can offer to drinkers who return, and what will sell well to them. Beer will again be a window into how Americans are thinking about their lives this summer—and not just for those buying it. Chesapeake Beverage’s Gold says that instead of going back to the esoteric craft-beer kegs and cocktail ingredients they were ordering before the pandemic, bars have shown growing interest in ready-to-serve products, which don’t require as much skill or time to prepare, and which help short-staffed businesses or those training new workers meet demand. That means that if you’re headed out to sit in a bar with friends for the first time in a long time, menus might look a little different. Get ready for canned cocktails, hard seltzers, bulk promotions served in buckets, and, yes, beer. But maybe not your first choice. from https://ift.tt/2TYjhr9 Check out http://natthash.tumblr.com This much is clear: The coronavirus is becoming more transmissible. Ever since the virus emerged in China, it has been gaining mutations that help it spread more easily among humans. The Alpha variant, first detected in the United Kingdom last year, is 50 percent more transmissible than the original version, and now the Delta variant, first detected in India, is at least 40 percent more transmissible than Alpha. What’s less certain, however, is how the virus’s increased transmissibility will affect the pandemic in the United States. Alpha’s arrival prompted worries about a new surge in the spring, but one never came. The proportion of Alpha cases kept going up, but the total number of cases kept going down. People got vaccinated. Alpha became dominant in the U.S. Cases fell even further. The virus had become more biologically transmissible, but it wasn’t transmitting to more people. There was one notable and confusing exception: In April, Michigan experienced a spike in cases that experts believe was indeed fueled by Alpha. The fact that the variant had such different consequences for Michigan than it did for the rest of the country shows just how difficult it is to make predictions. Alpha has little effect on vaccines, but fears about the variants that slightly erode vaccine protection, Beta and Gamma, have also quieted; neither is causing significant case spikes among the vaccinated. “If there’s ever a time that we needed to be humble, it’s around this issue,” says Michael Osterholm, an infectious-disease epidemiologist at the University of Minnesota. Delta has gotten so much attention because it has the most troubling collection of traits yet: It is markedly more transmissible than Alpha, can sicken a large proportion of people who have had only one dose of a vaccine (though not those who have had two), and may even cause more severe disease. All of this is enough to be a warning, especially as Delta is now responsible for 10 percent of U.S. cases and rising. But as with Alpha, which was also suspected to be more severe, how the variant ends up behaving in the real world will depend on more than its biology. It will also depend on how we—the virus’s hosts—choose to behave, how many more people we vaccinate, and, to some extent, how lucky we get. [Read: Coronavirus variants have nowhere to hide] All of these factors are likely to have played a role in the Alpha-associated springtime spike in Michigan. According to cellphone mobility data from that period, people in the state had gone back to nearly pre-pandemic levels of movement, says Emily Martin, an epidemiologist at the University of Michigan. The Alpha variant also got to Michigan relatively early, and happened to find its way into groups of young people who were not yet eligible to be vaccinated. “It was sort of bad timing,” Martin told me. If Alpha had arrived a little later, or the vaccines a little earlier, then Michigan might have looked more like the rest of the country, where immunization was able to blunt Alpha’s impact. In the race between variants and vaccines elsewhere in the U.S., vaccines won. Two concepts about viral spread help explain why timing and chance make such a difference. First, the coronavirus spreads exponentially, which means that even a slight delay in mitigation efforts can lead to dramatically different outcomes. Second, the virus’s spread is what epidemiologists call “overdispersed,” which means that the majority of patients do not infect anyone else but a small handful might infect dozens of people. In other words, most sparks of infection do not catch fire. But occasionally a single infection might cause an early super-spreader event, which ends up seeding a major outbreak. “Looking from state to state, it can be like, ‘Well, why is this state doing well versus that state?’ Sometimes it’s just luck,” says Adam Lauring, a virologist at the University of Michigan. In predicting how variants will behave, much of the world has looked to the U.K., where an excellent and comprehensive genomic-surveillance program has tracked the rise of Alpha and now Delta. Alpha made up 98 percent of all COVID-19 cases in the U.K. at that variant’s peak in March; Delta has since taken over, accounting for almost all new cases. It’s too early to say whether the U.S. will follow the same trajectory. Alpha was responsible for anywhere from 38 to 86 percent of all new U.S. cases last month, depending on the state. Nathan Grubaugh, an epidemiologist at Yale, says this fact suggests the limits of comparing the two countries. “The U.S. is far more heterogeneous than the U.K.,” he told me, with more diversity in viruses and bigger geographic differences in vaccine uptake. When it comes to Delta, he said, “that means some places are going to be impacted harder.” And most likely, those places are going to be the ones where fewer people have been vaccinated. Experts agree that vaccines are the best way to stop Delta. Data from the U.K. suggest that one dose of the Pfizer vaccine offers only 34 percent protection against the variant, while two doses provide 88 percent. Large swaths of the U.S., however, are still struggling to get people to take any doses at all. A recent Washington Post analysis found more than 100 counties where less than 20 percent of the population has been vaccinated. “Whatever cracks that we have in our program for getting communities vaccinated, that’s what Delta is going to exploit,” Martin said.
from https://ift.tt/2TLGBbO Check out http://natthash.tumblr.com The 89 people who work at Buffer, a company that makes social-media management tools, are used to having an unconventional employer. Everyone’s salary, including the CEO’s, is public. All employees work remotely; their only office closed down six years ago. And as a perk, Buffer pays for any books employees want to buy for themselves. So perhaps it is unsurprising that last year, when the pandemic obliterated countless workers’ work-life balance and mental health, Buffer responded in a way that few other companies did: It gave employees an extra day off each week, without reducing pay—an experiment that’s still running a year later. “It has been such a godsend,” Essence Muhammad, a customer-support agent at Buffer, told me. Miraculously—or predictably, if you ask proponents of the four-day workweek—the company seemed to be getting the same amount of work done in less time. It had scaled back on meetings and social events, and employees increased the pace of their day. Nicole Miller, who works in human resources at Buffer, also cited “the principle of work expanding to the time you give it”: When we have 40 hours of work a week, we find ways to work for 40 hours. Buffer might never go back to a five-day week. At a moment when the future of work is being decided—when businesses are questioning the value of physical office space and when lower-paid workers are agitating for better treatment as the economy reopens—what worked for this small, somewhat quirky tech company might be much less radical than the rest of the American workforce has been led to believe. People who work a four-day week generally report that they’re healthier, happier, and less crunched for time; their employers report that they’re more efficient and more focused. These companies’ success points to a tantalizing possibility: that the conventional approach to work and productivity is fundamentally misguided. “We live in a society in which overwork is treated as a badge of honor,” Alex Soojung-Kim Pang, an author and consultant who helps companies try out shorter workweeks, told me. “The idea that you can succeed as a company by working fewer hours sounds like you’re reading druidic runes or something.” But, he said, “we’ve had the productivity gains that make a four-day week possible. It’s just that they’re buried under the rubble of meetings that are too long and Slack threads that go on forever.” Regardless of any benefits to businesses, stripping away all of work’s extra scaffolding and paying people the same amount for fewer hours—whether they’re salaried or paid hourly—would genuinely nurture human flourishing. It would make caregiving, personal development, and the management of modern life easier for people across the economic spectrum. And it would reignite an essential but long-forgotten moral project: making American life less about work. Over the past couple of years, companies and governments around the world have become more open to the possibility that a four-day workweek could be better for businesses and the people who make them run. Before the pandemic, Microsoft Japan and the burger chain Shake Shack tried the schedule out with some employees, with positive results. The international conglomerate Unilever’s New Zealand offices are currently in the middle of a year-long four-day experiment, the results of which could inform the schedules of the company’s 155,000 employees worldwide. The governments of Spain and Scotland are planning trials that would subsidize employers that give workers an additional day off, and politicians in Japan and New Zealand have spoken favorably of the idea of a shorter workweek. Later this month, Jon Leland, an executive at Kickstarter, and Jon Steinman, who works in political advocacy, will launch, along with Pang, a nationwide campaign promoting the four-day workweek. Their plan is to spark interest among workers, and then use that interest to recruit companies to participate in a pilot program next year, which will be advised by academic researchers and which Leland and Steinman hope will generate a more robust body of data on four-day weeks. Still, four days’ work for five days’ pay is a rarity in the landscape of American business—Pang is aware of only a few dozen organizations in the U.S. with this arrangement. Many—though not all—of them match the profile of Buffer: They are relatively small, they do analytical, computer-based “knowledge” work, and they are still run by their founder, a setup that makes big changes easier to implement. But their experiences suggest that when done right, reducing workers’ hours doesn’t necessarily hurt profitability. [Read: Your professional decline is coming (much) sooner than you think] In 2018, Andrew Barnes approached the employees of his company, a New Zealand firm called Perpetual Guardian that manages wills, estates, and trusts, with an offer: If they could figure out how to get more done in a day, they could work one fewer day per week. In consultation with employees, the company installed lockers in which workers can voluntarily stash their phones for the day, and soundproofed meeting spaces to reduce the sound of ambient chatter. Meetings were shortened; employees started putting little flags in their pencil holders whenever they wanted to signal to coworkers that they didn’t want to be disturbed. It worked: Perpetual Guardian’s business didn’t suffer, and the four-day workweek is still in place three years later. When employees are given a good reason to work harder, they often focus more ruthlessly on their most important tasks. Barnes found that even though weekly working hours were cut by 20 percent, employees’ time spent on non-work websites fell by 35 percent. It also helped that employees had more time outside of work to manage the rest of their lives, so non-work responsibilities were less likely to intrude on the workday. “Because people have no time for home duties—trying to track down that plumber or sorting things out with the kids—all of that was eating into the day,” he told me. “So if I gave people more time outside of work to do those tasks, that would stop those things interfering in the business hours.” Natalie Nagele, the CEO of Wildbit, a small software company, introduced a four-day, 32-hour week in 2017, after reading about research indicating that the optimal amount of intense cognitive work is no more than four hours a day. (The four-day schedule even applies to Wildbit’s customer-support team; their days off are staggered so they can respond to inquiries all week.) “I have this dream that knowledge workers can get to a point where we can clearly define what enough means,” Nagele told me. “We don’t do a good job of saying, ‘This is done,’ or ‘I can put it away.’” She wonders if Wildbit’s next schedule could be four six-hour days. That may sound preposterous, but schedules like this intrigue productivity experts. Cal Newport, the author of Deep Work: Rules for Focused Success in a Distracted World, has written that the current version of office work, defined by long hours and “always-on electronic chatter,” seems poorly suited to cognitive labor. This mode of working has been around for only a decade or two, and we have found better ways to work before; it would be “arrogant and ahistoric,” he says, to assume that the current approach is best. This model doesn’t just work for computer programmers and other knowledge workers. In his book Shorter: Work Better, Smarter, and Less—Here’s How, Pang writes about a nursing home near Roanoke, Virginia, that was struggling to hire and retain nursing assistants, who do important but unglamorous, often low-paid work. To improve retention, the facility tried giving them 40 hours of pay for 30 hours of work, which necessitated hiring more nursing assistants to compensate for the reduced hours. That came at a price, but the change also yielded substantial savings on recruitment expenses and overtime pay, such that the overall cost worked out to only about $20,000 a year. Plus, call-bell response times, residents’ infection rates, and the number of falls and skin tears all declined. Last year, Diamondback Covers, a Pennsylvania-based manufacturer of metal truck-bed covers, shaved five hours off its factory team’s 40-hour week, but didn’t decrease pay, as it hired more workers to meet rising demand during the pandemic. The company expected that the 12.5 percent drop in working hours would lead to a rise of similar magnitude in the labor costs for each cover it made. But the cost increase was only 3 percent, due to increased efficiency. “It’s not by running a sweatshop … it’s more about working smarter,” Diamondback’s CEO, Ben Eltz, told me. During a 40-hour week, “very rarely does a person say, ‘I got my work done—now I’m going to go see how else I can help.’ It’s that teamwork idea of, everyone’s shooting for that common goal of ‘Let’s make this work.’” On top of that, with shorter days, the company is shedding its employees’ least-productive hours, when they’re worn out near the end of a shift. With the expected savings from reduced turnover and fewer safety incidents, Diamondback’s schedule change could end up saving the company money. Success stories like Diamondback’s—and Buffer’s, and Wildbit’s—point to a failure of imagination on the part of America’s bosses at a moment when they should be ready to reimagine corporate culture. Barnes thinks the same insight that is inspiring the post-pandemic spread of remote and hybrid work—that productivity is not a function of time spent in the office, under managers’ supervision—should also make business leaders more amenable to shorter workweeks. Pandemic aside, when he hears from people who doubt that a four-day week would work in their industry, “They’re saying nothing can be better than the way we work today,” he told me. “That’s a pretty closed-minded view.” [Read: Winners and losers of the work-from-home revolution] There is nothing sacred about a five-day, 40-hour workweek—which, in actuality, is more than 40 hours for about half of full-time U.S. workers—but it is certainly an improvement over what came before it. For most of the 19th century, the typical American worker was a male farmer who worked as many as 60 to 70 hours per week. The precipitous decline in working hours since then was made possible by productivity growth: The internal-combustion engine, electrification, and other advances meant that workers were able to get things done more quickly. The tempo of early factory work led to a push for a 10-hour day starting in the late 1820s; unions, which gained strength in the ensuing decades, fought for, as a popular slogan put it, “eight hours for work, eight hours for rest, eight hours for what we will” closer to the end of the century. The standard workweek in that era was still six days, and the shift to five occurred gradually, over the course of decades. According to Benjamin Hunnicutt, a historian at the University of Iowa and the author of Work Without End, the transition actually began in England, where in the 19th century it became normal for people to show up late to work, or skip it entirely, on Monday, basically because they would rather do other things. To discourage workers from “keeping Saint Monday,” as it was called, employers started agreeing to give them half of Saturday off. In the U.S., one of the earliest instances of a business implementing a five-day week was a mill in New England that in 1908 gave its Jewish workers a two-day weekend, to cover their Saturday sabbath. The practice caught on more widely in the following two decades, when unions pushed for it and business owners, applying the principles of “scientific management,” studied their production processes and concluded that a shorter week could increase productivity. In 1926, the Ford Motor Company adopted the five-day week, doubling the number of American workers with that schedule. Not all business leaders favored the change. “Any man demanding the forty hour week should be ashamed to claim citizenship in this great country,” the chairman of the board of the Philadelphia Gear Works wrote shortly after Ford rolled out its new hours. “The men of our country are becoming a race of softies and mollycoddles.” Less aggressive but just as resistant, the president of the National Association of Manufacturers, a trade group, wrote, “I am for everything that will make work happier but against everything that will further subordinate its importance.” It took a crisis to cement the five-day week as a standard. During the Great Depression, reducing hours was considered a way to spread the finite amount of work available among more people. The appetite for shorter schedules was so great that, in 1933, the U.S. Senate passed a bill that would have temporarily capped the workweek at 30 hours. President Franklin D. Roosevelt and his administration found it too extreme, however, and instead tried to provide economic relief to workers in the form of the New Deal—rather than limit work, they sought to create more of it. Five years after the 30-hour week fell apart, Roosevelt signed the Fair Labor Standards Act, which mandated higher pay beyond 40 hours in certain industries, effectively formalizing the five-day workweek. During this span of roughly 100 years, the notion that Americans could spend less and less time working didn’t elicit the same widespread sense of impossibility that it might today—it was in keeping with the common belief that expanding leisure time was a mark of moral progress. And for a time, it seemed that the workweek would continue to shrink. In 1930, the renowned British economist John Maynard Keynes made the prediction that in a hundred years, productivity growth would permit people to work as few as 15 hours per week. A quarter century later, Richard Nixon, as vice president, said he expected a four-day week soon enough. “These are not dreams or idle boasts,” he said. “They are simple projections of the [recent] gains we have made.” In the mid-1960s, a contributor to The New York Times Magazine surveyed the state of technological progress and concluded that it was “unlikely that the four-day week will be postponed indefinitely.” There isn’t one straightforward explanation for why it is still being postponed. One reason might be that working hours have fallen to the point that pushing them down further wouldn’t bring such a large payoff—it’s less vital to move from 40 hours to 30 hours than it was to move from 60 to 50. Another might be that, once salaried workers started receiving benefits such as pensions and health insurance through their jobs, hiring an additional worker became more expensive, so employers were incentivized to squeeze more hours out of their existing staff rather than bringing on someone else. And perhaps the workweek would have continued to shrink if unions’ influence hadn’t waned nationwide. A somewhat fuzzier explanation is that Americans’ fundamental aspirations changed. Hunnicutt argues that before the early 20th century, “work and wealth had a destination—that was a richer, fuller human life.” But after a cultural shift, he told me, “work was for more work … wealth was for more wealth, for ever and ever,” as a job became a religion-like source of meaning for many people. Hunnicutt also notes a blossoming of advertising and consumerism around this same time, which set people on a course of working more in order to buy more. [Read: Workism is making Americans miserable] Whatever the underlying causes, the standard American workweek is the same as it was when Roosevelt signed the Fair Labor Standards Act some 80 years ago, even as productivity has continued to shoot up. Some of those gains did get distributed to workers—Lichtenstein, the labor historian, told me that the working class’s buying power doubled between 1947 and 1973. But consider what happened to productivity growth after that. Starting in the mid-’70s, productivity continued to rise, but median pay stopped rising with it. Instead of going to the typical worker, much of the additional income flowed to highly paid workers—those with college degrees, particularly college grads in industries such as tech and finance. This is the familiar story of income inequality over the past half century. Less familiar is how this productivity growth could have translated to less time spent working. Today, the top 1 percent of earners bring in about 10 percentage points more of Americans’ total annual income than they did in 1980. Lawrence Katz, an economist at Harvard University, told me that if you could distribute that additional money among the bottom 90 percent of earners, their incomes would be roughly 20 percent higher than they are today. Alternatively, they could work 20 percent fewer hours, which happens to be the difference between a five-day week and a four-day week. Keynes was right: Productivity has grown enough to allow for expansive amounts of leisure—it’s just that, as a society, we’ve channeled these productivity gains toward other ends. Nowadays, working less is not front of mind. Because median wages are so low, many workers want higher pay or more hours, which means more money. “If the minimum wage had continued upward, linked to productivity, it would today be [close to] $25 an hour,” Lichtenstein told me. “If you were in a revolutionary moment, you could say, ‘Let’s double the wages.’” Indeed, about 50 percent of workers in a 2014 poll by HuffPost and YouGov said they would work one more day a week in exchange for 20 percent more pay. Part-time workers and those who made less than $40,000 a year were even more likely to make that trade. “If low-wage workers heard that their hours were going to be capped at 32, they would probably have a fit,” Rashawn Ray, a fellow at the Brookings Institution, told me. “They already don’t have enough money to make ends meet.” In an ideal world, the four-day workweek wouldn’t just mean lopping a day off salaried workers’ weeks—it would mean that hourly employees would work shifts that were 20 percent shorter for the same pay, and would have more predictable time off. If the four-day workweek spreads more widely, some people will, like others before them, argue that realizing this vision would diminish America’s economic vigor. But it is clearly possible for people to work less as the economy continues to grow—that has been the case for a great deal of the country’s history. In fact, the workweek that today’s business leaders defend as necessary is the one that yesterday’s business leaders argued was completely unreasonable. Some European countries stand as examples that hours could be lowered further without disastrous consequences. In 1975, Germans and Americans averaged the same number of annual working hours. More than 45 years later, Germany’s GDP per capita is on par with many other wealthy countries, yet Germans work roughly 400 fewer hours per year than Americans. (That works out to nearly eight hours per week, or one standard workday, though Germans also get more holidays and paid vacation.) Many proponents of the four-day week make a business case for it—that re-envisioning the workweek, and the tasks that fill it, can unclog the pipes of American efficiency. But pushing for shorter working hours should mean imagining a higher end than productivity, and picking up the dormant American project of moving work away from the center of life. If some productivity gains are incidental to that effort, fantastic. But the real case for the four-day workweek is not that it would benefit businesses. It’s that it would benefit people. When workers rhapsodize about the benefits of the four-day week, their statements can sound suspiciously like testimonials from an infomercial. Essence Muhammad, the customer-support agent at Buffer, said that having an extra day off each week allowed her to increase her course load in a bachelor’s-to-master’s program. She’s now on a “fast-track path” that will have her finishing the program possibly a year earlier than if she still worked five days a week. Since Monique Caraballo, a 37-year-old who works at a nonprofit in Ithaca, New York, started a four-day week last year, she has been able to pour herself into volunteering with another nonprofit and moderating online communities, such as a local mutual-aid effort and a networking group for women in marketing. She also picked up hula-hooping this past year, and she told me that none of these things would have been compatible with the unpredictable and inflexible hours at her previous job, at a hotel. “I used to try to do 10 minutes of yoga and couldn’t figure out how to fit that into my full and immovable schedule,” she said. Several people I spoke with said that transitioning to a four-day week cured them of the “Sunday scaries,” a somewhat silly term for the very real dread that many workers feel as the weekend comes to a close. [Read: How much leisure time do the happiest people have?] In 2020, I got my own taste of a diluted four-day workweek, when The Atlantic gave us a handful of Fridays off during the pandemic. In a punishing year, the additional time felt like being thrown a flotation device to cling to as I bobbed from week to week. If it helped me stay afloat during the worst crisis of my lifetime, I can only imagine where it might carry me in normal times. This is the best argument for the four-day week: For workers, it rocks. Anecdotally, it allows people to be less stressed, less strapped for time, more physically and mentally healthy, and more, as Hunnicutt, the historian, put it, “fully human.” It cannot, on its own, give everyone enough time and money, or fix miserable jobs. But it leads to a substantial improvement in quality of life. “One of the biggest factors in people’s level of work-family satisfaction is the pure number of work hours they have,” Melissa Milkie, a sociologist at the University of Toronto who studies time use, told me. “So cutting it is huge … It would re-balance things for working families.” Having an extra day off changes the texture of the weekend. “Before, Saturday felt like my recovery day, and then Sunday, we would try to jam two or three days of a weekend into one day, and I was exhausted on Monday,” said Nicole Miller, of Buffer. A shorter week “gives the rest of your life a little bit more of a chance.” For many people I spoke with, the extra day off became a “quiet day” to reflect and rest. “I like to take walks … just wander and let my brain breathe,” Natalie Nagele, the Wildbit CEO, told me. Others use the additional time to get ahead on laundry, grocery shopping, and other chores and errands, so their Saturday and Sunday can be more restful. Having more weekend time also means having more time to spend with people you care about: Vivienne Pearson, a 51-year-old in eastern Australia, said that another free day used to make it easier to visit her grandmother in a retirement home, when she was still alive. “When you talk to people about how they spend that extra day, they don’t spend it getting drunk,” Pang, the consultant, told me. “They spend it with their families, they spend it going to the doctor, taking up hobbies—incredibly vanilla, wholesome things.” Additional non-work time would also make being a working parent much less draining, particularly for working mothers and single parents. Before the pandemic, more than half of American parents who work full-time said they didn’t have enough time with their kids. “Sometimes I’m like, Man, I really haven’t seen my son in such a long time—like, you see him, but you’re just busy,” Brian Kerr, a customer-support agent at Wildbit and the father of a 2-year-old, told me. “Fridays are my day to slow down and just hang out with him.” Pang told me he sees more people start families at companies with a four-day week, because balancing work and parenting becomes easier. And just as the four-day week changes time outside of work, it changes work itself, too. In my conversations with more than a dozen people who work four days a week, some did note that the schedule could be more intense, but no one said workers at their company were secretly scrambling, on weeknights or over three-day weekends, to get everything done in four days. Instead, they talked about coming back to work better rested and more motivated at the beginning of each week. Of course, working fewer hours at an unfulfilling job doesn’t change its basic nature. According to Gallup, only 36 percent of workers in the U.S. “work with passion and feel a profound connection to their company.” A shorter schedule would not in and of itself give workers the sense of independence, purpose, and camaraderie that researchers have identified as traits of satisfying work. Still, working fewer hours at a job you loathe is better than working more. When Leland and Steinman’s four-day-workweek campaign surveyed about 1,000 American workers this spring, the responses were overwhelmingly positive: Only 4 percent of those polled felt negatively about a national push to move to a shorter week. The top argument against it was not about practicality—only one-fifth of all respondents said they wouldn’t be able to finish their work in that time. Instead, the most common concern was that a four-day week “won’t help some kinds of workers.” Indeed, at the moment, the shorter workweek seems unreachable for the people who need it most—low-wage shift workers, working single parents, hourly workers. Instead, it appears to be most attainable for a group of disproportionately white, highly paid, well-educated workers upon whom the labor market already showers enviable work perks. If a four-day week gains popularity, there is a real risk that it would widen existing inequalities. Juliet Schor, a sociologist at Boston College and the author of The Overworked American: The Unexpected Decline of Leisure, sketches out a more equitable path. “This is the way a lot of these advances in labor will come. Maybe the small firms [have it first], but then you also get the big, wealthier firms on board,” Schor told me. “Gig workers, hourly workers, lower-paid workers—one would hope that if this really started to take hold, then you get legislation that rolls it out for everybody.” There’s a question that comes up regularly in discussions of the four-day workweek. Proponents ask it enthusiastically, skeptics sarcastically: Is it possible to go even shorter? Why stop at four? When I spoke with workers who had a four-day week, I asked them how many days a week they would prefer to work if money were no object. It was an unscientific poll, but everyone said three or four. A survey by Kronos, a company that makes workforce-management software, yielded a similar finding: Four days a week was the most common answer. It’s hard to tell, though, whether people would feel differently if the five-day week weren’t already standard. Lonnie Golden, an economist at Penn State Abington, pointed out that in international surveys, some of the strongest preferences for reducing working hours are in European countries where weekly hours are already relatively low. There might be “a feedback loop,” he speculated. “They start working out, they start socializing—the things that make people happier. A lot of them take second jobs. It might not pay, but they find other pursuits in their non-work time, and they don’t want to go back.” From the perspective of human welfare, people don’t need to do much paid work in order to experience the benefits associated with it. Reviewing data from the United Kingdom, the authors of a 2019 study suggested that “working 8 [hours] a week is sufficient to gain the wellbeing benefits of employment”—that is, whether someone worked the equivalent of one day or five, they were just as eligible to receive the happiness bumps that work can bring. So if and when society comes around to working four days a week, let’s start talking about three. from https://ift.tt/3xtcDb3 Check out http://natthash.tumblr.com The worst things can happen on the most beautiful days. My family’s worst day was a perfect one in the summer of 2019. We picked my daughter up from camp and talked about where to go for lunch: the diner or the burger place. I don’t remember which we chose. What I do remember: being woken up, again and again, by doctors who insist on asking me the same questions—my name, where I am, what month it is—and telling me the same story, a story that I am sure is wrong. “You were in a car accident,” they say. But this cannot be. We’re having lunch and then going on a hike. I had promised the think tank where I work that I’d call in to a 4 p.m. meeting. “You are in Dartmouth-Hitchcock Hospital in New Hampshire.” Another ludicrous statement. I started the day in Vermont. Surely if I had crossed the river to New Hampshire I would know it. “What’s your name?” they ask me, and I tell them and tell them and tell them. “Where are you?” “New Hampshire,” I say, except for one time when I say “Vermont.” “New Hampshire,” they correct, and I want to say, “Really, we are so close to the border here, can’t you just give it to me this once?” “You were in a car accident,” they tell me again. “Your husband broke his leg and your son broke his collarbone.” These do not seem like horrible injuries, so I am waiting for the worse news, the news that my daughter is dead. She is the youngest and the smallest. She was born with albinism, and her existence has always felt improbable, and so now it must be over. But—thank God—it’s not. “Your daughter has fractures in her spine and damage to her lower intestine from the seat belt.” They tell me that my lower intestine was also injured, and that I’ve had surgery. I lift up my hospital gown and am surprised to see an angry red line and industrial-size staples. I remember an article I’d read about seat belts not being designed for women, and I ask the doctor if he sees more women with these injuries than men. I have yet to take in the reality of what has happened to me, to my family. Instead I am thinking about writing an exposé about the sexist seat-belt industry. They wake me up and ask me where I am and what my name is. A doctor asks me who the president is. “I don’t want to say,” I reply. He smiles. I am at Dartmouth for three days before I am transferred to the University of Vermont, where my husband and children are. The days pass like minutes, a loop of sleep interrupted by people asking me questions and telling me terrible things. [Joshua Sharpe: ]We should all be more afraid of driving One of the things I am told is that I have a brain bleed and a traumatic brain injury. I wonder if this is why I am slurring my words, but am told that the slurring is from the anti-seizure medication I am on. This sounds good. The slurring will stop. A doctor tells me I “got my bell rung.” This is a bad analogy. Bell clappers are meant to slam against the side of the bell. The brain is not meant to slam against the side of the skull. Of all the injuries my family is suffering from, mine is the worst. This is my totally biased opinion. My husband’s leg will take almost a year to heal. My daughter would have died if not for the surgery to repair her flayed abdomen. She is 10, and one of her friends tells her that because of the scar she will never be able to wear a bikini. She spends many days attempting to suss out whether she cares. She doesn’t yet know if she is the bikini-wearing type. My 13-year-old son is the only one who remembers the accident. He remembers a woman in a ponytail calling 911, the smell of gasoline and burnt metal. He remembers his father yelling “Jesus Christ.” He will have to live with the memory of his sister looking at my body and asking, “Is Mama dead?” These are terrible injuries, and yet, the other members of my family don’t walk around thinking, Am I still me? My brain injury has shaken my confidence in my own personality, my own existence. This is the worst injury. When we leave the hospital and move into a hotel, I frequently get lost in the hallway. The first time I roll into occupational therapy with my walker, I am grateful for the obvious signage pointing me toward the check-in desk. It’s almost as though the clinic is expecting people with brain damage. My therapist is a smiling, 40-something woman with dirty-blond hair. She reminds me of me before the accident. She asks if I am having any thinking problems or memory problems. I tell her about an incident with Parmesan cheese. “Can you get the Parmesan?” my husband asked. I opened the fridge and looked. I looked and looked. “I can’t find it,” I said with a shrug. My son opened the fridge and pulled out a block of Parmesan. It hadn’t occurred to me that this was a brain issue. Sometimes you just can’t find the Parmesan. Right? A test confirms that I have trouble scanning a visual field for objects. My brain is struggling to recognize what I see, but without a pre-accident baseline to judge from, there is no way to know how much worse I am at it now. Have I always been bad at finding things? Maybe? There are limits to how well an injured brain can scrutinize an injured brain. I have other visual-processing issues. At first I can’t watch television because my brain is unable to merge the images from my two eyes, so I see doubles of everything—two Phoebes, two Chandlers. I can watch with one eye closed, but I’m distracted, seething at my brain for failing to do such a simple task. In one session, the therapist tells me we are going to play a game. She pulls out a deck of cards and asks me to turn cards over while saying the number or the color or the suit. The game is so difficult, I want to physically remove my brain from my skull and hurl it against a wall. I will never play this game again as long as I live. Eventually I graduate from occupational therapy. But occupational therapy isn’t about getting people back on their feet so they can return to think tanks. It is about making sure they can run errands without getting lost. I am someone who has always taken pride in my intelligence, and now I am not so smart. I am just a functional human being, according to occupational therapy. When we go out in public as a family, we are a walking nightmare. “Wow,” a stranger says, marveling at the device that is bolted into my husband’s femur. And then my son appears with his arm in a sling, my daughter limps over in her back brace. An injured couple is potentially funny. There is nothing funny about an injured family. “What happened to you guys?” When we tell the story, we explain that we were in no way at fault, which feels important. We wore our seat belts and drove the speed limit and the weather wasn’t bad and yet this happened to us. Someone was driving a pickup truck in the opposite direction. He was late to a job interview or to get his kid, or maybe he was just antsy. In front of him was a motorcycle slowing him down. Maybe he’d been behind that motorcycle for miles. Maybe he liked to take risks. He pulled into our lane and passed the motorcycle while going up a hill at 70 miles per hour. I don’t know who makes this kind of decision. Did he think, I can’t believe I did something this stupid? Did he also yell “Jesus Christ”? Because we are not at fault, accident feels like the wrong word. Not just wrong, but unfair. My husband starts calling it the incident, but an incident is a small thing, not something that scars you for life. The smashing? The destruction? Newbury, after the town where it occurred? The only thing that comes close is the devastation. The devastated me is different. My brain used to race, making lists and plans, skipping from an article I was researching to whether my kids were in appropriate after-school programs to what vacation we should take in February. Now it does none of that. There are no plans to make. A few days after regaining consciousness, I check my Twitter feed. I have always been a news junkie. But I have missed nothing. The news seems to be not just familiar but actually repeating itself. Something bonkers happened in the White House. People are dying in a country I’ve never been to. A company did something possibly illegal. There was a house fire in the Bronx. Are these the things I used to care about? The most interesting piece of news is the one I am experiencing. In the hospital we are waiting to make sure my daughter can poop through her reconstructed colon. This article isn’t in The New York Times. When we return to New York I take the subway to doctor appointments. I don’t take out my phone, I just sit. My brain is quiet, which I find suspicious, but also soothing. Before the accident I went to yoga retreats and tried meditation. I said things like “I just need to unplug.” Apparently what I needed was to get hit by a truck. Perhaps I have discovered the secret to a peaceful mind, and it is traumatic brain injury. I fantasize about opening an expensive spa where busy people pay me money to whack them on the head with a baseball bat. The day of the accident I had been working on a project to improve how homeless people are placed into shelters. I say out loud, “I don’t care about homeless people” to see how it feels. It doesn’t ring true; I do care about homeless people. I just don’t feel like working. I have always been a regular exerciser. Now I can’t imagine wanting to do a burpee, let alone 10 of them. I always ate healthy things. But did you know that you can eat whole grains and still get hit by a truck? I have strange cravings. I think about apple cider all the time. Apple cider is not a normal part of my diet. I have a very detailed dream about eating chocolate cake. I eat the cake. That’s the entire dream. I find myself foraging in the fridge for flavors that don’t exist. I don’t know which symptoms are permanent and which are temporary. At first, the doctors say that after a year or two I’m likely to have a full return to my normal brain function. Or not. They don’t really know about the brain. It might be more like 95 percent. If I broke my elbow and someone told me I’d get 95 percent of my elbow function back, I’d be satisfied. But 95 percent of my brain function sounds terrifying. Which pieces will be missing? Some days I feel like myself. Other days all I can think about is the old life that is gone. Then, halfway through my recuperation, the coronavirus comes. The stores close, the schools close, the traffic on the avenue dwindles to a sporadic whoosh. And my busy friends who were always texting me about their crazy schedules are suddenly as quiet as I am. Together we wait for normal to return. The difference is that they know what normal looks like. In July it will be two years since the accident. The world is now coming back to life, my days slowly filling up with work and chores and exercise. Soon I will go back to in-person meetings and travel, and I wonder: Will I be up to the challenge? Or will I get lost in office buildings and airports? For now, in this liminal space between the old life and the new one, I often catch myself staring at my children. They have never been more beautiful. I chalk this up to the magic of braces––their teeth are finally coming into alignment––but I know this is ridiculous. They are beautiful because they are alive. I look at them, and I sit with the silence. Today, it is mine. Tomorrow, it may not be. from https://ift.tt/3q0UkY4 Check out http://natthash.tumblr.com |
Authorhttp://natthash.tumblr.com Archives
April 2023
Categories |