Cabbages with jet lag are less nutritious and more vulnerable to insect pests. Fruits and vegetables have an internal clock that can be reset by a daily cycle of light and dark, but storing produce in darkened refrigerators could disrupt this natural rhythm, researchers report June 20 in Current Biology.
Plants, even after being cropped from the stalk, are much more responsive to their external environment than we give them credit for, says Janet Braam, a plant biologist at Rice University. “When we harvest them they’re still metabolizing,” she says. “They’re still alive.” Braam normally studies circadian rhythms in plants that are growing, but an offhand comment by her son inspired her to turn to the grocery store for new research subjects.
She and her colleagues had previously found that the plant Arabidopsis thaliana schedules production of insect-repelling chemical defenses to match caterpillar feeding peaks. These defenses include compounds called glucosinolates, which are thought to have anticancer and antimicrobial properties in addition to their caterpillar-discouraging ones.
When Braam told her son about these experiments, he joked that now he knew the best time to eat his vegetables. She realized that cabbages — which also produce glucosinolates — might have similar daily cycles even after being picked, packed and shipped.
“So we went to the grocery store, bought some cabbage and put them under dark/light cycles that were either in phase or out of phase with our insects, and then asked whether the insects could tell the difference,” says Braam.
Like Arabidopsis, the cabbage leaves had daily glucosinolate cycles if the vegetables were exposed to alternating 12-hour periods of light and dark. Caterpillars on a cycle offset by 12 hours to the cabbages’ (so the cabbages’ dawn was the caterpillars’ dusk) ate about 20 times more than did caterpillars on a schedule synchronized to their food. Caterpillars also ate twice as much cabbage if the vegetable had been kept either in constant light or constant darkness.
It’s not just cabbages that adjust daily rhythm to better fend off caterpillars; the team found similar results for spinach, zucchini, sweet potatoes, carrots and blueberries. These fruits and vegetables don’t produce glucosinolates, so they must make some other kind of defenses on a daily cycle, says Braam.
The researchers suggest that we might improve the health benefits and pest resistance of fruits and vegetables by storing them under lighting conditions that mimic day and night. But Cathie Martin, a plant biologist at the John Innes Centre in England, is skeptical. She says most postharvest vegetable losses are from fungal infections, not the insects that eat vegetables in the field. And cabbages are sometimes cold-stored for months in the dark before being sold. Cabbages lose the clock-regulated pest resistance about a week after harvesting, the new study shows.
“But maybe I’ll be proven completely wrong,” says Martin. “Maybe one day we’ll all have little LEDs in the fridge.”
A new 3-D map of the brain is the best thing since sliced cold cuts, at least to some neuroscientists. “It’s a remarkable tour-de-force to reconstruct an entire human brain with such accuracy,” says David Van Essen, a neuroscientist at Washington University in St. Louis.
Using a high-tech deli slicer and about 100,000 computer processors, researchers shaved a human brain into thousands of thin slivers and then digitally glued them together. The result is the most detailed brain atlas ever published. Dubbed BigBrain, the digital model has a resolution 50 times greater in each of the three spatial dimensions than currently available maps, researchers report in the June 21 Science. The difference is like zooming from a satellite view of a city down to the street level, says coauthor Alan Evans, a neuroimaging scientist at McGill University in Montreal.
BigBrain allows researchers to navigate the landscape of the human cortex, the rugged outer layer of the brain. And unlike previous maps, the tool also lets scientists burrow beneath the surface, tunnel through the brain’s hemispheres and step slice-by-slice through high-res structural data.
Around 100 years ago, neuroscientists relied on thick slabs of brain tissue to crudely chart out neural regions. More recently, imaging tools such as MRI have let researchers take a more detailed look. But even the very best MRI maps are still a little fuzzy, says Hanchuan Peng, a computational biologist at the Allen Institute for Brain Science in Seattle.
In 2010, a team of Chinese researchers constructed a digital map of the mouse brain using techniques similar to the ones that produced BigBrain. But until now, no one had done it in humans. Because the human brain is thousands of times bigger than the mouse brain, Evans and colleagues had to massively scale up slicing and computing methods. First, Katrin Amunts and colleagues at the Jülich Research Center in Germany carved the donated brain of a 65-year-old woman into 7,404 ultrathin sheets, each about the thickness of plastic wrap.
Next, researchers stained the sheets to boost contrast, took pictures of each sheet with a flatbed scanner, and then harnessed the processing power from seven supercomputing facilities across Canada to digitally stitch together the images. In all, the researchers analyzed about one terabyte, or 1,000 gigabytes, of image data. That’s about the same amount of data as 250,000 MP3 songs.
“Your laptop would choke if it tried to run a typical image-processing program to look at this dataset,” Evans says.
His team designed a software program that lets researchers dig into BigBrain’s data. Users will be able to pick up the brain, rotate it in any direction and cut through any plane they want. “It’s like a video game,” he says.
Evans hopes BigBrain will provide a digital scaffold for other researchers to layer on different kinds of brain data. Scientists could stack on information about chemical concentrations or electrophysical signals, just as climate and traffic data can be layered onto a geographical map.
The 3-D map could also help researchers interpret data from lower-resolution brain-scanning techniques such as MRI and PET, study coauthor Karl Zilles of the Jülich Research Center said during a press briefing June 19. Overlaying images from these scans onto BigBrain might give neuroimagers a better idea of where exactly damaged tissue lies in diseased brains.
And neurosurgeons might use BigBrain to guide placement of electrodes during deep-brain stimulation for Alzheimer’s or Parkinson’s diseases, he said.
Though all human brains have largely similar architecture, Evans says, every person has subtle shape variations. As a result, he’d like to make maps of more brains for comparison.
Now that the teams have ironed out BigBrain’s technical kinks, the researchers think they can compile a second brain’s map in about a year. “The computational tools are all largely in place now,” Evans says.
Men who don’t produce sperm face nearly three times the risk of cancer compared with the male population average, researchers report June 20 in Fertility and Sterility.
About 4 million men in the United States are infertile, with a host of causes. Of them, about 600,000 men don’t deliver sperm from the testes to the semen at all.
Michael Eisenberg of Stanford University and his colleagues studied the medical records of Texas men who had visited a male health clinic between 1989 and 2009. The men averaged 36 years old when they were examined. Of 2,238 men found to be infertile, about one-fifth didn’t have any sperm in their semen.
Over an average follow-up time of 6.7 years, 10 of the 451 men who didn’t make sperm developed some kind of cancer, making them 2.9 times as likely as similar-aged men in the general Texas population to be diagnosed with cancer. The reason is unclear, the authors say. Men who were infertile for other reasons didn’t face an increased risk of cancer.
An exotic subatomic particle could be the first amalgamation of more than three quarks — a fundamental building block of atoms — to be produced experimentally. If it is what physicists think it is, the particle could provide clues about the force that holds nuclei together and perhaps about the earliest moments of the universe.
“We have very solid evidence of an unconventional particle,” says Ronald Poling, a physicist at the University of Minnesota in Minneapolis. “But it’s the interpretation — the possibility that it has four quarks — that makes it very exciting.” The details of the particle, inelegantly named Zc(3900), appear June 17 in Physical Review Letters. Physicists have known since the 1960s that protons and neutrons are made up of quarks, as are hundreds of other particles. All of these particles can be divided into two categories: mesons, which contain two quarks, and baryons (including protons and neutrons), which contain three.
Over the last decade many physicists, including those at the Belle experiment in Japan and the BESIII experiment in China have fruitlessly searched for particles with more than three quarks. Probing a particle’s insides is tough because physicists can’t see quarks directly. Instead they have to measure all the properties they can for a given particle, such as its mass, charge and decay products, looking for unusual characteristics that can be explained only by a peculiar combination of quarks.
The Belle and BESIII teams were both studying an odd particle called Y(4260) when they realized that it decayed to make another interesting particle, Zc(3900). Its mass, says Poling, who is part of the BESIII team, suggests that it is an electrically neutral meson made up of two quarks with opposite charges, called charm and anticharm. But surprisingly, both teams found that Zc(3900) has an electrical charge.
In fact, Poling says no two-quark or three-quark combinations can explain Zc(3900)’s charge and mass. That is leading physicists to the more exotic and exciting conclusion that the particle consists of four quarks: a charm and an anticharm along with an up and an antidown, which are extremely light and create a net positive charge. “The particle’s charge makes it a smoking gun for a four-quark state,” says Tomasz Skwarnicki, a physicist at Syracuse University in New York.
Assuming the evidence for a four-quark arrangement holds up, the big question will be how those quarks are arranged. Zc(3900) could be a single entity of four quarks, Skwarnicki says, but it could also be a coupling of two mesons, analogous to two atoms linking up to form a molecule.
Poling says that understanding the particle’s internal structure could improve physicists’ understanding of the strong nuclear force, which dictates how quarks bond together to create protons, neutrons and other composite forms of matter.
In addition, physicists believe that just after the Big Bang, matter existed in the form of a hot soup of individual quarks and gluons, particles that carry the strong force. Perhaps, as the universe cooled, that soup solidified into exotic multiquark combinations such as Zc(3900) before breaking up into the particles observed today. “The more complete our picture of all the elementary particles and their interactions,” Poling says, “the better we’ll understand where we started out and how we got to where we are.”
When a deer carcass appeared a few meters from a camera trap without obvious predator prints, scientists were a bit puzzled.
The mystery was only solved when the team reviewed 2-week-old footage from their camera and saw images of an adult golden eagle tearing into the back of a young sika deer. It was a rare sight, the first reported in the extreme eastern regions of Russia, the scientists suggest.
They published the images and a paper on the predator-prey interaction in the September Journal of Raptor Research.
A new instrument onboard the NASA–NOAA Suomi satellite has been capturing exquisitely detailed views of seasonal and environmental shifts in plant cover. Light sensors on the satellite identify vegetation by detecting differences in reflected amounts of visible light, which plants absorb for photosynthesis, and near-infrared light, which plants don’t absorb. Subtle changes in greenness can give advance warning of drought or fire conditions. Meteorologists can also use data on vegetation dynamics to improve weather prediction.
The internet is rife with advice for keeping the brain sharp as we age, and much of it is focused on the foods we eat. Headlines promise that oatmeal will fight off dementia. Blueberries improve memory. Coffee can slash your risk of Alzheimer’s disease. Take fish oil. Eat more fiber. Drink red wine. Forgo alcohol. Snack on nuts. Don’t skip breakfast. But definitely don’t eat bacon.
One recent diet study got media attention, with one headline claiming, “Many people may be eating their way to dementia.” The study, published last December in Neurology, found that people who ate a diet rich in anti-inflammatory foods like fruits, vegetables, beans and tea or coffee had a lower risk of dementia than those who ate foods that boost inflammation, such as sugar, processed foods, unhealthy fats and red meat. But the study, like most research on diet and dementia, couldn’t prove a causal link. And that’s not good enough to make recommendations that people should follow. Why has it proved such a challenge to pin down whether the foods we eat can help stave off dementia?
First, dementia, like most chronic diseases, is the result of a complex interplay of genes, lifestyle and environment that researchers don’t fully understand. Diet is just one factor. Second, nutrition research is messy. People struggle to recall the foods they’ve eaten, their diets change over time, and modifying what people eat — even as part of a research study — is exceptionally difficult.
For decades, researchers devoted little effort to trying to prevent or delay Alzheimer’s disease and other types of dementia because they thought there was no way to change the trajectory of these diseases. Dementia seemed to be the result of aging and an unlucky roll of the genetic dice.
While scientists have identified genetic variants that boost risk for dementia, researchers now know that people can cut their risk by adopting a healthier lifestyle: avoiding smoking, keeping weight and blood sugar in check, exercising, managing blood pressure and avoiding too much alcohol — the same healthy behaviors that lower the risk of many chronic diseases.
Diet is wrapped up in several of those healthy behaviors, and many studies suggest that diet may also directly play a role. But what makes for a brain-healthy diet? That’s where the research gets muddled.
Despite loads of studies aimed at dissecting the influence of nutrition on dementia, researchers can’t say much with certainty. “I don’t think there’s any question that diet influences dementia risk or a variety of other age-related diseases,” says Matt Kaeberlein, who studies aging at the University of Washington in Seattle. But “are there specific components of diet or specific nutritional strategies that are causal in that connection?” He doubts it will be that simple.
Worth trying In the United States, an estimated 6.5 million people, the vast majority of whom are over age 65, are living with Alzheimer’s disease and related dementias. Experts expect that by 2060, as the senior population grows, nearly 14 million residents over age 65 will have Alzheimer’s disease. Despite decades of research and more than 100 drug trials, scientists have yet to find a treatment for dementia that does more than curb symptoms temporarily (SN: 7/3/21 & 7/17/21, p. 8). “Really what we need to do is try and prevent it,” says Maria Fiatarone Singh, a geriatrician at the University of Sydney.
Forty percent of dementia cases could be prevented or delayed by modifying a dozen risk factors, according to a 2020 report commissioned by the Lancet. The report doesn’t explicitly call out diet, but some researchers think it plays an important role. After years of fixating on specific foods and dietary components — things like fish oil and vitamin E supplements — many researchers in the field have started looking at dietary patterns.
That shift makes sense. “We do not have vitamin E for breakfast, vitamin C for lunch. We eat foods in combination,” says Nikolaos Scarmeas, a neurologist at National and Kapodistrian University of Athens and Columbia University. He led the study on dementia and anti-inflammatory diets published in Neurology. But a shift from supplements to a whole diet of myriad foods complicates the research. A once-daily pill is easier to swallow than a new, healthier way of eating. Earning points Suspecting that inflammation plays a role in dementia, many researchers posit that an anti-inflammatory diet might benefit the brain. In Scarmeas’ study, more than 1,000 older adults in Greece completed a food frequency questionnaire and earned a score based on how “inflammatory” their diet was. The lower the score, the better. For example, fatty fish, which is rich in omega-3 fatty acids, was considered an anti-inflammatory food and earned negative points. Cheese and many other dairy products, high in saturated fat, earned positive points.
During the next three years, 62 people, or 6 percent of the study participants, developed dementia. People with the highest dietary inflammation scores were three times as likely to develop dementia as those with the lowest. Scores ranged from –5.83 to 6.01. Each point increase was linked to a 21 percent rise in dementia risk.
Such epidemiological studies make connections, but they can’t prove cause and effect. Perhaps people who eat the most anti-inflammatory diets also are those least likely to develop dementia for some other reason. Maybe they have more social interactions. Or it could be, Scarmeas says, that people who eat more inflammatory diets do so because they’re already experiencing changes in their brain that lead them to consume these foods and “what we really see is the reverse causality.”
To sort all this out, researchers rely on randomized controlled trials, the gold standard for providing proof of a causal effect. But in the arena of diet and dementia, these studies have challenges.
Dementia is a disease of aging that takes decades to play out, Kaeberlein says. To show that a particular diet could reduce the risk of dementia, “it would take two-, three-, four-decade studies, which just aren’t feasible.” Many clinical trials last less than two years.
As a work-around, researchers often rely on some intermediate outcome, like changes in cognition. But even that can be hard to observe. “If you’re already relatively healthy and don’t have many risks, you might not show much difference, especially if the duration of the study is relatively short,” says Sue Radd-Vagenas, a nutrition scientist at the University of Sydney. “The thinking is if you’re older and you have more risk factors, it’s more likely we might see something in a short period of time.” Yet older adults might already have some cognitive decline, so it might be more difficult to see an effect.
Many researchers now suspect that intervening earlier will have a bigger impact. “We now know that the brain is stressed from midlife and there’s a tipping point at 65 when things go sour,” says Hussein Yassine, an Alzheimer’s researcher at the Keck School of Medicine of the University of Southern California in Los Angeles. But intervene too early, and a trial might not show any effect. Offering a healthier diet to a 50- or 60-year-old might pay off in the long run but fail to make a difference in cognition that can be measured during the relatively short length of a study.
And it’s not only the timing of the intervention that matters, but also the duration. Do you have to eat a particular diet for two decades for it to have an impact? “We’ve got a problem of timescale,” says Kaarin Anstey, a dementia researcher at the University of New South Wales in Sydney.
And then there are all the complexities that come with studying diet. “You can’t isolate it in the way you can isolate some of the other factors,” Anstey says. “It’s something that you’re exposed to all the time and over decades.”
Food as medicine? In a clinical trial, researchers often test the effectiveness of a drug by offering half the study participants the medication and half a placebo pill. But when the treatment being tested is food, studies become much more difficult to control. First, food doesn’t come in a pill, so it’s tricky to hide whether participants are in the intervention group or the control group.
Imagine a trial designed to test whether the Mediterranean diet can help slow cognitive decline. The participants aren’t told which group they’re in, but the control group sees that they aren’t getting nuts or fish or olive oil. “What ends up happening is a lot of participants will start actively increasing the consumption of the Mediterranean diet despite being on the control arm, because that’s why they signed up,” Yassine says. “So at the end of the trial, the two groups are not very dissimilar.”
Second, we all need food to live, so a true placebo is out of the question. But what diet should the control group consume? Do you compare the diet intervention to people’s typical diets (which may differ from person to person and country to country)? Do you ask the comparison group to eat a healthy diet but avoid the food expected to provide brain benefits? (Offering them an unhealthy diet would be unethical.)
And tracking what people eat during a clinical trial can be a challenge. Many of these studies rely on food frequency questionnaires to tally up all the foods in an individual’s diet. An ongoing study is assessing the impact of the MIND diet (which combines part of the Mediterranean diet with elements of the low-salt DASH diet) on cognitive decline. Researchers track adherence to the diet by asking participants to fill out a food frequency questionnaire every six to 12 months. But many of us struggle to remember what we ate a day or two ago. So some researchers also rely on more objective measures to assess compliance. For the MIND diet assessment, researchers are also tracking biomarkers in the blood and urine — vitamins such as folate, B12 and vitamin E, plus levels of certain antioxidants. Another difficulty is that these surveys often don’t account for variables that could be really important, like how the food was prepared and where it came from. Was the fish grilled? Fried? Slathered in butter? “Those things can matter,” says dementia researcher Nathaniel Chin of the University of Wisconsin–Madison.
Plus there are the things researchers can’t control. For example, how does the food interact with an individual’s medications and microbiome? “We know all of those factors have an interplay,” Chin says.
The few clinical trials looking at dementia and diet seem to measure different things, so it’s hard to make comparisons. In 2018, Radd-Vagenas and her colleagues looked at all the trials that had studied the impact of the Mediterranean diet on cognition. There were five at the time. “What struck me even then was how variable the interventions were,” she says. “Some of the studies didn’t even mention olive oil in their intervention. Now, how can you run a Mediterranean diet study and not mention olive oil?”
Another tricky aspect is recruitment. The kind of people who sign up for clinical trials tend to be more educated, more motivated and have healthier lifestyles. That can make differences between the intervention group and the control group difficult to spot. And if the study shows an effect, whether it will apply to the broader, more diverse population comes into question. To sum up, these studies are difficult to design, difficult to conduct and often difficult to interpret.
Kaeberlein studies aging, not dementia specifically, but he follows the research closely and acknowledges that the lack of clear answers can be frustrating. “I get the feeling of wanting to throw up your hands,” he says. But he points out that there may not be a single answer. Many diets can help people maintain a healthy weight and avoid diabetes, and thus reduce the risk of dementia. Beyond that obvious fact, he says, “it’s hard to get definitive answers.”
A better way In July 2021, Yassine gathered with more than 30 other dementia and nutrition experts for a virtual symposium to discuss the myriad challenges and map out a path forward. The speakers noted several changes that might improve the research.
One idea is to focus on populations at high risk. For example, one clinical trial is looking at the impact of low- and high-fat diets on short-term changes in the brain in people who carry the genetic variant APOE4, a risk factor for Alzheimer’s. One small study suggested that a high-fat Western diet actually improved cognition in some individuals. Researchers hope to get clarity on that surprising result. Another possible fix is redefining how researchers measure success. Hypertension and diabetes are both well-known risk factors for dementia. So rather than running a clinical trial that looks at whether a particular diet can affect dementia, researchers could look at the impact of diet on one of these risk factors. Plenty of studies have assessed the impact of diet on hypertension and diabetes, but Yassine knows of none launched with dementia prevention as the ultimate goal.
Yassine envisions a study that recruits participants at risk of developing dementia because of genetics or cardiovascular disease and then looks at intermediate outcomes. “For example, a high-salt diet can be associated with hypertension, and hypertension can be associated with dementia,” he says. If the study shows that the diet lowers hypertension, “we achieved our aim.” Then the study could enter a legacy period during which researchers track these individuals for another decade to determine whether the intervention influences cognition and dementia.
One way to amplify the signal in a clinical trial is to combine diet with other interventions likely to reduce the risk of dementia. The Finnish Geriatric Intervention Study to Prevent Cognitive Impairment and Disability, or FINGER, trial, which began in 2009, did just that. Researchers enrolled more than 1,200 individuals ages 60 to 77 who were at an elevated risk of developing dementia and had average or slightly impaired performance on cognition tests. Half received nutritional guidance, worked out at a gym, engaged in online brain-training games and had routine visits with a nurse to talk about managing dementia risk factors like high blood pressure and diabetes. The other half received only general health advice.
After two years, the control group had a 25 percent greater cognitive decline than the intervention group. It was the first trial, reported in the Lancet in 2015, to show that targeting multiple risk factors could slow the pace of cognitive decline.
Now researchers are testing this approach in more than 30 countries. Christy Tangney, a nutrition researcher at Rush University in Chicago, is one of the investigators on the U.S. arm of the study, enrolling 2,000 people ages 60 to 79 who have at least one dementia risk factor. The study is called POINTER, or U.S. Study to Protect Brain Health Through Lifestyle Intervention to Reduce Risk. The COVID-19 pandemic has delayed the research — organizers had to pause the trial briefly — but Tangney expects to have results in the next few years.
This kind of multi-intervention study makes sense, Chin says. “One of the reasons why things are so slow in our field is we’re trying to address a heterogeneous disease with one intervention at a time. And that’s just not going to work.” A trial that tests multiple interventions “allows for people to not be perfect,” he adds. Maybe they can’t follow the diet exactly, but they can stick to the workout program, which might have an effect on its own. The drawback in these kinds of studies, however, is that it’s impossible to tease out the contribution of each individual intervention. Preemptive guidelines Two major reports came out in recent years addressing dementia prevention. The first, from the World Health Organization in 2019, recommends a healthy, balanced diet for all adults, and notes that the Mediterranean diet may help people who have normal to mildly impaired cognition.
The 2020 Lancet Commission report, however, does not include diet in its list of modifiable risk factors, at least not yet. “Nutrition and dietary components are challenging to research with controversies still raging around the role of many micronutrients and health outcomes in dementia,” the report notes. The authors point out that a Mediterranean or the similar Scandinavian diet might help prevent cognitive decline in people with intact cognition, but “how long the exposure has to be or during which ages is unclear.” Neither report recommends any supplements.
Plenty of people are waiting for some kind of advice to follow. Improving how these studies are done might enable scientists to finally sort out what kinds of diets can help hold back the heartbreaking damage that comes with Alzheimer’s disease. For some people, that knowledge might be enough to create change. “Inevitably, if you’ve had Alzheimer’s in your family, you want to know, ‘What can I do today to potentially reduce my risk?’ ” says molecular biologist Heather Snyder, vice president of medical and scientific relations at the Alzheimer’s Association.
But changing long-term dietary habits can be hard. The foods we eat aren’t just fuel; our diets represent culture and comfort and more. “Food means so much to us,” Chin says.
“Even if you found the perfect diet,” he adds, “how do you get people to agree to and actually change their habits to follow that diet?” The MIND diet, for example, suggests people eat less than one serving of cheese a week. In Wisconsin, where Chin is based, that’s a nonstarter, he says.
But it’s not just about changing individual behaviors. Radd-Vagenas and other researchers hope that if they can show the brain benefits of some of these diets in rigorous studies, policy changes might follow. For example, research shows that lifestyle changes can have a big impact on type 2 diabetes. As a result, many insurance providers now pay for coaching programs that help participants maintain healthy diet and exercise habits.
“You need to establish policies. You need to change cities, change urban design. You need to do a lot of things to enable healthier choices to become easier choices,” Radd-Vagenas says. But that takes meatier data than exist now.
During winter in India’s mountainous Ladakh region, some farmers use pipes and sprinklers to construct building-sized cones of ice. These towering, humanmade glaciers, called ice stupas, slowly release water as they melt during the dry spring months for communities to drink or irrigate crops. But the pipes often freeze when conditions get too cold, stifling construction.
Now, preliminary results show that an automated system can erect an ice stupa while avoiding frozen pipes, using local weather data to control when and how much water is spouted. What’s more, the new system uses roughly a tenth the amount of water that the conventional method uses, researchers reported June 23 at the Frontiers in Hydrology meeting in San Juan, Puerto Rico. “This is one of the technological steps forward that we need to get this innovative idea to the point where it’s realistic as a solution,” says glaciologist Duncan Quincey of the University of Leeds in England who was not involved in the research. Automation could help communities build larger, longer-lasting ice stupas that provide more water during dry periods, he says.
Ice stupas emerged in 2014 as a means for communities to cope with shrinking alpine glaciers due to human-caused climate change (SN: 5/29/19). Typically, high-mountain communities in India, Kyrgyzstan and Chile pipe glacial meltwater into gravity-driven fountains that sprinkle continuously in the winter. Cold air freezes the drizzle, creating frozen cones that can store millions of liters of water.
The process is simple, though inefficient. More than 70 percent of the spouted water may flow away instead of freezing, says glaciologist Suryanarayanan Balasubramanian of the University of Fribourg in Switzerland.
So Balasubramanian and his team outfitted an ice stupa’s fountain with a computer that automatically adjusted the spout’s flow rate based on local temperatures, humidity and wind speed. Then the scientists tested the system by building two ice stupas in Guttannen, Switzerland — one using a continuously spraying fountain and one using the automated system.
After four months, the team found that the continuously sprinkling fountain had spouted about 1,100 cubic meters of water and amassed 53 cubic meters of ice, with pipes freezing once. The automated system sprayed only around 150 cubic meters of water but formed 61 cubic meters of ice, without any frozen pipes.
The researchers are now trying to simplify their prototype to make it more affordable for high-mountain communities around the world. “We eventually want to reduce the cost so that it is within two months of salary of the farmers in Ladakh,” Balasubramanian says. “Around $200 to $400.”
“What does not kill me, makes me stronger,” 19th century German philosopher Friedrich Nietzsche famously wrote. Variations of that aphorism abound in literary, spiritual and, more recently, psychological texts.
That psychological research suggests that at least half of survivors not only recover from traumatic experiences but also go on to develop more appreciation for life, stronger relationships and emotional strength — a phenomenon researchers call “posttraumatic growth.”
The idea that bad events can often lead to good outcomes is appealing, especially in this present and difficult moment. More than 6.3 million people worldwide have died from COVID-19, and deaths continue to mount (SN: 5/18/22). Russia’s invasion of Ukraine has surpassed a hundred days (SN 4/12/22). And a recent string of mass shootings — including at a July 4 parade in Highland Park, Ill., an elementary school in Uvalde, Texas, and a grocery store in Buffalo, N.Y. — has left U.S. communities reeling (SN: 5/26/22). But in a series of talks presented in May in Chicago at the Association for Psychological Science conference, some researchers called findings of posttraumatic growth “largely illusory.” Growth studies suffer from serious methodological flaws, these researchers say. That includes a reliance on surveys that require people to assess their personal growth over time, a task that most people struggle with.
Better research tools may not remedy the problem. That’s because these studies are fundamentally flawed, the researchers say. They argue that the impetus to study trauma in terms of growth stems from a Western mind-set that tends to value positive emotions and devalue or even avoid negative emotions (SN: 12/7/19). That can pressure survivors to deny or suppress their negative feelings, which could have harmful consequences.
That yearning for positive outcomes can create “toxic cultural narratives,” says personality psychologist Eranda Jayawickreme of Wake Forest University in Winston-Salem, N.C. Referring to parents who lost a child in the Uvalde mass shooting, he says: “There is something grotesque about this expectation that people could come back from something like this.”
Focusing on the good A half-century ago, psychologists largely treated a person’s difficulty in rebounding from traumatic events as a personal failing. But research on returning Vietnam War veterans and other trauma survivors began shifting that way of thinking. In 1980, the American Psychiatric Association created a category for posttraumatic stress disorder, or PTSD, in its manual of mental disorders. Those struggling with the disorder might experience flashbacks, nightmares and severe anxiety.
“Thirty years ago, everyone was focused on the worst outcomes,” says trauma and resilience expert George Bonanno of Columbia University.
But only about one-fifth of people who experience trauma develop PTSD, psychologists realized. In the mid-1990s, psychologists Richard Tedeschi and Lawrence Calhoun, both then at the University of North Carolina at Charlotte, wrote in the 1996 Journal of Traumatic Stress that a focus on suffering obscured the good that can emerge from trauma.
The pair developed a now widely used “growth inventory” to assess positive outcomes that people reported after experiencing a traumatic event. In the 21-item survey, respondents rate statements such as “I have more compassion for others” and “I established a new path for my life.” Respondents can choose from a score of 0 for “I did not experience this change as a result of my crisis” to 5 for “I experienced this change to a very great degree as a result of my crisis.” The items reflect five broad categories: relating to others, personal strength, new possibilities, appreciation of life and spiritual change.
“Posttraumatic growth is you go down to the depths and then at some later point, rise above that baseline into some other realm,” says Tedeschi, now the chair of Boulder Crest Institute for Posttraumatic Growth, an organization in Bluemont, Va., that uses research on posttraumatic growth to help combat veterans, first responders and their families.
Cracks in the theory But some researchers soon began to question people’s ability to accurately respond to the growth survey.
In a now-seminal 2009 study in Psychological Science, researchers recruited about 1,500 undergraduate students and tracked them for eight weeks. Once at the start of the study and once at the end, students responded to surveys for each of the five growth categories covered by the growth inventory, along with a modified version of the inventory.
Unlike Tedeschi and Calhoun’s inventory, which asks respondents to compare their present state of mind to the past, the five surveys and modified inventory asked respondents to reflect on the present. For instance, in the relationship quality category, respondents rated the statement: “I enjoy personal and mutual conversations with family members or friends.”
During that eight-week period, 122 students reported experiencing a traumatic event that caused high levels of distress. Those students also completed the standard growth inventory at the end of the eight weeks. The researchers found no correlation between high perceived growth scores on the standard inventory and actual growth scores on the modified inventory and five surveys related to well-being.
That mismatch occurs because people are terrible at remembering how they felt in the past, says study coauthor Crystal Park, a clinical health psychologist at the University of Connecticut in Storrs. “I want to throw out the concept of people being able to accurately report that they have grown.”
Tedeschi counters that trauma cleaves one’s life into a before and after, making it more feasible for many people to distinguish changes over time. What’s more, growth takes time, he says. “You can’t expect people to change their spiritual beliefs in eight weeks.”
But how much time? And if enough time passes, how does one know that the growth arises from the trauma and not some other major life experience, like moving across the country or having children? Research shows that people are not great at introspection, Jayawickreme says. “We tend to come up with all types of stories for change, but those stories are just stories. They don’t really reflect what actually caused that change.”
‘Pulling for growth’ Other problems come with the design of the original inventory, Jayawickreme and others argue. For example, the questions account for positive changes only, and the worst or lowest outcome a respondent can give is that they experienced no change as a result of their crisis. That framing puts pressure on survivors to report growth when they may feel worse, Jayawickreme says. “The items are pulling for growth.”
Tedeschi and colleagues have recently developed a new growth survey that also accounts for negative changes following trauma. But it’s not yet as widely used in research on posttraumatic growth as the original.
Meanwhile, in unpublished work, trauma researcher Adriel Boals of the University of North Texas in Denton sought to overcome what he sees as the original survey’s growth bias in a different way. He asked people whether they changed because of a traumatic event, a measure of growth, or despite the event.
“Half the people are picking, ‘I changed despite this event,’” Boals says. That suggests that people who may report growth on the original inventory do not actually attribute that growth to the trauma but to other life experiences.
What’s more, say Jayawickreme and others, unlike most psychological surveys, which require respondents to complete a single cognitive task along the lines of “how are you doing now” or “reflect on the present,” the original growth inventory requires four cognitive tasks. Respondents must consider how they are doing at the moment and how they were doing prior to the trauma. They then must compute that difference in well-being, as well as determine if the trauma, or some other life event such as aging or having a child, caused the change.
But people don’t actually go through those steps. That finding was reported in March in Anxiety, Stress and Coping and at the May conference by Boals, Park and Elizabeth Griffith, a psychologist also at the University of North Texas.
When the team compared response times among college students taking the standard growth inventory with students taking a simplified version involving a single cognitive task, the researchers found that the students completing the standard inventory took just 8 percent longer than the other students. That equates to just a half-second longer response time per item, Boals says. “If you can go through steps 2, 3 and 4 … in half a second, you’re drinking stronger coffee than I am.”
Avoiding negative emotions Beyond the methodological particulars, some researchers studying trauma across cultures question some Western psychologists’ focus on personal growth and, to a lesser degree, resilience — a phenomenon marked by stability during hard times, rather than a steep increase in well-being after an initial decline as observed in posttraumatic growth (SN: 10/19/14).
“In Euro-American societies, ‘resilience’ and ‘posttraumatic growth’ are commonly used metaphorical terms for positive responses to extreme adversity,” write University of Zurich psychologists Iara Meili and Andreas Maercker in 2019 in Transcultural Psychiatry. They attribute the popularity of these Western “metaphors” to individualistic societies’ emphasis on self-determination and agency, even in the face of illness or death. These concepts exist elsewhere, Maercker says. But they rarely take on such outsize significance — what he calls “almost a religious approach.”
A societal expectation of growth can put tremendous pressure on survivors to hide their suffering, Jayawickreme says.
Take, for instance, a study where researchers compared outcomes among about 380 Norwegian soldiers who deployed to Iraq in 2004. Soldiers who reported the most personal growth five months after returning home also reported the most posttraumatic stress symptoms 10 months later, researchers noted in 2015 in Clinical Psychological Science.
“Self-reports of posttraumatic growth should be highly correlated with doing well on other measurements, like less depression, more satisfaction with life, those types of things,” says Boals, who was not involved with this research. “But if anything, [growth] is positively correlated with PTSD symptoms.”
Ultimately, the most compassionate response to suffering is to validate survivors’ feelings, Jayawickreme says. “How people respond to adversity is nuanced. People can change in positive ways. People can change in negative ways. People cannot change at all. And that’s fine.”
A new look at the starlet sea anemone’s stinger gets right to the point.
Live-animal images and 3-D computer reconstructions have revealed the complex architecture of the tiny creature’s needlelike weapons. Like a harpoon festooned with venomous barbs, the stinger rapidly transforms as it fires, biologists Matt Gibson, Ahmet Karabulut and colleagues report June 17 in Nature Communications.
Scientists can now see in exquisite detail “what this apparatus looks like before, during and after firing,” says Gibson, of the Stowers Institute for Medical Research in Kansas City, Mo. In the wild, the starlet sea anemone (Nematostella vectensis) can live in salty lagoons or shallow estuaries, where freshwater rivers meet the sea. Its tubular body burrows into the mud, and a crown of Medusa-like tentacles reaches up into the water, waiting for dinner to drift by (SN: 5/7/13). Each tentacle is packing heat: hundreds of stingers that can mean death for brine shrimp or free-floating plankton.
These stingers are among the fastest micromachines in nature. An anemone can jab a predator or nab some lunch in about a hundredth of a second, says Karabulut, also of the Stowers Institute. Scientists had an idea of how such stingers worked, but until now, had never gotten so up close and personal.
The researchers used fluorescent dye to see the stingers in action and scanning electron microscopy to reconstruct their three-dimensional structure. The work reveals the precise, step-by-step mechanics of the speedy shooters. Packed inside a stinger’s capsule, a venomous thread coils around a central shaft. When triggered, the shaft explodes out of the pressurized capsule and extends, turning itself inside out like a sock. Finally, the thread races up through the shaft, sending its barbs into an animal’s soft tissue.
Each stinger is good for just one shot. “It’s a one-hit wonder,” Karabulut says. “Once Nematostella uses it, it’s gone.”