Treatments for pain and other common health problems often fall short, leading to untold misery and frustration. So it’s not hard to understand the lure of a treatment that promises to be benign, natural and good for just about everything that ails you. Enter cannabidiol, or CBD.
So far, the U.S. Food and Drug Administration has approved only one drug containing the chemical: a treatment for rare and severe forms of epilepsy. But that hasn’t stopped people from trying CBD to relieve arthritis, morning sickness, pain, depression, anxiety, addiction, inflammation and acne. And it hasn’t kept companies from marketing the heck out of CBD-infused anything. It’s the sort of situation that gets us wondering: What’s the science here? The science is skimpy at best, neuroscience writer Laura Sanders reports in this issue. Clinical trials, some of which included children, were conducted to determine safety and efficacy before the FDA approved the first CBD-based epilepsy drug in 2018. But much less research has been done on CBD with regard to other ailments.
Adding to the intrigue, CBD can be extracted from marijuana, though CBD lacks the capacity to induce a buzzy high like its sister molecule THC. So government restrictions have been tight, and scientists have had a hard time getting access to CBD for studies. That makes it less likely that we’ll get clear answers anytime soon on whether CBD is indeed a panacea, or just another triumph of hype.
The surplus of unknowns hasn’t stopped companies from marketing hundreds of CBD products as treatments, attempting to avoid scrutiny by adding disclaimers that the products “are not intended to diagnose, treat or cure or prevent any disease.” But with such large gaps in the research, people trying these products in the hope of benefit become inadvertent guinea pigs.
The process of science may be frustratingly slow, but it can get the job done. In the last decade, clinical trials on vitamin D, for example, have found that despite much excitement surrounding the “sunshine vitamin,” there’s no definitive evidence of benefits in preventing heart disease or cancer. In our recent cover story “Vitamin D supplements aren’t living up to their hype,” contributing correspondent Laura Beil described the years of effort needed to develop that data (SN: 2/2/19, p. 16). As journalists, we see a big part of our mission as making sure that people have access to accurate, timely information about medical research, so people can make informed decisions for themselves and their families. That’s especially important when it involves products that people can self-prescribe. These two articles — by skilled journalists who put weeks of effort into reading studies, talking with researchers and investigating the business side — are great examples of how sophisticated and useful consumer science journalism can be. Most people look for health information online, but Googling a term like “CBD oil” serves up a muddle of marketing masquerading as impartial information.
CBD may end up being a worthwhile treatment for some problems beyond epilepsy; it’s too early to know. But while we wait for the evidence, it’s essential to know where the science stands right now.
A drug that treats a rare form of cystic fibrosis may have even better results if given before birth, a study in ferrets suggests.
The drug, known by the generic name ivacaftor, can restore the function of a faulty version of the CFTR protein, called CFTRG551D. The normal CFTR protein controls the flow of charged atoms in cells that make mucus, sweat, saliva, tears and digestive enzymes. People who are missing the CFTR gene and its protein, or have two copies of a damaged version of the gene, develop the lung disease cystic fibrosis, as well as diabetes, digestive problems and male infertility. Ivacaftor can reduce lung problems in patients with the G551D protein defect, with treatment usually starting when a patient is a year old. But if the results of the new animal study carry over to humans, an even earlier start date could prove more effective in preventing damage to multiple organs.
Researchers used ferret embryos with two copies of the G551D version of the CFTR gene. Giving the drug to mothers while the ferrets were in the womb and then continuing treatment of the babies after birth prevented male infertility, pancreas problems and lung disease in the baby ferrets, called kits, researchers report March 27 in Science Translational Medicine. The drug has to be used continuously to prevent organ damage — when the drug was discontinued, the kits’ pancreases began to fail and lung disease set in.
Cystic fibrosis affects about 30,000 people in the United States and 70,000 worldwide. But only up to 5 percent of patients have the G551D defect.
Other researchers are testing combinations of three drugs, including ivacaftor, aimed at helping the roughly 90 percent of cystic fibrosis patients afflicted by another genetic mutation that causes the CFTR protein to lack an amino acid (SN: 11/24/18, p. 11). Those drug combos, if proven effective, might also work better if administered early, cystic fibrosis researcher Thomas Ferkol of Washington University School of Medicine in St. Louis writes in a commentary published with the study.
Black holes are extremely camera shy. Supermassive black holes, ensconced in the centers of galaxies, make themselves visible by spewing bright jets of charged particles or by flinging away or ripping up nearby stars. Up close, these behemoths are surrounded by glowing accretion disks of infalling material. But because a black hole’s extreme gravity prevents light from escaping, the dark hearts of these cosmic heavy hitters remain entirely invisible.
Luckily, there’s a way to “see” a black hole without peering into the abyss itself. Telescopes can look instead for the silhouette of a black hole’s event horizon — the perimeter inside which nothing can be seen or escape — against its accretion disk. That’s what the Event Horizon Telescope, or EHT, did in April 2017, collecting data that has now yielded the first image of a supermassive black hole, the one inside the galaxy M87.
“There is nothing better than having an image,” says Harvard University astrophysicist Avi Loeb. Though scientists have collected plenty of indirect evidence for black holes over the last half century, “seeing is believing.”
Creating that first-ever portrait of a black hole was tricky, though. Black holes take up a minuscule sliver of sky and, from Earth, appear very faint. The project of imaging M87’s black hole required observatories across the globe working in tandem as one virtual Earth-sized radio dish with sharper vision than any single observatory could achieve on its own. Putting the ‘solution’ in resolution Weighing in around 6.5 billion times the mass of our sun, the supermassive black hole inside M87 is no small fry. But viewed from 55 million light-years away on Earth, the black hole is only about 42 microarcseconds across on the sky. That’s smaller than an orange on the moon would appear to someone on Earth. Still, besides the black hole at the center of our own galaxy, Sagittarius A* or Sgr A* — the EHT’s other imaging target — M87’s black hole is the largest black hole silhouette on the sky. Only a telescope with unprecedented resolution could pick out something so tiny. (For comparison, the Hubble Space Telescope can distinguish objects only about as small as 50,000 microarcseconds.) A telescope’s resolution depends on its diameter: The bigger the dish, the clearer the view — and getting a crisp image of a supermassive black hole would require a planet-sized radio dish. Even for radio astronomers, who are no strangers to building big dishes (SN Online: 9/29/17), “this seems a little too ambitious,” says Loeb, who was not involved in the black hole imaging project. “The trick is that you don’t cover the entire Earth with an observatory.” Instead, a technique called very long baseline interferometry combines radio waves seen by many telescopes at once, so that the telescopes effectively work together like one giant dish. The diameter of that virtual dish is equal to the length of the longest distance, or baseline, between two telescopes in the network. For the EHT in 2017, that was the distance from the South Pole to Spain.
Telescopes, assemble! The EHT was not always the hotshot array that it is today, though. In 2009, a network of just four observatories — in Arizona, California and Hawaii — got the first good look at the base of one of the plasma jets spewing from the center of M87’s black hole (SN: 11/3/12, p. 10). But the small telescope cohort didn’t yet have the magnifying power to reveal the black hole itself.
Over time, the EHT recruited new radio observatories. By 2017, there were eight observing stations in North America, Hawaii, Europe, South America and the South Pole. Among the newcomers was the Atacama Large Millimeter/submillimeter Array, or ALMA, located on a high plateau in northern Chile. With a combined dish area larger than an American football field, ALMA collects far more radio waves than other observatories.
“ALMA changed everything,” says Vincent Fish, an astronomer at MIT’s Haystack Observatory in Westford, Mass. “Anything that you were just barely struggling to detect before, you get really solid detections now.” More than the sum of their parts EHT observing campaigns are best run within about 10 days in late March or early April, when the weather at every observatory promises to be the most cooperative. Researchers’ biggest enemy is water in the atmosphere, like rain or snow, which can muddle with the millimeter-wavelength radio waves that the EHT’s telescopes are tuned to.
But planning for weather on several continents can be a logistical headache.
“Every morning, there’s a frenetic set of phone calls and analyses of weather data and telescope readiness, and then we make a go/no-go decision for the night’s observing,” says astronomer Geoffrey Bower of the Academia Sinica Institute of Astronomy and Astrophysics in Hilo, Hawaii. Early in the campaign, researches are picky about conditions. But toward the tail end of the run, they’ll take what they can get.
When the skies are clear enough to observe, researchers steer the telescopes at each EHT observatory toward the vicinity of a supermassive black hole and begin collecting radio waves. Since M87’s black hole and Sgr A* appear on the sky one at a time — each one about to rise just as the other sets — the EHT can switch back and forth between observing its two targets over the course of a single multi-day campaign. All eight observatories can track Sgr A*, but M87 is in the northern sky and beyond the South Pole station’s sight.
On their own, the data from each observing station look like nonsense. But taken together using the very long baseline interferometry technique, these data can reveal a black hole’s appearance.
Here’s how it works. Picture a pair of radio dishes aimed at a single target, in this case the ring-shaped silhouette of a black hole. The radio waves emanating from each bit of that ring must travel slightly different paths to reach each telescope. These radio waves can interfere with each other, sometimes reinforcing one another and sometimes canceling each other out. The interference pattern seen by each telescope depends on how the radio waves from different parts of the ring are interacting when they reach that telescope’s location. For simple targets, such as individual stars, the radio wave patterns picked up by a single pair of telescopes provide enough information for researchers to work backward and figure out what distribution of light must have produced those data. But for a source with complex structure, like a black hole, there are too many possible solutions for what the image could be. Researchers need more data to work out how a black hole’s radio waves are interacting with each other, offering more clues about what the black hole looks like.
The ideal array has as many baselines of different lengths and orientations as possible. Telescope pairs that are farther apart can see finer details, because there’s a bigger difference between the pathways that radio waves take from the black hole to each telescope. The EHT includes telescope pairs with both north-south and east-west orientations, which change relative to the black hole as Earth rotates.
Pulling it all together In order to braid together the observations from each observatory, researchers need to record times for their data with exquisite precision. For that, they use hydrogen maser atomic clocks, which lose about one second every 100 million years.
There are a lot of data to time stamp. “In our last experiment, we recorded data at a rate of 64 gigabits per second, which is about 1,000 times [faster than] your home internet connection,” Bower says.
These data are then transferred to MIT Haystack Observatory and the Max Planck Institute for Radio Astronomy in Bonn, Germany, for processing in a special kind of supercomputer called a correlator. But each telescope station amasses hundreds of terabytes of information during a single observing campaign — far too much to send over the internet. So the researchers use the next best option: snail mail. So far, there have been no major shipping mishaps, but Bower admits that mailing the disks is always a little nerve-wracking.
Though most of the EHT data reached Haystack and Max Planck within weeks of the 2017 observing campaign, there were no flights from South Pole until November. “We didn’t get the data back from the South Pole until mid-December,” says Fish, the MIT Haystack astronomer.
Filling in the blanks Combining the EHT data still isn’t enough to render a vivid picture of a supermassive black hole. If M87’s black hole were a song, then imaging it using only the combined EHT data would be like listening to the piece played on a piano with a bunch of broken keys. The more working keys — or telescope baseline pairs — the easier it is to get the gist of the melody. “Even if you have some broken keys, if you’re playing all the rest of them correctly, you can figure out the tune, and that’s partly because we know what music sounds like,” Fish says. “The reason we can reconstruct images, even though we don’t have 100 percent of the information, is because we know what images look like” in general. There are mathematical rules about how much randomness any given picture can contain, how bright it should be and how likely it is that neighboring pixels will look similar. Those basic guidelines can inform how software decides which potential images, or data interpretations, make the most sense.
Before the 2017 observing campaign, the EHT researchers held a series of imaging challenges to make sure their computer algorithms weren’t biased toward creating images to match expectations of what black holes should look like. One person would use a secret image to generate faux data of what telescopes would see if they were peering at that source. Then other researchers would try to reconstruct the original image.
“Sometimes the true image was not actually a black hole image,” Fish says, “so if your algorithm was trying to find a black hole shadow … you wouldn’t do well.” The practice runs helped the researchers refine the data processing techniques used to render the M87 image.
Black holes and beyond So, the black hole inside M87 finally got its closeup. Now what?
The EHT’s black hole observations are expected to help answer questions like how some supermassive black holes, including M87’s, launch such bright plasma jets (SN Online: 3/29/19). Understanding how gas falls into and feeds black holes could also help solve the mystery of how some black holes grew so quickly in the early universe, Loeb says (SN Online: 3/16/18).
The EHT could also be used, Loeb suggests, to find pairs of supermassive black holes orbiting one another — similar to the two stellar mass black holes whose collision created gravitational waves detected in 2015 by the Advanced Laser Interferometer Gravitational-Wave Observatory, or Advanced LIGO (SN: 3/5/16, p. 6). Getting a census of these binaries may help researchers identify targets for the Laser Interferometer Space Antenna, or LISA, which will search from space for gravitational waves kicked up by the movement of objects like black holes (SN Online: 6/20/17). The EHT doesn’t have many viable targets other than supermassive black holes, says astrophysicist Daniel Marrone, at the University of Arizona in Tucson. There are few other things in the universe that appear as tiny but luminous as the space surrounding a supermassive black hole. “You have to be able to get enough light out of the really tiny patches of sky that we can detect,” Marrone says. “In principle, we could be reading alien license plates or something,” but they’d need to be super bright.
Too bad for alien seekers. Still, even if the EHT is a one-trick pony, spying supermassive black holes is a pretty neat trick.
Ketamine banishes depression by slowly coaxing nerve cells to sprout new connections, a study of mice suggests. The finding, published in the April 12 Science, may help explain how the hallucinogenic anesthetic can ease some people’s severe depression.
The results are timely, coming on the heels of the U.S. Food and Drug Administration’s March 5 approval of a nasal spray containing a form of ketamine called esketamine for hard-to-treat depression (SN Online: 3/21/19). But lots of questions remain about the drug. “There is still a lot of mystery in terms of how ketamine works in the brain,” says neuroscientist Alex Kwan of Yale University. The new study adds strong evidence that newly created nerve cell connections are involved in ketamine’s antidepressant effects, he says.
While typical antidepressants can take weeks to begin working, ketamine can make people feel better in hours. Scientists led by neuroscientist Conor Liston suspected that ketamine might quickly be remodeling the brain by spurring new nerve cell connections called synapses. “As it turned out, that wasn’t true, not in the way we expected, anyway,” says Liston, of Cornell University.
Newly created synapses aren’t involved in ketamine’s immediate effects on behavior, the researchers found. But the nerve cell connections do appear to help sustain the drug’s antidepressant benefits over the longer term.
To approximate depression in people, researchers studied mice that had been stressed for weeks, either by being restrained daily in mesh tubes, or by receiving injections of the stress hormone corticosterone. These mice began showing signs of despair, such as losing their taste for sweet water and giving up a struggle when dangled by their tails. Three hours after a dose of ketamine, the mice’s behavior righted, as the researchers expected. But the team found no effects of the drug on nerve cells’ dendritic spines — tiny signal-receiving blebs that help make new neural connections. So the creation of new synapses couldn’t be responsible for ketamine’s immediate effects on behavior, “because the behavior came first,” Liston says.
When the researchers looked over a longer time span, though, they found that these new synapses were key. About 12 hours after ketamine treatment, new dendritic spines began to pop into existence on nerve cells in part of the mice’s prefrontal cortex, the brain area responsible for complex thinking. These dendritic spines seemed to be replacing those lost during the period of stress, often along the same stretch of neuron.
To test if these newly created spines were important for the mice’s improved behavior, the researchers destroyed the spines with a laser a day after the ketamine treatment. That effectively erased ketamine’s effects, and the mice again exhibited behavior resembling depression, including struggling less when held by their tails. (The mice kept their regained sugar preference.)
Research on humans has also suggested that depressed people have diminished synapses, says Ronald S. Duman, a neuroscientist at Yale University not involved in the study. The new work adds more support to those findings by showing that destroying new synapses can block ketamine’s behavioral effects. “That’s a huge contribution and advance,” Duman says.
The sun’s rhythm may have set the pace of each day, but when early humans needed a way to keep time beyond a single day and night, they looked to a second light in the sky. The moon was one of humankind’s first timepieces long before the first written language, before the earliest organized cities and well before structured religions. The moon’s face changes nightly and with the regularity of the seasons, making it a reliable marker of time.
“It’s an obvious timepiece,” Anthony Aveni says of the moon. Aveni is a professor emeritus of astronomy and anthropology at Colgate University in Hamilton, N.Y., and a founder of the field of archaeoastronomy. “There is good evidence that [lunar timekeeping] was around as early as 25,000, 30,000, 35,000 years before the present.”
When people began depicting what they saw in the natural world, two common motifs were animals and the night sky. One of the earliest known cave paintings, dated to at least 40,000 years ago in a cave on the island of Borneo, includes a wild bull with horns. European cave art dating to about 37,000 years ago depicts wild cattle too, as well as geometric shapes that some researchers interpret as star patterns and the moon.
For decades, prehistorians and other archaeologists believed that ancient humans were portraying what they saw in the natural world because of an innate creative streak. The modern idea that Paleolithic people were depicting nature for more than artistic reasons gained traction at the end of the 19th century and was further developed in the early 20th century by Abbé Henri Breuil, a French Catholic priest and archaeologist. He interpreted the stylistic bison and lions in the cave paintings and carvings of southern France as ritual art designed to bring luck to the hunt.
In the 1960s, a journalist–turned–amateur anthropologist proposed even more practical purposes for these drawings and other artifacts: They were created for telling time.
In the early days of the Apollo space missions, the journalist, Alexander Marshack, was writing a book about how the course of human history culminated in the moon shot. He delved into prehistory, trying to understand the earliest concepts of timekeeping and agriculture (SN: 4/14/79, p. 252).
“I had a profound sense of something missing,” Marshack wrote in his 1972 book, The Roots of Civilization. Formal science, including astronomy and math, apparently had begun “suddenly,” he noted. Same with writing, agriculture, art and the calendar. But surely these cognitive leaps took thousands of years of preparation, Marshack reasoned: “How many thousands was the question.”
To find out, he examined ancient bone carvings and wall art from locations including caves in Western Europe and fishing villages of equatorial Africa. He interpreted what was seen by some as simple dots and dashes or depictions of animals and people as sophisticated tools for keeping track of time — via the moon. Today, some experts support his thesis; others remain unconvinced. It’s easy enough to keep track of the seasons just by paying attention to the environment, of course. Throughout the world, animals like deer and cattle are pregnant through the winter’s dark privation; they give birth when the leaves appear on trees and when grasses grow tall.
Early humans of 30,000 years ago frequently connected the changes in these “phenophases,” the seasonal stages of flora and fauna, with the appearance of certain stars and the phases of the moon, says science historian and astronomer Michael Rappenglück of the Adult Education Center and Observatory in Gilching, Germany. He refers to early cave depictions as “paleo-almanacs” because they combined time-reckoning with information related to the cycles of life.
As Rappenglück puts it, simply noting the spinning of the seasons would not be enough to keep time. For one thing, flora and fauna change from place to place, and even 30,000 years ago, humans were traveling great distances in search of food. They needed something more constant to help them tell time.
“People carefully watched the course of the moon, noting its position over the natural horizon and the change of its phases,” Rappenglück wrote in the 2015 Handbook of Archaeoastronomy and Ethnoastronomy.
In the 1960s, Marshack, the first to argue that Paleolithic people were connecting the moon with time, sifted through dusty cabinets in French museums, retrieving bone and antler pieces that had been worked by humans. Others had interpreted the etchings on these objects as the by-product of point-sharpening, or maybe, as most before Breuil thought, abstract artworks made by idle hands.
But Marshack saw the earliest examples of sky almanacs. The etchings were numerical and notational, he argued. On a bone shard from a prehistoric settlement called Abri Blanchard in France, dating to 28,000 years ago, he found a pattern of pits, some with commalike curves and some round. He viewed it as a record of lunar cycles.
Deeply excited by the find, Marshack soon brought his conclusions to archaeologists and anthropologists throughout Europe and the United States. Some of these experts were impressed, according to accounts at the time.
Hunters who could figure out when the night would be illuminated by moonlight would have had an “adaptive advantage,” Aveni says. “That is so much what the cave paintings are about,” he says, referring to the tally marks near the animals on the walls of the Chauvet Cave in France and elsewhere.
Regarding Marshack’s speculations about the Blanchard bone shard, paleoanthropologist Ian Tattersall is still unsure. “We know Ice Age European art was highly symbolic, and there is no doubt that [ancient people] perceived symbols all around them in nature. And it is pretty certain that the moon played a huge role in their cosmology, and that they were fully aware of its cycle,” says Tattersall, curator emeritus of human origins at the American Museum of Natural History in New York City. “Beyond that, all bets are off.”
Thirteen notches In the decades after Marshack published his findings, historians and anthropologists began noticing similar lunar motifs throughout the archaeological record of this time period and afterward, Aveni notes. “There are more than one of these items that have markings on them that might relate to the moon,” he says.
The Venus of Laussel is one extraordinary example. It is a carving of a voluptuous woman, one hand resting on her abdomen, the other raised and holding a bison horn etched with 13 notches. Her face is turned toward the horn. The figure was carved between 22,000 and 27,000 years ago, in a rock-shelter in the Dordogne region of southwestern France. Some archaeologists now think the 13 notches represent the number of lunar cycles in a solar year — and, approximately, the average number of menstrual cycles. Though modern scientists have debunked any direct connection between the cycles of the moon and human fertility, ancient people would have recognized the parallel timing; the lunar cycle repeats every 29.5 days, roughly the same schedule as the average woman’s menstrual cycle. People of 30,000 years ago could have used the moon and stars to plan their pregnancies, Rappenglück speculates.
Cave paintings in the Dordogne region may be depictions of the lunar and menstrual cycles. Specifically, the Lascaux cave paintings, dating to 17,000 years ago, are best known for their curvy, sweeping depictions of horses and bulls. Beyond the cave entrance, past what is called the Hall of Bulls, is a dead-end passage called the Axial Gallery. Red aurochs, an extinct form of cattle, stand in a group. A huge black bull stands apart from them. Across the gallery, a pregnant horse gallops above a row of 26 black dots. The mare is running toward a massive stag, with front legs invisible behind 13 additional evenly spaced dots.
The animals may represent seasons, Rappenglück suggests. In Europe, bovines calve in the spring; horses both foal and mate in the late spring. The deer rut takes place in early autumn, and the wild goats known as ibex mate around the winter solstice.
To Rappenglück, the dots depict the 13 full moons of the lunar cycle. The 26 dots may roughly represent the days of a sidereal month, or the time it takes the moon to return to the same position in the sky relative to the stars. “The striking row of dots is a kind of a time-unit,” he wrote in 2004.
Critics have said Marshack’s work overinterprets many artifacts from Africa and Europe, some of which contain markings at the limit of naked-eye visibility (SN: 6/9/90, p. 357).
“By modern standards of evidence, he is playing with numerological coincidences,” art historian James Elkins wrote in 1996 in an article that is part critique and part celebration. Elkins noted that Marshack countered his doubters by throwing their uncertainty back at them, arguing that better explanations were lacking.
“Nights were real nights at that time, and Paleolithic people certainly had deep insights into what was going on in the sky,” says Harald Floss, an anthropologist at the University of Tübingen in Germany who studies the origin of art. “But I would not risk saying more.”
Astronomy lovers are not the only ones excited about the 50th anniversary of the moon landing. Publishers are also taking note, serving up a pile of books to mark the occasion.
Are you looking for a general overview of the birth of the U.S. space program? Would you rather geek out on the technical details of the Apollo missions? How about flipping through a collection of photographs from the era? Science News staff took a look at the offerings and picked out a few favorites to help you decide. There’s something for everyone in the list below. For history aficionados James Donovan Little, Brown and Co., $30
This retelling of the space race begins with the launch of the Soviet Union’s Sputnik satellite in 1957 and culminates in the historic Apollo 11 mission 12 years later. The book offers insights into the personalities of the astronauts, engineers and others who made the U.S. space program a success. For detail-obsessed NASA fans Charles Fishman Simon & Schuster, $29.99
Getting to the moon demanded a million hours of work for each hour spent in space, this book argues. Accordingly, the story focuses on the engineers, coders, project managers and others who toiled to get the Apollo program off the ground. For anyone who ever dreamed of being an astronaut J.L. Pickering and John Bisney Univ. of Florida, $45
Packed with hundreds of photos, some published for the first time, this coffee-table book reads like a photo album of the Apollo 11 mission. The images focus on candid moments from astronaut training, as well as the excitement of liftoff, the historic landing and the return home of the three men.
For readers ready for a sober view of Apollo Roger D. Launius Smithsonian Books, $27.95
A space historian takes the Apollo program off its pedestal to examine it from multiple angles: as a cog in the Cold War political machine, an engineering endeavor riddled with as many failures as feats of glory and an iconic cultural moment. The book explores both positive and negative viewpoints on the U.S. moonshot project from scientists, politicians, the media and the public during the space race and beyond.
For fans of graphic novels Jonathan Fetter-Vorm Hill and Wang, $35
Colorful and detailed, the comic-style illustrations in this book of graphic nonfiction bring the moon landing to life. Much of the astronauts’ dialog is based on real recordings, making the book feel particularly authentic.
For self-improvement buffs Richard Wiseman TarcherPerigee, $26
A psychologist takes practical lessons from the Apollo era and suggests ways to apply them to everyday problems, from changing careers to raising a family.
For space enthusiasts David Baker Arcturus Publishing Limited, $19.99
A former NASA engineer uses photographs, illustrations, blueprints and other documents to guide readers through a concise history of the space race and the Apollo program, from the beginnings of rocket science to the successful return home of the Apollo 11 crew.
For history wonks with a soft spot for psychology Basil Hero Grand Central Publishing, $22
The Apollo astronauts rarely gave personal interviews. But now that they’re getting older, the astronauts are starting to get introspective. This book distills conversations with the 12 lunar voyagers still alive into general wisdom on conquering fear and appreciating life.
For photography lovers Deborah Ireland Ammonite Press, $14.95
This slim book offers an offbeat take on the mission to the moon, telling the story of the Apollo program through the development of the Hasselblad cameras that Neil Armstrong and Buzz Aldrin used to document their time on the lunar surface. Science News is a participant in the Amazon Services LLC Associates Program. Please see our FAQ for more details.
“I remember carrying my little sister on my back because she’s too tired and walking through the huge sunflower fields … and me feeling so tired I didn’t think I could walk another step.”
“I remember being in a taxi with my mother, coming back to the man who had been violently abusive to all of us…. Her words to me were, ‘Just trust me, Trish. Just trust me.’ ” “I’m waiting at a train station … to meet my mother who I haven’t seen in many years…. Hours pass and eventually I try to call her … and she says to me, ‘I’m sorry, Trish. My neighbor was upset, and I needed to stay back with them.’ And her voice was slurring quite a lot, so I knew she had been drinking.”
Tran, who lives in Perth, Australia, is dispassionate as she describes a difficult childhood. Her account lacks what are generally considered classic signs of trauma: She makes no mention of flashbacks, appears to have a generally positive outlook and speaks with relative ease about distressing events. Yet she narrates her life growing up and living in the Australian Outback as a series of disconnected events; her life story lacks connective glue.
Two photos of Trish Tran. On the left is a black and white family photo with Tran as a small child sitting on her fathers lap while to their right her mother holds a baby and her three siblings stand. The photo on the right is Tran as an adult holding a microphone and smiling. That disjointed style is not how people, at least people in the West, tend to talk about themselves, says psychologist Christin Camia. Autobiographical accounts, like any good narrative, typically contain a curation of key past experiences, transitions linking those experiences and larger arcs about where life is headed. People use these stories to make sense of their lives, says Camia, of Zayed University’s Abu Dhabi campus in the United Arab Emirates.
But a growing body of evidence from fields as wide-ranging as psychology, neuroscience, linguistics, philosophy and literary studies suggests that, as with Tran, trauma can shatter the narrative coherence of one’s life. People lose the plot.
Life’s crises can trigger an existential crisis, Camia says. People think: “I don’t know who I am, and I don’t know where I go from here.” One therapy now in testing aims to re-tether traumatized individuals to their mental timelines, or their sense of themselves as connected across past, present and future. The therapy focuses on the future, which once rife with possibilities now appears as a void. It asks: What would it take for someone like Tran, or anyone traumatized by war, abuse, mass shootings, the ongoing pandemic and other calamities, to flip their life script, to say that they know who they are and where they go from here?
People maintain a sense of self across time In a nod to an established research approach, I have asked Tran to tell me her story in two parts. First, she should narrate seven snapshots of key moments in her life. Second, Tran, who is a lecturer on mental health recovery at Curtin University in Perth, should stitch those snapshots together to tell me how she became who she is today.
The first task comes easy. The second task eludes her. She switches to generalities. “I’ve always been a highly reflective person,” she says. “I’ve had to rely on my brains to keep myself and my family alive.”
I try to nudge her toward specifics, but her timeline disintegrates. She repeatedly attempted suicide. Her mother brought home many violent men.
The developer of this two-question approach, psychologist Tilmann Habermas, wasn’t focused on people who had experienced trauma. Habermas, now at Goethe University Frankfurt, wanted to understand how adolescents develop a narrative identity and then sustain that sense of self over time.
In 2003, Habermas launched a study that would follow participants for up to 16 years. Participants came into the lab every four years and dictated their life story in roughly 20-minute increments, using the two-task format I tried with Tran. Habermas analyzed the resulting transcripts line by line, coding them for emotion, tense, transitions and other features.
With few psychologists at the time studying autobiography as a window into the mind, Habermas turned to theorists from other fields for guidance. “After I read psychology, I read narratology, literary theory, linguistics, social linguistics,” he says. “I had to steal … all these concepts from the other areas.”
One of Habermas’ questions was how people retain their sense of self in the face of life’s many disturbances, such as divorce, illness, job loss or moving to a new location.
Philosophers have been puzzling over this question for millennia. “Your body has changed. Your experiences have changed. Your knowledge has changed. And yet, people generally think of themselves as being the same person … in the past and future,” says psychologist Yosef Sokol of Touro University in New York City. “That’s a hard problem.”
This general belief in self-continuity appears universal, even though how it is constructed may differ across cultures. In the third wave of Habermas’ long-term study, when 150 participants were ages 16, 20, 28, 44 and 69, Habermas and Camia, who joined Habermas’ lab in 2009, also analyzed the transcripts for a type of thinking called autobiographical reasoning. This reasoning links the self across space and time.
“Autobiographical reasoning is this conscious reflection. How did my past impact me? How did I become the person I am today, and what does it mean for my future?” Camia says. Such reasoning tends to stem from change, she adds. “If there is perfect stability in life, you don’t do a lot of autobiographical reasoning … it’s the changes and the crises that compel meaning-making.”
The researchers divvied such reasoning into eight categories, such as turning points, lessons learned, generalized insights and using an event to explain a change in personality.
Participants also filled out two surveys. One survey summed up the number of big life changes experienced over the previous four years. The other gauged self-continuity, with participants rating the truth of statements such as, “When I look at pictures of myself four years back, it feels a little unfamiliar” and “I have the feeling that at the core I am the same person I was four years ago.”
Researchers then compared the three variables: autobiographical reasoning, levels of life change and sense of self-continuity. As expected, levels of autobiographical reasoning showed no discernible pattern among participants who experienced few changes in life, the team reported in 2015 in Memory.
But when the researchers zoomed in on the quarter of participants reporting the greatest level of change, more autobiographical reasoning came with higher levels of self-continuity. “Constructing continuity in the life story buffers against the effect of change in your life,” Habermas says. Other teams have made similar findings. Most disruptions, however, do not rise to the level of trauma — such as that experienced by Tran. Several years later, Camia would study how traumatic events, notably being forced to flee one’s home and the resulting isolation and bereavement, affect people’s sense of self.
Trauma messes with our sense of time “What does war change first? One’s sense of time, one’s sense of space,” said Ukrainian writer Serhiy Zhadan in an October speech translated to English in the online magazine LitHub.
Zhadan speaks from experience. But the idea that trauma disrupts time perception is also borne out by research. Researchers have found that emotions frequently dictate whether we experience time as passing fast or slow. And traumatic events, which come with intense emotions, can cause people to experience time in slow motion, researchers reported in 2012 in Frontiers in Psychology.
During a car accident, for instance, a person’s whole body is ready to act, says Marc Wittmann, a psychologist with the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany. “Your inner workings, your processing, is speeded up. Relative to that, your outside slows down.”
What’s more, says health psychologist Alison Holman of the University of California, Irvine, in that moment or moments of crisis, you do not think about the past or future. All that matters is survival.
Zhadan speaks directly to this idea in his speech: “People in a war-torn space try not to plan for the future or think too much about what the world will be like tomorrow. What’s happening to you here and now is all that matters, just the people and things that will be with you tomorrow morning — tops. That’s if you survive and wake up.”
That narrow focus can wreak havoc on mental health. “[When] that present moment is so intense that it sears into your mind … it may set up the likelihood that you will have a hard time moving past it,” Holman says. “The past never passes.”
Such breakdowns in time can show up in language, particularly among those most severely affected by trauma. For instance, Habermas and his team compared the speech patterns of 14 women diagnosed with post-traumatic stress disorder following a singular shocking event, such as physical or sexual abuse, and 14 women without such a diagnosis. The women with PTSD used more immersive language. They quoted people directly and spoke of the past as if it was ongoing, says Habermas, who reported the findings in 2014. “Instead of saying, ‘He hit me,’ they would say, ‘He hits me.’ ”
This immersive language dominates Tran’s narration. She is “carrying” her little sister. Her mother is “coming” back to the violent man. She is “walking many kilometers to school in the rain and then opening up my newspaper-wrapped wet and warm tomato sandwiches. They’re so wet, but I’m so hungry that I know I have to eat them otherwise I’ll never make the walk back.”
And always there, her mother’s voice: “Just trust me, Trish. Just trust me.”
“I don’t think I will ever forget those words,” Tran says.
Traumatized people can lose their life story Tran remembers her mother’s words exactly, but other details of the abuse she experienced as a child are fuzzier. That’s common among people who experience trauma. People with trauma “have both an excess and depletion of memory,” says cognitive neuroscientist Elisa Ciaramelli of the University of Bologna in Italy.
How memory changes among trauma survivors remains controversial, write the authors of a 2021 opinion piece in Frontiers in Psychology. But mounting evidence suggests that people tend to remember stressful memories in detail. As the mind fixates on those traumatic memories, memories unrelated to the trauma seem to fade, while new memories fail to register.
For example, when asked to describe memories associated with a specific word, such as “beach,” people who do not have PTSD offer detailed reports, describing what they were wearing, what they said and who they were with, Ciaramelli says. People who have PTSD, on the other hand, typically provide general memories with little color.
Other memories can’t find a foothold. In one study, researchers asked 52 participants — 26 people with PTSD and 26 people who had experienced trauma but not developed PTSD — to keep a diary recording their memories over the course of a week. Participants also responded to questions about the memory, such as whether or not it related to their trauma, how central it was to their current life and how far away in time the memory felt.
Participants without PTSD recorded an average of 21.4 memories across the week while participants with PTSD recorded an average of just 11 memories, the team reported in 2017 in Clinical Psychological Science. The PTSD participants had more trauma-related memories than the non-PTSD group.
Tran recognizes this paucity of detail in her own life story. “My memories are lightbulb memories,” she says. “They are always attached to significant events like trauma or happy times. I may have 57 years of life, but you could truncate them into a chapter.”
Everyone’s memory has imprecision of course. That imprecision allows us to cut extraneous details and make sense of our story. The traumatized person’s relative lack of memories, though, both in clarity and quantity, means they struggle to construct a cohesive narrative of their past and to envision themselves moving forward.
“Ten years ago, people have found that the same brain regions that are activated and are necessary for remembering the past are also necessary to imagine the future,” Ciaramelli says. “We need memories to imagine the future.”
Camia’s work with refugees shows what can happen to the sense of self as people struggle and fail to reconcile a traumatic experience with the larger story of their life. Her central aim, which built on work with Habermas, was to see if the same autobiographical arguments people used to buffer against life’s everyday changes could help those facing traumatic disruptions. She and Rida Zafar, a psychology student at New York University Abu Dhabi, recruited 31 refugees living in Germany and asked them to narrate their life stories, plus fill out the life change and self-continuity surveys used in the 2015 study.
Among the 16 refugees who experienced relatively less change since arriving in Germany, such as fewer upheavals in relationships and fewer moves, more autobiographical reasoning did correlate with higher self-continuity, the team reported in 2021 in Frontiers in Psychology. Refugees who experienced high change also used autobiographical reasoning, but their sense of self-continuity remained low.
These individuals cannot settle their trauma, Camia explains, so their reckoning with the past leads not to resolution but rumination. They are stuck.
Therapy could restore the future self For most of her adult life, Tran grappled with that sense of stagnation. “My identity was rooted in the past, and I couldn’t move forward,” she says. “Time was this eternal loop. Every time a problem came up, it felt like a replication of a past problem. I couldn’t see that I could change anything.”
Over and over again, unable to envision a viable escape, Tran tried to kill herself.
Suicide attempts serve as the clearest signal that a person’s future has gone blank, says Sokol, the psychologist at Touro University. The thinking here is intuitive. “If you think you have a meaningful life into the future, you’re not going to kill yourself,” he says.
Conventional therapies for treating people struggling with suicidal thinking often fail to meet their needs because the therapies do not directly address people’s future self, Sokol and his team wrote in 2021 in the Journal of Cognitive Psychotherapy. For instance, dialectical behavior therapy emphasizes focusing on the present to cope with stress and manage emotions. Narrative therapy likewise aims to help patients incorporate traumatic and other events into a continuous timeline, but focuses on linking past to present, not present to future.
So Sokol developed a therapy that incorporates elements of past- and present-oriented treatments but prioritizes future thinking. It’s known as continuous identity cognitive therapy. His goal is to help military veterans struggling with mental illness re-create the plot in the mental timeline of their lives, to answer those foundational questions: Who am I? Where do I go from here?
Sokol tested an initial version of the therapy in a four-week pilot study with 17 veterans. The program contains many work-arounds for participants struggling to access or make sense of their memories. The specific memory is less important than the larger story, or the broader values contained within that memory, Sokol says. “I have all sorts of techniques to help people tap into something that they find important, meaningful.”
In the first week, participants are asked to define their core values. The hope is that those values, rather than specific past events, will form the core of a person’s life story. To get to that core, participants review negative and positive experiences from their past and identify choices they made.
Many veterans struggle with what are called moral injuries — choices they made that don’t seem to align with who they wish to be, Sokol says. So veterans push those memories away. With the values approach, he hopes participants can start to see that they made the best choices they could under challenging circumstances. One way to access those values is to have participants identify people they admire, and the values those people embody. Participants can then use those people’s experiences to identify their own core values.
The focus of the second week shifts to the future. Participants assemble possible futures by reflecting on how life might play out if they work with, or against, their stated values. Participants also actively construct self-continuity. For instance, they write letters to themselves across different time points, such as from their present self to their future self or vice versa.
In week three, participants learn to differentiate between external life stories, the series of events outside their control, and internal life stories made up of choices in line with their stated values. By week four, participants should be able to visualize their future self overcoming an issue that their present self faces. Tran came across Sokol’s research while embarking on her own journey to healing. That process began when Tran realized how her trauma was hurting the people she loved most. “I’m just causing my children and everybody near and dear trauma. I’m going to take [suicide] off the table,” she eventually realized. “This is not my pathway anymore. If it’s not my pathway, what am I going to do with the next 50 years of my life?”
Tran felt lost. So she dug into research on trauma survivors, eventually stumbling upon Sokol’s project. She was moved by the idea that participants did not have to reconstruct the past to build a new future. “This is true. My soul knows this to be true,” she remembers thinking.
Tran, who is also a trainer with DISCHARGED, a nonprofit organization that provides peer group support for people experiencing suicidal thoughts, and an occasional adviser to researchers writing about suicide, reached out to Sokol and offered to help him make the language used in his program more sensitive to people who have experienced trauma. For instance, she suggested changing references to “you” to “we” to give people a greater sense of belonging and agency. The two still work together.
Research on the therapy remains limited to Sokol’s lab, but initial results are promising. The pilot study showed that the program decreased previously reported levels of suicidal ideation and depression. Those levels stayed low one month after completion. Now Sokol has received a five-year, $1.1 million grant from the U.S. Department of Veterans Affairs to scale up the program and eventually roll out a randomized controlled trial. In its newer iteration, the program will run for three months instead of one.
With input from Tran and veterans in the program, Sokol made another substantial modification to the pilot program. Participants will now identify how their own story intersects with the stories of other people in their lives. That addition makes sense to Tran, who has become engrossed in research showing the intergenerational nature of trauma. She now sees her life as part of a larger story with many characters, each on their own often troubled journey.
She says her story will always be truncated. But even without a clean narrative arc, she has managed to sever time’s eternal loop. “You can change your relationship with your past experiences in a way that makes living a future possible,” Tran says.
Setting sail into a plastic sea — Science News, February 10, 1973
Scientists on an oceanographic voyage in the Central North Pacific last August became startled about the number of manmade objects littering the ocean surface. [Far from civilization and shipping lanes], they recorded 53 manmade objects in 8.2 hours of viewing. More than half were plastic. They go on to compute that there are between 5 million and 35 million plastic bottles adrift in the North Pacific.
Update The Great Pacific Garbage Patch is larger now than it was in 1973, containing an estimated 1.8 trillion pieces of plastic within an area twice the size of Texas (SN Online: 3/22/18). In recent years, marine biologists have started seeing evidence that garbage is disrupting ocean ecosystems. For instance, large pieces of trash have helped species cross into new territories (SN: 10/28/17, p. 32). But an even greater threat may lurk beneath the waves. Tiny bits of plastic concentrate hundreds of meters deep where they can be eaten by filter feeders and potentially make their way into the guts of larger predators (SN: 7/6/19 & 7/20/19, p. 5).
If you’ve noticed more lush medians and plant-covered roofs in cities, it’s not your imagination.
Incorporating more natural elements in urban landscapes is a growing management solution for the planet’s increasing climate hazards (SN: 3/10/22). Rain gardens, green roofs and landscaped drainage ditches are all examples of what’s known as green infrastructure, and are used to manage stormwater and mitigate risks like flooding and extreme heat. These strategies sometimes double as a community resource, such as a recreational space. But a major problem with green infrastructure is that the planning processes for the projects often fail to consider equity and inclusion, says Timon McPhearson, an urban ecologist and director of the Urban Systems Lab in New York City, which researches how to build more equitable, resilient and sustainable cities. Without an eye on equity, plans might exclude those most vulnerable to climate disasters, which typically include low-income communities or minority groups (SN: 2/28/22).
There has been talk of fostering equity and inclusion in urban planning for some time, McPhearson says, but he wanted to know if there had been any follow-through. After analyzing 122 formal plans from 20 major U.S. cities, including Atlanta, Detroit and Sacramento, he and colleagues found that most government-affiliated green infrastructure plans are falling short. The researchers focused on plans produced or directly supervised by city governments, as non-profit organizations tend to be more inclusive, the study says.
Over 90 percent of plans didn’t use inclusive processes to design or implement green infrastructure projects, meaning communities targeted for upgrades often didn’t have a chance to weigh in with their needs throughout the process. What’s more, only 10 percent of plans identified causes of inequality and vulnerability in their communities. That matters because without acknowledging the roots of injustices, planners are unable to potentially address them in future projects. And only around 13 percent of plans even defined equity or justice, the researchers report in the January Landscape and Urban Planning.
Such inadequate plans can perpetuate existing inequalities that are part of an “ongoing legacy of historically racist policies,” McPhearson says, including limited access to heat- and pollution-relieving green spaces or proper stormwater management.
“We have an opportunity with green infrastructure to invest in a way that can help solve multiple urban problems,” McPhearson says. “But only if we focus it in the places where there is the most need.”
One reason behind poor urban planning practices is a lack of recognition that infrastructure can be harmful, says Yvette Lopez-Ledesma, the senior director for community-led conservation at The Wilderness Society in Los Angeles, who wasn’t involved in the study. For instance, when cities build stormwater channels but not bridges, locals are left without a way to safely cross. City planners also often lack the training and education to implement more inclusive methods.
But there’s hope. The researchers identified three areas that need more work. First, city planners need to clearly define equity and justice in planning documents to help guide their work. They also need to change planning practices to focus on inclusion by keeping communities informed and supporting their participation throughout the planning, decision-making and implementation processes. And plans need to address current and potential causes of inequality: For example, acknowledging sources of gentrification and identifying how green infrastructure could contribute to gentrification further if officials aren’t careful (SN: 4/18/19).
“If equity isn’t centered in your plans, then inequity is,” Lopez-Ledesma says. “You could be doing more harm.”
CHICAGO – In January 2022, a cyclone blitzed a large expanse of ice-covered ocean between Greenland and Russia. Frenzied gusts galvanized 8-meter-tall waves that pounded the region’s hapless flotillas of sea ice, while a bombardment of warm rain and a surge of southerly heat laid siege from the air.
Six days after the assault began, about a quarter, or roughly 400,000 square kilometers, of the vast area’s sea ice had disappeared, leading to a record weekly loss for the region. The storm is the strongest Arctic cyclone ever documented. But it may not hold that title for long. Cyclones in the Arctic have become more frequent and intense in recent decades, posing risks to both sea ice and people, researchers reported December 13 at the American Geophysical Union’s fall meeting. “This trend is expected to persist as the region continues to warm rapidly in the future,” says climate scientist Stephen Vavrus of the University of Wisconsin–Madison.
Rapid Arctic warming and more destructive storms The Arctic Circle is warming about four times as fast as the rest of Earth (SN: 8/11/22). A major driver is the loss of sea ice due to human-caused climate change. The floating ice reflects far more solar radiation back into space than naked seas do, influencing the global climate (SN: 10/14/21). During August, the heart of the sea ice melting season, cyclones have been observed to amplify sea ice losses on average, exacerbating warming.
There’s more: Like hurricanes can ravage regions farther south, boreal vortices can threaten people living and traveling in the Arctic (SN: 12/11/19). As the storms intensify, “stronger winds pose a risk for marine navigation by generating higher waves,” Vavrus says, “and for coastal erosion, which has already become a serious problem throughout much of the Arctic and forced some communities to consider relocating inland.”
Climate change is intensifying storms farther south (SN: 11/11/20). But it’s unclear how Arctic cyclones might be changing as the world warms. Some previous research suggested that pressures, on average, in Arctic cyclones’ cores have dropped in recent decades. That would be problematic, as lower pressures generally mean more intense storms, with “stronger winds, larger temperature variations and heavier rainfall [and] snowfall,” says atmospheric scientist Xiangdong Zhang of the University of Alaska Fairbanks.
But inconsistencies between analyses had prevented a clear trend from emerging, Zhang said at the meeting. So he and his colleagues aggregated a comprehensive record, spanning 1950 to 2021, of Arctic cyclone timing, intensity and duration.
Arctic cyclone activity has intensified in strength and frequency over recent decades, Zhang reported. Pressures in the hearts of today’s boreal vortices are on average about 9 millibars lower than in the 1950s. For context, such a pressure shift would be roughly equivalent to bumping a strong category 1 hurricane well into category 2 territory. And vortices became more frequent during winters in the North Atlantic Arctic and during summers in the Arctic north of Eurasia. What’s more, August cyclones appear to be damaging sea ice more than in the past, said meteorologist Peter Finocchio of the U.S. Naval Research Laboratory in Monterey, Calif. He and his colleagues compared the response of northern sea ice to summer cyclones during the 1990s and the 2010s.
August vortices in the latter decade were followed by a 10 percent loss of sea ice area on average, up from the earlier decade’s 3 percent loss on average. This may be due, in part, to warmer water upwelling from below, which can melt the ice pack’s underbelly, and from winds pushing the thinner, easier-to-move ice around, Finocchio said.
Stronger spring storms spell trouble too With climate change, cyclones may continue intensifying in the spring too, climate scientist Chelsea Parker said at the meeting. That’s a problem because spring vortices can prime sea ice for later summer melting.
Parker, of NASA’s Goddard Space Flight Center in Greenbelt, Md., and her colleagues ran computer simulations of spring cyclone behavior in the Arctic under past, present and projected climate conditions. By the end of the century, the maximum near-surface wind speeds of spring cyclones — around 11 kilometers per hour today — could reach 60 km/h, the researchers found. And future spring cyclones may keep swirling at peak intensity for up to a quarter of their life spans, up from around 1 percent today. The storms will probably travel farther too, the team says.
“The diminishing sea ice cover will enable the warmer Arctic seas to fuel these storms and probably allow them to penetrate farther into the Arctic,” says Vavrus, who was not involved in the research.
Parker and her team plan to investigate the future evolution of Arctic cyclones in other seasons, to capture a broader picture of how climate change is affecting the storms.
For now, it seems certain that Arctic cyclones aren’t going anywhere. What’s less clear is how humankind will contend with the storms’ growing fury.