Limestone world gobbled by planet-eating white dwarf

SAN DIEGO — A remote planet — the first with hints of a limestone shell — has been shredded by its dead sun, a new study suggests.

A generous heaping of carbon is raining down on a white dwarf, the exposed core of a dead star, astrophysicist Carl Melis of the University of California, San Diego said June 13 at a meeting of the American Astronomical Society. The carbon — along with a dash of other elements such as calcium, silicon and iron — is probably all that remains of a rocky planet, torn apart by its dying sun’s gravity. Many other white dwarfs show similar signs of planetary cannibalism (SN Online: 10/21/15), but none are as flooded with carbon atoms as this one.

A planet slathered in calcium carbonate, a mineral found in limestone, could explain the shower of carbon as well as the relative amounts of other elements, said Melis. He and Patrick Dufour, an astrophysicist at the University of Montreal, estimate that calcium carbonate could have made up to 9 percent of the doomed world’s mass.

While a limestone-encrusted world is a first, it’s not shocking, says Melis. The recipe for calcium carbonate is just carbon and calcium in the presence of water. “If you have those conditions, it’s going to form,” he says.

“The real interesting thing is the carbon,” Melis adds. Carbon needs to be frozen — most likely as carbon dioxide — to be incorporated into a forming planet. But CO2 freezes far from a star, beyond where researchers suspect rocky planets are assembled. A limestone planet could have formed in an unexpected place and later wandered in while somehow retaining its carbon stores in the warm environs closer to its sun. Or the carbon might have been delivered to the world after it formed. But, Melis says, it’s not clear how either would happen.

Courts’ use of statistics should be put on trial

The Rev. Thomas Bayes was, as the honorific the Rev. suggests, a clergyman. Too bad he wasn’t a lawyer. Maybe if he had been, lawyers today wouldn’t be so reluctant to enlist his mathematical insights in the pursuit of justice.

In many sorts of court cases, from whether talcum powder causes ovarian cancer to The People v. O.J. Simpson, statistics play (or ought to play) a vital role in evaluating the evidence. Sometimes the evidence itself is statistical, as with the odds of a DNA match or the strength of a scientific research finding. Even more often the key question is how evidence should be added up to assess the probability of guilt. In either circumstance, the statistical methods devised by Bayes are often the only reasonable way of drawing an intelligent conclusion.

Yet the courts today seem suspicious of statistics of any sort, and not without reason. In several famous cases, flawed statistical reasoning has sent innocent people to prison. But in most such instances the statistics applied in court have been primarily the standard type that scientists use to test hypotheses (producing numbers for gauging “statistical significance”). These are the same approaches that have been so widely criticized for rendering many scientific results irreproducible. Many experts believe Bayesian statistics, the legacy of a paper by Bayes published posthumously in 1763, offers a better option.

“The Bayesian approach is especially well suited for a broad range of legal reasoning,” write mathematician Norman Fenton and colleagues in a recent paper in the Annual Review of Statistics and Its Application.

But Bayes has for the most part been neglected by the legal system. “Outside of paternity cases its impact on legal practice has been minimal,” say Fenton, Martin Neil and Daniel Berger, all of the School of Electronic Engineering and Computer Science at Queen Mary University London.

That’s unfortunate, they contend, because non-Bayesian statistical methods have severe shortcomings when applied in legal contexts. Most famously, the standard approach is typically misinterpreted in a way known as the “prosecutor’s fallacy.”

In formal logical terms, the prosecutor’s fallacy is known as “the error of the transposed conditional,” as British pharmacologist David Colquhoun explains in a recent blog post. Consider a murder on a hypothetical island, populated by 1,000 people. Police find a DNA fragment at the crime scene, a fragment that would be found in only 0.4 percent of the population. For no particular reason, the police arrest Jack and give him a DNA test. Jack’s DNA matches the crime scene fragment, so he is charged and sent to trial. The prosecutor proclaims that since only 0.4 percent of innocent people have this DNA fragment, it is 99.6 percent certain that Jack is the killer — evidence beyond reasonable doubt.
But that reasoning is fatally (for Jack) flawed. Unless there was some good reason to suspect Jack in the first place, he is just one of 1,000 possible suspects. Among those 1,000, four people (0.4 percent) should have the same DNA fragment found at the crime scene. Jack is therefore just one of four possibilities to be the murderer — so the probability that he’s the killer is merely 25 percent, not 99.6 percent.

Bayesian reasoning averts this potential miscarriage of justice by including the “prior probability” of guilt when calculating the probability of guilt after the evidence is in.

Suppose, for instance, that the crime in question is not murder, but theft of cupcakes from a bakery employing 100 people. Security cameras reveal 10 employees sneaking off with the cupcakes but without a good view of their identities. So the prior probability of any given employee’s guilt is 10 percent. Police sent to investigate choose an employee at random and conduct a frosting residue test known to be accurate 90 percent of the time. If the employee tests positive, the police might conclude there is therefore a 90 percent probability of guilt. But that’s another example of the prosecutor’s fallacy — it neglects the prior probability. Well-trained Bayesian police would use the formula known as Bayes’ theorem to calculate that given a 10 percent prior probability, 90 percent reliable evidence yields an actual probability of guilt of only 50 percent.

You don’t even need to know Bayes’ formula to reason out that result. If the test is 90 percent accurate, it will erroneously identify nine out of the 90 innocent employees as guilty, and it would identify only nine out of the 10 truly guilty employees. If the police tested all 100 people, then, 18 would appear guilty, but nine of those 18 (half of them) would actually be innocent. So a positive frosting test means only a 50 percent chance of guilt. Bayesian math would in this case (and in many real life cases) prevent a rush to injustice.

“Unfortunately, people without statistical training — and this includes most highly respected legal professionals — find Bayes’ theorem both difficult to understand and counterintuitive,” Fenton and colleagues lament.

One major problem is that real criminal cases are rarely as simple as the cupcake example. “Practical legal arguments normally involve multiple hypotheses and pieces of evidence with complex causal dependencies,” Fenton and colleagues note. Adapting Bayes’ formula to complex situations is not always straightforward. Combining testimony and various other sorts of evidence requires mapping out a network of interrelated probabilities; the math quickly can become much too complicated for pencil and paper — and, until relatively recently, even for computers.

“Until the late 1980s there were no known efficient computer algorithms for doing the calculations,” Fenton and colleagues point out.

But nowadays, better computers — and more crucially, better algorithms — are available to compute the probabilities in just the sorts of complicated Bayesian networks that legal cases present. So Bayesian math now provides the ideal method for weighing competing evidence in order to reach a sound legal judgment. Yet the legal system seems unimpressed.

“Although Bayes is the perfect formalism for this type of reasoning, it is difficult to find any well-reported examples of the successful use of Bayes in combining diverse evidence in a real case,” Fenton and coauthors note. “There is a persistent attitude among some members of the legal profession that probability theory has no role in the courtroom.”

In one case in England, in fact, an appeals court denounced the use of Bayesian calculations, asserting that members of the jury should apply “their individual common sense and knowledge of the world” to the evidence presented.

Apart from the obvious idiocy of using common sense to resolve complex issues, the court’s call to apply “knowledge of the world” to the evidence is exactly what Bayesian math does. Bayesian reasoning provides guidance for applying prior knowledge properly in assessing new knowledge (or evidence) to reach a sound conclusion. Which is what the judicial system is supposed to do.

Bayesian statistics offers a technical tool for avoiding fallacious reasoning. Lawyers should learn to use it. So should scientists. And then maybe then someday justice will be done, and science and the law can work more seamlessly together. But as Fenton and colleagues point out, there remain “massive cultural barriers between the fields of science and law” that “will only be broken down by achieving a critical mass of relevant experts and stakeholders, united in their objectives.”

Sounds from gunshots may help solve crimes

The surveillance video shows a peaceful city streetscape: People walking, cars driving, birds chirping.

“Then, abruptly, there’s the sound of gunfire,” said electrical engineer Robert Maher. “A big bang followed by another bang.”

Witnesses saw two shooters facing off, a few meters apart — one aiming north, the other south. But no one knew who shot first. That’s where Maher comes in. His specialty is gunshot acoustics, and he’s helping shore up the science behind a relatively new forensics field.
In the case of the two shooters, surveillance cameras missed the action, but the sounds told a story that was loud and clear.

A distinctive echo followed the first gunshot but not the second. The first gunshot’s sound probably bounced off a big building to the north, causing the echo, Maher concluded. So the first person to shoot was the person facing north, he reported May 24 in Salt Lake City at a meeting of the Acoustical Society of America.

Maher has analyzed the booming echoes of gunshots in dozens of cases, but he’s also studying the millisecond-long sound of a bullet blasting out of the barrel — and finding differences from one type of gun to the next.

He and colleagues at Montana State University in Bozeman erected a semicircular aluminum frame studded with 12 microphones, evenly spaced and raised 3 meters off the ground. When someone standing on a raised platform in the center of the contraption shoots a gun — a 12-gauge shotgun, for example, or a .38 Special handgun — the microphones pick up the sound.

“Each of the different firearms has a distinctive signal,” he says. His team is building a database of sounds made by 20 different guns. To the ear, the gunshots seem alike, but Maher can chart out differences in the sound waves.
One day, investigators might be able to use the information to figure out what kind of guns were fired at a crime scene. Of course, Maher says, most crime scene recordings aren’t high quality — they often come from cellphones or surveillance systems. But his team will compare those recordings with ones made in his outdoor “lab” and try to figure out which aspects of crime scene audio they can analyze.

Maher, a music lover who plays the cello and sings in a choir, didn’t intend this career. “If I were really talented at music, that’s what I’d be doing full time,” he says. Instead, he has applied his skills in math and science to problems involving sound: studying humans’ contribution to noise in national parks, for example, and now, gunshot acoustics.

For him, it’s “a nice way to bridge the gap between the science and the sound.”

Post-stroke shifts in gut bacteria could cause additional brain injury

When mice have a stroke, their gut reaction can amp up brain damage.

A series of new experiments reveals a surprising back-and-forth between the brain and the gut in the aftermath of a stroke. In mice, this dickering includes changes to the gut microbial population that ultimately lead to even more inflammation in the brain.

There is much work to be done to determine whether the results apply to humans. But the research, published in the July 13 Journal of Neuroscience, hints that poop pills laden with healthy microbes could one day be part of post-stroke therapy.
The work also highlights a connection between gut microbes and brain function that scientists are only just beginning to understand,says Ted Dinan of the Microbiome Institute at the University College Cork, Ireland. There’s growing evidence that gut microbes can influence how people experience stress or depression, for example (SN: 4/2/16, p. 23).

“It’s a fascinating study” says Dinan, who was not involved with the work. “It raises almost as many questions as it answers, which is what good studies do.”

Following a stroke, the mouse gut becomes temporarily paralyzed, leading to a shift in the microbial community, neurologist Arthur Liesz of the Institute for Stroke and Dementia Research in Munich and colleagues found. This altered, less diverse microbial ecosystem appears to interact with immune system cells called T cells that reside in the gut. These T cells can either dampen inflammation or dial it up, leading to more damage, says Liesz. Whether the T cells further damage the brain after a stroke rather than soothe it seems to be determined by the immune system cells’ interaction with the gut microbes.

Transplanting microbe-laden fecal matter from healthy mice into mice who had strokes curbed brain damage, the researchers found. But transplanting fecal matter from mice that had had strokes into stroke-free mice spurred a fourfold increase in immune cells that exacerbate inflammation in the brain.

Learning more about this interaction between the gut’s immune cell and microbial populations will be key to developing therapies, says Liesz. “We basically have no clue what’s going on there.”

Anesthesia steals consciousness in stages

The brain doesn’t really go out like a light when anesthesia kicks in. Nor does neural activity gradually dim, a new study in monkeys reveals. Rather, intermittent flickers of brain activity appear as the effects of an anesthetic take hold.

Some synchronized networks of brain activity fall out of step as the monkeys gradually drift from wakefulness, the study showed. But those networks resynchronized when deep unconsciousness set in, researchers reported in the July 20 Journal of Neuroscience.
That the two networks behave so differently during the drifting-off stage is surprising, says study coauthor Yumiko Ishizawa of Harvard Medical School and Massachusetts General Hospital. It isn’t clear what exactly is going on, she says, except that the anesthetic’s effects are a lot more complex than previously thought.

Most studies examining the how anesthesia works use electroencephalograms, or EEGs, which record brain activity using electrodes on the scalp. The new study offers unprecedented surveillance by eavesdropping via electrodes implanted inside macaque monkeys’ brains. This new view provides clues to how the brain loses and gains consciousness.

“It’s a very detailed description of something we know very little about,” says cognitive neuroscientist Tristan Bekinschtein of the University of Cambridge, who was not involved with the work. Although the study is elegant, it isn’t clear what to make of the findings, he says. “These are early days.”

Researchers from Massachusetts General, Harvard and MIT recorded the activity of small populations of nerve cells in two interconnected brain networks: one that deals with incoming sensory information and one involved with some kinds of movement, and with merging different kinds of information. Before the anesthetic propofol kicked in, brain activity in the two regions was similar and synchronized. But as the monkeys drifted off, the networks dropped out of sync, even though each networks’ own nerve cells kept working together.

Around the moment when the monkeys went unconscious, there was a surge in a particular kind of nerve cell activity in the movement network, followed by a different surge in the sensory network about two minutes later. The two networks then began to synchronize again, becoming more in lockstep as the anesthetic state deepened.

How Houdini tadpoles escape certain death

Tree frog tadpoles are the ultimate escape artists. To avoid becoming breakfast, the embryos of red-eyed tree frogs (Agalychnis callidryas) prematurely hatch and wriggle away from a snake’s jaws in mere seconds, as seen above. Embryos also use this maneuver to flee from flooding, deadly fungi, egg-eating wasps and other threats. Adding to the drama, red-eyed tree frogs lay their eggs on the undersides of leaves that hang a few inches to several feet above ponds. So the swimmers perform this feat suspended on a leaf, breaking free in midair and cannonballing into the water below.
High-speed video, captured by Kristina Cohen of Boston University and her colleagues, of unhatched eggs collected from Panamanian ponds shows that the embryos’ trick plays out in three stages. First, upon sensing a threat, an embryo starts shaking and, in some cases, gaping its mouth. Next, a hole forms. (The movement helps tear open the hole, but an embryo’s snout probably secretes a chemical that actually does the breaking.) Finally, the embryo thrashes its body about as if swimming and slips out of the egg.
Orientation is key to a hasty escape, the team reports in the June 15 Journal of Experimental Biology. An embryo must keep its snout aligned with the hole for a speedy exit. In observations of 62 embryos, the getaway took between six and 50 seconds — 20.6 seconds on average.

Some tadpoles may be leaping out of a cauldron into a fire. “There’s a trade-off,” Cohen says. “They may have escaped the threat of a snake, but earlier hatchlings fare worse against some aquatic predators.”

Cooling stars hint at dark matter particles

CHICAGO — Cooling stars could shine some light on the nature of dark matter.

Certain types of stars are cooling faster than scientists expect. New research suggests that the oddity could hint at the presence of hypothetical particles known as axions. Such particles have also been proposed as a candidate for dark matter, the unknown substance that makes up most of the matter in the universe.

Researchers analyzed previous measurements of white dwarf variable stars, which periodically grow dimmer and brighter at a rate that indicates how fast the star is cooling. For all five stars measured, the cooling was larger than predicted. Likewise, red giant stars have also shown excess cooling.
Considering each result on its own, “each one is not that interesting,” says physicist Maurizio Giannotti of Barry University in Miami Shores, Fla., who presented the result at the International Conference on High Energy Physics on August 4. But taken together, the consistent pattern could indicate something funny is going on.

After evaluating several possible explanations for the cooling of the stars, the researchers concluded that the axion explanation was most likely — barring some more mundane explanation like measurement error. Axions produced within the star stream outward, carrying energy away as they go, and cooling the star.

Although it may be more likely that the phenomenon will eventually be chalked up to measurement errors, it’s important to take note when something doesn’t add up, Giannotti says. “We can’t ignore the small hints.”

Female fish have a fail-safe for surprise sperm attacks

Some guys really know how to kill a moment. Among Mediterranean fish called ocellated wrasse (Symphodus ocellatus), single males sneak up on mating pairs in their nest and release a flood of sperm in an effort to fertilize some of the female’s eggs. But female fish may safeguard against such skullduggery through their ovarian fluid, gooey film that covers fish eggs.

Suzanne Alonzo, a biologist at Yale University, and her colleagues exposed sperm from both types of males to ovarian fluid from female ocellated wrasse in the lab. Nesting males release speedier sperm in lower numbers (about a million per spawn), while sneaking males release a lot of slower sperm (about four million per spawn). Experiments showed that ovarian fluid enhanced sperm velocity and motility and favored speed over volume. Thus, the fluid gives a female’s chosen mate an edge in the race to the egg, the researchers report August 16 in Nature Communications.

While methods to thwart unwanted sperm are common in species that fertilize within the body, evidence from Chinook salmon previously hinted that external fertilizers don’t have that luxury. However, these new results suggest otherwise: Some female fish retain a level of control over who fathers their offspring even after laying their eggs.

Bird nest riddle: Which shape came first?

WASHINGTON — To human thinking, songbird nests now seem to have evolved backwards: The most distant ancestor probably built complex, roofed structures. Simple open-top cup nests came later.

More than 70 percent of songbird species today build some form of that iconic open cup, evolutionary biologist Jordan Price said August 18 at the North American Ornithological Conference. Yet looking at patterns of nest style across recent bird family trees convinced him that the widespread cup style probably isn’t just a leftover from deepest bird origins.
Old bird lineages thought to have branched out near the base of the avian family tree tend to have plentiful roof-builders. Price, of St. Mary’s College of Maryland, and coauthor Simon Griffith of Macquarie University in Sydney reconstructed probable nest styles for various branching points in the tree. That reconstruction suggests that open cups showed up independently four times among songbirds, such as in bowerbirds and honeyeaters, the scientists conclude. Also, here and there, some of the earlier cup builders reverted to roofs.

Price said he began musing about nest history while reveling in Australia’s birds during a sabbatical with Griffith. Evolutionary biologists have proposed that the broader Australasia region was probably the starting point for the rise of songbirds. Price said that it isn’t clear what drove a switch from protective roofs to what looks like the quick and dirty alternative of open cups.

Cool nerve cells help mice beat heat

Scientists have identified the “refrigerator” nerve cells that hum along in the brains of mice and keep the body cool. These cells kick on to drastically cool mice’s bodies and may prevent high fevers, scientists report online August 25 in Science.

The results “are totally new and very important,” says physiologist Andrej Romanovsky of the Barrow Neurological Institute in Phoenix. “The implications are far-reaching.” By illuminating how bodies stay at the right temperature, the discovery may offer insights into the relationship between body temperature and metabolism.
Scientists had good reasons to think that nerve cells controlling body temperature are tucked into the hypothalamus, a small patch of neural tissue in the middle of the brain. Temperature fluctuations in a part of the hypothalamus called the preoptic area prompt the body to get back to baseline by conserving or throwing off heat. But the actual identify of the heat sensors remained mysterious. The new study reveals the cells to be those that possess a protein called TRPM2.

“Overall, this is a major discovery in the field of thermoregulation,” says Shaun Morrison of Oregon Health & Science University in Portland.

Jan Siemens, a neurobiologist at the University of Heidelberg in Germany, and colleagues tested an array of molecules called TRP channels, proteins that sit on cell membranes and help sense a variety of stimuli, including painful tear gas and cool menthol. In tests of nerve cells in lab dishes, one candidate, the protein TRPM2, seemed to respond to heat.

The researchers gave mice artificial fevers by injecting “heat up” molecules into the hypothalamus. Mice that lacked TRPM2 grew about 1 degree Celsius warmer than mice with the protein, results that suggest that TRPM2 helps counter high temperatures. “We like to think of it as an emergency brake” that prevents a fever from getting too hot, Siemens says.
Romanovsky cautions that the fever results are not easy to interpret. In some experiments, mice without TRPM2 didn’t run hotter fevers than mice with the protein. More experiments are needed to clarify how these nerve cells affect fever, he says.

Siemens and colleagues then used a genetic trick to take more direct control of preoptic-area nerve cells that have TRPM2. When these cells were prevented from firing off signals, the mice heated up slightly. And when these cells were prompted to fire off lots of signals, the mice grew downright frigid. A mouse’s normal body temperature hovers around 37°C (98.6°Fahrenheit). After a burst of activity from TRPM2 neurons, mice’s temperatures dropped by about 10 degrees C and stayed cool for about 12 hours, the team found. “That was really a ‘wow’ experience when we saw this,” Siemens says.
The cold mice grew less active, but didn’t seem to suffer any ill effects. It’s not clear how similar this chilly state is to torpor, a hibernation-like state that mice enter when the temperature is cold or food is scarce.

When these nerve cells sent their cool-down signals, mice started dumping body heat by shunting warm blood to the surface of their bodies, warming up the paws and tails — body parts from which heat easily escapes. Infrared cameras revealed hot tails soon after the nerve cells were activated. The mice’s sleeping areas also heated up as warmth transferred from bodies to beds, the cameras revealed. “They were actually warming up their surroundings,” Siemens says.

More work is needed to say whether similar cells help cool people, and scientists don’t have good drugs that affect TPRM2 specifically. Yet the results might one day lead to ways to induce hypothermia from inside the body. Doctors sometimes use ice packs and cooling blankets to chill people after cardiac arrest. But an internal cooldown might be more effective.

What’s more, the chilly mice may also offer scientists ways to study how body temperature and metabolism are connected. The results could have important implications for obesity and longevity, both of which are related to metabolism, Morrison says.