Syndicate content Futurity
Research news from top universities.
Updated: 3 min 52 sec ago

Here’s what people mean by terms like ‘privilege’

Tue, 2019-10-15 15:16

The terms intersectionality, privilege, and positionality are increasingly common. This podcast episode offers definitions and examples.

“We want to pay attention to that particular intersection where two forms of oppression come together because there’s a story there, and it’s a story that is almost never told,” says Mark Brimhall-Vargas, vice president for diversity, equity, and inclusion at Brandeis University, of intersectionality.

And positionality “gives us insight into what other people experience that we may not,” says Brimhall-Vargas.

As for privilege, “Most people, when they think about privilege, actually think about something that is earned,” but that’s just one definition. Listen in for the explanation:


A full transcript of the episode is available here.

Source: Brandeis University

The post Here’s what people mean by terms like ‘privilege’ appeared first on Futurity.

To protect city kids from lead, focus on the dirt

Tue, 2019-10-15 13:22

Tracking lead levels in soil over time is critical for cities to determine the risks of lead to their youngest and most vulnerable residents, according to new research.

The study, which focuses on New Orleans, could serve as a model for cities around the world, researchers say. It is the first to show how long-term changes in soil lead levels have a corresponding effect on lead blood levels in children.

Lead dust is invisible and it’s tragic that lead-contaminated outdoor areas are unwittingly provided for children as places to play,” says Howard Mielke, a pharmacology research professor at Tulane University School of Medicine and lead author of the paper in PNAS.

“Young children are extremely vulnerable to lead poisoning because of their normal crawling, hand-to-mouth, exploratory behavior.”

Exposure to lead is often irreversible, particularly for children, and includes behavioral or learning problems, decreased IQ, hyperactivity, delayed growth, hearing problems, anemia, kidney disease, and cancer. In rare cases, exposure can lead to seizures, coma, or death.

Soil lead after Katrina

In metropolitan New Orleans, children living in communities with more lead in the soil and higher blood lead levels have the lowest school performance scores. Experts recently cited lead as a top risk factor for premature death in the United States, particularly from cardiovascular disease, and is responsible for 412,000 premature deaths each year.

The research team began tracking the amount of lead in New Orleans soil in 2001, collecting about 5,500 samples in neighborhoods, along busy streets, close to houses, and in open spaces including parks.

The team from Mielke’s Lead Lab collected another round of soil sampling 16 years later. Those samples showed a 44% decrease in the amount of soil lead in communities flooded during Hurricane Katrina in 2005 as well as soils in communities not affected by the levee failures and storm surge.

Researchers then compared the soil lead with children’s blood lead data from the Louisiana Healthy Homes and Childhood Lead Poisoning Prevention Program from 2000-2005 and 2011-2016. Lead in blood samples decreased by 64% from 2000-2005 to the 2011-2016 time period and that decreasing lead in topsoil played a key factor in the declining children’s blood lead levels.

Environmental justice

Lead exposure is a critical environmental justice issue, according to researchers. The team found black children were three times more likely than white children to have higher blood lead levels, which socioeconomic status and education, the type and age of housing, and proximity to major roads and industry may explain.

“While the metabolism of the city could theoretically affect all residents equally, in reality social formations produce inequitable outcomes in which vulnerable populations tend to bear greater burdens of contaminant exposure,” Mielke says.

Further study is needed to determine if demographic changes in New Orleans since 2001 contributed to the decline in children’s blood lead levels, and if decreases are occurring equitably for all populations.

Additional coauthors are from Australia, Colorado State University, and City University of New York.

Source: Tulane University

The post To protect city kids from lead, focus on the dirt appeared first on Futurity.

Good old days? Index charts national happiness since 1820

Tue, 2019-10-15 10:36

By analyzing 8 million books and 65 million newspaper articles, researchers have created the first-ever “index of national happiness” going back to 1820.

Key findings include from the index include:

Money helps: Increases in national income do generate increases in national happiness but it takes a huge rise to have a noticeable effect at the national level.

Longer lives matter: An increase in longevity of 1 year had the same effect on happiness as a 4.3% increase in GDP.

War is costly: One less year of war had the equivalent effect on happiness of a 30% rise in GDP.

There are low moments: In post-war UK, the worst period for national happiness occurred around the appropriately named “Winter of Discontent.” In post-war USA the lowest point of the index coincides with the Vietnam War and the evacuation of Saigon.

“What’s remarkable is that national subjective well-being is incredibly resilient to wars,” says coauthor Thomas Hills, professor at the University of Warwick and the Alan Turing Institute.

“Even temporary economic booms and busts have little long-term effect. We can see the American Civil War in our data, the revolutions of ’48 across Europe, the roaring 20’s, and the Great Depression. But people quickly returned to their previous levels of subjective well-being after these events were over.

“Our national happiness is like an adjustable spanner [wrench] that we open and close to calibrate our experiences against our recent past, with little lasting memory for the triumphs and tragedies of our age.”

“Aspirations seem to matter a lot: after the end of rationing in the 1950s national happiness was very high as were expectations for the future, but unfortunately things did not pan out as people might have hoped and national happiness fell for many years until the low-point of the Winter of Discontent,” says coauthor Daniel Sgroi, professor at the University of Warwick.

Past national happiness and policy today

Governments the world over are making increasing use of “national happiness” data from surveys to help them consider the impact of policy on national wellbeing.

Unfortunately, data for most countries are only available from 2011 onward, and for a select few, from the mid 1970s. This makes it hard to establish long-term trends, or to say anything about the main historical causes of happiness.

In order to tackle this problem, the researchers took a key insight from psychology—that more often than not what people say or write reveals much about their underlying happiness level—and developed a method to apply it to online texts from millions of books and newspapers published over the past 200 years.

Their findings appear in Nature Human Behavior.

Words with stable meanings

The method uses psychological valence norms—values of happiness that can be derived from text—for thousands of words in different languages to compute the relative proportion of positive and negative language for four different nations (the USA, UK, Germany, and Italy). The research team also controlled for the evolution of language, to take into account the fact that some words change their meaning over time.

“For example, the word “gay” had a completely different meaning in the 1800s than it does today,” explains coauthor Chanuki Seresinhe of the Alan Turing Institute. “We processed terabytes of word co-occurrence data from Google Books to understand how the meaning of words has changed over time, and we validate our findings using only words with the most stable historical meanings.”

They validated the new index against existing survey-based measures. One theory as to why books and newspaper articles are such a good source of data is that editors prefer to publish pieces that match the mood of their readers.

“Our index is an important first step in understanding people’s satisfaction in the past. Looking at the Italian data, it is interesting to note a slow but constant decline in the years of fascism and a dramatic decline in the years after the last crisis,” says coauthor Eugenio Proto of the University of Glasgow.

Source: University of Warwick

The post Good old days? Index charts national happiness since 1820 appeared first on Futurity.

Mars used to have salty lakes

Tue, 2019-10-15 10:25

Mars once had salt lakes similar to those on Earth and went through wet and dry periods, according to a new study.

As reported in Nature Geoscience, researchers examined geological terrains on Mars from Gale Crater, an immense 95-mile-wide rocky basin that the NASA Curiosity rover has explored since 2012 as part of the MSL (Mars Science Laboratory) mission.

The results show that the lake that was present in Gale Crater over 3 billion years ago underwent a drying episode, potentially linked to the global drying of Mars. Gale Crater formed about 3.6 billion years ago when a meteor hit Mars and created its large impact crater.

“Since then, its geological terrains have recorded the history of Mars, and studies have shown Gale Crater reveals signs that liquid water was present over its history, which is a key ingredient of microbial life as we know it,” says Marion Nachon, a postdoctoral research associate in the geology and geophysics department at Texas A&M University.

“During these drying periods, salt ponds eventually formed. It is difficult to say exactly how large these ponds were, but the lake in Gale Crater was present for long periods of time—from at least hundreds of years to perhaps tens of thousands of years.”

Where did they go?

Nachon says Mars probably became drier over time, and the planet lost its planetary magnetic field, which left the atmosphere exposed and vulnerable to stripping from solar wind and radiation over millions of years.

“With an atmosphere becoming thinner, the pressure at the surface became lesser, and the conditions for liquid water to be stable at the surface were not fulfilled anymore,” Nachon says. “So liquid water became unsustainable and evaporated.”

Researchers believe salt ponds on Mars are similar to those found on Earth, especially those in a region called Altiplano, near the Bolivia-Peru border.

The Altiplano is an arid, high-altitude plateau where rivers and streams from mountain ranges “do not flow to the sea but lead to closed basins, similar to what used to happen at Gale Crater on Mars,” Nachon says.

“This hydrology creates lakes with water levels heavily influenced by climate. During the arid periods Altiplano lakes become shallow due to evaporation, and some even dry up entirely. The fact that the Atliplano is mostly vegetation-free makes the region look even more like Mars.”

Dried out climate

The study also shows that the ancient lake in Gale Crater underwent at least one episode of drying before “recovering.” It’s also possible that the lake was segmented into separate ponds, where some could have undergone more evaporation.

Because up to now only one location along the rover’s path shows such a drying history, Nachon says it might give clues about how many drying episodes the lake underwent before Mars’s climate became as dry as it is today.

“It could indicate that Mars’s climate ‘dried out’ over the long term, on a way that still allowed for the cyclical presence of a lake,” Nachon says.

“These results indicate a past Mars climate that fluctuated between wetter and drier periods. They also tell us about the types of chemical elements (in this case sulphur, a key ingredient for life) that were available in the liquid water present at the surface at the time, and about the type of environmental fluctuations Mars life would have had to cope with, if it ever existed.”

Source: Texas A&M University

The post Mars used to have salty lakes appeared first on Futurity.

Some radiation-drug combos just make side effects worse

Tue, 2019-10-15 10:18

Taking certain cancer-fighting drugs while undergoing radiation therapy may not increase survival, but could instead increase side effects, according new study.

The drugs may still benefit patients who aren’t undergoing radiation therapy, say the researchers.

In a meta-analysis, researchers find that treatments that include both radiotherapy and receptor tyrosine kinase inhibitor—or RTKI—drugs did not significantly improve survival rates of patients, but appeared to worsen negative side effects, such as fatigue, nausea, and diarrhea, says Nicholas G. Zaorsky, assistant professor of radiation oncology and public health sciences at Penn State College of Medicine.

“In the 1990s and 2000s there was a push in oncology to study these receptor tyrosine kinase inhibitor drugs, which target receptors that are either expressed on the cancer cells, or expressed on cells that surround cancer cells,” Zaorsky says.

“The receptors are thought to help cancer cells grow, essentially pressing the gas pedals for the cancer cells. Thus, blocking the gas pedal with RTKIs has been thought to slow down cancer cells.”

Understanding how RTKI drugs, which include names like Avastin, Erbitux, Iressa, and Tarceva, react with radiation therapy is important because of the radiotherapy’s widespread use, he adds.

“Radiation therapy is prescribed to about two-thirds of cancer patients and a lot of these patients are also receiving receptor tyrosine kinase inhibitor (RTKI) drugs. What hasn’t been known is if these drugs added to patients receiving radiation therapy help or hurt patients.”

Researchers statistically analyzed the results of 11 large clinical trials that featured both RTKI and radiation therapy. The trials, which focused on solid forms of cancer and included 5,284 patients, evaluated both survival rates and side effects. The results reveal that adding RTKI drugs to radiation therapy does not significantly improve survival, but may increase side effects for patients undergoing both treatments.

“Because it’s such a broad question and because there are so many drugs available for so many different types of cancers, we decided to do a meta-analysis using all of the published data from around the world,” Zaorsky says.

Just because the combination of radiation therapy and RTKI drugs doesn’t appear to improve survival, it doesn’t mean the effectiveness of the drugs alone are in question, Zaorsky says. For many cancer patients who have not undergone simultaneous radiation therapy, these drugs have been immensely helpful.

The study may help guide future clinical trials looking into RTKI and radiation therapies, as well as doctors prescribing these therapies to current cancer patients, the researchers say. For example, the findings suggest that further clinical trials of radiation therapy and RKTIs may not be effective. Further, in some cases, it may be beneficial to hold RTKI drugs while the patient is receiving radiation therapy.

The researchers presented their findings at the 2019 American Society for Radiation Oncology annual meeting and published them in the International Journal of Radiation Oncology.

Additional researchers from Penn State, Mount Sinai School of Medicine, and the Mayo Clinic in Jacksonville, Florida, contributed to the study. The Penn State Cancer Institute and the National Institutes of Health supported this work.

Source: Penn State

The post Some radiation-drug combos just make side effects worse appeared first on Futurity.

Different factors rearranged our brains and braincases

Tue, 2019-10-15 10:13

Evolutionary changes to the braincase reflect our shift to walking upright, whereas changes to the brain reflect the path toward complex cognitive tasks, research finds.

The human brain is like a fish in an aquarium, floating inside the liquid-filled braincase—but filling it out almost completely. The relationship between the brain and the braincase, and how they interacted during human evolution, has been occupying the minds of researchers for almost a century.

José Luis Alatorre Warren, researcher in the University of Zürich anthropology department, tackled this question using computed tomography (CT) and magnetic resonance imaging (MRI) data from humans and chimpanzees.

By combining CT/MRI data, he was able to quantify the spatial relationships between brain structures such as gyri (convolutions) and sulci (furrows) on the one hand, and cranial structures such as bony sutures on the other.

CT/MRI datasets of a human (left), chimpanzee (center), and gorilla (right). Surface reconstructions of bony structures derived from CT data, while volume renderings of brain segmentations came from postprocessed MRI data. (Credit: J.L. Alatorre Warren/U. Zurich)

The results, published in PNAS, show that the characteristic spatial relationships between brain and bone structures in humans are clearly distinct to those in chimpanzees. While the brain and its case continued to evolve side by side, they did so along largely independent evolutionary paths.

For example, brain structures related to complex cognitive tasks such as language, social cognition, and manual dexterity expanded significantly in the course of human evolution. This becomes visible as a shift of the neuroanatomical boundaries of the frontal lobe of the brain.

This shift, however, did not affect the bony structures of the braincase. Instead, changes in the braincase largely reflect adaptations to walking upright on two legs, or bipedalism. For example, the opening at the skull base for the spinal cord moved forward during human evolution in order to optimize balance of the head atop the vertebral column. However, these evolutionary changes to the braincase did not have an effect on our cerebral structures.

“The brain followed its own evolutionary path of neural innovation while freely floating in the braincase,” says Alatorre Warren. “The position and size of braincase bones thus don’t enable us to draw conclusions about evolutionary changes in the size or rearrangement of adjacent brain regions.”

Coauthors Marcia Ponce de León and Christoph Zollikofer believe their study’s data provide an important point of reference for future research: “Having answered the brain-braincase question for humans and great apes, we can now take a fresh look at the braincases of fossil hominids.”

Source: University of Zürich

The post Different factors rearranged our brains and braincases appeared first on Futurity.

Survey finds bipartisan support for sex ed

Tue, 2019-10-15 08:22

A new study finds that Democrats and Republicans agree on something: sex ed for teens.

The study, published in the journal Sex Education, surveyed close to 1,000 likely voters who identified as Democrats or Republicans. The findings show a strong majority of them support sex education in schools and the continued funding by the government for teenage pregnancy prevention programs that include information about both abstinence and contraception.

“Sex education remains a vital component to reducing unintended teenage pregnancies and sexually transmitted diseases among young people as well as providing young people with the information and skills they need to build healthy relationships,” says professor Leslie M. Kantor, chair of the department of urban-global public health at the Rutgers School of Public Health.

“Recent attempts by the government to shift funding away from evidence-based pregnancy prevention programs and back to abstinence-only-until-marriage-approaches are out of alignment with what likely voters want.”

The study found that Democrats and Republicans express similar support for including the issues of puberty and sexually transmitted diseases in school sex education programs and Republicans were more likely to also want abstinence included as a topic.

Democrats are more likely than Republicans to want the topics of healthy relationships, birth control, consent, and sexual orientation included in school sex education programs. However, strong support exists for including all of the topics.

“Planned Parenthood’s mission includes providing sex education programs and resources that teach teens to make healthy, informed choices,” says Nicole Levitz, director of digital products at Planned Parenthood Federation of America and a coauthor of the study. “This study validates that most likely voters want comprehensive sex education for middle and high school students.”

“This study reconfirms broad and deep support among people in the United States for the importance of teaching high quality sex education that includes information about a myriad of topics, including birth control, healthy relationships, and consent,” says Ginny Ehrlich, CEO of Power to Decide.

“In addition, it finds that a majority of likely voters across party affiliations support continued funding for the evidence-based Teen Pregnancy Prevention Program (TPP Program) and Personal Responsibility Education Program (PREP)—consistent with a previous Power to Decide survey of adults in the United States.”

As attitudes and perceptions shift on key topics such as sexual orientation, the researchers say additional studies should take place among likely voters as well as other groups to assess current attitudes toward sex education practice and policy.

Source: Rutgers University

The post Survey finds bipartisan support for sex ed appeared first on Futurity.

Mindfulness may benefit people on methadone

Mon, 2019-10-14 16:01

Mindfulness techniques and methadone may reduce cravings and pain among people experiencing opioid addiction and chronic pain, research finds.

The study, published in the journal Drug and Alcohol Dependence, involved 30 patients.

The findings showed that those who received methadone and a mindfulness training-based intervention were 1.3 times better at controlling their cravings and had significantly greater improvements in pain, stress, and positive emotions, even though they were aware of more cravings than those who only received standard methadone treatment and counseling.

Mindfulness is the meditative practice of focusing on the present moment and accepting one’s thoughts, feelings, and bodily sensations without judgment.

“Methadone maintenance therapy (MMT) has been an effective form of medication treatment for opioid use disorder,” says Nina Cooperman, associate professor and clinical psychologist in the Division of Addiction Psychiatry at Rutgers Robert Wood Johnson Medical School. “However, nearly half of individuals on MMT continue to use opioids during treatment or relapse with six months.”

Cooperman says many of those with opioid addictions experience chronic pain, anxiety, and depression while on methadone maintenance, which is why mindfulness-based, non-drug interventions seem promising.

The researchers say mindfulness-based interventions could help people dependent on opioids increase their self-awareness and self-control over cravings and be less reactive to emotional and physical pain. Individuals with an opioid addiction could also learn to change their negative thoughts and savor pleasant events, which may help them to regulate their emotions and experience more enjoyment.

Coauthors are from Rutgers and the University of Utah.

Source: Rutgers University

The post Mindfulness may benefit people on methadone appeared first on Futurity.

Walking speed at 45 may indicate accelerated aging

Mon, 2019-10-14 15:14

The walking speed of 45-year-olds, particularly their fastest walking speed without running, may indicate the aging of their brains and bodies, according to a new study in New Zealand.

The findings show that slower walkers have “accelerated aging” on a 19-measure scale—and their lungs, teeth, and immune systems tend to be in worse shape than the people who walked faster.

“The thing that’s really striking is that this is in 45-year-old people, not the geriatric patients who are usually assessed with such measures,” says lead researcher Line J.H. Rasmussen, a postdoctoral researcher in Duke University’s psychology and neuroscience department.

Equally striking, neurocognitive testing that these individuals took as children could predict who would become the slower walkers. At age 3, their scores on IQ, understanding language, frustration tolerance, motor skills, and emotional control predicted walking speed at age 45.

“Doctors know that slow walkers in their seventies and eighties tend to die sooner than fast walkers their same age,” says senior author Terrie E. Moffitt, professor of psychology at Duke and professor of social development at King’s College London. “But this study covered the period from the preschool years to midlife, and found that a slow walk is a problem sign decades before old age.”

The data come from a long-term study of nearly 1,000 people born during a single year in Dunedin, New Zealand. Researchers tested, quizzed, and measured 904 participants over their entire lives, mostly recently from April 2017 to April 2019 at age 45.

MRI exams during the last assessment showed the slower walkers tend to have lower total brain volume, lower mean cortical thickness, less brain surface area, and higher incidence of white matter “hyperintensities,” small lesions associated with small vessel disease of the brain. In short, their brains appear somewhat older.

Adding insult to injury perhaps, the slower walkers also looked older to a panel of eight screeners who assessed each participant’s ‘facial age’ from a photograph.

Researchers have long used gait speed as a measure of health and aging in geriatric patients, but what’s new in this study is the relative youth of these study subjects and the ability to see how walking speed matches up with health measures the study collected during their lives.

“It’s a shame we don’t have gait speed and brain imaging for them as children,” Rasmussen says. (MRI was invented when participants were five years old, but was not given to children for many years after.)

Some of the differences in health and cognition may link to lifestyle choices these individuals have made. But the study also suggests that there are already signs in early life of who would become the slowest walkers, Rasmussen says. “We may have a chance here to see who’s going to do better health-wise in later life.”

The study appears in JAMA Network Open.

Funding came from the US National Institute on Aging, the UK Medical Research Council, the Jacobs Foundation, the New Zealand Health Research Council, the New Zealand Ministry of Business, Innovation and Employment, the Lundbeck Foundation, the US National Science Foundation, and the US National Institute of Child Health and Human Development.

Source: Duke University

The post Walking speed at 45 may indicate accelerated aging appeared first on Futurity.

Some cases of SIDS may have this genetic cause

Mon, 2019-10-14 13:42

New research links a genetic anomaly and some forms of SIDS, or sudden infant death syndrome, which claims the lives of more than 3,000 infants a year.

The research, published in Nature Communications, focuses on mitochondrial tri-functional protein deficiency, a potentially fatal cardiac metabolic disorder caused by a genetic mutation in the gene HADHA.

Newborns with this genetic anomaly can’t metabolize the lipids found in milk, and die suddenly of cardiac arrest when they are a couple months old. Lipids are a category of molecules that include fats, cholesterol, and fatty acids.

“There are multiple causes for sudden infant death syndrome,” says Hannele Ruohola-Baker, professor of biochemistry at the University of Washington School of Medicine, who is also associate director of the Medicine Institute for Stem Cell and Regenerative Medicine.

“There are some causes which are environmental. But what we’re studying here is really a genetic cause of SIDS. In this particular case, it involves defect in the enzyme that breaks down fat.”

Unexplained cardiac disease

Lead author Jason Miklas, who earned his PhD at the University of Washington and is now a postdoctoral fellow at Stanford University, says he first came up with the idea while researching heart disease and noticed a small research study that had examined children who couldn’t process fats and who had cardiac disease that was not readily explained.

So he and Ruohola-Baker started looking into why heart cells, grown to mimic infant cells, died in the petri dish where they were growing.

“If a child has a mutation, depending on the mutation the first few months of life can be very scary as the child may die suddenly,” Miklas says. “An autopsy wouldn’t necessarily pick up why the child passed but we think it might be due to the infant’s heart stopping to beat.”

“We’re no longer just trying to treat the symptoms of the disease,” Miklas says. “We’re trying to find ways to treat the root problem. It’s very gratifying to see that we can make real progress in the lab toward interventions that could one day make their way to the clinic.”

In MTP deficiency, the heart cells of affected infants don’t convert fats into nutrients properly, resulting in a build-up of unprocessed fatty material that can disrupt heart functions. More technically, the breakdown occurs when enzymes fail to complete a process known as fatty acid oxidation. It is possible to screen for the genetic markers of MTP deficiency; but effective treatments remain a ways off.

No cure, but hope

Ruohola-Baker says the latest laboratory discovery is a big step towards finding ways to overcome SIDS.

“There is no cure for this,” she says. “But there is now hope, because we’ve found a new aspect of this disease that will innovate generations of novel small molecules and designed proteins, which might help these patients in the future.”

One drug the group is focusing on is Elamipretide, used to stimulate hearts and organs that have oxygen deficiency, but barely considered for helping infant hearts, until now. In addition, prospective parents can undergo screening to see if there is a chance that they could have a child who might carry the mutation.

Ruohola-Baker has a personal interest in the research: one of her friends in Finland, her home country, had a baby who died of SIDS.

“It was absolutely devastating for that couple,” she says. “Since then, I’ve been very interested in the causes for sudden infant death syndrome. It’s very exciting to think that our work may contribute to future treatments, and help for the heartbreak for the parents who find their children have these mutations.”

The National Institutes of Health, the Academy of Finland, Finnish Foundation for Cardiovascular Research. Wellstone Muscular Dystrophy Cooperative Research Center, Natural Sciences and Engineering Research of Canada, an Alexander Graham Bell Graduate Scholarship, and the National Science Foundation funded the work.

Source: University of Washington

The post Some cases of SIDS may have this genetic cause appeared first on Futurity.

Funding for child gun death research is 30X too low

Mon, 2019-10-14 11:17

Firearm injuries kill 2,500 American children each year, and send another 12,000 to the emergency department. Research funding isn’t keeping up, research shows.

A new study finds that the nation spends far less on studying what leads to these injuries and what might prevent and treat them, than it spends on other, less-common causes of death in children between the ages of 1 and 18 years.

In fact, on a per-death basis, funding for pediatric firearm research is 30 times lower than it would have to be to keep pace with research on other child health threats.

That mismatch between deaths and research funding may help explain why firearm deaths among young people have climbed, when deaths from other causes have dropped, according to a new study published in Health Affairs.

Just 3.3% of the necessary funding

Researchers analyzed records from a wide range of federal research funding sources, and catalogued grants given over a 10-year period to teams studying the major causes of death in children and teens. Using data on the causes of death of children and teens during this same time, they then compiled a dollars-per-death amount for each area of research.

Child-specific research on motor vehicle crashes—the top cause of death in US young people—received an average of $88 million per year from 2008 to 2017. That comes out to about $26,000 in research funding for each one of the 33,577 young people killed in a vehicle crash in that decade.

Meanwhile, research on pediatric cancer—the third leading cause of death in this age group—received $335 million per year. That’s $195,500 for each of the 17,111 child cancer deaths in the 10-year window.

During this same time, the federal government provided $1 million a year to fund research on firearm-related injuries—the second-leading cause of death among children and teens.

That works out to $597 per death for the 20,719 young people who died from intentional and accidental firearm injuries in the years of the study. In all, the researchers say, pediatric firearm research receives 3.3% of the $37 million per year it would need to keep pace with research on other causes of death among American children.

Less funding, less knowledge

Less funding means less new knowledge generated through studies and evaluations, the researchers say.

“This lack of knowledge does not result from the scientific questions or data being more difficult to research than they were for research on the molecular basis of cancer, polio prevention, or motor vehicle crash prevention. Instead, it is because federal agencies have not invested in scientists seeking to discover answers to the key research questions about firearm injuries,” they write.

“We know that when researchers study a health issue, and evaluate efforts to reduce its impact, the toll on individuals and society can drop,” says first author Rebecca Cunningham, interim vice president of research at the University of Michigan and an emergency medicine physician at Michigan Medicine.

“This is a stark demonstration of the lack of support for research that could help reduce the chances that children will be hurt or killed by firearms.”

“Our goal with this study,” says senior author Patrick Carter, “is to illuminate the vast opportunity we have as a nation to study firearm-related issues in young people, and apply new knowledge to the problem, if more funding were made available.”

The US should create a national institute focused on firearm-related research, the authors argue.

Wide funding disparities

In addition to disparities in funding for the most common causes of child death, the team also finds that research on relatively rare causes of childhood death received even more dollars proportional to their told on children and teens.

Meningitis, which killed 400 young people in 10 years, received $33.1 million in funding per year in that time. Researchers studying pediatric AIDS shared about $25 million in annual funding for each of the 91 AIDS-related deaths of a child or teen. And diabetes, which led to 697 deaths of children and teens in the study decade, received $20 million per year in research funding.

The authors acknowledge that deaths are only one way to measure the impact of a disease or cause of injury on children, teens, their families, and society. But researchers have not compiled data on the other impacts of firearm injuries, a fact that the FACTS Consortium’s members laid out in a group of recent papers published in the Journal of Behavioral Medicine. They also laid out the most urgent firearm-related pediatric research questions that need answers in a recent piece in the journal Pediatrics.

The new study goes beyond past efforts to quantify the scope of research on different causes of death among all Americans. Cunningham and her colleagues, included only research grants from federal agencies that were specific to children and teens.

They included grants from a wide range of federal agencies, and an estimate of pediatric-related vehicle crash research funding from the National Highway Traffic Safety Administration. The study did not include private foundation or industry funding or other public funding not available in federal databases, such as state funding.

In all, 32 research grants (called awards in the paper) went to pediatric firearm research in the decade studied, compared with 5,168 grants for pediatric cancer research.

The research team also compiled a total number of research papers published with findings about each of the causes of pediatric death. Cancer had the most, with 50,235 papers in one decade. By contrast, pediatric firearm research accounted for just 540 research papers in that same decade, and pediatric vehicle crash research results were reported in 2,223 papers.

Additional coauthors are from Brown University/Rhode Island Hospital.

Source: University of Michigan

The post Funding for child gun death research is 30X too low appeared first on Futurity.

Ocean info fills gaps in Earth’s ‘methane budget’

Mon, 2019-10-14 10:50

New research uses data science to determine how much methane goes from the ocean and into the atmosphere each year.

To predict the impacts of human emissions, researchers need a complete picture of the atmosphere’s methane cycle. They need to know the size of the inputs—both natural and human—as well as the outputs. They also need to know how long methane resides in the atmosphere.

The results, published in Nature Communications, fill a longstanding gap in methane cycle research and will help climate scientists better assess the extent of human perturbations.

Every three years, an international group of climate scientists called the Global Carbon Project updates what is known as the methane budget. The methane budget reflects the current state of understanding of the inputs and outputs in the global methane cycle. Its last update was in 2016.

“The methane budget helps us place human methane emissions in context and provides a baseline against which to assess future changes,” says Tom Weber, assistant professor of Earth and environmental sciences at the University of Rochester. “In past methane budgets, the ocean has been a very uncertain term. We know the ocean naturally releases methane to the atmosphere, but we don’t necessarily know how much.”

Adding up the methane budget

In the methane budget, if one term is uncertain, it adds uncertainty to all the other terms, and limits researchers’ ability to predict how the global methane system might change. For that reason, coming up with a more accurate estimate of oceanic methane emissions has been an important goal of methane cycle research for many years.

But, Weber says, “it’s not easy.” Because the ocean is so vast, only small portions of it have been sampled for methane, meaning data is scarce.

To overcome this limitation, Weber and Nicola Wiseman, graduate student at the University of California, Irvine, compiled all the available methane data from the ocean and fed it into machine learning models—computer algorithms designed for pattern recognition. These models were able to recognize systematic patterns in the methane data, allowing the researchers to predict what the emissions are likely to be, even in regions where no direct observations have been made.

“Our approach allowed us to pin down the global ocean emission rate much more accurately than ever before,” Weber says.

The newest version of the methane budget will be released later this year and incorporates the results from Weber’s paper, giving researchers a better understanding of how methane cycles throughout the Earth’s system.

Shallow waters and phytoplankton

In addition to contributing to a better understanding of the global methane budget, the research yielded two other interesting findings:

First, very shallow coastal waters contribute around 50% of the total methane emissions from the ocean, despite making up only 5% of the ocean area. That’s because methane can seep out of natural gas reservoirs along continental margins and can be produced biologically in anoxic (oxygen-depleted) sediments at the seafloor.

In deep waters, methane is likely to be oxidized as it travels its long route from the seafloor to the atmosphere. But in shallow waters, there’s a rapid route to the atmosphere and methane escapes before it is oxidized. Weber is collaborating with John Kessler, a professor of Earth and environmental sciences at the University of Rochester, to resolve the remaining uncertainties in coastal methane emissions by conducting research cruises and further developing machine learning models.

Second, methane exhibits a spatial pattern very similar to that of phytoplankton abundance, which supports a controversial recent hypothesis that plankton produces methane in the surface ocean. Previously, scientists believed methane could only be produced in the anoxic conditions found at the bottom of the ocean. “Evidence is gradually accumulating to overturn that paradigm, and our paper adds an important piece,” Weber says.

Each natural source of methane is likely sensitive to climate change as well, and it is important for researchers to have an accurate baseline.

“There are a number of reasons to believe that the ocean might become a larger source of methane in the future, but unless we have a good estimate of how much it emits right now, we’ll never be to identify those future changes,” Weber says.

Wiseman is a former undergraduate researcher at the University of Rochester. She and Weber worked with Annette Kock at the GEOMAR Helmholtz Centre for Ocean Research in Germany.

Source: University of Rochester

The post Ocean info fills gaps in Earth’s ‘methane budget’ appeared first on Futurity.

The trouble with psych tests that say ‘answer without thinking’

Mon, 2019-10-14 09:20

Asking people to answer a question quickly and without thinking doesn’t get honest responses, especially if the quick response isn’t the most socially desirable, research finds.

There’s a longstanding belief in the field of psychology that limiting the time subjects have to respond to questions will result in more honest answers. Certainly, many of us who have participated in personality tests have heard the directive to “say the first thing that comes to mind.”

“One of the oldest methods we have in psychology—literally more than a hundred years old—is the method of asking people to answer quickly and without thinking,” says John Protzko, a cognitive scientist in the psychological and brain sciences department at the University of California, Santa Barbara and the lead author of a paper in Psychological Science. “You could see this in the early 1900s with people like Carl Jung advocating this method for therapeutic insight.”

The concept behind the method, Protzko explains, is that by asking for a quick response, people—psychologists in particular—might be able to bypass the part of the mind that could intervene and alter that response.

“The idea has always been that we have a divided mind—an intuitive, animalistic type and a more rational type,” he says. “And the more rational type is assumed to always be constraining the lower order mind. If you ask people to answer quickly and without thinking, its supposed to gives you sort of a secret access to that lower order mind.”

To test this assumption, Protzko and fellow psychologists Jonathan Schooler and Claire Zedelius devised a test of 10 simple yes-or-no questions—a Social Desirability questionnaire. They then asked respondents to take fewer than 11 seconds, or alternatively, more than 11 seconds to answer each question, to gauge whether their answers would differ with the time spent answering them.

Try it yourself

Curious about the test? You can take the short version, below. Answer quickly and without thinking.

True or False:
  1. I have never intensely disliked anyone
  2. I sometimes feel resentful when I don’t get my way
  3. No matter who I’m talking to, I’m always a good listener
  4. There have been occasions when I took advantage of someone
  5. I’m always willing to admit it when I make a mistake
  6. I sometimes try to get even, rather than forgive and forget
  7. There have been occasions when I felt like smashing things
  8. There have been times when I was quite jealous of the good fortune of others
  9. I have never felt that I was punished without cause
  10. I have never deliberately says something that hurt someone’s feelings

If you answered “true” to questions 1, 3, 5, 9, or 10, you’re probably lying. If you answered “false” to questions 2, 4, 6, 7, 8, you’re probably lying.

That’s because researchers designed the questions—which they presented one by one in random order to participants, and then documented the answers—to force the respondent to consider what their social desirability would be as a result of their responses. The honest answers—and who among us have never disliked anyone or have always been good listeners?—tend to portray respondents in a more negative light.

If you lied, well, you’re in good company.

“What we found is that people just lie,” Protzko says. According to the study, the fast-answering group was more likely to lie, while the slow answerers and the ones who were not given any time constraints (fast or slow) were less likely to do so. Asking people to answer quickly, the study says, causes them to give more socially desirable responses, showing that asking people to respond quickly and without thinking does not always yield the most honest response.

‘Good-true-self bias’

Are people giving socially desirable responses under time pressure because they think they’re good people, deep down inside? That was the subject of the next experiment Protzko and colleagues conducted.

“People have what’s called a ‘good-true-self’ bias,” he says. To extents that vary with the individuals, people generally believe that people have “true selves,” and that these selves are essentially good, he explains.

The team tested the degree of respondents’ good-true-self biases through a social judgment task where they asked participants to assess fictional individuals in situations where they behaved uncharacteristically and how true they were to “the deepest, most essential aspects” of their being. The higher positive true-self judgment scores indicated greater good-true-self bias.

If indeed time pressure caused people to align with their good true selves, according to the study, then the time pressure to respond in a socially desirable manner should affect those who scored lower on the good-true-self bias scale (i.e., they thought people were more a mix of good and bad qualities) less.

The scientists found, however, that when they asked participants to respond to the Social Desirability questionnaire under time pressure, those who saw the true self as bad were more likely to respond in a socially desirable manner. Socially desirable answers from people on the high end of the good-true-self scale were more likely to happen if they had more time to deliberate.

“When you demand an answer very quickly, people—even if they don’t think that people are good at heart—will still lie to you,” Protzko says. “They’ll still give you the answer they think you want to hear.”

It could be that under time pressure, people default not to their core goodness, but their desire to appear virtuous, even if it means misrepresenting themselves, because of learned and internalized behaviors, and perhaps the likelihood that in the long run, it is socially advantageous to appear virtuous.

The results of this study indicate that the seemingly tried-and-true method of demanding quick answers may not always be the way for psychologists to access their patients’ inner selves or a suppressed mind, Protzko says.

“It doesn’t call into question what else has been shown using this method of ‘answer quickly’,” he says. The study is, rather, a test of the assumptions of methods used in psychological thought.

“A lot of the time we have these assumptions, and you can cite Sigmund Freud or Wilhelm Wundt and hundred-year-old research to back you up and it seems that there’s this authority behind it.” Protzko says, “but sometimes we’re not entirely sure what is actually happening inside the mind when we use these methods.”

Source: UC Santa Barbara

The post The trouble with psych tests that say ‘answer without thinking’ appeared first on Futurity.

Test for mild cognitive impairment according to sex?

Mon, 2019-10-14 09:04

Scoring memory tests according to sex would possibly result in more women and fewer men getting a diagnosis of mild cognitive impairment, research finds.

Using sex-specific scores on the tests could also change the diagnosis for 20% of those currently diagnosed with mild cognitive impairment (MCI), according to the study in Neurology.

Coauthor Anat Biegon, director of the Center on Gender, Hormones, and Health at the Renaissance School of Medicine at Stony Brook University, says future confirmation of these results could ultimately change the testing of men and women for dementia.

MCI, considered a precursor to dementia, is when people have memory and thinking skill problems. Because women typically score higher than men on tests of verbal memory, they may not be diagnosed with MCI as early as men are when they have the same levels of Alzheimer’s disease-related brain changes, such as the amount of amyloid plaque deposits in the brain or amount of shrinkage in the hippocampus area of the brain.

In the study, researchers used memory test scores based on sex instead of averages for both men and women. Using the sex-specific scores, they found that 10% more women received an MCI diagnosis and 10% fewer men did than when using the averages.

“There are numerous implications to our findings if they are confirmed,” says Biegon, also a professor of radiology and neurology. They are:

If women are inaccurately identified as having no problems with memory and thinking skills when they actually have mild cognitive impairment, then treatments are not being started early enough, and they and their families are not planning ahead for their care or financial or legal situations.

If men are inaccurately diagnosed with mild cognitive impairment, they can be exposed to unneeded medications along with undue stress for them and their families,” she explains.

The study involved 985 people from the Alzheimer’s Disease Neuroimaging Initiative. All of the participants took a verbal memory test that involves learning a list of 15 unrelated words and recalling as many as possible in five immediate tests, where scores range from 0 to 75, and also after learning another list and then a 30- minute delay, where scores range from 0 to 15.

Overall, using typical scores bases on averages across men and women, 26% of women were diagnosed with MCI and 45% of men were diagnosed with MCI. With sex-specific scores, 36% of women, and 35% of men, respectively, were diagnosed with MCI.

The National Institutes of Health supported the work.

Source: Stony Brook University

The post Test for mild cognitive impairment according to sex? appeared first on Futurity.

Sorry, Darwin, but bacteria don’t compete to survive

Fri, 2019-10-11 15:17

“Survival of the friendliest” outweighs “survival of the fittest” for groups of bacteria, according to new research.

The research reveals that bacteria would rather unite against external threats, such as antibiotics, rather than fight against each other. The discovery is a major step towards understanding complex bacteria interactions and the development of new treatment models for a wide range of human diseases and new green technologies.

For a number of years the researchers have studied how combinations of bacteria behave together when in a confined area. After investigating many thousands of combinations it has become clear that bacteria cooperate to survive and that these results contradict what Darwin said in his theories of evolution.

“In the classic Darwinian mindset, competition is the name of the game. The best suited survive and outcompete those less well suited. However, when it comes to microorganisms like bacteria, our findings reveal the most cooperative ones survive,” explains Søren Johannes Sørensen, a professor of microbiology at the University of Copenhagen.

Bacteria survival as team sport

By isolating bacteria from a small corn husk (where they were forced to “fight” for space) the scientists were able to investigate the degree to which bacteria compete or cooperate to survive. They selected the bacterial strains based on their ability to grow together. Researchers measured bacterial biofilm, a slimy protective layer that shields bacteria against external threats such as antibiotics or predators. When bacteria are healthy, they produce more biofilm and become stronger and more resilient.

Time after time, the researchers observed the same result: Instead of the strongest outcompeting the others in biofilm production, bacteria allowed space for the weakest, so they could grow better than they would have on their own. At the same time, the researchers could see that the bacteria split up laborious tasks by shutting down unnecessary mechanisms and sharing them with their neighbors.

“It may well be that Henry Ford thought that he had found something brilliant when he introduced the assembly line and worker specialization, but bacteria have been taking advantage of this strategy for a billion years,” says Sørensen, referring to the oldest known bacterial fossils with biofilm.

“Our new study demonstrates that bacteria organize themselves in a structured way, distribute work, and even help each other. This means that we can find out which bacteria cooperate, and possibly, which ones depend on each another, by looking at who sits next to who,” he says.

All alone vs. part of the team

The researchers also investigated what properties bacteria had when they were alone versus when they were with other bacteria. Humans often discuss the workplace or group synergy, and how people inspire each other. Bacteria take this one step further when they survive in small communities.

“Bacteria take our understanding of group synergy and inspiration to a completely different level. They induce attributes in their neighbors that would otherwise remain dormant. In this way groups of bacteria can express properties that aren’t possible when they are alone. When they are together totally new features can suddenly emerge,” Sørensen explains.

Understanding how bacteria interact in groups has the potential to create a whole new area in biotechnology that traditionally strives to exploit single, isolated strains, one at a time.

“Bio-based society is currently touted as a solution to model many of the challenges that our societies face. However, the vast majority of today’s biotech is based on single organisms. This is in stark contrast to what happens in nature, where all processes are managed by cooperative consortia of organisms. We must learn from nature and introduce solutions to tap the huge potential of biotechnology in the future,” says Sørensen.

The research appears in the ISME Journal.

Source: University of Copenhagen

The post Sorry, Darwin, but bacteria don’t compete to survive appeared first on Futurity.

How old is the ice on the moon?

Fri, 2019-10-11 14:12

While a majority of the icy deposits on the moon are likely billions of years old, some may be much more recent, according to new research.

The discovery of ice deposits in craters scattered across the moon’s south pole has helped to renew interest in exploring the lunar surface, but no one is sure exactly when or how that ice got there.

Lead author Ariel Deutsch, a graduate student in the earth, environmental, and planetary sciences department at Brown University, says that constraining the ages of the deposits is important both for basic science and for future lunar explorers who might make use of that ice for fuel and other purposes.

“The ages of these deposits can potentially tell us something about the origin of the ice, which helps us understand the sources and distribution of water in the inner solar system,” Deutsch says. “For exploration purposes, we need to understand the lateral and vertical distributions of these deposits to figure out how best to access them. These distributions evolve with time, so having an idea of the age is important.”

Shackleton Crater, the floor of which is permanently shadowed from the sun, appears to be home to deposits of water ice. (Credit: NASA/GSFC/Arizona State) How old is the ice on the moon?

Using data from NASA’s Lunar Reconnaissance Orbiter, which has been orbiting the moon since 2009, the researchers looked at the ages of the large craters in which scientists found evidence for south pole ice deposits. To date the craters, researchers count the number of smaller craters that have accrued inside the larger ones. Scientists have an approximate idea of the pace of impacts over time, so counting craters can help establish the ages of terrains.

The majority of the reported ice deposits are within large craters formed about 3.1 billion years or longer ago, the researchers found. Since the ice can’t be any older than the crater, that puts an upper bound on the age of the ice. Just because the crater is old doesn’t mean that the ice within it is also that old too, the researchers say, but in this case there’s reason to believe the ice is indeed old. The deposits have a patchy distribution across crater floors, which suggests that micrometeorite impacts and other debris has battered the ice over a long period of time.

If those reported ice deposits are indeed ancient, that could have significant implications in terms of exploration and potential resource utilization, the researchers say.

“There have been models of bombardment through time showing that ice starts to concentrate with depth,” Deutsch says. “So if you have a surface layer that’s old, you’d expect more underneath.”

Craters in craters

While the majority of ice was in the ancient craters, the researchers also found evidence for ice in smaller craters that, judging by their sharp, well-defined features, appear to be quite fresh. That suggests that some of the deposits on the south pole got there relatively recently.

“That was a surprise,” Deutsch says. “There hadn’t really been any observations of ice in younger cold traps before.”

If there are indeed deposits of different ages, the researchers say, that suggests they may also have different sources. Older ice could have been sourced from water-bearing comets and asteroids impacting the surface, or through volcanic activity that drew water from deep within the moon. But there aren’t many big water-bearing impactors around in recent times, and volcanism is thought to have ceased on the Moon over a billion years ago. So more recent ice deposits would require different sources—perhaps bombardment from pea-sized micrometeorites or implantation by solar wind.

The best way to find out for sure, the researchers say, is to send spacecraft there to get some samples. And that appears to be on the horizon. NASA’s Artemis program aims to put humans on the moon by 2024, and plans to fly numerous precursor missions with robotic spacecraft in the meantime. Coauthor Jim Head, Deutsch’s PhD advisor, says studies like this one will help to shape those future missions.

“When we think about sending humans back to the moon for long-term exploration, we need to know what resources are there that we can count on, and we currently don’t know,” Head says. “Studies like this one help us make predictions about where we need to go to answer those questions.”

The study appears in the journal Icarus. Gregory Neumann from the NASA Goddard Space Flight Center also contributed to the work.

Source: Brown University

The post How old is the ice on the moon? appeared first on Futurity.

Our brains walk a fine line for maximum performance

Fri, 2019-10-11 13:30

To maximize information processing, the brain tunes itself to be as excitable as possible without tipping into disorder, new research confirms.

Researchers long wondered how the billions of independent neurons in the brain come together to reliably build a biological machine that easily beats the most advanced computers. All of those tiny interactions appear to be tied to something that guarantees an impressive computational capacity.

Over the past 20 years, evidence mounted in support of the criticality hypothesis, which asserts that the brain is poised on the fine line between dormancy and chaos. At exactly this line, information processing is maximized.

However, researchers had never tested one of the key predictions of this theory—that criticality is truly a set point, and not a mere inevitability—until now. The new research directly confirms this long-standing prediction in the brains of freely behaving animals.

“When neurons combine, they actively seek out a critical regime,” says lead author Keith Hengen, assistant professor of biology at Washington University in St. Louis. “Our new work validates much of the theoretical interest in criticality and demonstrates that criticality is a hallmark of normally functioning networks.”

Criticality is actively regulated, the researchers determined. But the mechanisms underlying this optimized state are not straightforward.

“We were surprised to find that, in our models, it was largely accounted for by a population of inhibitory neurons that, in retrospect, are well poised to regulate the organization of the larger network,” Hengen says.

Seeing criticality close-up

Criticality is the only known computational regime that, by its very definition, optimizes information processing—such as memory, dynamic range, and the ability to encode and transmit complex patterns.

Theoretical physicists originally proposed that the brain may be critical. Neuroscientists had a mixed reaction.

“There’s a long history of solid theoretical work on criticality and some fun controversy that adds spice,” Hengen says. “I think that this controversy comes from two places. First, much of the work in vivo has been largely descriptive, I think because these datasets are hard to collect and challenging to analyze. Either way, direct demonstration that criticality is something that the brain attends to has been absent.

“Second, there’s been quite a bit of argument about the math people use to measure criticality,” Hengen says. “Recently, people moved away from measuring simple power laws, which can pop out of random noise, and have started looking at something called the exponent relation. So far, that’s the only true signature of criticality, and it’s the basis of all of our measurements.”

“Our lab brings a very high caliber contribution to the discussion of criticality in the brain—because of the resolution [single neuron], and because of the total time we’re looking across,” he says. “We can watch critical dynamics as a function of time across an incredibly long period.”

The research draws on data from neuronal recordings of freely behaving mice that Hengen collected at Brandeis University. Hengen has since built his own laboratory, and he is collecting his own neuronal recordings—recordings that span months and hundreds of neurons.

Such recordings are extraordinarily data-intensive and technically challenging.

“The temporal resolution is very high—that’s an advantage,” says first author Zhengyu Ma, a recent PhD physics graduate. “Also, they can record nine days. I’m still very surprised by this. There are not many labs that can reach nine days of recording.” With few exceptions, the previous state-of-practice for neuronal recordings was 30 minutes to a few hours, tops—a maximum that use to limit experimental tests of criticality.

With Ma contributing to the computational heavy lifting, Hengen and his coauthors combined and processed the data from Hengen’s many single-neuron recordings over time to model activity across entire neural networks.

When things fall apart

Taking advantage of their ability to continuously track the activity of neurons for more than a week, the researchers first confirmed that network dynamics in the visual cortex are robustly tuned to criticality, even across light and dark cycles.

Next, by blocking vision in one eye, the researchers revealed that criticality was severely disrupted, more than a day before the manipulation affected the firing rates of individual neurons.

Twenty-four hours later, criticality re-emerged in the recordings—at which point the visual deprivation suppressed individual neurons.

“It seems that as soon as there’s a mismatch between what the animal expects and what it’s getting through that eye, the computational dynamic falls apart,” Hengen says.

“This is consistent with the theoretical physics that the critical regime is firing-rate independent,” he says. “It’s not about just the total number of spikes in the network, because the firing rate hasn’t changed at all at the very early part of deprivation—and yet the regime falls apart.”

The researchers now believe that criticality in the brain is likely connected to inhibitory neurons imposing and organizing the computational dynamics.

Implications for disease

The findings could have important implications for motor learning and for disease. The brain’s self-organization around criticality is an active process, Hengen notes, and impaired homeostatic regulation is increasingly implicated in severe human pathologies such as Alzheimer’s, epilepsy, Rett Syndrome, autism, and schizophrenia.

“One interpretation of this work is that criticality is a homeostatic end goal for networks in the brain,” Hengen says. “It’s an elegant idea: that the brain can tune an emergent property to a point neatly predicted by the physicists. And it makes intuitive sense, that evolution selected for the bits and pieces that give rise to an optimal solution. But time will tell. There’s a lot of work to be done.”

The paper appears in the journal Neuron.

Source: Washington University in St. Louis

The post Our brains walk a fine line for maximum performance appeared first on Futurity.

Nazi scientists created an alternative to DDT pesticide

Fri, 2019-10-11 13:25

DFDT, a fast-acting insecticide, has an alarming history, researchers report.

“We set out to study the growth of crystals in a little-known insecticide and uncovered its surprising history, including the impact of World War II on the choice of DDT—and not DFDT—as a primary insecticide in the 20th century,” says Bart Kahr, professor of chemistry at New York University and one of the study’s senior authors.

Kahr and fellow chemistry professor Michael Ward study the growth of crystals, which two years ago led them to discover a new crystal form of the notorious insecticide DDT. DDT is known for its detrimental effect on the environment and wildlife, but the new form Kahr and Ward developed was more effective against insects—and in smaller amounts, potentially less harmful to the environment.

The origins of DFDT in the Third Reich

Alongside chemical analysis, the researchers uncovered a rich and unsettling backstory for DFDT. Through historical documents, they learned that German scientists created DFDT during World War II as an insecticide and that the German military used it for insect control in the Soviet Union and North Africa, in parallel with the use of DDT by American armed forces in Europe and the South Pacific.

In the post-war chaos, however, DFDT manufacturing came to an abrupt end. Allied military officials who interviewed Third Reich scientists dismissed the Germans’ claims that DFDT was faster and less toxic to mammals than DDT, calling their studies “meager” and “inadequate” in military intelligence reports.

Allied military officials who interviewed German scientists after World War II dismissed their claims that DFDT (also known as “Gix” or “Fluorgesarol”) was faster and less toxic to mammals than DDT, calling their studies “meager” and “inadequate” in military intelligence reports. (Credit: Combined Intelligence Objectives Sub-Committee Report on Insecticides, Insect Repellents, Rodenticides, and Fungicides of I.G. Farbenindustrie A.G.,1945 (declassified))

In his 1948 Nobel Prize address for the discovery of the insect-killing capability of DDT, Paul Müller noted that DFDT should be the insecticide of the future, given that it works more quickly than does DDT. Despite this, DFDT has largely been forgotten and was unknown to contemporary entomologists with whom the researchers consulted.

“We were surprised to discover that at the outset DDT had a competitor which lost the race because of geopolitical and economic circumstances, not to mention its connection to the German military, and not necessarily because of scientific considerations. A faster, less persistent insecticide, as is DFDT, might have changed the course of the 20th century; it forces us to imagine counterfactual science histories,” says Kahr.

Swapping out atoms

In continuing to explore the crystal structure of insecticides, the research team began studying fluorinated forms of DDT, swapping out chlorine atoms for fluorine. They prepared two solid forms of the compound—a monofluoro and a difluoro analog—and tested them on fruit flies and mosquitoes, including mosquito species that carry malaria, yellow fever, Dengue, and Zika. The solid forms of fluorinated DDT killed insects more quickly than did DDT; the difluoro analog, known as DFDT, killed mosquitoes two to four times faster.

“Speed thwarts the development of resistance,” says Ward, a senior author on the study. “Insecticide crystals kill mosquitoes when they are absorbed through the pads of their feet. Effective compounds kill insects quickly, possibly before they are able to reproduce.”

The researchers also made a detailed analysis of the relative activities of the solid-state forms of fluorinated DDT, noting that less thermodynamically stable forms—in which the crystals liberate molecules more easily—were more effective at quickly killing insects.

A monofluoro analog of DDT, as seen through an optical microscope. Solid fluoridated forms of DDT killed insects more quickly than did DDT. (Credit: Xiaolong Zhu and Jingxiang Yang/NYU)


Searching for better pesticides

Mosquito-borne diseases such as malaria—which kills a child every two minutes—are major public health concerns, resulting in 200 million illnesses annually. Newer diseases like Zika may pose growing threats to health in the face of a changing climate.

Mosquitoes are increasingly resistant and are failing to respond to the pyrethroid insecticides built into bed nets. Public health officials are concerned and have reconsidered the use of DDT—which has been banned for decades in much of the world with the exception of selective use for malaria control—but its controversial history and environmental impact encourage the need for new insecticides.

“While more research is needed to better understand the safety and environmental impact of DFDT, we, along with the World Health Organization, recognize the urgent need for new, fast insecticides. Not only are fast-acting insecticides critical for fighting the development of resistance, but less insecticide can be used, potentially reducing its environmental impact,” says Ward.

The study appears in the Journal of the American Chemical Society.

Additional researchers from NYU and Arrowhead Pharma contributed to the work. Support for the work came from the NYU Materials Research Science and Engineering Center program of the National Science Foundation. The NSF partially supports the NYU X-ray facility.

Source: New York University

The post Nazi scientists created an alternative to DDT pesticide appeared first on Futurity.

Online games could limit screen time while making money

Fri, 2019-10-11 13:25

Game creators could change their products to cut players’ screen time while making more money, according to new research.

Yulia Nevskaya’s first foray into the World of Warcraft started one evening at 7PM. She created an avatar to represent her in the online video game and set off to explore another land.

“It’s like another Earth. It looked like paradise,” says Nevskaya, an assistant professor of marketing in the Olin Business School at Washington University in St. Louis who studies, among other things, how consumers form habits. “I was completely immersed.”

The next thing she knew, it was 4AM.

“It’s a win-win outcome for both the firm and consumers.”

Nevskaya’s recent research used data that a bot gleaned from World of Warcraft, a massively popular, multiplayer role-playing computer game set in a fantasy universe.

Blizzard Entertainment launched the game in 2004, and within seven years it had more than 10 million subscribers worldwide. A user in character in World of Warcraft spends, on average, 12.5 hours per week playing the game, and more than 53 million people in the US played online games at least once a month in 2016, for example.

Curbing screen time

Nevskaya and coauthor Paulo Albuquerque of INSEAD in France focused their investigation on three main actions that the game developer has at its disposal to manage consumers’ use of the game: redesigning content and in-game reward schedules, sending notifications to gamers, and imposing time limits on gameplay. In all, they analyzed a random sample of 402 gamers and nearly 15,000 gaming sessions.

They discovered this: When a firm changes its game’s rewards schedule and also limits how long gamers can play in a sitting, the firm can actually make more money—and players cut their screen time.

“It’s a win-win outcome for both the firm and consumers,” Nevskaya says. “Those actions led to higher revenues and a smaller share of people’s time devoted to gaming, curbing potentially excessive use of the product.”

“What’s good for the consumer is not necessarily bad for the company.”

The researchers found gamers’ slower consumption of content led to an increase in their long-term engagement with the product, which is based on subscriptions. At the time of the research, subscription fees were about 50 cents a day on a weekly or monthly automated payment plan.

“What’s good for the consumer is not necessarily bad for the company,” Nevskaya says.

Nevskaya and Albuquerque built an empirical model that mimics how consumers make choices so they could learn about gamers’ decisions—such as when to start and stop playing. Their approach allowed them to study consumers’ responses to product design, notifications, and rewards over time, as well as to identify people who display signs of habitual gaming. According to the study, more than two-thirds of gamers exhibit signs of habitual gaming with, on average, 100.8 minutes in every 24-hour period.

A software program that logged onto the game server every 5 to 10 minutes collected the data. It recorded gamers’ avatars present on the server at the moment, as well as their current experience level and the content area in which they were playing.

Yes, they found that altering in-game reward schedules and imposing time limits leads to shorter gaming sessions and longer subscriptions. But they also learned that notifications saying players should take a break don’t help.

Here’s the rub: Because a suggestion to take a break may arrive at a time when a gamer is not yet satiated with a gaming session and is in a “hot habit state,” in Nevskaya’s terms, it also may motivate the gamer to return quickly to the game—and reinforce the gaming habit. Notifications lead to a pattern of shorter but more frequent sessions, resulting in a significant increase in active gaming time for a large group of gamers, the authors discovered.

“Our paper addresses the important question of how to curb excessive screen usage, which has been a frequent concern among public policymakers,” Nevskaya says.

Why it matters

Since 2014, the researchers note, the World Health Organization (WHO) has been evaluating the public health implications of excessive use of the internet, computers, smartphones, and other devices. Last year, the WHO included “gaming disorder” in the 11th edition of the International Classification of Diseases as a clinically recognizable and significant syndrome when “the pattern of gaming behavior is of such a nature and intensity that it results in marked distress or significant impairment in personal, family, social, educational, or occupational functioning.”

With about $19.9 billion in sales in 2016 worldwide, the online video gaming industry especially benefits from new technologies that allow almost-constant online connectivity. Online and mobile games and social media platforms have spent significant resources to increase product use through customized content, frequent promotions, and virtual rewards, the authors note.

“‘Gamification’ of products is a common practice, which makes understanding of how consumers react to game-like product features increasingly important,” they write.

“We’re not claiming that gaming is harmful. It can be a wonderful pastime,” Nevskaya says. “But it’s potentially harmful when enjoyed in excess.”

As a marketing expert, she says she feels a responsibility to consumers.

“We can agree that marketing has become very sophisticated” in large part because of the massive troves of data now available to companies, she says. “Academics as well as responsible businesses should help consumers navigate the field safely.”

The research appears in the Journal of Marketing Research.

Source: Washington University in St. Louis

The post Online games could limit screen time while making money appeared first on Futurity.

Certain gut bacteria may prevent rotavirus infection

Fri, 2019-10-11 13:15

The presence of certain gut bacteria in the digestive tract can prevent and cure rotavirus infection in mice, research finds.

Rotavirus is the leading cause of severe, life-threatening diarrhea in children worldwide.

The findings, published in the journal Cell, may explain why rotavirus causes severe, life-threatening disease in some people and only mild disease in others. The work could lead to possible treatments and preventive measures for rotavirus infection. There are no existing treatments for rotavirus besides administering fluids to avoid dehydration.

Rotavirus is a highly contagious virus that can cause severe diarrhea, vomiting, fever, abdominal pain, and death. Rotavirus infection occurs as a result of direct contact with an infected person or exposure to their fecal matter. Infants and young children are the most susceptible to this disease, and the illness can lead to severe dehydration, hospitalization, or even death. Rotavirus leads to an estimated 215,000 deaths worldwide in children younger than 5 years old, according to the Centers for Disease Control and Prevention.

Susceptibility to rotavirus is not well understood and can vary vastly among individuals and regions. Rotavirus causes relatively mild disease in both developed countries and poor regions of central America, but it kills many thousands of children each year in poor regions of Sub-Saharan Africa and India. Even within particular societies, rotavirus causes mild disease in some individuals and severe life-threatening disease in others.

“This study shows that one big determinant of proneness to rotavirus infection is microbiota composition,” says Andrew Gewirtz, senior author of the study and a professor in the Institute for Biomedical Sciences at Georgia State University.

Gut microbiota is the collective term for microorganisms living in the intestinal tract. Microbiota composition was known to influence bacterial infection, but a role for microbiota in influencing viral infection was totally unknown.

“This discovery was serendipitous,” Gewirtz says. “We were breeding mice and realized that some of them were completely resistant to rotavirus whereas others were highly susceptible. We investigated why and found that the resistant mice carried distinct microbiota. Fecal microbiota transplant transferred rotavirus resistance to new hosts.”

Further investigation revealed that a predominant determinant of resistance to rotavirus was the presence of a single bacterial species called Segmented Filamentous Bacteria, or SFB. The researchers found SFB reduces rotavirus infectivity and protects against rotavirus by causing epithelial cells to shed and be replaced with new, uninfected cells.

“It’s a new basic discovery that should help understand proneness to rotavirus infection,” Gewirtz says. “It does not yield an immediate treatment for humans, but provides a potential mechanism to explain the differential susceptibility of different populations and different people to enteric viral infection. Furthermore, it may lead to new strategies to prevent and treat viral infections.”

First author Zhenda Shi who now works at the Centers for Disease Control and Prevention’s rotavirus branch, is investigating whether or not gut microbiota can explain differences in sensitivity to rotavirus infection in humans.

Coauthors of the study are from Georgia State; Fudan University in Shanghai, China; Children’s Hospital of Philadelphia; Washington University School of Medicine; Vanderbilt University School of Medicine; Oregon Health Sciences University; and the University of Pittsburgh School of Medicine.

Funding came from the National Institutes of Health’s National Institute of Diabetes and Digestive and Kidney Diseases, and National Institute of Allergy and Infectious Diseases.

Source: Georgia State University

The post Certain gut bacteria may prevent rotavirus infection appeared first on Futurity.