Futurity.org

Syndicate content
Research news from top universities.
Updated: 4 min 3 sec ago

Why we tend to go big when we undermine our goals

Mon, 2018-08-13 19:30

New research clarifies why we tend to really go for it when violating a personal goal, such as saving money or sticking to a diet.

When consumers contemplate violating a personal goal (i.e., cheating on a diet, overspending on a budget), they often seek to make the most of that violation by choosing the most extreme option, according to the research.

“This implies that one will not ‘blow one’s diet on Twinkies’…”

In the experiments, Kelly Goldsmith, associate professor of marketing at the Vanderbilt University Owen Graduate School of Management and her coauthors manipulated whether or not participants adopted a goal, such as losing weight or saving money. Next, the researchers presented participants with a choice between options that conflicted with that goal, such as indulgent desserts or luxury hotel stays.

The researchers found that participants who adopted the goal (vs. those who did not) tended to choose the most indulgent option: People trying to save money chose the more expensive of two resorts for a hypothetical vacation, and those trying to lose weight chose the higher calorie doughnut. At first, these results might seem counter-intuitive, but the authors conducted further experiments to elucidate the thought process behind these seemingly contradictory decisions.

The authors found that individuals who have a goal, such as weight loss, feel conflicted when choosing among options that violate that goal, such as two doughnuts. This conflict causes the individuals to seek out the option that justifies a violation of their goal, which is often the more indulgent choice—i.e. the doughnut with the chocolate icing, rather than the plain treat.

“This implies that one will not ‘blow one’s diet on Twinkies’ (i.e., a commonplace, low-cost, low-quality indulgence) but instead will be more likely to do so for an outcome that maximizes indulgence and is hence ‘worth it,'” note Goldsmith and coauthors, professor Ravi Dhar and doctoral candidate Elizabeth Friedman of Yale University.

Little treats aren’t a vice. They get us to our goals

In other words, if either choice causes you to violate the goal, you might as well make the most of the violation and choose the most indulgent option. These findings have practical implications for both retailers and consumers. Retailers that fall at extreme ends of the market—either very indulgent or very healthy, for example—might benefit from offering even more extreme options to take advantage of consumers experiencing conflicting goals, since these individuals are more likely to choose the most extreme option available to them.

On the other hand, consumers can take steps to avoid a situation where they might be forced to choose between two options that violate one of their goals: A dieter can bring a healthy lunch to work instead of debating between unhealthy options at the company cafeteria. If the situation is unavoidable, remaining aware of this counter-intuitive decision-making tendency can prompt people to choose the “lesser of the two evils” instead of the most justifiable (and therefore most extreme) option.

For maximum happiness, pick concrete goals

“For example, dieters can be trained to view choice options strictly through the lens of calorie content, as opposed to other attributes,” Goldsmith says. “If you strip your choices down to a comparison between two numbers [i.e. calories]—and your goal offers a rule for which number is better—the choice is a lot easier to make, and you are a lot less susceptible to these biases in decision making.”

The study will appear in the Journal of the Association for Consumer Research.

Source: Kara Sherrer for Vanderbilt University

The post Why we tend to go big when we undermine our goals appeared first on Futurity.

Experiment nails down new properties of water

Mon, 2018-08-13 19:18

Researchers have uncovered new molecular properties of water.

Liquid water is an excellent transporter of its own autoionization products; that is, the charged species obtained when a water molecule (H2O) is split into protons (H+) and hydroxide ions (OH−). This remarkable property of water makes it a critical component in emerging electrochemical energy production and storage technologies such as fuel cells; life itself would not be possible if water did not possess this characteristic.

Water consists an intricate network of weak, directional interactions known as hydrogen bonds. For nearly a century, scientists thought that the mechanisms by which water transports the H+ and OH− ions were mirror images of each other—identical in all ways except for directions of the hydrogen bonds involved in the process.

Current state-of-the-art theoretical models and computer simulations, however, predicted a fundamental asymmetry in these mechanisms. If correct, this asymmetry is something that could be exploited in different applications by tailoring a system to favor one ion over the other.

Experimental proof of the theoretical prediction has remained elusive because of the difficulty in directly observing the two ionic species. Different experiments have only provided glimpses of the predicted asymmetry.

In the new work, researchers devised a novel experiment for nailing down this asymmetry. The experimental approach involved cooling water down to its so-called temperature of maximum density, where researchers expect the asymmetry to most strongly manifest, thereby allowing researchers to carefully detect it.

Chilled, not frozen

It is common knowledge that ice floats on water and that lakes freeze from the top. This is because water molecules pack into a structure with lower density than that of liquid water—a manifestation of the unusual properties of water: the density of liquid water increases just above the freezing point and reaches a maximum at four degrees Celsius (39 degrees Fahrenheit), the so-called temperature of maximum density; this difference in density dictates that liquid is always situated below ice.

By cooling water down to this temperature, the team employed nuclear magnetic resonance methods (the same type of approach is medically in magnetic resonance imaging) to show that the difference in lifetimes of the two ions reaches a maximum value (the greater the lifetime, the slower the transport). By accentuating the difference in lifetimes, the asymmetry became glaringly clear.

As mentioned previously, water consists of one oxygen atom and two hydrogen atoms, but the hydrogen atoms are relatively mobile and can hop from one molecule to another, and it is this hopping that renders the two ionic species so mobile in water.

In seeking explanations for the temperature-dependent characteristics, the researchers focused on the speed with which such hops can occur.

Prior research had indicated that two main geometrical arrangements of hydrogen bonds (one associated with each ion) facilitate the hops. The researchers found that one of the arrangements led to significantly slower hops for OH− than for H+ at four degrees Celsius.

Water gets weird at extreme pressures and temps

Being that this is also the temperature of maximum density, the researchers felt that the two phenomena had to be linked. In addition, their results showed that molecules’ hopping behavior changed abruptly at this temperature.

‘Intense interest’

“The study of water’s molecular properties is of intense interest due to its central role in enabling physiological processes and its ubiquitous nature,” says Alexej Jerschow, a professor of chemistry at New York University and the corresponding author of the study.

“The new finding is quite surprising and may enable deeper understanding of water’s properties as well as its role as a fluid in many of nature’s phenomena,” Jerschow says.

“It is gratifying to have this clear piece of experimental evidence confirm our earlier predictions,” says Mark Tuckerman, a professor of chemistry and mathematics, who was one of the first researchers to predict the asymmetry in the transport mechanisms and the difference in the hydrogen bond arrangements.

“We are currently seeking new ways to exploit the asymmetry between H+ and OH− transport to design new materials for clean energy applications, and knowing that we are starting with a correct model it central to our continued progress.”

Metafluid experiment gives water new consistency

The team’s findings will also affect a large swath of other research, ranging from the study of enzyme function in the body to understanding how living organisms can thrive in harsh conditions, including sub-freezing temperatures and highly acidic environments.

The National Science Foundation and the MRSEC Program of the National Science Foundation funded the research.

Source: NYU

The post Experiment nails down new properties of water appeared first on Futurity.

Music lifts well-being for people in palliative care

Mon, 2018-08-13 19:10

Hospice and palliative care patients who listen to live music in their rooms as part of their treatment report feeling better both emotionally and physically, a new study reports. They also request fewer opioid-based medications, according to the study.

Doctors working with seriously ill patients at Kent Hospital and Women and Infants Hospital in Rhode Island, gave them the option of having a flutist play music in their rooms as part of their palliative care, which focuses on improving quality of life and relieving symptoms for people with serious illnesses.

The idea was that music might help these patients contend with symptoms like pain and stress and improve their moods. Studies show that patients who engage with visual arts, creative writing, and other expressive activities report improved emotional and psychological well-being, according to the study.

A whole person

“The field of palliative care is very mindful of the patient as a whole person, looking out for their spiritual and emotional well-being in addition to their physical health,” says Cynthia Peng, a third-year medical student at Brown University’s Warren Alpert Medical School and lead author of the study, which appears in the American Journal of Hospice and Palliative Medicine.

The researchers conducted the study in 2017 with 46 patients. During the study, palliative care physicians integrated music as supplementary treatment into routine visits.

“…that in this high-symptom burden population that something non-pharmacological could influence their own [opioid] usage is pretty remarkable.”

Peng, who is trained as a flutist, played the music. Often, the physician introduced Peng to patients during consultation and she typically played for the patient and any family or friends present shortly after that interaction.

Before coming to Brown, Peng was a musician with the Georgetown Lombardi Arts and Humanities Program, which uses music, writing, dance, and visual arts as part of therapeutic patient care at the MedStar Georgetown University Hospital.

Patient-centered intervention

Patients could request particular songs or styles of music, or leave the choice up to Peng. She had a wide variety of music on hand for the patients’ various needs and preferences, including classical music, folk songs, oldies, hymnals, and jazz. Having that choice ensured that the intervention was patient-centered, Peng says.

Even the option to decline or accept the intervention was a way of putting the patients, who relinquish so much control when they’re in the hospital, in charge, she adds.

“I want to spend as much time as possible with my kids and grandkids… I am now getting discharged in a good mood.”

“A lot of these patients are inpatient for long periods of time,” Peng says. “People—family, friends—may visit, but for the majority of the time they’re kind of either passing time or watching TV.

“Having an intimate, enjoyable experience for the patients is really valuable, especially when they’re facing a lot of difficult decisions, symptom-management issues, maybe facing the end of life.”

Researchers tracked both patients’ opioid use and their self-reported states before and after Peng treated them to a mini concert in their rooms.

Patients who opted for the music intervention filled out a six-question version of the Edmonton Symptom Assessment Scale, which is designed to get a patient’s perspective on their symptoms. They answered questions about pain, anxiety, depression, nausea, shortness of breath, and overall feelings of well-being before and after the music intervention.

Patients or their surrogates also answered four open-ended questions about their experience with the music after hearing it.

What patients said

The researchers say the responses could be grouped into five general categories: spirituality, comfort, connection, escape, and reflections.

“The music made me think of God, granting me peace, strength and hope,” one patient wrote, while another said of the music, “It put me in a quiet pasture.”

Other patients said the music reminded them of playing music for their children years ago or choosing music to accompany their painting practice. One wrote, “I want to go home in a happy mood. I want to spend as much time as possible with my kids and grandkids as possible. I am now getting discharged in a good mood.”

Palliative care really does boost quality of life

Of the 46 patients in the study, 33 used opioids, and the researchers tracked their levels of use before and after the music intervention.

Unlike the broader population of patients, the use of opioids is not generally considered problematic for palliative care patients, who must cope with many symptoms from their illnesses, and hospice patients, who are typically in the end stages of their lives, Peng says.

These patients often require high doses, and although one might expect opiate use to increase after the physician visit, the study’s findings suggest a trend toward a decrease in opioid use.

While the study was performed with a limited timeframe and patient census, Peng says, “To demonstrate that in this high-symptom burden population that something non-pharmacological could influence their own usage is pretty remarkable.”

Peng says she hopes that hospital and clinic administrators will consider incorporating music and other interventions in patient care.

Palliative care in nursing homes cuts hospital trips

“Classical music shouldn’t just be for concert halls. It should be something that everyday people can participate in, take part in. I hope more hospitals and healthcare settings can make music accessible as a source of comfort for patients and their families.”

Additional coauthors are from Care New England. The George A. and Marilyn M. Bray fund for Medical Humanities through the Warren Alpert Medical School of Brown University funded the work.

Source: Brown University

The post Music lifts well-being for people in palliative care appeared first on Futurity.

How Asian-American firstborns see their family role

Mon, 2018-08-13 19:05

When compared to European Americans, Asian-American firstborns feel the additional burden of being cultural brokers and having to take care of their immigrant parents and young siblings at the same time, research suggests.

The study explores how both groups—ages 18 to 25—viewed sibling relationships, their birth order, and family relations.

Several positive themes of siblingship emerged from the interviews: feeling supported, appreciated, and comforted during interactions with their siblings. Some participants disclosed that siblings alleviate pressure from parents that might otherwise cause conflict.

Along birth order themes, firstborns from both groups felt motivated to become role models for their younger siblings by having high-achievement levels, confidence, and behavior. However, for some Asian-American later-borns, the pressure to measure up also stemmed in part from parents’ tendency to compare their children, according to the study.

For firstborn Asian Americans, the sibling caregiving and cultural brokering responsibility—regardless of gender—created dual pressure, the study shows. In Asian cultures, the oldest son traditionally has greater obligations in the family, but more firstborn females are taking on these roles—even when there are young male siblings in the household, says lead author Kaidi Wu, a doctoral candidate in social psychology at the University of Michigan.

Asian-American families may rely more heavily on the firstborn than their counterparts for various reasons. But the increased family obligations may have an adverse impact on the older Asian-American siblings, such as greater depression and anxiety, the study cautions.

Sibling bonds protect kids from fighting parents

Nevertheless, Wu says having siblings can be beneficial to Asian-American firstborns, when firstborns struggle with their parents’ more traditional cultural perspectives (such as marrying a Chinese person because they are Chinese) and have their younger siblings to relate to. This finding contrasts with previous research in which older siblings closely resemble parents’ stance on Asian values and differ from later-borns who acculturate more easily into the mainstream American culture.

The findings appear in the Journal of Family Issues. The study’s other authors are from UCLA, the University of Michigan, and the Toronto District School Board.

Source: University of Michigan

The post How Asian-American firstborns see their family role appeared first on Futurity.

Killing of black suspects is more than a ‘white police problem’

Mon, 2018-08-13 19:00

Police officers of all races—not just white ones—disproportionately kill African American suspects, according to a new study that points to a need for policing and legal reforms.

“…white officers are no more likely to use lethal force against minorities than nonwhite officers.”

Researchers looked at every use of deadly force by officers across the United States and discovered that the killing of black suspects is a police problem, not a white police problem. Further, the killing of unarmed suspects of any race is extremely rare.

“There might be some bad apples in the police department, but white officers are no more likely to use lethal force against minorities than nonwhite officers,” says Charles Menifield, dean of the School of Public Affairs and Administration (SPAA) at Rutgers University-Newark, and lead author of the paper, which appears in Public Administration Review.

“Still, the killings are no less racist but will require a very different set of remedies if we are to change the culture and stop this from happening.”

For the study, Menifield and colleagues created a database of all confirmed uses of deadly force by police in the United States in 2014 and 2015, the most recent years for which sufficient data were available.

The findings show that police kill African Americans more than twice as often as the general population. While only about 12 percent of the American population is black, 28 percent of people killed during this two-year period were black, according to the research, which also found that Latinos were killed slightly more than would be expected and white citizens less often.

The study also found that less than 1 percent of victims of police killings were unarmed. Across all racial groups, 65.3 percent of those killed possessed a firearm at the time of their death.

“The gun could be in their car, or on them, but it was there at the time they were killed,” says Menifield. “This shouldn’t be surprising because of the availability and ease of getting a gun in the United States.”

High-profile killings of unarmed black men in the last few years—like that of 18-year-old Michael Brown in Ferguson, Missouri, in 2014, which gave rise to the Black Lives Matter movement—have led many to speculate that white officers may target nonwhite suspects with lethal force, Menifield says.

The new research shows, however, that white police officers actually kill black and other minority suspects at lower rates than would be expected if killings were randomly distributed among officers of all races.

The disproportionate killing of black men occurs, according to the researchers, because of the institutional and organizational racism in police departments and the criminal justice system’s targeting minority communities with policies—like stop and frisk and the war on drugs—that have more destructive effects.

“The question of the basic causes of racial disparities in police killings has profound real-world implications for policing a diverse society,” Menifield says, suggesting that appropriate reforms for a fundamentally institutional problem would target racism in police department practices and criminal policy that result in over policing of minority populations.

Police killings of black Americans take a toll on mental health

“Today, we have politicians who are arguing for tougher stances on immigration,” Menifield says. “These things have a way of trickling down to other things like tougher sentences on crime and policies that have a disparate impact on minority communities.”

Addressing the problem will not be easy. Menifield says the US Department of Justice needs to enforce the Death in Custody Reporting Act of 2013 that requires police organizations to report data on police killings. This data will allow researchers to thoroughly investigate each case and determine if other variables are driving police behavior.

60% of black women killed by police were unarmed

In addition, he says, police departments need to bring in external reviewers to examine all of their institutional practices including hiring, promotions, and training. The long‐running racial discrepancies in the way that police officers apply force to suspects have significantly eroded trust between law enforcement and the public whom they serve, the researchers argue.

“There is definitely a problem when one race of people are being killed by police at much higher rates than other populations,” Menifield says. “This unfortunate state of affairs is unlikely to improve until fundamental changes in public policy and policing are undertaken.”

Source: Rutgers University

The post Killing of black suspects is more than a ‘white police problem’ appeared first on Futurity.

Past space weather may boost prep for threats to Earth

Mon, 2018-08-13 18:54

Looking back at historic space weather may help us understand what’s coming next.

Space weather can disrupt electronics, aviation and satellite systems, and communications. This depends on solar activity, but as this is different for each solar cycle, the overall likelihood of space weather events are difficult to forecast.

“…there is an underlying pattern to [the likelihood of extreme space weather events], which does not change.”

In their new work, researchers charted the space weather in previous solar cycles across the last half century, and discovered an underlying repeatable pattern in how space weather activity changes with the solar cycle. The findings will allow better understanding of and planning for space weather.

The sun goes through solar cycles around every 11 years, during which time the number of sunspots increases to the maximum point (the “solar maximum”). More solar activity means more solar flares, which in turn can mean more extreme space weather at Earth.

Each solar cycle has a different duration and peak activity level, and, as a consequence the climate of Earth’s space weather has also been different at each solar maximum. The more extreme events are less frequent so that it is harder to build up a statistical picture of how likely they are to occur.

Over the last five solar cycles (more than fifty years), ground- and space-based observations have almost continually monitored the drivers of space weather, the sun and solar wind, and the response seen at Earth. Looking at this data, researchers found that space weather and the activity of the sun are not entirely random—and may constrain how likely large weather events are in future cycles.

‘Magnetopause’ mission to help predict space storms

“We analyzed the last five solar maxima and found that although the overall likelihood of more extreme events varied from one solar maximum to another, there is an underlying pattern to their likelihood, which does not change,” says project leader Sandra Chapman, professor from the Centre for Fusion, Space, and Astrophysics at the University of Warwick.

“If this pattern persists into the next solar maximum, our research, which constrains how likely large events are, will allow better preparation for potential space weather threats to Earth.”

Source: University of Warwick

The post Past space weather may boost prep for threats to Earth appeared first on Futurity.

Community-based conservation ups wildlife populations

Mon, 2018-08-13 15:50

Putting land management in the hands of local communities helps the wildlife within, according to new research.

A new paper in the Journal of Wildlife Management demonstrates the positive ecological impacts of a community-based wildlife conservation area in Tanzania.

“Community-based natural resource management has become one of the dominant paradigms of natural resource conservation worldwide,” says study author Derek E. Lee, an associate research professor at Penn State and principal scientist at the Wild Nature Institute.

“This type of strategy transfers the resource management and user rights from central government agencies to local communities. The impact of these projects on wildlife is rarely rigorously assessed, so we compared wildlife densities inside and outside the community conservation area,” Lee explains.

“My data demonstrate that one of the first areas of this type in Tanzania has had positive ecological outcomes in the form of higher wildlife densities and higher giraffe population growth,” says Lee.

Community conservation

In Tanzania, efforts to decentralize wildlife management to local communities occur through the creation of Wildlife Management Areas, whereby several villages set aside land for wildlife conservation in return for a share of tourism revenues from these areas.

Currently, there are 19 Wildlife Management Areas open in Tanzania, encompassing 7 percent (6.2 million hectares) of the country’s land area, with 19 more planned. Tourism in Tanzania generates around $6 billion US dollars annually, which represents about 13 percent of the country’s total gross domestic product, so there is good incentive for villages to participate in these management areas.

“For six years, I studied the Burunge Wildlife Management Area in Tanzania, which was formally established in 2006 and added increased wildlife protections in 2015,” says Lee. He observed higher numbers of wildlife inside the protected area compared to the village lands just outside the area as well as lower densities of livestock, including cattle, sheep, and goats.

Lee also observed higher numbers of wild ungulates—hooved mammals—and lower numbers of livestock in the management area after the increased wildlife protections began.

“This suggests that the specific management activities implemented in 2015 have a positive effect on wildlife within the Burunge Wildlife Management Area,” says Lee. “These include performing anti-poaching activities to protect wildlife, reducing wood cutting, preventing livestock encroachment, and providing training and equipment to village rangers so that they can perform these activities.”

How to save the ocean while protecting local people

The change to management activities also improved survival and population growth of giraffes within the management area. Lee did not observe any change in giraffe demographics outside of the management area in the adjacent Tarangire National Park over the same time period.

Tailoring solutions

This study highlights the usefulness of monitoring wildlife to evaluate specific management strategies as well as the general concept of community-based natural resource management. In particular, locally based monitoring schemes could lead to more sustainable community-based conservation.

“We know from this and previous studies that these management areas can have positive effects on wildlife,” says Lee. “But there have been some social and economic critiques of wildlife management areas.

“For example, there are higher incidences of poverty around protected areas compared to other rural areas. Although it can be challenging for community-based natural resource management to achieve both conservation and human development goals, the concept appears to be the best opportunity for Tanzania to retain its place as one of the most famous and profitable wildlife tourism destinations while also sustainably developing local communities.”

Conservation won’t work without locals on board

The PAMS Foundation, African Wildlife Foundation, Rufford Foundation, Columbus Zoo, Sacramento Zoo, Tierpark Berlin, Living Desert Zoo, Cincinnati Zoo, Tulsa Zoo, Greater Sacramento Chapter of American Association of Zoo Keepers, World Giraffe Foundation, and Save the Giraffes funded the work.

Source: Penn State

The post Community-based conservation ups wildlife populations appeared first on Futurity.

There’s zero chance you’ll be eaten by a megalodon

Mon, 2018-08-13 12:38

Scientists are officially debunking the myth that megalodon sharks still exist. The whale-eating monsters became extinct about 2.6 million years ago.

“I was drawn to the study of Carcharocles megalodon’s extinction because it is fundamental to know when species became extinct to then begin to understand the causes and consequences of such an event,” says Catalina Pimiento, a doctoral candidate at the Florida Museum of Natural History at University of Florida.

“I also think people who are interested in this animal deserve to know what the scientific evidence shows, especially following Discovery Channel specials that implied megalodon may still be alive.”

Published in PLOS ONE, the study represents the first phase of Pimiento’s ongoing reconstruction of megalodon’s extinction. As modern top predators, especially large sharks, are significantly declining worldwide due to the current biodiversity crisis, the study serves as the basis to better understand the consequences of these changes.

“When you remove large sharks, then small sharks are very abundant and they consume more of the invertebrates that we humans eat,” Pimiento says. “Recent estimations show that large-bodied, shallow-water species of sharks are at greatest risk among marine animals, and the overall risk of shark extinction is substantially higher than for most other vertebrates.”

(Credit: Misslelauncherexpert via Wikimedia Commons) Megalodon mania

Pimiento plans to further investigate possible correlations between changes in megalodon’s distribution and the evolutionary trends of marine mammals, such as whales and other sharks.

“When we calculated the time of megalodon’s extinction, we noticed that the modern function and gigantic sizes of filter feeder whales became established around that time,” she says. “Future research will investigate if megalodon’s extinction played a part in the evolution of these new classes of whales.”

The slowly unraveling details of megalodon’s extinction and various aspects of its natural history have consumed Pimiento’s research for the past six years, including ongoing analysis of megalodon’s body size and a 2010 PLOS ONE study that suggested that Panama served as a nursery habitat for the species.

For the new study, researchers used databases and scientific literature of the most recent megalodon records and calculated the extinction using a novel mathematical model proven reliable in recent experimental testing by study coauthor Christopher F. Clements with the Institute of Evolutionary Biology and Environmental Studies at the University of Zurich.

The study will not only serve as a key reference for debunking the myth that megalodon still exists, but also its new methods will influence the future of scientific research of extinct animals and plants, says Jorge Velez-Juarbe, a vertebrate paleontologist with the Natural History Museum of Los Angeles County.

How shark poo keeps coral reefs healthy

“The methodology that the authors used had only been previously employed to determine extinction dates in historical times, such as to estimate the extinction date of the dodo bird,” he says.

“In this work, scientists applied that same methodology to determine the extinction of an organism millions of years ago, instead of hundreds. It’s a new tool that paleo-biologists didn’t have, or rather had not thought of using before.”

Source: University of Florida (Originally published October 23, 2014)

The post There’s zero chance you’ll be eaten by a megalodon appeared first on Futurity.

For songbirds, 1 way to learn is best for generalizing

Mon, 2018-08-13 11:03

While zebra finches may learn faster through observing other birds, they’re more able to generalize that knowledge when they acquire it through trial and error, according to new research.

Children are constantly learning new things, but whether they find it easy or hard to generalize what they have learned and apply it to new situations can depend on how they learned it. It is much the same for songbirds. In their first few months of life, songbirds, too, must learn a great deal, such as the characteristic song of their species. And like people, birds also learn in different ways.

“When observing, the birds may focus on a large number of song details, many of which are irrelevant for solving the problem at hand…”

In their experiments, the researchers showed that zebra finches can learn by observing fellow members of their species. The birds had to learn through trial and error to discriminate between two classes of birdsong, one long and one short.

Without any special preparation, the median number of repetitions it took for the birds to master the task was 4,700. But if the finches observed other finches as they learned this task, then it took them just 900 repetitions.

In this experimental set up and for statistical reasons, 800 repetitions are required in order to evaluate the animals’ performance. This means that the observing birds mastered the task almost from the very beginning.

The experiment

For the experiment, the scientists used two adjacent birdcages with a partition between them and a zebra finch in each cage. One of the finches had to use trial and error to learn to discriminate between two classes of birdsong. The other bird observed the learning process.

Each of the birds could see the other only by sitting on a particular perch in the cage next to a window in the partition. Because zebra finches are social animals, they were naturally drawn to this particular perch.

If the “experimenter” finch flew to that perch, it would hear one of ten variations of a zebra finch song. The samples had minimal differences in length, which was the defining property for splitting the song samples into two classes: Class A contained the five shorter song samples (lasting 0.9 to 1.0 seconds), and Class B had the five longer ones (1.03 to 1.13 seconds). One second after researchers played a sample from Class B, the team administered an air-puff to the bird.

If the bird was able to differentiate between the two classes, then it could avoid the slightly unpleasant puff of air. In this way, the scientists were able to determine whether a bird had learned the task or not.

Listen to audio from Class A: https://www.ethz.ch/content/dam/ethz/news/eth-news/2018/08/180813-wie-voegel-lernen/short_example_3.mp3 Listen to audio from Class B: https://www.ethz.ch/content/dam/ethz/news/eth-news/2018/08/180813-wie-voegel-lernen/long_example_8.mp3 Generalizing what they learned

In the next phase of the experiment, the researchers tested how well the zebra finches could solve a second, similar task, in which the birds had to distinguish between varying lengths of a different sample of birdsongs.

“Birds that learned a perceptual skill through trial and error were better able to generalize and adapt that skill to new situations…”

This revealed that birds that learned the first task using trial and error from the outset could solve the second task practically right away: It took them a median of just 800 attempts. By contrast, birds that learned the first task primarily through observation needed a median value of 3,600 attempts.

“These results indicate that in zebra finches, learning by trial and error is the more robust method,” summarizes Richard Hahnloser, professor at ETH Zurich and the University of Zurich. “Birds that learned a perceptual skill through trial and error were better able to generalize and adapt that skill to new situations than those that learned it through observation.”

Different benefits

Gagan Narula, a postdoc in Hahnloser’s group and lead author of the study, points to parallels with how children learn: “Active learning, which focuses on experimentation and trial and error, is becoming more and more prevalent in schools. In secondary schools, even maths is now being taught with the help of experiments.”

Still, “both methods have their advantages,” Hahnloser says, “but learning through observation is faster.” He notes that the Swiss education system deliberately incorporates both learning methods: lectures and observation on the one hand, and experiments, exercises, and homework on the other.

Zebra finches use ‘baby talk’ to teach chicks

Neural computer models assisted the scientists in interpreting their findings. From these model calculations, the researchers surmise that although the act of observation involves many synapses between neurons in a finch brain, these are relatively weak. In contrast, trial-and-error learning involves a smaller number of synapses, but they are much stronger, leading to an enhanced ability to generalize.

“When observing, the birds may focus on a large number of song details, many of which are irrelevant for solving the problem at hand,” Hahnloser explains. “In the trial-and-error case, they remember fewer details but focus on the most prominent aspects of the song, such as its duration.”

Researchers still need to investigate whether different learning methods affect the brains of children and teenagers in the same way.

Songbirds may have ‘universal grammar’

“In the past, research on zebra finches has repeatedly provided important clues and hypotheses for investigating neurobiological processes, in particular in relation to vocal learning,” says Hahnloser. “Our latest findings in finches also lead to hypotheses that could be studied in humans to better understand social learning processes.”

Source: ETH Zurich

The post For songbirds, 1 way to learn is best for generalizing appeared first on Futurity.

‘Catch-up rule’ could shorten lengthy baseball games

Mon, 2018-08-13 10:05

A new rule could shorten major league baseball games, make them more competitive, and, perhaps, boost fan interest at the same time, according to researchers.

If MLB’s intent is to truly speed up contests and increase the league’s competitiveness, a radical change is in order.

Recent rule changes, such as a limited number of mound visits, have done little to shorten baseball games—contests currently average three hours and five minutes. It’s likely these marathons will continue to have an impact on attendance, which was down 9 percent in the first half of the MLB season.

Moreover, the 2018 season suffers from a lack of competitiveness—at least in the American League, where only six teams are vying for five playoff spots. If MLB’s intent is to truly speed up contests and increase the league’s competitiveness, some contend, a more radical change is in order.

One proposal comes from two researchers who outline a rule change that would shorten major-league baseball games by almost half an hour while making contests closer and, perhaps, re-igniting interest in the game.

The proposal calls for reducing the number of outs the leading team gets during its at-bats.

Steven Brams, a professor of politics, and Aaron Isaksen, a researcher at the Tandon School of Engineering’s Game Innovation Lab at New York University, label their proposal the catch-up rule. It works like this: If a team is ahead or goes ahead during its turn at bat in an inning, it would have only two, rather than three, outs before its side retires.

To test the efficacy of its proposal, they re-ran, under the catch-up rule, all MLB regular-season and post-season games (more than 100,000) for the past 50 years (1967-2017). Specifically, to estimate the length of nine-inning games under the catch-up rule, there are three cases:

  • Neither team is or goes ahead in its at-bat half inning: 3 + 3 = 6 outs required.
  • One team is or goes ahead in its at-bat half inning: 2 + 3 = 5 outs required.
  • Each team is or goes ahead in its at-bat half inning: 2 + 2 = 4 outs required.

If this provision had been in place for all of these games, excepting those that ended in extra innings or concluded in a tie, the following would have occurred:

  • A reduction in the number of outs over nine innings from an average of 52.5 to an average of 45.9—a 13-percent decrease, resulting in an estimated 24-minute reduction in the length of games;
  • A reduction in the winning team’s average margin of victory from 3.21 runs to 2.15 runs—a 33-percent decrease—making games more competitive;
  • An increase in the number of tied games (which would, in reality, move to extra frames) from 10,053 to 15,493—a 54-percent uptick. However, because only 14 percent of all analyzed games using the catch-up rule would go into extra innings, this increase would have only a minor effect on the average length of games, increasing it from 45.9 to 47.2 outs, or 1.3 outs—about 3 percent.
What’s going on with baseball’s rising home run rates?

The researchers acknowledge that if MLB adopts the catch-up rule, teams would undoubtedly make adjustments to try to exploit it. Because there is a disadvantage for an at-bat team to be ahead and have only two outs, it will want to try to jump ahead by as much as possible before it incurs two outs.

For example, it would not make sense under the catch-up rule for an at-bat team, ahead and with one out in an inning, to use a sacrifice bunt to advance on-base runners, because then it would have to retire immediately after the sacrifice.

The catch-up rule may lead to other strategic adjustments, such as in batting order, the use of pinch hitters and runners, and conditions under which to steal bases.

“But there is not a great deal that teams can do to capitalize on the catch-up rule because success at hitting and stealing is highly individualistic,” the researchers write.

Foul ball! Strike out 105-year-old ‘Baseball Rule’?

“Unlike other sports, a team’s performance is much less a function of team effort than, for example, scoring in football, basketball, or hockey. Accordingly, we would expect the adjustments that teams might make in, say, batting order would not have much effect. In short, the basic features of baseball are likely to stay the same.”

Source: NYU

The post ‘Catch-up rule’ could shorten lengthy baseball games appeared first on Futurity.

Sports stats show why lefties are rare

Mon, 2018-08-13 09:32

Left-handed people are relatively rare because of the balance between cooperation and competition in human evolution, according to a new study of sports data.

Representing only 10 percent of the general human population, left-handers have been viewed with suspicion and persecuted across history. The word “sinister” even derives from “left or left-hand.”

Researchers at Northwestern University now report that a high degree of cooperation, not something odd or sinister, plays a key role in the rarity of left-handedness.

They developed a mathematical model that shows the low percentage of lefties is a result of the balance between cooperation and competition in human evolution.

In a fight, a left-hander would have the advantage in a right-handed world.

Professor Daniel M. Abrams and graduate student Mark J. Panaggio—both right-handers—are the first to use real-world data (from competitive sports) to test and confirm the hypothesis that social behavior is related to population-level handedness.

“The more social the animal—where cooperation is highly valued—the more the general population will trend toward one side,” says Abrams, an assistant professor of engineering sciences and applied mathematics at the McCormick School of Engineering and Applied Science.

“The most important factor for an efficient society is a high degree of cooperation. In humans, this has resulted in a right-handed majority.”

If societies were entirely cooperative everyone would be same-handed, Abrams says. But if competition were more important, one could expect the population to be 50-50. The new model can predict accurately the percentage of left-handers in a group—humans, parrots, baseball players, golfers—based on the degrees of cooperation and competition in the social interaction.

The model helps to explain our right-handed world now and historically: the 90-10 right-handed to left-handed ratio has remained the same for more than 5,000 years. It also explains the dominance of left-handed athletes in many sports where competition can drive the number of lefties up to a disproportionate level.

Cooperation favors same-handedness—for sharing the same tools, for example. Physical competition, on the other hand, favors the unusual. In a fight, a left-hander would have the advantage in a right-handed world.

Abrams and Panaggio turned to the world of sports for data to support their balance of cooperation and competition theory. Their model accurately predicted the number of elite left-handed athletes in baseball, boxing, hockey, fencing, and table tennis—more than 50 percent among top baseball players and well above 10 percent (the general population rate) for the other sports.

Identical twins, who share exactly the same genes, don’t always share the same handedness.

On the other hand, the number of successful left-handed PGA golfers is very low, only 4 percent. The model also accurately predicted this.

“The accuracy of our model’s predictions when applied to sports data supports the idea that we are seeing the same effect in human society,” Abrams says.

Handedness, the preference for using one hand over the other, is partially genetic and partially environmental. Identical twins, who share exactly the same genes, don’t always share the same handedness.

“As computers and simulation become more widespread in science, it remains important to create understandable mathematical models of the phenomena that interest us, such as the left-handed minority,” Abrams says.

This anxiety/depression therapy could be really bad for lefties

“By discarding unnecessary elements, these simple models can give us insight into the most important aspects of a problem, sometimes even shedding light on things seemingly outside the domain of math.”

The James S. McDonnell Foundation supported this research, which is available in the Journal of the Royal Society Interface.

Source: Northwestern University (Originally published April 26, 2012)

The post Sports stats show why lefties are rare appeared first on Futurity.

Prototype could offer fresh water where wells run salty

Mon, 2018-08-13 09:19

A solar-powered distillation unit could desalinate water in arid coastal areas where wells are so depleted that seawater leaches into the freshwater supply.

The prototype can distill 150 liters (40 gallons) of water per day and can scale up to 3,000 liters (793 gallons). That’s equal to five truckloads of fresh water and a much more eco-friendly solution to the problem of insufficient access to fresh water, says Jose Alfaro, an assistant professor at the University of Michigan’s School for Environment and Sustainability.

“We developed this product with a particular community in mind, but we realized that it would be good for a number of communities,” he says.

Circular economy

The researchers designed the system for Tastiota, a small village in the Sonoran desert, which had been trucking in its water from a source 100 kilometers (62 miles) away.

After distillation, what’s left is brine that can be converted to salt and sold to other businesses nearby, creating a circular economy.

Other markets for the desalination unit include the global sunbelt located several degrees above and below the equator and hotels in coastal communities, Alfaro says.

“Hotels could use this to reduce their impact on the areas they are serving. A lot of the locations of these hotels are in fragile basins at risk of getting saline intrusion.”

Sustainable solution

Alfaro and Iulia Mogosanu, who graduated in the spring with an MBA, and Pablo Taddei, who graduated with a master’s degree in sustainable systems in 2017, wanted to create a sustainable solution to water scarcity issues in coastal communities where arid conditions, rising temperatures, and decreased precipitation due to climate change exacerbate the problem.

Over the past year, the team developed a proprietary process to remove salt from local water sources by leveraging solar radiation to power an innovative desalination technology. Early analysis indicates that the combination of concentrated solar power and single-stage distillation will provide a cost-effective and easy solution to water scarcity issues.

What makes this solution truly sustainable is the business component, the researchers say. This technology results in both a sellable byproduct, by processing brine into salt, and an improved capacity for coastal fishers to bring their catch to larger markets. This significantly improves the technology’s financial viability and provides a true market solution.

Device makes clean water with paper and sunlight

Taddei is a native of Hermosillo County in Mexico—a region that is along the Sonora coast. Hermosillo, as with many coastal communities, has been experiencing severe water scarcity due to saline contamination of the wells, which low precipitation makes worse. Taddei was interested in finding a sustainable solution to this problem and began probing ideas that would desalinate the abundant source of ocean water.

“I realized that the potential of such a solution had far-reaching implications globally. It was clear to me that the commercial potential of this idea was scalable to different conditions in different regions of the world,” Mogosanu says.

Mat ‘baits, hooks, and destroys’ pollution in water

Alfaro traveled to Costa Rica last month to determine if there were communities that might benefit from the distillation unit. Working with a United Nations official, he plans to run a pilot program on a small island there where water arrives by boat.

To work, the area needs direct sunlight, a good governance system around the water that would run the desalination units after initial set up, and a need for potable water. The team also plans to market to communities in West Africa, Lima, Peru, and along the coast in Chile.

Source: University of Michigan

The post Prototype could offer fresh water where wells run salty appeared first on Futurity.

Where do rare spiders live? Ask citizen scientists

Mon, 2018-08-13 08:09

Online data from citizen scientists may be key to mapping the distribution of rare species in the wild, a new study reports.

Species distribution maps are essential to understanding an organism’s ecology, forecast how climate change and other activities will affect it, and plan management strategies.

But detailed knowledge of most species’ ranges is lacking, since the number of professional field biologists contributing information and specimens to museum collections—the principle source of range information—is small. Further, as climate changes, ranges shift, and professional observations to track those shifts are quite rare.

Northern black widow. (Credit: Sean McCann via McGill)

“People who are excited about discovering where species live can contribute in meaningful ways to scientific progress…”

To test the potential of online citizen scientist observations to contribute to detailed species mapping, researchers combined information from online databases that include observations from citizen scientists and museum collections for two spider species, the northern black widow (Latrodectus variolus) and the black purse-web spider (Sphodros niger).

The researchers then modeled distribution for each, using a variety of statistical tests and modeling tools to remove questionable observations and increase the validity of the final range prediction model. They also compared current predicted range to historical range to test for the occurrence of range shift over time.

They predict that the purse-web spider’s range may have shifted since 1960, contracting in the southwest corner (including Arkansas, Missouri, and Tennessee), and expanding along the northern edge in Canada. The northern edge of the black widow’s range may also have increased over time.

The most important environmental factor determining predicted current range for the purse-web spider was the mean temperature of the coldest three months of the year, while for the black widow it was the mean temperature of the warmest three months.

“In our project, the citizen science data was essential in modeling distributions of spiders,” explains Christopher Buddle, professor at McGill University.

Scientists need help: Capture mosquito buzz on your phone

“People who are excited about discovering where species live can contribute in meaningful ways to scientific progress and this is exciting, important, and is changing how we do research.”

“Our models show the first reliable distribution maps of these two species,” says Yifu Wang of McGill University, but she noted that both species have potential distribution ranges beyond currently documented regions.

Rare beach flower returns if nibbling mice lose cover

“The logical next step is to conduct sampling efforts in typical habitats associated with these species in our predicted range to further validate the models.

“We propose to call on citizen scientists by launching a monitoring project through a platform such as BugGuide and iNaturalist to produce a large-scale sampling effort. This would represent a rapid, low-cost, highly efficient, and innovative way to test these large scale predictive models.”

The findings appear in PLOS ONE.

Source: McGill University

The post Where do rare spiders live? Ask citizen scientists appeared first on Futurity.

Diverse data upend history of language’s evolution

Sun, 2018-08-12 19:47

New research could revise the history of how we think humans acquired language.

Scientists have held up a gene that may affect speech and language, FOXP2, as a “textbook” example of positive selection on a human-specific trait. In a new paper in the journal Cell, however, researchers challenge this finding.

In their analysis of genetic data from a diverse sample of modern people and Neanderthals, researchers saw no evidence for recent, human-specific selection of FOXP2.

“We’re interested in figuring out, on a genetic level, what makes us human…”

A paper from 2002 claimed there was a selective sweep relatively recently in human evolutionary history that could largely account for our linguistic abilities and even help explain how modern humans were able to flourish so rapidly in Africa within the last 50-100,000 years, says senior author Brenna Henn, a population geneticist at Stony Brook University and the University of California, Davis.

“…emphasizing diversity and inclusivity in data collection…clearly yields more accurate results.”

Henn was immediately interested in the dating of these mutations and the selective sweep. She wanted to re-analyze FOXP2 with larger and more diverse data sets, especially in more African populations.

Henn says that when researchers did the original 2002 work, they didn’t have access to the modern sequencing technology that now provides data on whole genomes, so they only analyzed a small fraction of the FOXP2 gene in about 20 individuals, mostly of Eurasian descent.

“We wanted to test whether their hypothesis stood up against a larger, more diverse data set that more explicitly controlled for human demography,” she says.

FOXP2 is highly expressed during brain development and regulates some muscle movements aiding in language production. When the gene isn’t expressed, it causes a condition called specific language impairment in which people may perform normally on cognitive tests but cannot produce spoken language. FOXP2 has also been shown to regulate language-like behaviors in mice and songbirds.

“In the past five years, several archaic hominin genomes have been sequenced, and FOXP2 was among the first genes examined because it was so important and supposedly human-specific,” says first author Elizabeth Atkinson of Stony Brook University and the Broad Institute of Harvard and MIT. “But this new data threw a wrench in the 2002 paper’s timeline, and it turns out that the FOXP2 mutations we thought to be human-specific, aren’t.”

Our efficient brains may explain language ‘universals’

Atkinson and her colleagues assembled mostly publicly available data from diverse human genomes—both modern and archaic—and analyzed the entire FOXP2 gene while comparing it to the surrounding genetic information to better understand the context for its evolution. Despite attempting a series of different statistical tests, they were unable to replicate this idea that there was any positive selection occurring for FOXP2.

The researchers hope that this paper will serve as a template for other population geneticists to conduct similar projects on human evolutionary history in the future.

“We’re interested in figuring out, on a genetic level, what makes us human,” Henn says. “This paper shows how important it is to use a diverse set of humans in studying the evolution of all of us as a species.

“There’s a severe Eurocentric bias in a lot of medical and other scientific studies, but we’ve found a scientific impetus for emphasizing diversity and inclusivity in data collection because it clearly yields more accurate results.”

Brain ‘tunes in’ to rhythm in sign language and speech

The National Institutes of Health and a Terman Fellowship funded the research.

Source: UC Davis

The post Diverse data upend history of language’s evolution appeared first on Futurity.

Older adults on dialysis face higher risk for dementia

Sun, 2018-08-12 19:00

Older kidney disease patients who are sick enough to require blood-filtering dialysis have a substantially higher risk of dementia, including Alzheimer’s disease, a new study suggests.

“The dementia risk in this population seems to be much higher than what we see among healthy community-dwelling older adults,” says lead author Mara McAdams-DeMarco, assistant professor of epidemiology at Johns Hopkins University’s Bloomberg School of Public Health.

The findings suggest that doctors should be doing more to monitor cognitive decline among older dialysis patients, researchers say.

“The high incidence of dementia seems to be overlooked in this population,” McAdams-DeMarco says.

Blood flow to the brain

Cognitive decline and dementia, including Alzheimer’s disease, are largely age-related and relatively common in the elderly. Research suggests, however, that kidney disease appears to worsen the problem.

Studies over the past two decades have found evidence that as kidney function declines, cognitive functions are apt to decline as well. One recent study in dialysis patients found that this kidney-related cognitive decline was particularly noticeable for executive functions such as attention, impulse control, and working memory.

The precise biological mechanism linking kidney disease to brain problems is not clear, but kidney disease itself has been linked to poor blood flow in the brain, so researchers suspect that as a key factor.

To get a better understanding of the dementia problem among elderly patients with advanced kidney disease, researchers examined a large national kidney disease registry, focusing on 356,668 Medicare patients older than 66 who initiated dialysis due to end-stage kidney disease from 2001 to 2013.

Their analysis aimed mainly at estimating the risk of a dementia diagnosis within a given period after initiating dialysis. For female patients, the estimated risk was 4.6 percent for a dementia diagnosis within a year, 16 percent within 5 years, and 22 percent—a more than one in five chance—within 10 years. For males, the corresponding figures were slightly lower at 3.7, 13, and 19 percent.

Alzheimer’s disease represented a significant proportion of dementia diagnoses: The one-year risk of this form of dementia was 0.6 percent for women and 0.4 percent for men.

Higher than normal

The study didn’t compare dialysis patients directly to healthy people of the same age; even so, the dementia risk among these patients was considerably higher than what would be expected in this age group.

For example, a well-known prior study following residents of a Massachusetts town found that community-dwelling 65-year-olds had only a 1 to 1.5 percent incidence of dementia within 10 years, while for 75-year-olds the incidence was only about 7.5 percent.

Longer lives mean more people will have dementia

By contrast, in this study the researchers determined that the 10-year risk of dementia after starting dialysis was 19 percent for patients in the sample aged 66 to 70, and 28 percent among 76- to 80-year-olds.

Even the Alzheimer’s disease risk among the dialysis patients seemed higher than normal—for example, 4.3 percent of the 66- to 70-year-olds received a diagnosis for the disease within 10 years of starting dialysis, compared to a 10-year incidence of less than 1 percent among 65-year-olds in the Massachusetts study. That suggests that older patients with end-stage kidney disease may even be vulnerable to Alzheimer’s disease.

Chronic kidney disease brings heavy pill burden

Researchers also found that older dialysis patients with a dementia diagnosis were about twice as likely to die at any time in the study period, compared to older dialysis patients without a dementia diagnosis.

As stark as these findings are, they may understate the problem. “We know from other studies that only about half of patients with dementia receive a diagnosis, so the figures in this study could be seen as a lower limit,” McAdams-DeMarco says.

She and her colleagues suggest that more in-depth studies need to be done to gauge the true extent of the dementia problem among older end-stage kidney disease patients.

“Patients starting dialysis generally meet with health care providers a few times per week, so in principle there is ample opportunity to do at least brief cognitive screening,” she says.

She also recommends more studies of potential measures to prevent dementia among these vulnerable patients. “We’re currently setting up a large clinical trial to identify appropriate interventions to preserve cognitive function in these patients,” McAdams-DeMarco says.

The National Institutes of Health funded the work.

Source: Johns Hopkins University

The post Older adults on dialysis face higher risk for dementia appeared first on Futurity.

Most teens keep risk-taking impulses in check

Sun, 2018-08-12 18:55

Most teens have behavioral brakes, and use them, to keep risk-taking experiments and impulsive behavior in check, a new study reports.

The study, which appears in Journal of Youth and Adolescence, finds that only a subset of teens—those with weak cognitive control—engage in excessive levels of impulsiveness, such as acting without thinking, and end up struggling with addictions or other behavioral problem as young adults.

The research challenges traditional thinking that adolescence is a time of universal imbalance.

Cognitive control is the ability to exert top-down control over behavior, thoughts, and emotions. This ability, tied to executive functions, rests in the brain’s prefrontal cortex.

“People have heard so much about the teenage brain being all gas and no brakes, stemming from an imbalance between the reward and control regions of the brain,” says Atika Khurana, a professor and director of graduate programs in the prevention science program at the University of Oregon. “This study shows that this is not true. There is an imbalance for some youth, but it is not universal.”

The findings, she says, challenge traditional thinking that adolescence is a time of universal imbalance, with kids lacking cognitive control and taking risks to reap instant rewards.

Khurana and colleagues analyzed six waves of data collected from 387 adolescents, ages 11 to 18, in the Philadelphia area. They looked at changes in sensation-seeking and impulsivity during teenage years in relation to cognitive control and as predictors of substance use disorders in late adolescence.

Only those teens with weakness in cognitive control were at risk for impulsive behaviors, putting them at higher risk for substance abuse. While sensation-seeking did increase during teenage years, it was not associated with weakness in cognitive control or later substance abuse.

“Previous studies modeling changes in impulsivity and sensation seeking during adolescence drew conclusions based on age differences without looking at the same adolescents over time as they developed,” Khurana says. “This study looked at individual trajectories and captured distinct patterns of change that were not otherwise observable when looking at youth at different ages.”

Today’s teens aren’t as into drugs, alcohol, or theft

The study supported predictions of the Lifespan Wisdom Model developed by coauthor Daniel Romer of the University of Pennsylvania’s Annenberg Public Policy Center. It also was in line with a series of published findings that have emerged from Khurana’s work with the same data, which began while she was a postdoctoral fellow at the Annenberg Center.

In 2012, her group reported a positive association of working memory with sensation-seeking and a negative association with impulsivity. While children with sensation seeking engaged in exploratory forms of risk-taking, they were not getting stuck in unhealthy patterns of risk-taking.

Subsequently, the group has shown that weak working memory in combination with impulsivity can be used to predict trajectories of early alcohol use and risky sexual behavior in adolescents, and that adolescents with strong working memory are better equipped to escape early progression in drug use and avoid substance abuse issues.

Teens take risks, but that’s not a flaw of their brains

The research, Khurana says, speaks to the need for greater emphasis on early interventions that can strengthen cognitive control.

The National Institutes of Health funded the research. Additional coauthors are from the Children’s Hospital of Philadelphia.

Source: University of Oregon

The post Most teens keep risk-taking impulses in check appeared first on Futurity.

Frequent skin cancer may be a huge warning sign

Sun, 2018-08-12 18:45

People who develop abnormally frequent cases of a skin cancer known as basal cell carcinoma appear to be at significantly increased risk for the development of other cancers, including blood, breast, colon, and prostate cancers, according to a new, preliminary study.

“[Skin is] the best organ to detect genetic problems that could lead to cancers.”

Mutations in a panel of proteins responsible for repairing DNA damage likely cause the increased susceptibility, researchers say.

“We discovered that people who develop six or more basal cell carcinomas during a 10-year period are about three times more likely than the general population to develop other, unrelated cancers,” says senior author Kavita Sarin, assistant professor of dermatology at Stanford University.

“We’re hopeful that this finding could be a way to identify people at an increased risk for a life-threatening malignancy before those cancers develop.”

The research appears in JCI Insight.

Canary in the coal mine

The skin is the largest organ of the body and the most vulnerable to DNA damage caused by the sun’s ultraviolet rays. Try as one might, it’s just not possible to completely avoid sun exposure, which is why proteins that repair DNA damage are important to prevent skin cancers like basal cell carcinoma.

Most of the time this system works well. But sometimes the repair team can’t keep up. Basal cell carcinomas are common—more than 3 million cases a year are diagnosed in the United States alone—and usually highly treatable.

“About 1 in 3 Caucasians will develop basal cell carcinoma at some point in their lifetime…”

Sarin and lead author Hyunje Cho, a medical student, wondered whether the skin could serve as a kind of canary in the coal mine to reveal an individual’s overall cancer susceptibility. “The skin is basically a walking mutagenesis experiment,” Sarin says. “It’s the best organ to detect genetic problems that could lead to cancers.”

Sarin and Cho studied 61 people treated for unusually frequent basal cell carcinomas—an average of 11 per patient over a 10-year period. They investigated whether these people may have mutations in 29 genes that code for DNA-damage-repair proteins.

“We found that about 20 percent of the people with frequent basal cell carcinomas have a mutation in one of the genes responsible for repairing DNA damage, versus about 3 percent of the general population. That’s shockingly high,” Sarin says.

Virus sleeps for years then wakes up to cause skin cancer

Furthermore, 21 of the 61 people reported a history of additional cancers, including blood cancer, melanoma, prostate cancer, breast cancer, and colon cancer—a prevalence that suggests the frequent basal cell carcinoma patients are three times more likely than the general population to develop cancers.

‘A strong correlation’

To confirm the findings, the researchers applied a similar analysis to a large medical insurance claims database. Over 13,000 people in the database had six or more basal cell carcinomas; these people also were over three times more likely to have developed other cancers, including colon, melanoma, and blood cancers.

Finally, the researchers identified an upward trend: the more basal cell carcinomas an individual reported, the more likely that person was to have had other cancers as well.

“I was surprised to see such a strong correlation,” Sarin says. “But it’s also very gratifying. Now we can ask patients with repeated basal cell carcinomas whether they have family members with other types of cancers, and perhaps suggest that they consider genetic testing and increased screening.”

The researchers are continuing to enroll patients in the study, which is ongoing, to learn whether particular mutations in genes responsible for repairing DNA damage are linked to the development of specific malignancies. They’d also like to conduct a similar study in patients with frequent melanomas. But they emphasized that there’s no reason for people with occasional basal cell carcinomas to worry.

This generic skin cream may cut carcinoma risk

“About 1 in 3 Caucasians will develop basal cell carcinoma at some point in their lifetime,” Sarin says. “That doesn’t mean that you have an increased risk of other cancers. If, however, you’ve been diagnosed with several basal cell carcinomas within a few years, you may want to speak with your doctor about whether you should undergo increased or more intensive cancer screening.”

The Dermatology Foundation, the National Institutes of Health, the Stanford Society of Physician Scholars, the American Skin Association, and Pellepharm Inc. supported the research. Stanford’s dermatology department also supported the work.

Two of the coauthors are cofounders, directors, and officers of Pellepharm, a biotechnology company focused on rare dermatological conditions.

Source: Stanford University

The post Frequent skin cancer may be a huge warning sign appeared first on Futurity.

C-sections less likely after inducing new moms at 39 weeks

Fri, 2018-08-10 10:53

Inducing labor in healthy first-time mothers in the 39th week of pregnancy results in lower rates of cesarean sections compared with waiting for labor to begin naturally at full term, according to new research.

Additionally, births to women who had inductions at 39 weeks were not more likely to result in stillbirths, newborn deaths, or other major health complications for the baby.

“This study is a potential game changer and will have a significant impact on the practice of obstetrics,” says senior author George Macones, head of the obstetrics and gynecology department at Washington University School of Medicine in St. Louis.

“The concern has been that inducing labor—even at 39 weeks—would increase the cesarean section rate and health problems in newborns,” says Macones. “We found inductions at 39 weeks lowered, not raised, the number of deliveries by cesarean section.”

Inductions at 39 weeks resulted in 18.6% c-sections, whereas waiting for labor to start naturally resulted in a 22.2% c-section rate.

The findings appear in the New England Journal of Medicine.

Delivering by cesarean section generally is considered safe for mother and baby. However, the procedure involves major surgery and, therefore, poses increased complication risks and longer recovery times for mothers compared to delivering vaginally.

Previous studies have shown that inducing labor without medical reason before pregnancies are full-term at 39 weeks poses health risks for newborns, primarily because the lungs, brain, and other organs haven’t fully developed. But inductions at 39 weeks—one week before a woman’s due date—has become more common in recent years, and the researchers wanted a better understanding of the risks and benefits to mother and baby.

“Our department already is recommending induction at 39 weeks for healthy pregnant women,” says Macones, who treats patients at Barnes-Jewish Hospital. “Some women prefer to schedule an induction because it allows them to plan ahead. Of course, women without pregnancy complications can choose how they want to experience labor and delivery, and we respect their wishes.”

The study enrolled about 6,100 healthy, first-time mothers-to-be at 41 hospitals belonging to the Maternal-Fetal Medicine Units Network. The researchers assigned about half of the pregnant women to labor induction at 39 weeks, while the other half waited for labor to begin naturally. Some women in the latter group had inductions after 39 weeks for medical reasons.

“Our findings offer healthy, pregnant women options for labor and delivery.”

Of those who were induced at 39 weeks, 569 (18.6 percent) had cesarean sections compared with 674 women (22.2 percent) who delivered by cesarean after waiting for labor to occur naturally—a difference that is statistically significant.

Other health benefits experienced by women in the induced labor group included reduced rates of pregnancy-related hypertension and postpartum infections. Specifically, 277 (9.1 percent) women induced at 39 weeks experienced blood pressure problems and 50 (1.6 percent) contracted infections after delivery, compared with 427 (14.1 percent) and 65 (2.1 percent) in the spontaneous labor group.

Infants born to both groups of mothers had the same risks for complications such as newborn death, seizure, infection, injury, and the need for infant respiratory support. Of the women in the induced labor group and the spontaneous labor group, 132 (4.3 percent) and 164 (5.4 percent), respectively, experienced birth complications that affected the babies’ health. The difference between the two groups is not significant.

Tdap vaccine during pregnancy pays off for baby

“Our findings offer healthy, pregnant women options for labor and delivery,” Macones says. “However, the choice always remains theirs.”

Macones chairs the Meternal-Fetal Medicine Units Network, which has support from the NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development.

Source: Washington University in St. Louis

The post C-sections less likely after inducing new moms at 39 weeks appeared first on Futurity.

We’re at risk of toppling into a ‘hothouse Earth’ scenario

Fri, 2018-08-10 10:27

Even if we achieve the carbon emissions reductions that the Paris Agreement calls for, there is the risk of the planet entering “hothouse Earth” conditions.

A “hothouse Earth” climate will in the long term stabilize at a global average of 4-5°C higher than pre-industrial temperatures with sea level 10-60 m higher than today.

This warning appears in the Proceedings of the National Academy of Sciences. And accelerating the transition towards an emission-free world economy has become more urgent than ever, conclude the paper’s authors.

“Climate and other global changes show us that we humans are impacting the Earth system at the global level,” says coauthor Katherine Richardson, professor at the Center for Macroecology, Evolution and Climate at the University of Copenhagen. “This means that we as a global community can also manage our relationship with the system to influence future planetary conditions. This study identifies some of the levers that can be used to do so.”

“Places on Earth will become uninhabitable if ‘hothouse Earth’ becomes the reality.”

“Human emissions of greenhouse gas are not the sole determinant of temperature on Earth. Our study suggests that human-induced global warming of 2°C may trigger other Earth system processes, often called “feedbacks,” that can drive further warming—even if we stop emitting greenhouse gases,” says lead author Will Steffen from the Australian National University and Stockholm Resilience Centre. “Avoiding this scenario requires a redirection of human actions from exploitation to stewardship of the Earth system.”

Currently, global average temperatures are just over 1°C above pre-industrial and rising at 0.17°C per decade.

Types of ‘feedback’

The authors of the study consider a number of natural feedback processes, some of which are “tipping elements” that lead to abrupt change if we cross a critical threshold. These feedbacks could turn from being a “friend” that stores carbon into a “foe” that emits it uncontrollably in a warmer world.

These feedbacks are:

  • permafrost thaw,
  • loss of methane hydrates from the ocean floor,
  • weakening land and ocean carbon sinks,
  • increasing bacterial respiration in the oceans,
  • Amazon rainforest dieback,
  • boreal forest dieback,
  • reduction of northern hemisphere snow cover,
  • loss of Arctic summer sea ice,
  • and reduction of Antarctic sea ice and polar ice sheets.

“These tipping elements can potentially act like a row of dominoes. Once one is pushed over, it pushes Earth towards another,” adds coauthor Johan Rockström, executive director of the Stockholm Resilience Centre and incoming co-Director of the Potsdam Institute for Climate Impact Research.

“It may be very difficult or impossible to stop the whole row of dominoes from tumbling over. Places on Earth will become uninhabitable if “hothouse Earth” becomes the reality.”

Can we ‘park’ at +2º?

Coauthor Hans Joachim Schellnhuber, director of the Potsdam Institute for Climate Impact Research, says: “What we do not know yet is whether the climate system can be safely ‘parked’ near 2°C above pre-industrial levels, as the Paris Agreement envisages. Or if it will, once pushed so far, slip down the slope towards a hothouse planet. Research must assess this risk as soon as possible.”

Did this 1945 nuclear test begin the Anthropocene? 

The paper says that maximizing the chances of avoiding a “hothouse Earth” requires not only reduction of carbon dioxide and other greenhouse gas emissions but also enhancement and/or creation of new biological carbon stores via, for example:

  • improved forest, agricultural, and soil management;
  • biodiversity conservation;
  • and technologies that remove carbon dioxide from the atmosphere and store it underground.

Critically, the study emphasizes that these measures must have the support of fundamental societal changes that are required to maintain a “stabilized Earth” where temperatures are ~2°C warmer than the pre-industrial.

Source: University of Copenhagen

The post We’re at risk of toppling into a ‘hothouse Earth’ scenario appeared first on Futurity.

Moral outrage online can backfire big time

Fri, 2018-08-10 10:21

When outcry against offensive behavior on social media goes viral, people may see those challenging the behavior less as noble heroes doing the right thing and more as bullies doling out excessive punishment, according to a new study.

Through a series of laboratory studies, Benoît Monin, a professor of ethics, psychology, and leadership at the Graduate School of Business and professor of psychology at Stanford University, and PhD candidate Takuya Sawaoka found that while comments against offensive behavior are seen as legitimate and even admirable as individual remarks, they may lead to greater sympathy for the offender when they multiply.

Viral anger

“One of the features of the digital age is that anyone’s words or actions can go viral, whether they intend to or not,” says Sawaoka.

“In many cases, the social media posts that are met with viral outrage were never intended to be seen by people outside of the poster’s social circle. Someone doesn’t even need to be on social media in order for their actions to go viral.”

“We’ve all either been in one of those maelstroms of outrage or just one step away from one as bystanders on our social media news feeds…”

Because of social media, responses to questionable behavior reach further than ever before.

“We’ve all either been in one of those maelstroms of outrage or just one step away from one as bystanders on our social media news feeds,” says Monin, noting how frequent these public outcries have become on social media.

For example, in 2013 there was public outcry over a young woman who tweeted that she couldn’t get AIDS while traveling to Africa because she was white. Her post, which she says she intended as a joke, went viral across social media and quickly made its way into the news. It led to her losing her job.

“On the one hand, speaking out against injustice is vital for social progress, and it’s admirable that people feel empowered to call out words and actions they believe are wrong,” says Sawaoka. “On the other hand, it’s hard not to feel somewhat sympathetic for people who are belittled by thousands of strangers online, and who even lose friends and careers as a result of a poorly thought-out joke.”

‘Outrage at the outrage’

Sawaoka and Monin put their observations to the test. They conducted six experiments with a total of 3,377 participants to examine how people perceived public outcry to an offensive or controversial post on social media. The researchers set up a variety of scenarios, including asking people how they felt when there were only one or two comments versus a mass of replies.

In one study, the researchers showed participants a post taken from a real story of a charity worker who posted a photograph of herself making an obscene gesture and pretending to shout next to a sign that read “Silence and Respect” at Arlington National Cemetery.

“There is a balance between sympathy and outrage…”

They asked participants how offensive they found the photograph, as well as what they thought about the responses to the post.

The researchers found that when participants saw the post with just a single comment condemning it, they found the reaction applaudable.

When they saw that reply echoed by many others, they viewed the original reply—which had been praiseworthy in isolation—more negatively. Early commenters were de facto penalized for later, independent responses, they say.

“There is a balance between sympathy and outrage,” says Monin about their findings. “The outrage goes up and up but at some point sympathy kicks in. Once a comment becomes part of a group, it can appear problematic. People start to think, ‘This is too much—that’s enough.’ We see outrage at the outrage.”

What about a white supremacist?

The researchers were curious to know whether people would feel less sympathetic depending on the status of the offender. Would they feel differently if something offensive was says by a well-known person, or by someone many people regard as abhorrent, like a white supremacist?

“Obviously, the implication is not that people should simply stay silent about others’ wrongdoing.”

In one study, participants were shown a social media post taken from a real story where a comedian ridiculed overweight women. The researchers set up two conditions: one where they referred to him as an average social media user, and another where they said he was an up-and-coming comedy actor.

Mirroring their earlier findings, the researchers found that a high-profile persona did not elicit any less sympathy than the average person—despite the fact that people believed they could cause more harm from their post. And like their previous results, the researchers found that people viewed individual commenters less favorably after outrage went viral.

When Sawaoka and Monin tested for affiliation to a white supremacist organization, they found similar results. Although participants were less sympathetic toward a white supremacist making a racist comment, they did not view the individuals who participated in the outrage any differently. They still perceived the display of viral outrage as bullying.

Negative posts out-do flops in social media marketing

“These results suggest that our findings are even more broadly applicable than we had originally anticipated, with viral outrage leading to more negative impressions of individual commenters even when the outrage is directed toward someone as widely despised as a white supremacist,” Sawaoka and Monin write.

No quick fix

The question about how to respond to injustice in the digital age is complex, Sawaoka and Monin conclude in the paper.

“Our findings illustrate a challenging moral dilemma: A collection of individually praiseworthy actions may cumulatively result in an unjust outcome,” Sawaoka says.

Depression more likely for social media addicts

“Obviously, the implication is not that people should simply stay silent about others’ wrongdoing,” he clarifies. “But I think it is worth reconsidering whether the mass shaming of specific individuals is really the best way to achieve social progress.”

Source: Stanford University

The post Moral outrage online can backfire big time appeared first on Futurity.