Syndicate content Futurity
Research news from top universities.
Updated: 18 min 55 sec ago

Moody or depressed teen? 40% of parents aren’t sure

Mon, 2019-11-18 10:02

In a recent poll, 40% of parents said they struggle to differentiate between normal mood swings and signs of depression.

At the same time, 30% said their child is good at hiding feelings.

Though the majority of parents say they are confident they would recognize depression in their middle or high school aged child, two thirds acknowledge barriers to spotting specific signs and symptoms, according to the CS Mott Children’s Hospital National Poll on Children’s Health at the University of Michigan. The report is available online.

“In many families, the preteen and teen years bring dramatic changes both in youth behavior and in the dynamic between parents and children,” says poll co-director Sarah Clark. “These transitions can make it particularly challenging to get a read on children’s emotional state and whether there is possible depression.”

Still, a third of parents polled said nothing would interfere with their ability to recognize signs of depression in their child.

“Some parents may be overestimating their ability to recognize depression in the mood and behavior of their own child,” Clark says. “An overconfident parent may fail to pick up on the subtle signals that something is amiss.”

Depression on the mind

The poll also suggests that the topic of depression is all too familiar for middle and high school students. One in four parents say their child knows a peer or classmate with depression, and 1 in 10 say their child knows a peer or classmate who has died by suicide.

Indeed, rates of youth suicide continue to rise. Among people ages 10 to 24 years old, the suicide rate climbed 56% between 2007 and 2017, according to the Centers for Disease Control and Prevention.

“Our report reinforces that depression is not an abstract concept for today’s teens and preteens, or their parents,” Clark says.

“This level of familiarity with depression and suicide is consistent with recent statistics showing a dramatic increase in suicide among US youth over the past decade. Rising rates of suicide highlight the importance of recognizing depression in youth.”

Moody or depressed?

Compared to the ratings of their own ability, parents polled were also less confident that their preteens or teens would recognize depression in themselves.

Clark says parents should stay vigilant on spotting any signs of potential depression in kids, which may vary from sadness and isolation to anger, irritability, and acting out. Parents might also talk with their preteen or teen about identifying a go-to trustworthy adult if they are feeling blue, Clark says.

Most parents also believe schools should play a role in identifying potential depression, with 7 in 10 supporting depression screening starting in middle school.

“The good news is that parents view schools as a valuable partner in recognizing youth depression,” Clark says. The bad news is that too few schools have adequate resources to screen students for depression, and to offer counseling to students who need it.”

Clark encourages parents to learn whether depression screening is taking place at their child’s school and whether counseling is available for students who screen positive. Given the limited resources in many school districts, parents can be advocates of such efforts by talking to school administrators and school board members about the importance of offering mental health services in schools.

The nationally representative Mott Poll report is based on responses from 819 parents with at least one child in middle school, junior high, or high school.

Source: University of Michigan

The post Moody or depressed teen? 40% of parents aren’t sure appeared first on Futurity.

Superfast star launched from black hole at Milky Way’s center

Mon, 2019-11-18 09:52

Astronomers have spotted a high velocity star, traveling at a blistering six million kilometers per hour (3,728,227 miles per hour), that the supermassive black hole at the heart at the Milky Way ejected five million years ago.

The researchers saw the star, known as S5-HVS1 and located in the constellation of Grus—the Crane, was moving 10 times faster than most stars in the Milky Way.

“The velocity of the discovered star is so high that it will inevitably leave the galaxy and never return,” says coauthor Douglas Boubert from the University of Oxford.

Astronomers have wondered about high velocity stars since their discovery only two decades ago. S5-HVS1 is unprecedented due to its high speed and close passage to the Earth, “only” 29,000 light years away. With this information, astronomers could track its journey back into the center of the Milky Way, where a four million solar mass black hole, known as Sagittarius A*, lurks.

“This is super exciting, as we have long suspected that black holes can eject stars with very high velocities. However, we never had an unambiguous association of such a fast star with the galactic center,” says lead author Sergey Koposov, an assistant professor of physics and member of the McWilliams Center for Cosmology at Carnegie Mellon University.

“We think the black hole ejected the star with a speed of thousands of kilometers per second about five million years ago. This ejection happened at the time when humanity’s ancestors were just learning to walk on two feet.”

Black holes can eject superfast stars via the Hills Mechanism, which astronomer Jack Hills proposed thirty years ago. Originally, S5-HSV1 lived with a companion in a binary system, but they strayed too close to Sagittarius A*. In the gravitational tussle, the black hole captured the companion star, while it threw out S5-HVS1 at extremely high speed.

“This is the first clear demonstration of the Hills Mechanism in action,” says Ting Li from Carnegie Observatories and Princeton University, and leader of the S5 Collaboration. “Seeing this star is really amazing as we know it must have formed in the galactic center, a place very different to our local environment. It is a visitor from a strange land.”

The location of the star on the sky and the direction of its motion. The star is flying away from the Galactic center, from which it was ejected 5 million years ago. (Credit: Sergey Koposov)

Researchers discovered S5-HVS1 with the 3.9-meter Anglo-Australian Telescope (AAT) near Coonabarabran, New South Wales, Australia, coupled with superb observations from the European Space Agency’s Gaia satellite, which allowed the astronomers to reveal the full speed of the star and its journey from the center of the Milky Way.

“The observations would not be possible without the unique capabilities of the 2dF instrument on the AAT,” says Daniel Zucker, an astronomer at Macquarie University in Sydney, Australia, and a member of the S5 executive committee. “It’s been conducting cutting-edge research for over two decades and still is the best facility in the world for our project.”

“I am so excited this fast-moving star was discovered by S5,” says Kyler Kuehn of Lowell Observatory and a member of the S5 executive committee. “While the main science goal of S5 is to probe the stellar streams—disrupting dwarf galaxies and globular clusters—we dedicated spare resources of the instrument to searching for interesting targets in the Milky Way, and voila, we found something amazing for ‘free.’ With our future observations, hopefully we will find even more!”

The results appear in the Monthly Notices of the Royal Astronomical Society.

Source: Carnegie Mellon University

The post Superfast star launched from black hole at Milky Way’s center appeared first on Futurity.

Keto diet lets mice better fight the flu

Mon, 2019-11-18 08:50

Mice that ate a “keto” diet were better able to combat the flu virus than those that ate food high in carbohydrates, a study shows.

The ketogenic, or keto, diet—which for people includes meat, fish, poultry, and non-starchy vegetables—activates a subset of T cells in the lungs not previously associated with the immune system’s response to influenza, enhancing mucus production from airway cells that can effectively trap the virus, the researchers report.

“This was a totally unexpected finding,” says co-senior author Akiko Iwasaki, professor of immunobiology and molecular, cellular and developmental biology at Yale University, and an investigator of the Howard Hughes Medical Institute.

The research project was the brainchild of two trainees—one working in Iwasaki’s lab and the other with co-senior author Visha Deep Dixit, professor of comparative medicine and of immunobiology.

Ryan Molony worked in Iwasaki’s lab, which had found that immune system activators called inflammasomes can cause harmful immune system responses in their host. Emily Goldberg worked in Dixit’s lab, which had shown that the ketogenic diet blocked formation of inflammasomes.

The two wondered if diet could affect immune system response to pathogens such as the flu virus.

They showed that mice fed a ketogenic diet and infected with the influenza virus had a higher survival rate than mice on a high-carb normal diet. Specifically, the researchers found that the ketogenic diet triggered the release of gamma delta T cells, immune system cells that produce mucus in the cell linings of the lung—while the high-carbohydrate diet did not.

When mice were bred without the gene that codes for gamma delta T cells, the ketogenic diet provided no protection against the influenza virus.

“This study shows that the way the body burns fat to produce ketone bodies from the food we eat can fuel the immune system to fight flu infection,” Dixit says.

The study appears in Science Immunology.

Source: Yale University

The post Keto diet lets mice better fight the flu appeared first on Futurity.

Too little sleep can be bad for women’s bone density

Fri, 2019-11-15 16:23

Getting five or fewer hours of sleep a night is associated with low bone mineral density and higher odds of osteoporosis, researchers report.

“Our study suggests that sleep may negatively impact bone health, adding to the list of the negative health impacts of poor sleep,” says lead author Heather Ochs-Balcom, associate professor of epidemiology and environmental health at the University at Buffalo School of Public Health and Health Professions.

“I hope that it can also serve as a reminder to strive for the recommended seven or more hours of sleep per night for our physical and mental health.”

The study, published in the Journal of Bone and Mineral Research, focused on 11,084 postmenopausal US women from the Women’s Health Initiative. Women who self-reported sleeping five hours or less per night had significantly lower bone mineral density at four sites—whole body, hip, neck, and spine—compared to women who slept seven hours a night, a difference equivalent to one year of aging.

Researchers note there was no statistical difference among women who slept more than seven hours.

The body undergoes an array of healthy processes during sleep, including bone remodeling, during which old tissue is removed and new bone tissue forms.

“There’s a rhythm throughout the day. If you are sleeping less, one possible explanation is that bone remodeling isn’t happening properly,” Ochs-Balcom says.

The current study follows research the team published last year that found that women who had short sleep had higher likelihood of sustaining a fracture.

“The question was, is it because they’re up and walking around more, or because they really have lower bone mineral density?” Ochs-Balcom says. “I said why don’t we take a look at it because we have a sample of BMD scans from 11,000 women. This helps tell more of the story.”

While the findings might be the stuff of nightmares for older adults, the silver lining is that sleep is something people can control, along with adding in a few extra healthy behaviors.

Poor sleep is linked to a number of adverse health conditions, including obesity, diabetes, hypertension, and cardiovascular disease.

“It’s really important to eat healthy, and physical activity is important for bone health,” Ochs-Balcom says. “That’s the exciting part of this story—most of us have control over when we turn off the lights, when we put the phone down.”

Additional coauthors are from the University of Michigan; the University of Pittsburgh; Stony Brook University; the University of Massachusetts Medical School; the University of Arizona Cancer Center; the University of Wisconsin-Madison; Stanford University; California Pacific Medical Center; University of California, San Francisco; University of Washington; and Mercy Health Osteoporosis and Bone Health Services, Cincinnati.

Source: University at Buffalo

The post Too little sleep can be bad for women’s bone density appeared first on Futurity.

Supercharged trash gas could produce more green energy

Fri, 2019-11-15 15:56

Synthetic compounds called “siloxanes” from everyday products like shampoo and motor oil are finding their way into landfills and supercharging the biogas those landfills produce, researchers report.

While it’s a problem today, the researchers say it could be an opportunity to get more energy out of landfill gas.

The compounds efficiently conduct heat and interact with water, and as such their popularity has increased in a variety of consumer products. That means more and more siloxanes are headed to your local landfill.

Biogas refers to fuel gases that are synthesized from different biological or organic feedstocks like landfill gas and wastewater treatment plants. In recent years, it has become clear that siloxanes have been damaging the power-generating equipment that’s fueled with landfill gas. But the researchers say it may be possible to harness the siloxanes to produce more energy.

‘Like rocket fuel’

The researchers conducted the first chemical analysis of how siloxanes affect biogas. They found that siloxanes increase the reactivity of biogas, leading to faster ignition in engines and the release of more energy. But the siloxanes in the biogas can damage those engines—typically power-generating gas turbines and reciprocating piston engines.

“Siloxanes are highly ignitable,” says Margaret Wooldridge, professor of mechanical engineering and director of the Dow Sustainability Fellows Program at the University of Michigan. “They change the chemistry of biogas like crazy. The stuff is like rocket fuel, literally—crazy-reactive.”

The siloxanes essentially change the biogas’s “flame speed,” which is a measure of how quickly a fuel combusts and drives a turbine or piston.

Biogas is composed mainly of methane. There’s methane gas in nature but it’s also produced when organic material decomposes in landfills, along with hydrogen, carbon monoxide, and other hydrocarbons. Methane is the main component of natural gas and biogas, making both valuable sources of fuel and energy that are cleaner than coal.

In the atmosphere, however, methane is particularly good at trapping heat, adding to our global warming problem. In particular, methane is 30 times more effective a greenhouse gas than CO2. And according to the EPA, municipal solid waste landfills account for 14% of all human-related methane emissions in the US each year—the third-largest source behind the gas and petroleum industry and agriculture.

That property has spurred efforts to capture methane from landfills and use it as a fuel, instead of allowing it to escape unchecked.

Measuring ‘ignitability’

In this study, the researchers separately tested hydrogen and carbon monoxide mixtures containing two siloxanes—trimethylsilanol (TMSO) and hexamethyldisiloxane (HMDSO)—against hydrogen and carbon monoxide mixtures with no siloxanes.

Specifically, the researchers clocked how long it took for each mixture to ignite. Scientists consider fuels that have a shorter ignition delay more ignitable or reactive—and hydrogen is one of the most reactive fuels we use.

Hydrogen and carbon monoxide with TMSO produced ignition delay times that were 37% faster than the reference case. And HMDSO-infused methane produced delay times 50% faster.

Researchers hope their work sheds light on how siloxanes alter engine performance when used as a fuel.

“Trace concentrations of siloxanes have been a known problem in biogas applications—leading to the formation of abrasive silica deposits on engine components,” says study coauthor Rachel Schwind, a doctoral student. “For this reason, most prior research in this area has focused on how to remove them from the captured gas.”

The potential of siloxanes

Along with the problem siloxanes pose, there is also potential. Wooldridge says siloxanes could be key to deriving bolstered energy production from biogas.

“We would love to be able to harness them as an energy source,” she says.

Analyzing the combustion chemistry is a step in that direction.

“That would potentially negate the need for scrubbing or removal during biogas processing and reduce costs,” Schwind says. “If we can reduce those costs, it moves biogas closer to being a truly carbon neutral fuel. And if we can make landfill gas a more economically attractive option, landfill operators will have more incentive to capture and utilize this harmful greenhouse gas.”

The research appears in the journal Combustion and Flame.

The US Department of Energy’s Basic Energy Sciences program supported the research.

Source: University of Michigan

The post Supercharged trash gas could produce more green energy appeared first on Futurity.

Blood test could detect melanoma of the eye

Fri, 2019-11-15 14:38

A simple blood test could one day offer early detection of melanoma in the eye.

Researchers have discovered markers in the blood that can differentiate between a benign mole and a melanoma—and identify whether the cancer has spread to other areas of the body.

The blood test could monitor very early signs of the disease, says Mitchell Stark, an early career fellow at the University of Queensland Diamantina Institute.

“This blood test was able to detect the difference between a benign mole located at the back of the eye and a melanoma in the eye,” Stark says. “The test also has the potential to show if the melanoma has metastasized and spread to other areas of the body.

“Moles or naevi in the eye are common, but can be difficult to monitor because changes to their shape or coloring can’t always be seen as easily as on the skin.

“Outcomes are poor for people with melanoma in their eye if their cancer spreads to the liver. Given that having naevi in the eye is fairly common, this test may allow us to better screen these patients for early signs of melanoma formation.”

The study adds to research Stark conducted at QIMR Berghofer Medical Research Institute, where researchers first developed the panel of biomarkers and used it to detect melanoma on the skin.

In this research, researchers collected blood samples from people with either benign naevi or melanoma in the back of their eye, in addition to a small number of metastasized cases.

Researchers then tested the samples against the panel of microRNA biomarkers to distinguish the stage of disease.

Stark says after further development, the blood test has the potential as a monitoring tool in conjunction with optometrists, GPs, and specialists.

“If someone went to their optometrist for a regular check-up and a mole was found, you could have this blood test at each routine visit to help monitor mole changes,” he says. “If the biomarker in the blood had increased, it might be an early warning sign of melanoma.

“Knowing this patient was high-risk means they could be monitored more closely for the potential spread of cancer and be progressed more rapidly through the healthcare system.”

The test would be extremely helpful in clinical practice, says Bill Glasson, an ophthalmologist and service director at Queensland Ocular Oncology.

“These research findings are exciting for our patients with ocular tumors. It will allow for earlier diagnosis as well as giving doctors an earlier indication of the development of metastatic disease and importantly, a better outcome for our patients.”

The National Health and Medical Research Council and the Merchant Charitable Foundation funded the work.

The research appears in Translational Vision Science & Technology.

Source: University of Queensland

The post Blood test could detect melanoma of the eye appeared first on Futurity.

By 7 years old, kids get that hypocrisy is wrong

Fri, 2019-11-15 14:20

Kids seem to learn about the idea of hypocrisy early in elementary school, new research suggests.

The researchers discovered that children who were at least 7 years old began to predict future behavior based on a person’s statement about morals.

Unlike their younger peers, those children think that someone who says stealing is bad would be less likely to steal. Further, they think if those individuals did steal, they should receive harsher punishments.

“Our findings suggest that children of this age are thinking critically about people falsely representing themselves in some way,” says first author Hannah Hok, a doctoral student at the University of Chicago. “They’re thinking about reputation at a relatively early age.”

The research, which appears in the journal Child Development, relied on a series of experiments conducted with more than 400 children ranging from 4 to 9 years old.

“Children understand that when people’s words—when they talk about moral principles—are discordant with their actual behavior, they should be punished more harshly,” says senior author Alex Shaw, assistant professor of psychology and a leading expert on how concepts such as reputation and fairness develop in childhood.

In the first experiment, the researchers told participants about two children, one who condemned stealing (“Stealing is bad.”) and one who made a morally neutral statement (“Broccoli is gross.”). The researchers then asked them to predict who was more likely to steal, and which theft should be punished more severely.

Researchers asked participants in other experiments to compare someone who condemned stealing with someone who praised sharing (“Sharing is really, really good.”), as well as with someone who denied stealing (“I never steal.”).

In all cases, the 7- to 9-year-old participants were more likely than younger children (ages 4 to 6) to use condemnation as a predictor for future action.

A final experiment presented participants with someone who praised stealing and someone who condemned it. Both older and younger children predicted that the former would be more likely to steal—indicating that young children may have particular trouble using condemnation as a behavioral signal.

Interviewing children at a Chicago science museum, the researchers did not collect demographic information other than age and gender, and did not find significant gender-based differences in their results.

Shaw hopes to conduct more research into the behavior of younger children and whether they can better predict actions that are morally neutral, such as eating broccoli. He also hopes to examine how children’s judgments may change with social context, and how they treat hypocrisy that doesn’t benefit the speaker.

“It may not be inconsistency, per se, that kids are reacting to,” Shaw says. “We think it’s engaging in hypocrisy to benefit yourself that provokes the negative reaction.”

Additional coauthors are from New Zealand’s Victoria University of Wellington and the University of Chicago.

Source: University of Chicago

The post By 7 years old, kids get that hypocrisy is wrong appeared first on Futurity.

Neuron transplant may prevent epilepsy after brain injury

Fri, 2019-11-15 12:42

A new cell therapy improved memory and prevented seizures in mice following traumatic brain injury, researchers report.

Traumatic brain injuries affect 2 million Americans each year and causes cell death and inflammation in the brain. People who experience a head injury often suffer from lifelong memory loss and can develop epilepsy.

In the study, the researchers transplanted embryonic progenitor cells capable of generating inhibitory interneurons, a specific type of nerve cell that controls the activity of brain circuits, into the brains of mice with traumatic brain injury. They targeted the hippocampus, a brain region responsible for learning and memory.

The researchers discovered that the transplanted neurons migrated into the injury where they formed new connections with the injured brain cells and thrived long term.

Within a month after treatment, the mice showed signs of memory improvement, such as being able to tell the difference between a box where they had an unpleasant experience from one where they did not. They were able to do this just as well as mice that never had a brain injury.

The cell transplants also prevented the mice from developing epilepsy, which affected more than half of the mice not treated with new interneurons.

“Inhibitory neurons are critically involved in many aspects of memory, and they are extremely vulnerable to dying after a brain injury,” says Robert Hunt, an assistant professor of anatomy and neurobiology at the School of Medicine at the University of California, Irvine, who led the study. “While we cannot stop interneurons from dying, it was exciting to find that we can replace them and rebuild their circuits.”

This is not the first time Hunt and his team has used interneuron transplantation therapy to restore memory in mice. In 2018, the team used a similar approach, delivered the same way but to newborn mice, to improve memory of mice with a genetic disorder.

Still, this was an exciting advance for the researchers. “The idea to regrow neurons that die off after a brain injury is something that neuroscientists have been trying to do for a long time,” Hunt says. “But often, the transplanted cells don’t survive, or they aren’t able to migrate or develop into functional neurons.”

To further test their observations, Hunt and his team silenced the transplanted neurons with a drug, which caused the memory problems to return.

“It was exciting to see the animals’ memory problems come back after we silenced the transplanted cells, because it showed that the new neurons really were the reason for the memory improvement,” says first author Bingyao Zhu, a junior specialist.

Currently, there are no treatments for people who experience a head injury. If the researchers can replicate the results in humans, it could have a tremendous impact for patients. The next step is to create interneurons from human stem cells.

“So far, nobody has been able to convincingly create the same types of interneurons from human pluripotent stem cells,” Hunt says. “But I think we’re close to being able to do this.”

The study appears in Nature Communications.

The National Institutes of Health funded the work.

Source: UC Irvine

The post Neuron transplant may prevent epilepsy after brain injury appeared first on Futurity.

Older adults better handle stress of type 2 diabetes

Fri, 2019-11-15 12:40

Age plays a critical role in the well-being of people newly diagnosed with type 2 diabetes, researchers report.

A new study finds younger patients more susceptible to psychological distress, resulting in worse health outcomes.

“We found we can evaluate a patient’s initial stress and predict how they will be doing six months later,” says Vicki Helgeson, professor of psychology at Carnegie Mellon University. “If you can identify people who are facing diabetes distress earlier, you can intervene and prevent their health from declining.”

Currently about 27 million people in the United States live with type 2 diabetes. Past research shows that stress associated with diabetes management leads to poor blood sugar control.

For the study in the Journal of Behavioral Medicine, researchers evaluated 207 patients (55% male, 53% white, 47% black, 25-82 years of age), who were diagnosed with type 2 diabetes within the past two years.

They used several surveys to evaluate health, psychological distress, and health care, as well as studied the participants’ daily diaries to identify stressors. The researchers assessed patients at the start of the study to establish a baseline and then six months later. They examined the results with regard to gender, race/ethnicity, age, education, employment, income, relationship status, and use of medication.

More stress in younger diabetes patients

Younger patients (42 years and younger) experienced higher diabetes-related and psychological distress, as did patients with higher education and income. Conversely, patients over 64 had less psychological stress and greater consistency in self-care, blood sugar control, and medication adherence. Patients in long-term relationships also reported less diabetes stress.

“This is a diverse sample with respect to age, education, and race, which makes the result even more provocative,” Helgeson says. “We do not know in an objective way if patients with a higher income have more stressors, but they perceive they have more stress.”

Patients identified diet as the greatest stressor (38%). Other significant stressors include checking blood sugar (8%) and experiencing high or low blood sugar events (7%). Patients who self-reported greater stress also reported greater depressed mood, less adherence to medication, and higher anxiety.

“Diabetes care is difficult, because it requires a lifestyle change that you have to do forever,” says Helgeson, the paper’s senior author. “Life gets in the way of sticking to a diabetes regimen.”

Support from peers

While the study was not designed to explore why patients handle stressors differently, Helgeson believes older adults may live in the present compared to younger adults, whose focus on the future may magnify their stressors.

Diabetes is also more common as people age, and older patients may find more support from their peer group. Helgeson also suggests older adults may leverage past experiences to employ emotion regulation strategies to mitigate the stress associated with managing the disease.

After a diagnosis, many patients experience stress as they modify their lifestyle to accommodate diet, weight control, medication, and exercise routines, which can be time-consuming, complicated and costly. Complications from diabetes include heart disease, stroke, kidney disease, and lower limb amputations.

Researchers did not design the study to interpret the cause of underlying stressors or identify emotion regulation strategies. In addition, they did not develop the daily stress measure to expand on the nature of the stressor. Future studies could evaluate how patients react to stressors to develop effective intervention and regulation strategies for different age, gender, and cultural groups.

Source: Carnegie Mellon University

The post Older adults better handle stress of type 2 diabetes appeared first on Futurity.

Smartphone device sniffs out toxin-producing algae in 15 minutes

Fri, 2019-11-15 10:59

A new highly sensitive system uses a smartphone to rapidly detect the presence of toxin-producing algae in water within 15 minutes, researchers report.

The system uses the phone’s wireless communications capabilities to generate test results on-site and report findings in real-time.

The technology could play a big role in preventing the spread of harmful microorganisms in aquatic environments, which could threaten global public health and cause environmental problems.

Monitoring toxin-producing algae

A sudden surge in the amount of algae and associated toxins in lakes, ponds, rivers, and coastal waters can adversely affect water quality, and in turn, may have unfavorable effects on human health, aquatic ecosystems, and water supply.

For instance, in 2015, an algae bloom wiped out more than 500 tons of fish in Singapore, and caused some fish farmers to lose millions of dollars.

Conventional methods of algae detection and analysis are time consuming, and require specialized and costly equipment, as well as skilled operators to conduct water sampling and testing.

One approach tests for the presence of chlorophyll using complex instruments that cost more than $2,200. Another common method involves cytometric and image analysis to detect algal cells—but this method involves equipment that costs more than $73,000.

“Currently, it can take a day or more to collect water samples from a site, bring them back to the laboratory for testing, and analyze the results. This long lead time is impractical for monitoring of algae blooms, as the management of contamination sources and affected waters could be slowed down,” says Bae Sung Woo, assistant professor from the civil and environmental engineering department at the National University of Singapore.

Cheaper, smaller, and highly sensitive

As reported in Harmful Algae, the device comprises three sections—a microfluidic chip, a smartphone, and a customizable 3D-printed platform that houses optical and electrical components including a portable power source and an LED light.

Researchers coated the chip with titanium oxide phthalocyanine, a type of photoconductive polymer-based material. The photoconductive layer guides water droplets along the chip during the analysis process.

The researchers then place the coated chip on top of the screen of a smartphone, which projects a pattern of light and dark regions onto the chip. When droplets of the water sample touch the surface of the chip, a voltage drop difference, created by the light and dark areas illuminated on the photoconductive layer, modifies the surface tension of the water droplets.

This causes the water droplets to move towards the dark illuminated areas. At the same time, the movement induces the water droplets to mix with a chemical that stains algae cells present in the water sample. Light patterns toward the center of the phone guide the mixture.

Next, an LED light source and a green filter embedded in the 3D-printed platform, near the camera of the smartphone, create the conditions suitable for the camera to capture fluorescent images of the stained algae cells. The user then sends the images to an app to count the number of algae cells present in the sample. The user can also send the images to another location via the smartphone to quantify the number of algae cells. The entire analysis process only takes 15 minutes.

The portable and easy-to-use device costs less than $220—excluding the phone—and weighs less than needed to generate reliable results.

Water quality tests: Anytime, anywhere

Researchers tested the system using water samples collected from the sea and reservoirs. They filtrated and spiked the samples with specific amounts of four different types of toxin-producing algae—two types of freshwater algae C. reinhardtii and M. aeruginosa, and two types of marine water algae Amphiprora sp and C. closterium.

Experiments used the new device and a hemocytometer, a standard cell-counting technique commonly used for water quality monitoring, to test for the presence of algae.

The new system was able to detect the four types of algae with an accuracy of 90%, comparable with the hemocytometer results.

“The combination of on-chip sample preparation, data capture, and analysis makes our system unique. With this tool, water quality tests can be conducted anytime and anywhere,” Bae says.

“This new method is also very cost efficient as the microfluidic chip can be washed and re-used. This device will be particularly useful for fish farmers who need to monitor the water quality of their fish ponds on a daily basis.”

Funding for the work came from the National Research Foundation Singapore through its Marine Science Research and Development Programme and the Ministry of Education.

Source: National University of Singapore

The post Smartphone device sniffs out toxin-producing algae in 15 minutes appeared first on Futurity.

More plants and less meat could cut brain risks later

Fri, 2019-11-15 09:51

Sticking to a healthy diet with more plants and less meat in midlife could be associated with a reduced risk of cognitive impairment in old age, researchers report.

Researchers looked at the diet patterns of the nearly 17,000 middle-aged participants of the Singapore Chinese Health Study over a period of 20 years. Researchers scored the participants on how similar their diet patterns were to five high quality diets:

  • the alternative Mediterranean diet;
  • the Alternate Healthy Eating Index 2010;
  • the Dietary Approaches to Stop Hypertension diet;
  • the plant-based diet index;
  • the healthful plant-based diet index.

Dietary patterns rich in plant-based foods—including whole grains, vegetables, fruits, nuts, and legumes—and low in red meat and sugar-sweetened beverages, have been shown to reduce the risk of cancer, diabetes, and cardiovascular diseases.

The results revealed that participants with the most similarity to these dietary patterns had a significant reduction in risk of cognitive impairment—of 18% to 33%—compared with those with the least similarity.

People in Singapore currently lead the world in life expectancy, with life spans averaging 85 years. This, along with an aging population, has increased the need to identify and take measures to prevent the development of common conditions associated with old age such as cognitive impairment and dementia.

“Our study suggests that maintaining a healthy dietary pattern is important for the prevention of onset and delay of cognitive impairment,” says Koh Woon Puay, principal investigator of the Singapore Chinese Health Study and a professor at Saw Swee Hock School of Public Health at the National University of Singapore and the Duke-NUS Medical School.

“Such a pattern is not about the restriction of a single food item but the composition of an overall pattern that recommends cutting back on red meats, especially if they are processed, and including lots of plant-based foods—vegetables, fruit, nuts, beans, whole grains—and fish.”

The Health Promotion Board in Singapore (HPB) recommends eating across all food groups for a balanced and varied diet.

“A simple guide is to fill half our plate with fruit and vegetables, a quarter with whole grains such as brown rice and whole meal bread, and the last quarter with protein foods such as bean products, seafood, and meat,” advises Annie Ling, group director of the HPB Policy, Research, and Surveillance Division.

The research appears in the American Journal of Clinical Nutrition.

Source: National University of Singapore

The post More plants and less meat could cut brain risks later appeared first on Futurity.

Doctors give electronic health records an ‘F’

Fri, 2019-11-15 09:51

Electronic health records may improve quality and efficiency for doctors and patients alike—but physicians give them an “F” for usability and they may contribute to burnout, according to new research.

By contrast, in similar but separate studies, Google’s search engine earned an “A” and ATMs a “B.” The spreadsheet software Excel got an “F.”

“A Google search is easy,” says Edward R. Melnick, assistant professor of emergency medicine and director of the Clinical Informatics Fellowship at Yale University. “There’s not a lot of learning or memorization; it’s not very error-prone. Excel, on the other hand, is a super-powerful platform, but you really have to study how to use it. EHRs mimic that.”

Usability ratings for everyday products measured with the System Usability Scale. Google: 93%; microwave: 87%; ATM: 82%; Amazon: 82%; Microsoft Word: 76%; digital video recorder: 74%; global positioning system: 71%; Microsoft Excel: 57%; electronic health records: 45%. (Credit: Michael S. Helfenbein)

There are various electronic health record systems that hospitals and other medical clinics use to digitally manage patient information. These systems replace hard-copy files, storing clinical data, such as medications, medical history, lab and radiology reports, and physician notes.

The systems were developed to improve patient care by making health information easy for healthcare providers to access and share, reducing medical error.

But the rapid rollout of EHRs following the Health Information Technology for Economic and Clinical Health Act of 2009, which pumped $27 billion of federal incentives into the adoption of EHRs in the US, forced doctors to adapt quickly to often complex systems, leading to increasing frustration.

Two hours of personal time

According to the study, physicians spend one to two hours on EHRs and other deskwork for every hour spent with patients, and an additional one to two hours daily of personal time on EHR-related activities.

“As recently as 10 years ago, physicians were still scribbling notes,” Melnick says. “Now, there’s a ton of structured data entry, which means that physicians have to check a lot of boxes.

“Often this structured data does very little to improve care; instead, it’s used for billing. And looking for communication from another doctor or a specific test result in a patient’s chart can be like trying to find a needle in a haystack. The boxes may have been checked, but the patient’s story and information have been lost in the process.”

For the current study, published in Mayo Clinic Proceedings, Melnick zeroed in on the effect of EHRs on physician burnout.

The AMA, along with researchers at the Mayo Clinic and Stanford University, surveys over 5,000 physicians every three years on topics related to burnout. Most recently, the burnout rate was 43.9%—a drop from the 54.4% of 2014, but still worryingly high, researchers say. The same survey found that burnout for the general US population was 28.6%.

Electronic health records and burnout

Researchers also asked one quarter of the respondents to rate their EHR’s usability by applying a measure, System Usability Scale (SUS), previously used in over 1,300 other usability studies in various industries.

Users in other studies ranked Google’s search engine an “A.” Microwave ovens, ATMs, and Amazon got “Bs.” Microsoft Word, DVRs, and GPS got “Cs.” Microsoft Excel, with its steep learning curve, got an “F.”

In Melnick’s study, EHRs came in last, with a score of 45—an even lower “F” score than Excel’s 57.

Further, EHR usability ratings correlated highly with burnout—the lower physicians rated their EHR, the higher the likelihood that they also reported symptoms of burnout.

The study found that certain physician specialties rated their EHRs especially poorly—among them, dermatology, orthopedic surgery, and general surgery.

Specialties with the highest SUS scores included anesthesiology, general pediatrics, and pediatric subspecialties.

Demographic factors like age and location matter, too. Older physicians found EHRs less usable, and doctors working in veterans’ hospitals rated their EHR higher than physicians in private practice or in academic medical centers.

Benchmarking physicians’ feelings about EHRs will make it possible to track the effect of technology improvements on usability and burnout, Melnick says.

“We’re trying to improve and standardize EHRs,” Melnick says. “The goal is that with future work, we won’t have to ask doctors how they feel about the EHR or even how burned out they are, but that we can see how doctors are interfacing with the EHR and, when it improves, we can see that improvement.”

Source: Yale University

The post Doctors give electronic health records an ‘F’ appeared first on Futurity.

Does learning music hinge on smarts, not mindset?

Fri, 2019-11-15 09:46

Intelligence could play a role in how quickly people learn music, according to new research on the early stages of learning to play piano.

The study may be the first to examine the relationship among intelligence, music aptitude, and growth mindset in beginner pianists.

Growth mindset refers to whether students believe they can improve basic abilities, like piano ability.

“The strongest predictor of skill acquisition was intelligence, followed by music aptitude,” says Alexander Burgoyne, a doctoral candidate in cognition and cognitive neuroscience at Michigan State University.

“By contrast, the correlation between growth mindset and piano performance was about as close to zero as possible,” he says.

In the study, 161 undergraduates were taught how to play “Happy Birthday” on the piano with the help of a video guide. After practice, the students performed the 25-note song multiple times. Three graduate students judged the performances based on their melodic and rhythmic accuracy.

There were striking differences in the students’ skill acquisition trajectories. Some learned quickly, earning perfect marks within six minutes of practice. Others performed poorly at first but improved substantially later. By comparison, some seemed to fade as if they had lost their motivation and others never figured it out, performing poorly throughout the study.

So why did some students fail while others succeeded?

To find out, the researchers gave the students tests of cognitive ability that measured things like problem-solving skills and processing speed, and tests of music aptitude that measured, for example, the ability to differentiate between similar rhythms. They also surveyed their growth mindset.

“The results were surprising, because people have claimed that mindset plays an important role when students are confronted with challenges, like trying to learn a new musical instrument,” Burgoyne says. “And yet, it didn’t predict skill acquisition.”

That said, results will likely differ for those with greater skill.

“Our study examined one of the earliest stages of skill acquisition,” Burgoyne says. “Early experiences can be formative, but I would caution against drawing conclusions about skilled musicians based on our study of beginners.”

But applied generally, the study’s findings may be helpful in education.

It follows a recent review of mindset research that found a weak relationship between growth mindset and academic achievement.

Perhaps more concerning, that study found interventions designed to boost achievement by encouraging children to believe they can improve their basic abilities may be fruitless. That is, when those interventions successfully altered students’ mindsets, there wasn’t a significant effect on academic achievement.

The paper appears in the journal Intelligence.

Source: Michigan State University

The post Does learning music hinge on smarts, not mindset? appeared first on Futurity.

Discovery is a key to deciphering this lost Minoan language

Fri, 2019-11-15 09:21

Determining the word order of Linear A, a precursor to the earliest form of Greek, is a step toward finally deciphering long-lost language.

Linear A is the yet-undeciphered language of the ancient Minoan civilization of Crete that flourished from roughly 1700 BCE to 1490 BCE.

The Minoans live on in myth as people of the land of King Minos who kept the half-bull, half-man Minotaur in a labyrinth below his palace at Knossos.

“If we can decipher these inscriptions, we will have the personal prayers of Minoan people.”

They are also possibly the oldest civilization of Western Europe, and their language could reveal more about a people and culture that was the foundation on which Ancient Greek and (ultimately) Roman culture were built.

Linguist and archaeologist Brent Davis, a lecturer at the University of Melbourne, is one of only a handful of people around the world to have made any significant headway on solving Linear A in the last 50 years.

He established for the first time the word order of the language as being verb-subject-object, like ancient Egyptian. So rather than “Minos has a minotaur,” a Minos would write “has Minos a minotaur.”

(Credit: Wikipedia) Cracking an ancient language

The eccentric English architect Michael Ventris famously cracked Linear B, a slightly later but closely related script found in Crete and mainland Greece, in 1952.

He discovered that Linear B was actually a very early form of ancient Greek—Mycenaean—and his finding extended the origin of ancient Greek civilization back a further 500 years earlier than first thought.

The Linear B tablets were preserved by chance when the dried clay that had been written on was fired as a result of palaces and other buildings burning down during natural and human-made calamities.

The information they revealed proved to be largely inventories of people, produce, accounts, offerings, and other goods, giving us glimpses of people and their occupations.

Linear A is likely to reveal similar information, but Davis says much Linear A occurs as religious script. “If we can decipher these inscriptions, we will have the personal prayers of Minoan people,” he says.

At the time he cracked Linear B, Ventris told the BBC it was like having to solve a crossword puzzle without knowing which spaces are blacked-out.

In this audio clip, Ventris describes the discovery:

In fact, Ventris’ achievement was built on the crucial work of little-acknowledged US classicist Alice Kober, who died in 1950.

It was Kober who identified similar word endings in Linear B, allowing her to find some root words she thought were place names and which Ventris would later realize were akin to Greek.

She also devised a method for tabulating the relationships between signs that Ventris would build on—leaving behind more than 180,000 index cards.

Linear A is a tougher challenge

Deciphering Linear B was a monumental achievement, but the challenge of Linear A is even more difficult. That’s partly because the language behind the script doesn’t appear to be like any other language.

“It seems to be a wholly unknown indigenous language,” says Davis.

“Linear B took most of its signs from Linear A, and because we can read Linear B, we can actually pronounce Linear A inscriptions, but if you do pronounce them, it just sounds like complete gobbledygook.”

Like Ventris, Davis became fascinated with deciphering ancient languages as a boy, particularly the story of how Egyptian hieroglyphs were deciphered using the Rosetta Stone that Napoleon’s soldiers found in Egypt.

But he’s always known that solving Linear A was a tough task.

“Ventris vowed, when he was just 14, that one day he’d solve Linear B. At the same age I was saying I’d love to solve Linear A, but I’m not promising anything,” laughs Davis.

‘Yasumatu gives olives’

By establishing the word order of the language, linguists can identify the function of a word in a sentence just from its position. It’s like finding a key word in a massive crossword puzzle.

“The definite word order in English is subject (S)-verb (V)-object (O), as in the phrase John likes cats. And we know that about 97% of human languages are either in this form or S-O-V (John cats likes) or V-S-O (Likes John cats).”

“…what we really need to find is a palace archive, which is where we are likely to find enough Linear A to finally decipher it.”

But when Davis looked at other Bronze Age languages of this period in the region, none were like English.

They were either S-O-V (like early Greek and Sumerian), or V-S-O (like ancient Egyptian). He guessed Linear A was likely to have one of these two word orders.

He then applied this framework to a series of inscriptions that appear on Minoan offering bowls. To put it simply, he found that the words on the bowls tended to recur in what was obviously a formula, except for the second word in the inscription, which was always different from bowl to bowl.

His guess was that this word was probably the name of the person (the subject) making the offering. If correct then Linear A was likely a V-S-O language.

That was confirmed when he found the Linear B sign for “olives” (which it borrowed borrowed from Linear A), occurring after the name as the object of the phrase.

The repeated start of the phrase was therefore a verb, like “gives”, yielding the phrase “gives Yasumatu olives,” or in English, “Yasumatu gives olives.”

Just such an offering of olives in a goblet has been found, preserved at the bottom of a sacred Minoan well. “It was a huge feeling of discovery, completely thrilling,” says Davis.

The rest may depend on dirty work

But he cautions that understanding the word order alone won’t be enough to solve Linear A. “Examining the word order provides something of a magic key, but we if we are to crack it what we need most is simply more material,” he says.

Material was another advantage that Ventris had in deciphering Linear B. There were 20,000 examples of Linear B signs occurring in inscriptions, compared to just 7,000 examples of Linear A signs. “That is about three-to-four A4 pages worth.

“Mathematicians tells us that if we are to crack Linear A, we’ll need something like 10,000 to 12,000 examples of signs, which means we aren’t that far away—but it all depends on archaeology.

“Discoveries are still being made, so I’m optimistic, but what we really need to find is a palace archive, which is where we are likely to find enough Linear A to finally decipher it.”

It’s an intellectual problem that still needs some serious dirt-digging.

Davis is the 2019 winner of the Michael Ventris Award at the University of London, which the Michael Ventris Memorial Fund supports.

Source: University of Melbourne

The post Discovery is a key to deciphering this lost Minoan language appeared first on Futurity.

How HIV dodges our immune defenses

Fri, 2019-11-15 08:18

New research reveals how a protein that specializes in killing off invading viruses latches on to attackers, as well as how some viruses like HIV evade capture and death.

Humans have evolved dynamic defense mechanisms against the viruses that seek to infect our bodies—proteins that specialize in identifying, capturing, and destroying the genetic material that viruses try to sneak into our cells.

Revealing the precise mechanism that makes the protein, called ZAP (short for zinc-finger antiviral protein), an effective antiviral in some cases is a critical first step in the path toward better methods for attacking viruses that manage to dodge it.

Cells make ZAP to restrict a virus from replicating and spreading infection. When cells detect a virus, the ZAP gene turns on and produces more of the protein. ZAP then singles out the virus’s genetic material, RNA, from the cell’s native RNA and targets the viral RNA for destruction.

The researchers wanted to determine how ZAP recognizes the virus’s genome and how some viruses avoid it.


A previous study revealed that ZAP grabs onto only one specific sequence of neighboring nucleotides (the building blocks of DNA and RNA): a cytosine followed by a guanine, or a CG dinucleotide. Human RNAs have few CG dinucleotides, and HIV RNA has evolved to mimic this characteristic.

“The main motivation for the study was, ‘How does HIV avoid this antiviral protein?'” says co-lead author Jennifer Meagher, a researcher at the Life Sciences Institute at the University of Michigan. “And because we’re structural biologists, we wanted to determine how ZAP ‘sees’ a CG dinucleotide—and how, structurally, it binds the RNA.”

Using a piece of viral RNA that researchers genetically altered to include extra CG sequences, Meagher and her colleagues determined the structure of the ZAP protein bound to RNA, exposing the mechanisms that enable the protein to be so selective.

The researchers discovered that ZAP binds to the viral RNA at only one of the four “zinc fingers” on the protein that they considered potential binding sites. They further demonstrated that even a tiny change to that one binding site—altering just a single atom—hampered ZAP’s binding ability.

A ‘molecular arms race’

Working in cells, researchers found similar results when they altered ZAP’s composition. They created mutant versions of ZAP that cells infected with either normal HIV or a version of the virus enriched with CG sequences expressed.

The mutant ZAP proteins hard a harder time recognizing CG-enriched regions of the viral RNA in cells. They also exhibited increased binding to areas of the RNA that were not rich in CG dinucleotides, indicating that alterations impair ZAP’s ability to distinguish viral RNA from human RNA.

“Natural selection appears to have shaped the ZAP protein structure in such a way to optimize the discrimination of nonself from self RNA, based on CG dinucleotide content,” says Paul Bieniasz, an investigator in the Howard Hughes Medical Institute and head of the Laboratory of Retrovirology at Rockefeller University . “However, successful viruses are often one step ahead in a molecular arms race.”

“This is the crucial first step in a complicated story of how the cell eventually degrades the virus’s RNA,” says Janet Smith, an research professor at LSI and a professor of biological chemistry at the University of Michigan Medical School. “And now we know how the step is executed, and why it is not effective on HIV and other viruses that lack this CG sequence.”

The paper appears in the Proceedings of the National Academy of Sciences.

The research was done through the Center for HIV RNA Studies and received support from the National Institutes of Health, Howard Hughes Medical Institute, Michigan Economic Development Corporation, and the Michigan Technology Tri-Corridor. X-ray crystallography data came from the US Department of Energy’s Advanced Photon Source at Argonne National Laboratory.

Source: University of Michigan

The post How HIV dodges our immune defenses appeared first on Futurity.

That sick feeling might actually be an emotion

Fri, 2019-11-15 07:25

That weary feeling that sets in with an illness is an emotion that helps you fight off infection, researchers say.

Slack facial muscles and drooping eyelids appear early. Exhaustion, loss of appetite, and increased sensitivity to cold and pain come on. Those signs are among a long list of features that researchers have linked to the emotion of being sick, which the authors label lassitude, a now little-used term for weariness from 16th-century Latin.

In a paper in the journal Evolution and Human Behavior, researchers argue that the state of being sick qualifies as an emotion following a review of the literature on sickness behavior, most of which focused on behavioral and physiological changes in nonhuman animals.

Feeling sick to feel better

In the paper, the researchers merge the accrued knowledge from 130 published studies and proposed that lassitude is a complex adaptation, like the immune system, that evolved to help people fight infectious disease.

“The immune system clearly helps us fight off infections, but activating the immune system costs a lot of energy,” says lead author Joshua Schrock, a doctoral student at the University of Oregon. “This cost creates a series of predicaments for the body’s regulatory systems.”

“Lassitude is the program that adjusts your body’s regulatory systems to set them up for fighting infection,” Schrock says. “These adjustments make you feel sadder, more fatigued, more easily nauseated, less hungry, and more sensitive to cold and pain.”

Lassitude, the researchers write, persists until the immune response subsides. During that response, the body calls upon various mechanisms to coordinate the fight against infection, which, they note, can trigger symptoms resembling psychological depression.

Changing your behavior

During the battle, lassitude coordinates adjustments to patterns of movement, risk avoidance, body temperature, appetite, and even how a person elicits caregiving behavior from social networks.

Lassitude, the researchers write, “modifies the cost-benefit structure of a wide range of decisions.” Those who are ill place lower value on food and sex, for example, and often prefer to avoid social and physical risks.

“When threat levels are high, the system sends a signal to various motivational systems, configuring them in ways that facilitate effective immunity and pathogen clearance,” the researchers write in their conclusion. “We believe that investigating the information-processing structure of lassitude will contribute to a more complete understanding of sickness behavior, much like the information-processing structure of hunger helps us understand feeding behavior.”

While the paper focused primarily on illnesses that bacteria, viruses, parasitic worms, and protozoans trigger, they also theorized that other situations—such as injuries, poisoning, and chronic degenerative diseases—may present similar adaptive problems.

Source: University of Oregon

The post That sick feeling might actually be an emotion appeared first on Futurity.

‘Bottlebrush’ polymers bring coatings under control

Thu, 2019-11-14 15:38

Microscopic “bottlebrush” polymers that look like the common kitchen implement could offer exquisite control over coatings, researchers say.

Rafael Verduzco, a chemical and biomolecular engineer at Rice University’s Brown School of Engineering, has long studied bottlebrush copolymers. Now, he and colleagues have developed models and methods to refine surface coatings to make them, for instance, more waterproof or more conductive.

The researchers discovered that bottlebrushes mixed with linear polymers tend to migrate to the top and bottom of a thin film as it dries. These films, as coatings, are ubiquitous in products—as waterproof layers to keep metals from rusting or fabrics from staining.

When the migration happens, the linear polymers hold the center while the bottlebrushes are drawn to the air above or the substrate below. This effectively decouples the properties of the bulk coating from its exposed surfaces, Verduzco says.

Computational models and experiments showed that variations in the bottlebrush itself could control surface characteristics.

Never-ending uses for bottlebrush polymers

Bottlebrush polymers remain challenging to make in bulk, but their potential uses are vast, Verduzco says. Applications could include drug delivery via functionalized bottlebrushes that form micelles, lubricants, soft elastomers, anti-fouling filters, and surfaces that heal themselves.

The researchers characterized various bottlebrushes made of polystyrene and poly(methyl methacrylate) (aka PMMA) while studying what causes the polymers to migrate.

Resembling their macro kitchen cousins (as well as certain flowers), bottlebrushes consist of small polymer chains that radiate outward from a linear polymer rod. The bottlebrushes self-assemble in a solution, which researchers can manipulate to adjust their properties.

Coatings are everywhere

Coatings are ubiquitous, Verduzco says. “If we didn’t have the right coatings, our materials would degrade quickly,” he says. “They would react in ways we don’t want them to. So coating a surface is usually a separate process; you make something and then you have to find a way to deposit a coating on top of it.

“What we’re looking at is a kind of universal additive, a molecule you can blend with whatever you’re making that will spontaneously go to the surface or the interface,” he says. “That’s how we ended up using bottlebrushes.”

Researchers can tune bottlebrushes by varying the number of side chains, their length, or the length of the backbone polymer, Verduzco says. The side chains themselves can be of mixed type, and small molecules or proteins can be added to their end groups.

“The chemistry of these materials is advanced sufficiently that you can pretty much put just about any kind of polymer as one of these bristles on the side chain,” he says. “You can put them in different order.”

The study is published in the journal Macromolecules. Additional coauthors are from the University of Tennessee, Knoxville; Oak Ridge National Laboratory; and the University of Houston.

Source: Rice University

The post ‘Bottlebrush’ polymers bring coatings under control appeared first on Futurity.

Careful male allies can ease sexism at work

Thu, 2019-11-14 15:01

A new study on sex-based discrimination toward women in the workplace documents the plusses and minuses of male allies.

They can play a powerful role in combating chauvinistic behavior toward women, according to the study, but they can also unintentionally contribute to sexism.

An increase in the number of sex-based discrimination charges filed with the US Equal Employment Opportunity Commission in recent years prompted the research, says Eden King, an associate professor of psychology at Rice University and the study’s senior author.

“A lot of research has already been done about how women can fight sexism in the workplace,” King says. “What we were interested in studying was how men play a role in this.”

King and her fellow authors evaluated 100 women of varying ethnicities, ranging in age from 19 to 69, with total work experience ranging from 1 to 50 years. These women took an online survey about male ally behavior in the workplace and were asked to recall situations when they thought their male allies were effective or ineffective in helping them fight sexism.

The researchers found that men can effectively act as allies in a number of ways, including doing things to advance a woman’s career (such as offering special projects or promotions), putting a stop to bad behavior by peers, or simply lending support when asked.

The women surveyed described a number of positive side effects from having male allies, including feeling grateful, happy, confident, empowered, supported, and more comfortable in their workplace.

“The ally’s behavior made me feel valued and ‘heard,'” one participant wrote.

However, the women answering the survey also pointed out situations where male allies did more harm than good. Women most frequently described allyship as ineffective when it had no impact on sexist behavior or organizational culture, or when they or their ally experienced backlash over their actions.

Some women also described situations where male allies’ behavior hindered their careers. One woman described how a colleague with a negative reputation tried to promote her, but his support ultimately led to her contract not being renewed.

“When we did this study, we were concerned that not everything people do believing they are acting as an ally is actually construed that way,” King says. “And we discovered that this is very true.”

A less common experience the surveyed women reported was when male allies exhibited a “savior complex,” when a male ally steps in to help or intervene on behalf of a woman who doesn’t want or need his help.

“The participants indicated that this type of behavior made them feel less confident in their ability to fulfill their job responsibilities,” King says.

Ultimately, the researchers say that male allies should take cues from their female colleagues about how to be an ally. Some common forms of allyship that participants described as helpful were listening and being a confidante behind the scenes, in addition to taking steps that ensure women get the same opportunities as men, including promotions and raises.

“While we found that allies can have a very positive impact, we encourage these individuals to confer with their female colleagues to see if help is wanted or needed,” King says. “If the answer is yes, then allies should keep doing what they are doing. If the answer is no, they should respect that.”

The study appears in Personnel Assessment and Decisions.

Source: Rice University

The post Careful male allies can ease sexism at work appeared first on Futurity.

Buckyballs in space may come from dying stars

Thu, 2019-11-14 13:59

New research may explain how “buckyballs”—complex carbon molecules with a soccer-ball-like structure—form in space.

Carbon 60, or C60 for short, (the official name is Buckminsterfullerene) comes in spherical molecules consisting of 60 carbon atoms organized in five-membered and six-membered rings. The name “buckyball” derives from their resemblance to the architectural work of Richard Buckminster Fuller, who designed many dome structures that look similar to C60. Their formation was thought to only be possible in lab settings until their detection in space challenged this assumption.

The existence of buckyballs in space has long puzzled scientists. For decades, people thought interstellar space was sprinkled with lightweight molecules only: mostly single atoms, two-atom molecules, and the occasional nine or 10-atom molecules. This was until researchers detected massive C60 and C70 molecules a few years ago.

An artist’s conception showing spherical carbon molecules known as buckyballs coming out from a planetary nebula—material shed by a dying star. Researchers at the University of Arizona have now created these molecules under laboratory conditions thought to mimic those in their “natural” habitat in space. (Image: NASA/JPL-Caltech)

Researchers were also surprised to find that that they were composed of pure carbon. In the lab, researchers blast together pure carbon sources, such as graphite to make C60. In space, they detected C60 in planetary nebulae, which are the debris of dying stars. This environment has about 10,000 hydrogen molecules for every carbon molecule.

“Any hydrogen should destroy fullerene synthesis,” says lead author Jacob Bernal, a doctoral student in astrobiology and chemistry at the University of Arizona. “If you have a box of balls, and for every 10,000 hydrogen balls you have one carbon, and you keep shaking them, how likely is it that you get 60 carbons to stick together? It’s very unlikely.”

Spacey simulations

Bernal and his coauthors began investigating the C60 mechanism after realizing that the transmission electron microscope, or TEM, at the University of Arizona could simulate the planetary nebula environment fairly well.

The TEM has a serial number of “1” because it is the first of its kind in the world with its exact configuration. Its 200,000-volt electron beam can probe matter down to 78 picometers—scales too small for the human brain to comprehend—in order to see individual atoms. It operates under a vacuum with extremely low pressures. This pressure, or lack thereof, in the TEM is very close to the pressure in circumstellar environments.

“It’s not that we necessarily tailored the instrument to have these specific kinds of pressures,” says coauthor Tom Zega, associate professor in the Lunar and Planetary Lab. “These instruments operate at those kinds of very low pressures not because we want them to be like stars, but because molecules of the atmosphere get in the way when you’re trying to do high-resolution imaging with electron microscopes.”

Tom Zega at the control panel of the 12-foot tall transmission electron microscope at the Kuiper Materials Imaging and Characterization Facility at the Lunar and Planetary Lab. The instrument revealed that buckyballs formed in samples researchers exposed to conditions thought to reflect those in planetary nebulae. (Photo: Daniel Stolte/University Communications)

The team partnered with the US Department of Energy’s Argonne National Lab, near Chicago, which has a TEM capable of studying radiation responses of materials. They placed silicon carbide, a common form of dust made in stars, in the low-pressure environment of the TEM, subjected it to temperatures up to 1,830 degrees Fahrenheit, and irradiated it with high-energy xenon ions.

Then, they brought it back to Tucson for researchers to utilize the higher resolution and better analytical capabilities of the University of Arizona TEM. They knew they could validate their hypothesis if they observed the silicon shedding and exposing pure carbon.

The stellar origins of buckyballs

“Sure enough, the silicon came off, and you were left with layers of carbon in six-membered ring sets called graphite,” says coauthor Lucy Ziurys, professor of astronomy, chemistry, and biochemistry. “And then when the grains had an uneven surface, five-membered and six-membered rings formed and made spherical structures matching the diameter of C60. So, we think we’re seeing C60.”

This work suggests that C60 is derived from the silicon carbide dust made by dying stars, which then gets hit by high temperatures, shockwaves, and high energy particles , leeching silicon from the surface and leaving carbon behind. These big molecules dispersed because dying stars eject their material into the interstellar medium—the spaces in between stars—thus accounting for their presence outside of planetary nebulae. Buckyballs are very stable to radiation, allowing them to survive for billions of years if shielded from the harsh environment of space.

“The conditions in the universe where we would expect complex things to be destroyed are actually the conditions that create them,” Bernal says, adding that the implications of the findings are endless.

“If this mechanism is forming C60, it’s probably forming all kinds of carbon nanostructures,” Ziurys says. “And if you read the chemical literature, these are all thought to be synthetic materials only made in the lab, and yet, interstellar space seems to be making them naturally.”

If the findings are any sign, it appears that there is more the universe has to tell us about how chemistry truly works.

The research appears in the Astrophysical Journal Letters.

Support for the work came from the National Science Foundation, the National Institutes of Health, the US Department of Energy, and the Sloan Foundation Baseline Scholars Program. The National Science Foundation and NASA fund the Arizona TEM.

Source: University of Arizona

The post Buckyballs in space may come from dying stars appeared first on Futurity.

Arctic sea ice loss opens marine mammals to deadly virus

Thu, 2019-11-14 13:34

Scientists have linked Arctic sea ice loss to a deadly virus that could threaten marine mammals in the North Pacific, according to a new study.

Researchers identified phocine distemper virus, or PDV, a pathogen responsible for killing thousands of European harbor seals in the North Atlantic in 2002, in northern sea otters in Alaska in 2004, raising questions about when and how the virus reached them.

The 15-year study, published in the journal Scientific Reports, highlights how the radical reshaping of historic sea ice may have opened pathways for contact between Arctic and sub-Arctic seals that was previously impossible. This allowed for the virus’ introduction into the Northern Pacific Ocean.

“The loss of sea ice is leading marine wildlife to seek and forage in new habitats and removing that physical barrier, allowing for new pathways for them to move,” says corresponding author Tracey Goldstein, associate director of the One Health Institute at the School of Veterinary Medicine at the University of California, Davis.

“As animals move and come in contact with other species, they carry opportunities to introduce and transmit new infectious disease, with potentially devastating impacts.”

Researchers sampled marine mammals for phocine distemper virus exposure and infection from 2001 to 2016. Sampled mammals included ice-associated seals, northern fur seals, Steller sea lions, and northern sea otters from Southeast Alaska to Russia along the Aleutian Islands and the Bering, Chukchi, and Beaufort seas.

They assessed Arctic ocean sea ice and open water routes from the North Atlantic to North Pacific oceans. Satellite telemetry data helped link animal movement and risk factor data to demonstrate that exposed animals have the potential to carry phocine distemper virus long distances.

The researchers identified widespread infection and exposure to the virus across the North Pacific Ocean beginning in 2003, with a second peak of exposure and infection in 2009. These peaks coincided with reductions in Arctic sea ice extent.

“As sea ice continues its melting trend, the opportunities for this virus and other pathogens to cross between North Atlantic and North Pacific marine mammals may become more common,” says first author Elizabeth VanWormer, a postdoctoral researcher at UC Davis during the study and currently an assistant professor at the University of Nebraska, Lincoln.

“This study highlights the need to understand PDV transmission and the potential for outbreaks in sensitive species within this rapidly changing environment.”

Additional coauthors are from the University of Saint Andrews, the US Fish and Wildlife Service, National Oceanic and Atmospheric Administration Fisheries, Alaska Fisheries Science Center Marine Mammal Center, the University of Glasgow, Alaska Department of Fish and Game, the University of Alaska-Fairbanks, Queens University Belfast, Pirbright Institute, and the Alaska Veterinary Pathology.

The Morris Animal Foundation, the NOAA Oceans and Human Health Graduate Traineeship Program, the Alaska Fisheries Science Center Marine Mammal Laboratory, and the US Fish and Wildlife Service funded the work.

Source: UC Davis

The post Arctic sea ice loss opens marine mammals to deadly virus appeared first on Futurity.