Being diagnosed with a malignant brain tumor is devastating news for patients and their loved ones. While some types of tumor respond well to treatment, others such as glioblastomas—the most common and aggressive brain tumors—recur and progress within short times from the diagnosis.
Patients diagnosed with this type of cancer, and who undergo current standard treatment, have a median survival of 16 months.
Now, researchers have developed a new clinical approach to increase the efficiency of treatment in glioblastomas that increased the median survival to 22 months. The findings of the phase II clinical trial appear in the International Journal of Radiology Oncology.
“Glioblastomas are very difficult to treat,” says George Shenouda, radio-oncologist at the McGill University Health Centre and lead author of the study. “These tumors grow and spread quickly throughout the brain, making it very difficult to completely remove with surgery.”This protein flags the worst glioblastomas
The standard treatment for glioblastomas includes removing as much of the tumor as possible with surgery and then eliminating what is left through radiotherapy combined with chemotherapy. After surgery, patients need at least four-to-five weeks of recovery before starting radiotherapy.
“50% of the patients in our study have survived two years since their diagnosis. This is very encouraging…”
Unfortunately, during recovery, any remaining cancer cells will continue to grow. To make matters more complicated, remaining cancer cells, mainly cancer stem cells, can be more resistant to radiotherapy and chemotherapy.
The new approach adds chemotherapy prior to radiotherapy—a process called neo-adjuvant chemotherapy. Giving neo-adjuvant chemotherapy prevented the tumor from progressing during recovery and increased survival. After the neo-adjuvant chemotherapy, patients then received accelerated radiotherapy.
“We had better control over the tumor by giving patients the same overall dose of radiotherapy in fewer sessions and a shorter period of time. By doing this, we increased the efficacy of the treatment and we believe that in turn the treatment targeted the stem cells, which are the basis of recurrence,” Shenouda says.
“Reducing the radiotherapy sessions by one-third also alleviates the burden for patients. In addition, this represents a considerable cost reduction of delivery of treatment.”
Although additional research is required, the initial results are very promising, Shenouda says. “Fifty percent of the patients in our study have survived two years since their diagnosis—this is very encouraging and we are very positive about the outcome.”
Source: McGill University
The post Treatment extends lives of patients with glioblastomas appeared first on Futurity.
Even students who perform extremely well on math exams can suffer from anxiety. And the better a student does at math, the more strongly anxiety will drag his or her performance down, new research shows.
And the relationship between anxiety and achievement holds true not just in the United States, but worldwide.
“Math anxiety is disrupting these students’ ability to fulfill their potential,” says Alana Foley, a postdoctoral fellow in psychology at the University of Chicago. “Even though they’re still doing better than kids who are overall performing lower, they’re not performing as well as they could because they have math anxiety.”
For a new study in Current Directions in Psychological Science, researchers looked at the findings of 40 different laboratory experiments combined with analysis of data from the Program in International Student Assessments, which administers standardized math tests to 15-year-old students around the world. The lab studies provide insight into the test results, and the test results help contextualize the lab studies.Parents’ math anxiety can rub off on kids
“The effects of anxiety are true, even in countries that we think of as being really high-performing in math—Singapore, Korea, Japan, China,” says coauthor Julianne Herts, a doctoral student in psychology. “Even students in those countries who perform very well in math and score very high on tests still show this relation. That’s something we didn’t know would be the case.”
Why does anxiety have such a hold? To do math, we need to be able to hold information in our minds and manipulate and remember it, behavioral and neuroimaging studies suggest.
“The students who normally do really well have a large capacity to hold information in their minds and use advanced strategies that require a lot of cognitive resources,” Foley says. “But when they’re math anxious, the anxiety and the emotion system of the brain interfere with their ability to hold onto information, so they end up performing much worse than they otherwise would if they weren’t anxious.”
Being told that symptoms associated with anxiety, such as a rapid heartbeat, can actually help them do well, may help student performance, says coauthor Sian Beilock, professor of psychology.
“Research also shows anxious students’ performance improves when they write about their feelings before taking a test. Externalizing the anxiety seems to help alleviate its deleterious effects,” Beilock adds.
No intervention can be expected to work in every culture, Herts says. “We have to look at how math anxiety might operate differently in different countries, even though it has the same effect.”
Researchers from the Organization for Economic Cooperation and Development in Paris are coauthors of the study. The Overdeck Family Foundation, National Science Foundation, Heising-Simons Foundation, and the US Department of Education funded the work.
Source: University of Chicago
Consequences of global climate change—loss of Arctic sea ice and increased Eurasian snowfall—may make China’s severe winter air pollution problems worse.
Modeling and data analysis suggest that sea ice and snowfall changes have shifted China’s winter monsoon, helping create stagnant atmospheric conditions that trap pollution over the country’s major population and industrial centers. Those changes in regional atmospheric conditions are frustrating efforts to address pollution through emission controls.
“Emissions in China have been decreasing over the last four years, but the severe winter haze is not getting better,” says Yuhang Wang, a professor in Georgia Tech’s School of Earth and Atmospheric Sciences. “Mostly, that’s because of a very rapid change in the high polar regions where sea ice is decreasing and snowfall is increasing. This perturbation keeps cold air from getting into the eastern parts of China where it would flush out the air pollution.”
The paper in Science Advances presents a clear example of how large-scale perturbations resulting from global climate change can have significant regional impacts, and is believed to be the first to link sea ice and snowfall levels to regional air pollution.
Haze problems in the East China Plains—which include the capital Beijing—first gained worldwide attention during the winter of 2013 when an instrument at the US embassy recorded extremely high levels of PM 2.5 particles. The haze prompted the Chinese government to institute strict targets for reducing emissions from industry and other sources.Complaints on Weibo predict Chinese air pollution
Though these emission controls appear to be working, the haze during December and January continues. So Wang and colleagues Yufei Zou, Yuzhong Zhang, and Ja-Ho Koo wondered if other factors may play a role.An unusual January
Long-term air quality measurements aren’t available in China, so the researchers had to piece together estimates based on visibility measures and satellite data. To analyze the historical records, they created a new Pollution Potential Index (PPI) that used air temperature gradient anomalies and surface wind speeds as a proxy for ventilation conditions over eastern China.
“Once we generated the PPI and combined it with the visibility data, it was obvious that January 2013 was well beyond anything that had ever been seen before going back at least three decades,” says Wang. “But in that month emissions had not changed, so we knew there had to be another factor.”
The East China Plains consist of interconnected basins surrounded by mountain ranges to the west and the ocean to the east, a mirror image of the polluted Southern California. Pollution generated by industry and vehicles can be removed effectively only by horizontal dispersion or by vertical mixing in winter, and when those processes fail to move out stagnant air, pollution builds up. It seemed likely that something was preventing the ventilation that would have kept the air cleaner.Rats gain weight after breathing Beijing’s air for 3 weeks
The researchers next looked at climate features such as sea ice, snowfall, El Niños, and Pacific Oscillations. They conducted principal component and maximum covariance analyses and found correlations of stagnant air conditions over China to Arctic sea ice—which reached a record low in the fall of 2012—and snowfall in the upper latitudes of Siberia, which had reached a record high earlier in the winter. They then used atmospheric model simulations to study how those factors change large-scale atmospheric circulation patterns and pollution ventilation over eastern China.
“Despite the efforts to reduce emissions, we think that haze will probably continue for the future.”
“The reductions in sea ice and increase in snowfall have the effect of damping the climatological pressure ridge structure over China,” Wang says. “That flattens the temperature and pressure gradients and moves the East Asian Winter Monsoon to the east, decreasing wind speeds and creating an atmospheric circulation that makes the air in China more stagnant.”
The results of the model were consistent with observations that Korea and Japan had been unusually cold that winter, while eastern China had been unusually warm—both suggesting that the cold center had moved.
The winter of 2017 saw the same factors, with low levels of Arctic sea ice in September 2016, high snowfall—and severe haze. Wang says those factors are likely to continue as the global climate change disrupts the normal structure of the atmosphere.
“Despite the efforts to reduce emissions, we think that haze will probably continue for the future,” he says. “This is partly climate-driven now, so it probably won’t get much better in the winter. Emissions are no longer the only driver of these conditions.”Big impact
Wang hopes to continue the study using new data from China’s air quality monitoring network. The impact of global climate change, he says, may be unique to China because of its geography and sensitivity to changes in atmospheric circulation structure. Though the problem is now manifested in air pollution, he said the results of the study should encourage the nation to continue addressing climate change.
“The very rapid change in polar warming is really having a large impact on China,” he says. “That gives China an incentive to not only follow through on air pollutant emission reductions, and also to look at the potential for reducing greenhouse gas emissions. Our research shows that cutting greenhouse gases would help with the winter haze problem.”
The National Science Foundation Atmospheric Chemistry Program and the US EPA Science To Achieve Results (STAR) Program funded the work. It has not been subjected to any EPA review and therefore does not necessarily reflect the views of the EPA, and no official endorsement should be inferred.
Source: Georgia Tech
A custom-built microscope is giving scientists the closest view yet of living nerve synapses.
The brain hosts an extraordinarily complex network of interconnected nerve cells that are constantly exchanging electrical and chemical signals at speeds difficult to comprehend.
Understanding the detailed workings of a synapse—the junction between neurons that govern how these cells communicate with each other—is vital for modeling brain networks and understanding how diseases such as depression, Alzheimer’s, or schizophrenia may affect brain function, according to the researchers.Scientists have achieved the closest view yet of working nerve synapses, the junctions between neurons that govern how these cells communicate. The researchers have been able to achieve even closer looks than shown above, however this image shows multiple working nerve synapses. (Credit: Dario Maschi)
Studying active rat neurons, even those growing in a dish, is a challenge because they are so small. Further, they move, making it difficult to keep them in focus at high magnifications under a light microscope.
“Synapses are little nanoscale machines that transmit information,” says senior author Vitaly A. Klyachko, associate professor of cell biology and physiology at Washington University in St. Louis School of Medicine. “They’re very difficult to study because their scale is below what conventional light microscopes can resolve. So what is happening in the active zone of a synapse looks like a blur.Artificial synapse could make computers more like brains
“To remedy this, our custom-built microscope has a very sensitive camera and is extremely stable at body temperatures, but most of the novelty comes from the analysis of the images,” he says. “Our approach gives us the ability to resolve events in the synapse with high precision.”
Until now, close-up views of the active zone have been provided by electron microscopes. While offering resolutions of mere tens of nanometers—about 1,000 times thinner than a human hair and smaller—electron microscopes can’t view living cells. To withstand bombardment by electrons, samples must be fixed in an epoxy resin or flash frozen, cut into extremely thin slices and coated in a layer of metal atoms.
“Most of what we know about the active zone is from indirect studies, including beautiful electron microscopy images,” says Klyachko, also an associate professor of biomedical engineering at the School of Engineering and Applied Science. “But these are static pictures. We wanted to develop a way to see the synapse function.”
A synapse consists of a tiny gap between two nerves, with one nerve serving as the transmitter and the other as the receiver. When sending signals, the transmitting side of the synapse releases little packages of neurotransmitters, which traverse the gap and bind to receptors on the receiving side, completing the information relay. On the transmitting side of the synapse the neurotransmitters at the active zone are packaged into synaptic vesicles.
“One of the most fundamental questions is: Are there many places at the active zone where a vesicle can release its neurotransmitters into the gap, or is there only one?” Klyachko says. “A lot of indirect measurements suggested there might be only one, or maybe two to three, at most.”Neurons never quite recover from ‘Pavlov’s bell’
In other words, if the active zone could be compared to a shower head, the question would be whether it functions more as a single jet or as a rain shower.
The findings of the new study, published in the journal Neuron, show the active zone is more of a rain shower. But it’s not a random shower; there are about 10 locations dotted across the active zone that are reused too often to be left to chance. There is also a limit to how quickly these sites can be reused—about 100 milliseconds must pass before an individual site can be used again. And at higher rates of vesicle release, the site usage tends to move from the center to the periphery of the active zone.
“Neurons often fire at 50 to 100 times per second, so it makes sense to have multiple sites,” Klyachko says. “If one site has just been used, the active zone can still be transmitting signals through its other sites.
“We’re studying the most basic machinery of the brain,” he says. “Our data suggest these machines are extremely fine-tuned—even subtle modulations may lead to disease. But before we can study disease, we need to understand how healthy synapses work.”
The Esther A. & Joseph Klingenstein Fund, the Whitehall Foundation, and the McDonnell Center at Washington University supported the work.
The ocean isn’t the sole source of life-sustaining fog and dew for the Namib Desert’s numerous plants and animals, report researchers.
“Knowing exactly where the fog and dew come from will help us predict the availability of non-rainfall water in the future, both in the Namib and elsewhere,” says Lixin Wang, an ecohydrologist and assistant professor of Earth sciences at Indiana University-Purdue University Indianapolis, who led the new study. “With this knowledge, we may be able to determine ways to harvest novel water sources for potential use in water-scarcity situations.”
Surprisingly, non-ocean-derived fog accounted for more than half of total fog events in the Namib over the one-year period of the study. Groundwater-derived fog was the most significant locally generated fog, serving as a source of more than a quarter of the desert’s fog. Soil water, which derives from rainfall and is below the surface but located higher than groundwater, also turns out to be an unexpected source of moisture.In desert cave, microbes feed on water, rocks, air
Drylands, which in addition to deserts include parched but nondesert areas of the Great Plains and southwestern United States, cover approximately 40 percent of Earth’s land surface and are home to an estimated 2.5 billion people. With global warming, more areas in the United States and around the world are becoming drier and more desert-like.
“Dryland ecosystems have some of the lowest annual rainfall amounts recorded on Earth,” says Tom Torgersen, program officer in the division of Earth sciences of the National Science Foundation, which funded the work. “To survive, these ecosystems recycle water in the form of fog and dew. In the driest places on the planet, even seemingly minor components of the water cycle, such as fog and dew, become major and are critical to keeping the environment alive and functioning.”Lots more rain once fell on the Sahara
Like other dryland ecosystems worldwide, the Namib is likely to experience changes in its hydrological cycle in response to global climate change. Given the abundance and importance of fog and dew in this desert, it provides an ideal location to study non-rainfall water.
Many parts of the Namib receive virtually no rain.
The Namib, which borders the Atlantic Ocean for 1,243 miles with temperatures ranging from below 32°F (0°C) to 140°F (60°C), is almost completely devoid of surface water. Many parts of the Namib receive virtually no rain. Some years are rainless; in other years, there may be only an inch or two of rain, although some areas may receive as many as four inches. But the Namib does support a wide variety of specially adapted organisms, such as a fog-harvesting beetle. Most of the Namib’s plants and animals are believed to obtain moisture from fog or dew during rainless periods to be able to survive.
Fog consists of tiny droplets of water suspended in air, and dew consists of tiny droplets that form on the surface of plants, soil, and other objects on the ground.
Wang used analysis of stable isotopes in water—the same element with different neutron numbers in the nuclei such as of hydrogen and oxygen—to trace the origins of non-rainfall water. In future research, he plans to explore the mechanisms by which groundwater and soil water become fog and dew. The long-term goal is to expand this ecohydrology research beyond the Namib to a global scale.
The study appears in Science Advances.
Random, unpredictable DNA copying mistakes account for nearly two-thirds of the genetic changes that cause cancer—far more mutations than those triggered by heredity or by environmental factors like smoking or pollution, a study finds.
The study used a new mathematical model based on DNA sequencing and epidemiologic data from around the world.
The findings do not in any way suggest that we give up on healthy lifestyles and other strategies for minimizing the likelihood of cancer, say the researchers.
“We need to continue to encourage people to avoid environmental agents and lifestyles that increase their risk of developing cancer mutations,” says Bert Vogelstein, co-director of the Ludwig Center at Johns Hopkins University’s Kimmel Cancer Center.
The findings do suggest, however, that medical research pay more attention to those “mistake” mutations that occur randomly as cells copy their genetic information—their DNA—when preparing to divide and create new cells.
“Many people will still develop cancers due to these random DNA copying errors,” Vogelstein says, “and better methods to detect all cancers earlier, while they are still curable, are urgently needed.”
Vogelstein and Cristian Tomasetti, assistant professor of biostatistics at the Johns Hopkins Bloomberg School of Public Health, report the findings in the journal Science.
The researchers estimate that 66% of cancer mutations result from copying errors, 29% can be attributed to lifestyle or environment, and the remaining 5% are inherited.
They say their conclusions do not conflict with epidemiologic studies showing that avoiding unhealthy environments and lifestyles can prevent about 40 percent of cancers. But cancer often strikes people who follow all the rules—not smoking, maintaining a healthy diet and weight, avoiding carcinogens—and who have no family history of the disease. That prompts the pained question, “Why me?”
In a previous study, Tomasetti and Vogelstein reported that DNA copying errors could explain why certain cancers in the United States, such as those of the colon, occur more than others, such as brain cancer. In the new research, they addressed a different question: What fraction of mutations in cancer are due to these DNA copying errors?
The scientists took a close look at the mutations that drive abnormal cell growth among 32 cancer types. They developed their new model using DNA sequencing data from the Cancer Genome Atlas and epidemiologic data from the Cancer Research UK database.
It generally takes two or more critical gene mutations for cancer to start. Tomasetti and Vogelstein used their model to show, for example, that when critical mutations in pancreatic cancers are added together, 77 percent are due to random DNA copying errors, 18 percent to environmental factors, such as smoking, and 5 percent to heredity.
In other cancer types, such as those of the prostate, brain, or bone, more than 95 percent of the mutations are due to random copying errors.
Lung cancer, they note, is different: 65 percent of mutations are due to environmental factors, mostly smoking, and 35 percent are due to DNA copying errors. Inherited factors are not known to play a role.
Looking across all 32 cancer types studied, the researchers estimate that 66 percent of cancer mutations result from copying errors, 29 percent can be attributed to lifestyle or environment, and the remaining 5 percent are inherited.Cancer’s metabolism might be a way to kill it
The scientists say their approach is akin to sorting out why “typos” occur in a 20-volume book: being tired while typing, which corresponds to environmental exposures; a stuck or missing key in the keyboard, which represent inherited factors; and other typographical errors that randomly occur, which represent DNA copying errors.
“You can reduce your chance of typographical errors by making sure you’re not drowsy while typing and that your keyboard isn’t missing some keys,” Vogelstein says. “But typos will still occur, because no one can type perfectly. Similarly, mutations will occur, no matter what your environment is, but you can take steps to minimize those mutations by limiting your exposure to hazardous substances and unhealthy lifestyles.”
Tomasetti says random DNA errors will only get more important as populations age, prolonging the opportunity for cells to make more and more mistakes.
Funding for the study came from the John Templeton Foundation, the Lustgarten Foundation for Pancreatic Cancer Research, the Virginia and D.K. Ludwig Fund for Cancer Research, the Sol Goldman Center for Pancreatic Cancer Research, and the National Cancer Institute.
Source: Johns Hopkins University
Although sleepy people had trouble interpreting happiness and sadness in a recent study, they had no problem doing so with other emotions—anger, fear, surprise, and disgust.
That’s likely because we’re wired to recognize those more primitive emotions in order to survive acute dangers, says lead researcher William D.S. Killgore, a professor of psychiatry, psychology, and medical imaging at the University of Arizona.
“If someone is going to hurt you, even when you’re sleep deprived you should still be able to pick up on that.”
While emotions such as fear and anger could indicate a threat, social emotions such as happiness and sadness are less necessary for us to recognize for immediate survival. When we’re tired, it seems we’re more likely to dedicate our resources to recognizing those emotions that could impact our short-term safety and well-being.
“If someone is going to hurt you, even when you’re sleep deprived you should still be able to pick up on that,” Killgore says. “Reading whether somebody is sad or not is really not that important in that acute danger situation, so if anything is going to start to degrade with lack of sleep it might be the ability to recognize those social emotions.”Morphed faces
Killgore used data from a larger research effort on sleep deprivation’s effects on social, emotional, and moral judgment that he began while working as a research psychologist for the US Army.Long gap between photos stymies facial recognition
For the current study, published in Neurobiology of Sleep and Circadian Rhythms, 54 participants saw photographs of the same male face expressing varying degrees of fear, happiness, sadness, anger, surprise, and disgust and indicated which of those six emotions they thought each faces expressed the most.
In order to assess participants’ ability to interpret more subtle emotional expressions, the images presented were composite photos of commonly confused facial expressions morphed together by a computer program. For example, a face might show 70 percent sadness and 30 percent disgust or vice versa. Participants saw a total of 180 blended facial expressions at each testing session.
Participants’ baseline responses to the images were compared to their responses after they were deprived of sleep for one night.
Researchers found that blatant facial expressions—such as an obvious grin or frown (90 percent happy or 90 percent sad)—were easily identifiable regardless of how much sleep a participant got. Sleep deprived participants had a harder time, however, correctly identifying more subtle expressions of happiness and sadness, although their performance on the other emotions was unchanged.
When participants were tested again after one night of recovery sleep, their performance on happiness and sadness improved, returning to its baseline level.Sleep and our relationships
While the difference in performance was not overwhelming, it’s enough that it could have a significant impact in critical social interactions, Killgore says.
“As a society, we don’t get the full seven to eight hours of sleep that people probably need to be getting. The average American is getting a little less than six hours of sleep on average, and it could affect how you’re reading people in everyday interactions.Lack of sleep can lead to false confessions
“You may be responding inappropriately to somebody that you just don’t read correctly, especially those social emotions that make us human. Or you may not be as empathic. Your spouse or significant other may need something from you and you’re less able to read that. It’s possible that this could lead to problems in your relationships or problems at work. To me, that is one of the biggest problems—how this affects our relationships.”
The research builds on existing work on the effects of sleep deprivation on the brain’s ventromedial prefrontal cortex—an area that helps people make judgments and decisions using their emotions.
A prior study, published by Harvard University’s Seung-Schik Yoo and colleagues, showed that when people are sleep deprived, a disconnect occurs between the prefrontal cortex and the amygdala—one of the key emotionally responsive areas of the brain.
“So, in simplistic terms, the part of the brain that controls your emotions and the part that sees faces and responds to the emotional content basically start to lose their ability to communicate. “We wanted to test that out and see if it plays out in terms of how people read facial expressions—and, in fact, it looks like it does.”
Source: University of Arizona
Food scarcity and poor oral health are the major causes that lead older adults suffering from malnutrition—and who are already at high risk of functional decline, decreased quality of life, and increased mortality—to land in the emergency department.
“For patients who don’t have enough food at home, the solution is pretty obvious and likely much less expensive than paying for the medical care that results from malnutrition,” says Tim Platts-Mills, assistant professor of emergency medicine at the University of North Carolina at Chapel Hill.
“There is an existing national system of food assistance programs, such as Meals on Wheels, and we believe we can use the emergency department to link patients in need to those programs.”
“Even though such programs are relatively inexpensive—about $6 per individual per day—many many programs are underutilized and under-funded. We need to link patients to these programs and fund these programs,” says Platts-Mills, who is also co-director of the Division of Geriatric Emergency Medicine at the UNC School of Medicine.For older Americans, divide between rich and poor gets bigger
A new study, published in the Journal of the American Geriatrics Society, included 252 patients age 65 and older seeking treatment in emergency departments in North Carolina, Michigan, and New Jersey. Participants were screened for malnutrition and then asked about the presence of risk factors.
The overall prevalence of malnutrition in the study sample was 12 percent, which is consistent with previous estimates from US emergency departments and about double the prevalence in community-dwelling adults (those who are not hospitalized and do not live in an assisted-living facility).
Of the three sites, patients receiving care in the North Carolina emergency department had the highest rate of malnutrition, 15 percent. North Carolina also has one of the highest rates of older adults living below the poverty line (ranked third out of 50 states).
Of the risk factors studied, poor oral health had the largest impact on malnutrition. More than half of the patients in the study had some dental problems, and patients with dental problems were three times as likely to suffer from malnutrition as those without dental problems.How coffee could help you keep your teeth
Ten percent of patients experienced food insecurity—as evidenced by their responses to questions about not having enough food, eating fewer meals, and going to bed hungry. Further, food insecurity was strongly associated with malnutrition. Other factors that may contribute to malnutrition problems include social isolation, depression, medication side effects, and limited mobility.
“Improving oral health in older adults will be more challenging but also important. Medicare does not cover dental care,” says Collin Burks, a UNC medical student and the study’s lead author.
“Fixing dental problems not only makes it easier for these individuals to eat but also can improve their self-esteem, quality of life, and overall health. We need affordable methods of providing dental care for older adults.”
A research training grant from the National Institutes of Health supported the work.
Source: UNC-Chapel Hill
The post Dental trouble tied to malnutrition among some seniors appeared first on Futurity.
In the decades following the work of physiologist Ivan Pavlov and his famous salivating dogs, scientists have discovered how molecules and cells in the brain learn to associate two stimuli, such as Pavlov’s bell and the resulting food.
What they haven’t been able to study is how whole groups of neurons work together to form that link. Now, researchers have observed how large groups of neurons in the brain both learn and unlearn a new association.
“It’s been over 100 years since Pavlov did his amazing work but we still haven’t had a glimpse of how neural ensembles encode a long-term memory,” says Mark Schnitzer, associate professor of biology and applied physics at Stanford University. “This was an opportunity to examine that.”
For a new study published in Nature, researchers worked with mice and focused on the amygdala, a part of the brain known to be involved in learning that is extremely similar across species. They taught mice to associate a tone with a mild shock and found that, once the mice learned the association, the pattern of neurons that activated in response to tone alone resembled the pattern that activated in response to the shock.
Using Pavlov’s dogs as an analogy, this would mean that, as the dogs learned to associate the bell with the food, the neural network activation in their amygdalas would look similar whether they were presented with food or just heard the bell.
The findings also reveal that the neurons never returned to their original state, even after the training was undone. Although this was not the main focus of the study, the results could have wide-ranging implications for studying emotional memory disorders, such as post-traumatic stress disorder (PTSD).Tone and shock
The researchers trained mice in the study to associate a tone with a light foot shock. At the beginning of the experiment, mice had no reaction to the tone, but would freeze in place in response to the light shock. After pairing the tone and the light shock a few times, the tone alone was enough to cause the mice to freeze in place.How Pavlov’s bell teaches dogs to drool
“You can think of this type of learning as a survival strategy,” says Benjamin Grewe, lead author of the paper and former postdoctoral scholar in the Schnitzer lab. “We need that as humans, animals need that. When we associate certain stimuli with their possible dangerous outcomes, it helps us to avoid dangerous situations in the first place.”
During the training, researchers directly observed the activity of about 200 neurons in the amygdala. Using a miniature microscope—developed previously by the Schnitzer lab—to view neurons deep in the brain, they could observe activity of individual cells as well as of the entire ensemble. They found that, as the mice learned to associate the tone with the shock, the set of cells that responded to the tone began to resemble those that responded to the shock itself.
“The two stimuli are both eliciting fear responses,” Schnitzer says. “It’s almost as if this part of the brain is blurring the lines between the two, in the sense that it’s using the same cells to encode them.”A zap to the right spot might ‘reset’ your brain
The amount of change in how the group of neurons responded to the tone also predicted how much the mouse behavior would change. Mice whose amygdalas activated similarly in response to the tone and to the shock froze most consistently in response to the tone, by itself, 24 hours later.
“We managed, for the first time, to record the activity of a large network of neurons in the amygdala and did that with single cell resolution,” Grewe says. “So we knew what every single cell was doing.”Undoing the association
As part of the experiments, the team also undid the conditioning so that the mice stopped freezing in reaction to the tone. During this phase the neural response never completely returned to its original state.
The experiment to reverse the association was not designed to represent any human diseases or disorders, but this finding could potentially inform research into problems with emotional memory, such as generalized anxiety disorder or PTSD, where people may have difficulty dissociating neutral stimuli from negative ones. That kind of application, however, would likely be some years in the future.
“We’re just beginning this work,” Schnitzer says, “but these findings give us a window into how the external world may be annotated for us in this brain structure.”
Additional researchers from Stanford, the University of Basel, and the Friedrich Miescher Institute for Biomedical Research are coauthors of the work. The Swiss National Science Foundation, the US National Science Foundation, Stanford University, the Simons Foundation, the Helen Hay Whitney Foundation, the Novartis Research Foundation, Howard Hughes Medical Institute, and DARPA funded the research.
Source: Stanford University
Compounds called xanthones from mangosteens could provide a suitable new drug in the fight against tuberculosis, new research suggests.
Researchers report that xanthones were very effective at inhibiting and killing Mycobacterium tuberculosis (Mtb), the bacterium responsible for causing TB. The study also demonstrated that xanthones had a low propensity for developing drug resistance, making it a promising candidate in developing anti-TB drugs.
Earlier studies in Singapore using mangosteen fruit extracts found that xanthones were effective against bacterial infections such as Staphylococcus. This prompted the researchers to investigate the potential for this class of compounds in tackling multi-drug resistant TB bacteria.
“We discovered that xanthones are effective in killing off persistent strains of bacteria, a property that could result in treatment shortening therapies,” says Nicholas Paton, professor in the Division of Infectious Diseases at NUS Medicine, a member of the National University Health System (NUHS). Paton is head of the Singapore Programme of Research Investigating New Approaches to Treatment of Tuberculosis (SPRINT-TB).
“The discovery of this new potential TB drug candidate is significant; aside from two new drugs approved in recent years for multidrug-resistant TB, the disease had not seen new drug developments in over 40 years. Using a proven antibacterial compound like xanthones means we do not have to re-invent the wheel by searching for and testing totally new compounds.”Drug from bananas may fight flu virus
TB affects an estimated 8.6 million people globally and causes some 1.3 million deaths annually. Developed nations in Europe and North America have reduced TB rates of between 5 and 10 per 100,000 people each year compared to Singapore, which sees over 40 cases per 100,000 people. Asia accounts for 59 percent of the world’s TB cases.
A rapidly aging population and the prevalence of diseases such as diabetes also contribute to the high incidence of TB cases, as compromised immune systems make older people more susceptible to infections. Many elderly TB patients could also have been latently infected when the disease was far more prevalent in the past, with the infection reactivating as they age.
The fight against TB is an uphill struggle. Common strains of TB have developed multi-drug resistance, rendering existing drugs ineffective against the disease. There is also a worryingly low number of potential new chemical entities in the TB drug pipeline at present.Lettuce can make drugs that don’t require fridges
“The average TB patient currently expects to undergo six to 24 months of tedious treatment,” says associate professor Thomas Dick, the study’s principal investigator and head of the Antibacterial Drug Discovery Laboratory and Director of the Biosafety-level 3 (BSL-3) Core Facility at NUS Medicine.
“Xanthones offer a realistic avenue towards developing new and more effective drugs for TB with potentially shortened treatment times as well. All these factors can help in reducing the disease burden faced by Singapore’s aging population, as well as treatment costs incurred by patients and their families.”
While laboratory and preclinical testing on xanthones will take at least several more years, the discovery of its efficacy against the disease is a step in the right direction for TB research. SPRINT-TB is also working with BSL-3 to speed up the investigation into xanthones and other potential TB treatment methods.
The researchers report their work in the European Journal of Medicinal Chemistry.
Source: National University of Singapore
A large hole at the base of the skull offers clues to the evolution of bipedalism—walking on two feet—in humans.
Compared with those of other primates, the foramen magnum—where the spinal cord passes through—is shifted forward.
While many scientists generally attribute this shift to the evolution of bipedalism and the need to balance the head directly atop the spine, others have been critical of the proposed link.
A new study validates the connection and provides another tool for researchers to determine whether a fossil hominid walked upright on two feet like humans or on four limbs like modern great apes.
Controversy has centered on the association between a forward-shifted foramen magnum and bipedalism since 1925, when Raymond Dart discussed it in his description of “Taung child,” a 2.8-million-year-old fossil skull of the extinct South African species Australopithecus africanus. Last year, Aidan Ruth and colleagues continued to stir up the controversy when they published a paper offering additional criticisms of the idea.
Gabrielle Russo, an assistant professor at Stony Brook University, and Chris Kirk, an anthropologist at the University of Texas at Austin, now build on their own prior research to show that a forward-shifted foramen magnum exists not just in humans and their bipedal fossil relatives, but also among bipedal mammals more generally.Comparison of the positioning of the foramen magnum in a bipedal springhare (left) and its closest quadrupedal relative, the scaly-tailed squirrel. (Credit: Russo and Kirk, Journal of Human Evolution)
“This question of how bipedalism influences skull anatomy keeps coming up partly because it’s difficult to test the various hypotheses if you only focus on primates,” says Kirk. “However, when you look at the full range of diversity across mammals, the evidence is compelling that bipedalism and a forward-shifted foramen magnum go hand-in-hand.”These pre-humans didn’t have ‘nutcracker’ jaws
In the Journal of Human Evolution, Kirk and Russo expand on research published in the same journal in 2013 by using new methods to quantify aspects of foramen magnum anatomy and sampling the largest number of mammal species to date.
They compared the position and orientation of the foramen magnum in 77 mammal species including marsupials, rodents, and primates. Their findings indicate that bipedal mammals such as humans, kangaroos, springhares, and jerboas have a more forward-positioned foramen magnum than their quadrupedal close relatives.
“We’ve now shown that the foramen magnum is forward-shifted across multiple bipedal mammalian clades using multiple metrics from the skull, which I think is convincing evidence that we’re capturing a real phenomenon,” Russo says.
Further, the study identifies specific measurements that can be applied to future research to map out the evolution of bipedalism. “Other researchers should feel confident in making use of our data to interpret the human fossil record.”
Source: University of Texas at Austin
A drip, or even a stream, of wine from the bottle can ruin your tablecloth—unless you wrap a napkin around the bottle as a sommelier would.
Daniel Perlman—wine-lover, inventor, and biophysicist at Brandeis University—has figured out a solution to this age-old problem. Over the course of three years, he has been studying the flow of liquid across the wine bottle’s lip. By cutting a groove just below the lip, he’s created a drip-free wine bottle.
There are already products on the market designed to prevent wine spillage, but they require inserting a device into the bottle’s neck. Perlman didn’t want people to have to take an additional step after they made their purchase. “I wanted to change the wine bottle itself,” he says. “I didn’t want there to be the additional cost or inconvenience of buying an accessory.”
Figure out the physics, he thought, and you might be able to build a drip-free wine bottle. Perlman studied slow-motion videos of wine pouring. He observed first that dripping was most extreme when a bottle was full or close to it. He also saw that a stream of wine tends to curl backward over the lip and run down the side of the glass bottle because glass is hydrophilic, meaning it attracts water.
Using a diamond-studded tool, Perlman, with the assistance of engineer Greg Widberg, created a circular groove around the neck of the bottle just beneath the top. A droplet of wine that would otherwise run down the side of the bottle encounters the groove, but can’t traverse it. Instead, it immediately falls off the bottle into the glass along with the rest of the wine.
Remember that when you pour a full or nearly-full bottle of wine, you hold it at a slightly upward angle in relation to the glass. For a drop of wine to make it across Perlman’s groove, it would have to travel up inside the groove against the force of gravity or have enough momentum to jump from one side of the groove to the other.Ancient jars held 2,000 liters of strong, sweet wine
After many tests, Perlman found the perfect width, roughly 2 millimeters, and depth, roughly 1 millimeter, for the groove so that the wine stream can’t get past it. Current wine bottle designs date to the early 1800s and haven’t changed much since. About 200 years of drips, drabs, stains, and spots may be coming to an end.
Perlman has 100 patents to his name for everything from specialized lab equipment to the first miniaturized home radon detector. Along with K.C. Hayes, professor emeritus of biology, he developed the “healthy fats” in Smart Balance margarine. Most recently, he devised coffee flour, a food ingredient and nutritional supplement derived from par-baked coffee beans.
Perlman is currently speaking with bottle manufacturers about adopting his design.
Source: Brandeis University
People who lose a partner to suicide are at increased risk for physical and mental problems including cancer, mood disorders like depression, and even herniated discs.
The findings underscore the need for support systems for bereaved partners and others who have lost loved ones to suicide, since interventions addressing complicated grief could help mitigate some of the effects, researchers say.
“Health care providers, friends, and neighbors often do not know how best to support those bereaved by suicide.”
More than 800,000 people around the world die by suicide each year—and the suicide rate in many countries, including the United States, is on the rise.
The study, published in the journal JAMA Psychiatry, followed 4,814 Danish men and 10,793 Danish women bereaved by partner suicide for up to 35 years, from 1980 to 2014, and compared them to the general population of Denmark.
“It is an exceedingly devastating experience when someone you love dearly dies suddenly by suicide,” says study leader Annette Erlangsen, adjunct professor of mental health at the Johns Hopkins Bloomberg School of Public Health.
“We were able to show that being exposed to such a stressful life event as the suicide of your partner holds higher risks for physical and mental disorders and is different from losing a partner from other causes of death, such as illness or sudden accident.”Even after death, one spouse relies on the other
Using Denmark’s Cause of Death Registry, researchers identified everyone in the country 18 or older who had died by suicide since 1970. Using other national records, the team then identified surviving partners, including spouses, registered partners, or those with whom the deceased cohabitated. The research team studied those individuals over the years after their losses.
Researchers compared this data to those from two other groups: Denmark’s general population age 18 or older between 1980 and 2014 and people in the general population who had suffered the death of partners through causes other than suicide.
People who lost partners to suicide were at more risk of cancer, cirrhosis of the liver, and spinal disc herniation than the general population. After long-term follow-up, there was an increased risk of sleep disorders and, for women only, chronic respiratory disease. The study found that suicide-bereaved partners also had an increased risk for mood disorders, PTSD, anxiety disorders, alcohol use disorder, and self-harm.
“The suicide rate in the United States is increasing, which makes this research even more relevant,” says Holly C. Wilcox, associate professor of mental health and psychiatry. “Health care providers, friends, and neighbors often do not know how best to support those bereaved by suicide.”Public and private mix when we mourn on Twitter
While the researchers were not surprised by the thrust of the findings, there were some unexpected results, such as the finding of an increased risk for a herniated disc. Also, they found that partners who lost a loved one to suicide and then remarried had a lower chance of divorcing than the general population. At approximately 44 percent, the divorce rate in Denmark is comparable to that of other developed countries, including the United States.
“Maybe people who have experienced such a traumatic loss might be more selective when they choose a new partner and, as such, are less likely to experience a divorce,” Erlangsen says.
The research highlights the need for both personal and professional interventions for people whose lives have been affected by the suicide of their spouse or partner.
“This is a population in need of support and outreach,” Wilcox says. “Surviving a family member’s suicide is often a very isolating experience. Often friends and family of the bereaved are afraid of saying the wrong thing so they don’t say anything at all. The stigma associated with suicide can lead survivors to suffer in silence alone.”
The researchers say they chose Denmark because it has such rich data sets on its citizens. The findings, the researchers believe, are applicable to other countries.
The American Foundation for Suicide Prevention and the Danish Health Insurance Foundation supported the work.
Source: Johns Hopkins University
The post After suicide, grieving partners live with health risks appeared first on Futurity.
Early Mars may have had rings like Saturn, and might have them again, according to a new model. The research suggests that debris, pushed into space from an asteroid or other body, slammed into the red planet around 4.3 billion years ago and alternates between becoming a planetary ring and clumping up to form a moon.
A theory exists that Mars’ large North Polar Basin or Borealis Basin, which covers about 40 percent of the planet in its northern hemisphere, results from that impact, sending debris into space.
“That large impact would have blasted enough material off the surface of Mars to form a ring,” says Andrew Hesselbrock, a doctoral student in physics and astronomy at Purdue University.
The new model suggests that as the ring formed and the debris slowly moved away from the planet and spread out, it began to clump and eventually formed a moon. Over time, Mars’ gravitational pull would have pulled that moon toward the planet until it reached the Roche limit, the distance within which the planet’s tidal forces will break apart a celestial body that is held together only by gravity.Are the dark rings around this star hiding planets?
Phobos, one of Mars’ moons, is getting closer to the planet. According to the model, Phobos will break apart upon reaching the Roche limit and become a set of rings in roughly 70 million years.The Martian moon Phobos, seen here in a photo taken by NASA’s Mars Reconnaissance Orbiter from 4,200 miles away. (Credit: NASA)
Depending on where the Roche limit is, this cycle may have repeated between three and seven times over billions of years. Each time a moon broke apart and reformed from the resulting ring, its successor moon would be five times smaller than the last, and debris would have rained down on the planet, possibly explaining enigmatic sedimentary deposits found near Mars’ equator.
“You could have had kilometer-thick piles of moon sediment raining down on Mars in the early parts of the planet’s history, and there are enigmatic sedimentary deposits on Mars with no explanation as to how they got there,” says David Minton, assistant professor of earth, atmospheric and planetary sciences. “And now it’s possible to study that material.”These rings are 200 times bigger than Saturn’s
Other theories suggest that the impact with Mars that created the North Polar Basin led to the formation of Phobos 4.3 billion years ago, but Minton says it’s unlikely the moon could have lasted all that time.
Also, Phobos would have had to form far from Mars and would have had to cross through the resonance of Deimos, the outer of Mars’ two moons. Resonance occurs when two moons exert gravitational influence on each other in a repeated periodic basis, as major moons of Jupiter do. By passing through its resonance, Phobos would have altered Deimos’ orbit. But Deimos’ orbit is within one degree of Mars’ equator, suggesting it has had no effect on Phobos.
“Not much has happened to Deimos’ orbit since it formed,” Minton says. “Phobos passing through these resonances would have changed that.”
Richard Zurek of NASA’s Jet Propulsion Laboratory in Pasadena, California, is the project scientist for NASA’s Mars Reconnaissance Orbiter, whose gravity mapping provided support for the hypothesis that the northern lowlands come from a massive impact.
“This research highlights even more ways that major impacts can affect a planetary body,” he says.
NASA funded the study that appears in the journal Nature Geoscience.
Source: Purdue University
Forty years ago, the Hyde Amendment began as a single-sentence prohibition on Medicaid funding for abortion. Since then, it has provided the blueprint for expanded prohibitions.
In an article in the Journal of the American Medical Association tracing the amendment’s history and impact, researchers say that the Hyde blueprint now has a renewed chance of becoming codified into law.
Passed by the US House of Representatives on January 24, the “No Taxpayer Funding for Abortion and Abortion Insurance Full Disclosure Act of 2017,” a bill also known as H.R. 7, would prohibit all federal funds from underwriting abortion or subsidizing insurance that provides coverage for abortion. That would essentially turn the Hyde blueprint of restricting public funds for abortion into permanent law.
The bill’s fate in the US Senate, where it likely lacks a filibuster-proof majority, is unclear, but for the first time since its introduction in 2011, it has a receptive president in the White House, says essay coauthor Eli Adashi, professor of medicine and former dean of medicine and biological sciences at Brown University.
Under former President Barack Obama, Republicans knew “there was no future for the bill,” Adashi says. “Now there is at least a theoretical possibility that something will happen.”History of expansion
The Hyde Amendment got its start on September 30, 1976, as a one-sentence rider on an appropriation bill for fiscal 1977, write Adashi and Rachel Ochiogrosso, a student at Brown’s Warren Alpert Medical School. It was specific to Medicaid and allowed an exception in the case of danger to the life of the mother.Funds help vulnerable women pay for abortion
After the rider survived a four-year legal battle resolved in its favor by the Supreme Court, the blueprint of prohibiting federal monies from funding abortion continued to expand to new areas of the federal budget. It is now the policy of the Peace Corps, the federal employment health benefits plan, the federal prisons, Medicare, the Department of Veterans Affairs, the Department of Defense, the Children’s Health Insurance Program, and the city of Washington, DC.
“It is remarkable,” Adashi says. “You have initially a rider that is fairly circumscribed, and then if you were to use a medical analogy, it metastasized.”Riders matter
Though most Democrats oppose the Hyde Amendment and all but a handful voted against H.R. 7, it was the passage of the Affordable Care Act—aka “Obamacare”—in 2010 that gave the Hyde blueprint relevance to private insurance, the essay notes. A compromise to ensure the law’s passage extended the funding prohibition to federally subsidized health insurance. Today, 32 states prohibit their funds from going to abortion care.Abortion booklets from states are misinforming women
Throughout its history, Adashi and Ochiogrosso write, the original Hyde Amendment has endured under Congressional majorities and presidents of both parties. Yet it has also never been formally made permanent law. Instead it lives on as an annually renewed appropriations rider.
“There’s a lesson here not to dismiss riders in general or to assume they will stay where they are,” Adashi says.
So long as the blueprint persists, in whatever form it does, the net effect of the Hyde Amendment is that abortion remains difficult to attain for low-income women whose insurance or benefits prohibit it, Adashi and Ochiogrosso write.
Source: Brown University
The cerebellum—which means “little brain”—is thought to just sit there helping us balance and breathe, like some kind of tiny heating and ventilation system.
Now, however, scientists have discovered that neurons within the cerebellum respond to and learn to anticipate rewards, a first step toward a much more exciting future for the cerebrum’s largely overlooked little brother and one that could open up new avenues of research for neuroscientists interested in the roots of cognition. The findings appear in the journal Nature.
Scientists had assumed the cerebellum helped control muscles mostly because of what happened when it got injured.
“If you have disruption of the cerebellum, the first thing you see is a motor coordination defect,” says senior author Liqun Luo, professor of biology at Stanford University and an investigator at the Howard Hughes Medical Institute and a member of Stanford Bio-X and the Stanford Neurosciences Institute.Granule cells
Admittedly, there had been some hints of a larger role for the cerebellum, but scientists had a hard time following up on those hints in part because the neurons that make up most of the cerebellum are difficult to study.
Those neurons, known as granule cells, account for 80 percent of the neurons in the brain—all packed into the cerebellum—but only about 10 percent of its volume. At that density, conventional techniques for recording cell activity don’t work well, and without an effective way of studying granule cells in real time, scientists were left with an incomplete picture of what the cerebellum was really doing.
“It was actually a side observation, that, wow, they actually respond to reward.”
Enter Mark Wagner, a postdoctoral fellow in Luo’s lab who led the research with Tony Kim, a graduate student in the lab of Mark Schnitzer, associate professor of biology and of applied physics and an investigator at the Howard Hughes Medical Institute.
Wagner hadn’t set out to redeem the cerebellum. He simply wanted to study how it controls muscles in mice using a new technique that would allow him to record granule cells in real time.
Wagner had earned his PhD working with Schnitzer, who develops methods for imaging neuronal activity in fruit flies, mice, and other living animals. One method, called two-photon calcium imaging, had the resolution Wagner needed to study mouse granule cells in action.High-res MRI links cerebellum to bipolar disorder
In order to study motor control, the team had to get the mice to move. In this case, mice received sugar water about a second after pushing a little lever. While the mice pushed levers and received their rewards, Wagner recorded activity in each mouse’s granule cells, expecting to find that that activity in those cells would be related to planning and executing arm movements.
And to some extent he was right—some granule cells did fire when the animals moved. But other granule cells fired when the mice were waiting for their sugary rewards. And when Wagner sneakily took away their rewards, still other granule cells fired.
“It was actually a side observation, that, wow, they actually respond to reward,” Luo says.Beyond motor tasks
That discovery is something of a revelation. For 50 years, the assumption was that granule cells—and by extension the cerebellum—performed only the most basic functions. But because no one had the tools to look closely at granule cells in action, “we just didn’t know,” Wagner says.This cluster of cells lets us gain motor skills
Now that scientists have a better idea of what’s happening, they hope it could lead to something much bigger. “Given what a large fraction of neurons reside in the cerebellum, there’s been relatively little progress made in integrating the cerebellum into the bigger picture of how the brain is solving tasks, and a large part of that disconnect has been this assumption that the cerebellum can only be involved in motor tasks,” Wagner says.
“I hope that this allows us to unify it with studies of more popular brain regions like the cerebral cortex, and we can put them together,” to figure out what’s really going on inside our heads.
Source: Stanford University
The post The ‘little brain’ may do lots more than we thought appeared first on Futurity.
A sperm’s tail creates a characteristic rhythm that pushes the sperm forward, but also pulls the head backwards and sideways in a coordinated way, report researchers.
They developed a mathematical formula based on these movements, which could make it easier to understand and predict how sperm make the difficult journey towards fertilizing an egg.
The team aims to use these new findings to understand how larger groups of sperm behave and interact, a task that would be impossible using modern observational techniques. The work could provide new insights into treating male infertility.The whip-like tail of the sperm has a rhythm that pulls the head backwards and sideways to create a jerky fluid flow. (Credit: Kyoto University)
“In order to observe, at the microscale, how a sperm achieves forward propulsion through fluid, sophisticated microscopic high precision techniques are currently employed,” says Hermes Gadêlha of the University of York’s mathematics department.
“Measurements of the beat of the sperm’s tail are fed into a computer model, which then helps to understand the fluid flow patterns that result from this movement.
“Numerical simulations are used to identify the flow around the sperm, but as the structures of the fluid are so complex, the data is particularly challenging to understand and use. Around 55 million spermatozoa are found in a given sample, so it is understandably very difficult to model how they move simultaneously.
“We wanted to create a mathematical formula that would simplify how we address this problem and make it easier to predict how large numbers of sperm swim. This would help us understand why some sperm succeed and others fail.”Could sperm-blocking hydrogel replace vasectomy?
By analyzing the head and tail movements of the sperm, researchers have now shown that the sperm moves the fluid in a coordinated rhythmic way, which can be captured to form a relatively simple mathematical formula. This means complex and expensive computer simulations are no longer necessary to understand how the fluid moves as the sperm swim.
The research demonstrates that the sperm has to make multiple contradictory movements, such as moving backwards, in order to propel itself forward towards the egg.
The whip-like tail of the sperm has a particular rhythm that pulls the head backwards and sideways to create a jerky fluid flow, countering some of the intense friction that their small size creates.
“It is true when scientists say how miraculous it is that a sperm ever reaches an egg, but the human body has a very sophisticated system of making sure the right cells come together,” says Gadêlha.
“You would assume that the jerky movements of the sperm would have a very random impact on the fluid flow around it, making it even more difficult for competing sperm cells to navigate through it, but in fact you see well defined patterns forming in the fluid around the sperm.
“This suggests that the sperm stirs the fluid around in a very coordinated way to achieve locomotion, not too dissimilar to the way in which magnetic fields are formed around magnets. So although the fluid drag makes it very difficult for the sperm to make forward motion, it does coordinate with its rhythmic movements to ensure that only a few selected ones achieve forward propulsion.”
Now that the team has a mathematical formula that can predict the fluid movement of one sperm, the next step is to use the model for predictions on larger numbers of cells.
The research appears in the journal Physical Review Letters. Coauthors are from the Universities of York, Birmingham, and Oxford, and Kyoto University.
Source: University of York
Researchers have an idea to simplify electronic waste recycling: Crush it into nanodust.
Specifically, they want to make the particles so small that separating different components is relatively simple compared with processes used to recycle electronic junk now.
Chandra Sekhar Tiwary, a postdoctoral researcher at Rice University and a researcher at the Indian Institute of Science in Bangalore, uses a low-temperature cryo-mill to pulverize electronic waste—primarily the chips, other electronic components, and polymers that make up printed circuit boards (PCBs)—into particles so small that they do not contaminate each other. Then they can be sorted and reused, he says.
Tiwary and his coauthors intend their idea to replace current processes that involve dumping outdated electronics into landfills, or burning, or treating them with chemicals to recover valuable metals and alloys. None is particularly friendly to the environment, Tiwary says.
“In every case, the cycle is one way, and burning or using chemicals takes a lot of energy while still leaving waste,” he says. “We propose a system that breaks all of the components—metals, oxides, and polymers—into homogenous powders and makes them easy to reuse.”A billion tons by 2030
The researchers estimate that so-called e-waste will grow by 33 percent over the next four years, and by 2030 will weigh more than a billion tons. Nearly 80 to 85 percent of often-toxic e-waste ends up in an incinerator or a landfill, Tiwary says, and is the fastest-growing waste stream in the United States, according to the Environmental Protection Agency.
The answer may be scaled-up versions of a cryo-mill designed by the Indian team that, rather than heating them, keeps materials at ultra-low temperatures during crushing.
Cold materials are more brittle and easier to pulverize, Tiwary says. “We take advantage of the physics. When you heat things, they are more likely to combine: You can put metals into polymer, oxides into polymers. That’s what high-temperature processing is for, and it makes mixing really easy.Key smartphone ‘ingredients’ could soon run out
“But in low temperatures, they don’t like to mix. The materials’ basic properties—their elastic modulus, thermal conductivity, and coefficient of thermal expansion—all change. They allow everything to separate really well,” he says.Very cold crushing
As reported in Materials Today, the test subjects in this case were computer mice—or at least their PCB innards. The cryo-mill contained argon gas and a single tool-grade steel ball. A steady stream of liquid nitrogen kept the container at 154 kelvins (minus 182 degrees Fahrenheit).
Shaking makes the ball smash the polymer first, then the metals, and then the oxides just long enough to separate the materials into a powder, with particles between 20 and 100 nanometers wide. That can take up to three hours, after which a water bath separates the particles.
“Then they can be reused,” Tiwary says. “Nothing is wasted.”
Source: Rice University
A new nanofiber solution creates thin, see-through air filters that offer airflow 2.5 times better than that of conventional air filters.
The researchers engineered organic molecules from phthalocyanine, a chemical compound commonly used in dyeing, which can self-organize to form nanoparticles and then nanofibers.
The nanofibers are suspended in liquid and easily “cling” onto non-woven mesh. Spreading the concoction onto non-woven mesh and leaving it to dry naturally produces improved air filters.
Most of the current nanofibers used in air filters are energy intensive to produce and require specialized equipment, according to team leader Tan Swee Ching from National University of Singapore Materials Science and Engineering.
“Our team has developed a simple, quick and cost-effective way of producing high-quality air filters that effectively remove harmful particles and further improves indoor air quality by enhancing air ventilation and reducing harmful UV rays,” he says. He adds that it may be possible in the future to buy the nanofiber solution and create DIY air filters at home.
Air pollution poses serious health threats. Even limited exposure to air pollutants can trigger respiratory symptoms and aggravate existing heart or lung conditions. Even healthy people can suffer from irritation of the eyes, nose, and throat.Air pollution makes bees bumbles search for food
The new air filter offers a quality factor of about two times higher than commercial respirators. It can filter up to 90 percent of hazardous particles that are less than 2.5 microns in size—also known as PM2.5 particles.
Tan says that high-efficiency air filters often require multiple layers of microfibers or nanofibers, thus limiting their transparency and as such, are not suitable for incorporation in doors and windows.
“The see-through air filter developed using our approach has promising applications in terms of improving indoor air quality, and could be especially useful for countries experiencing haze or with high pollution levels. While increasing filtration efficiency will lead to a trade-off in airflow, the overall performance of our air filter is still better than commercial respirators,” he explains.
The research team has filed a patent for their invention, and plans to include further functionalities, such as antibacterial properties, into the air filter. They plan to work with industry partners to commercialize the technology as well. Their work appears in the journal Small.
Source: National University of Singapore
Expertise is clearly beneficial in the workplace, but workers who are highly trained may actually be at more risk for making errors if they are interrupted.
The reason: Since these workers are generally faster at performing procedural tasks, their actions are more closely spaced in time—which means it can be harder to recall where exactly a worker left off when the interruption happened.
“Suppose a nurse is interrupted while preparing to give a dose of medication and then must remember whether he or she administered the dose,” says Erik Altmann, a psychology professor at Michigan State University.
“The more experienced nurse will remember less accurately than a less-practiced nurse, other things being equal, if the more experienced nurse performs the steps involved in administering medication more quickly.”
That’s not to say skilled nurses should avoid giving medication, but only that high skill levels could be a risk factor for increased errors after interruptions. Experts who perform a task quickly and accurately have probably figured out strategies for keeping their place in a task, Altmann says.
For the study, published online in the Journal of Experimental Psychology: General, 224 people performed two sessions of a computer-based procedural task on separate days. Participants were interrupted randomly by a simple typing task, after which they had to remember the last step they performed to select the correct step to perform next.We slow down after mistakes, but still mess up
In the second session, people became faster, and on most measures, more accurate. After interruptions, however, they became less accurate, making more errors by resuming the task at the wrong spot.
“The faster things happen, the worse we remember them,” Altmann says. The findings suggest that it may be beneficial to offer training and equipment designed to help employees remember where they were when they stopped working.
The Office of Naval Research funded the work.
Source: Michigan State University