-
Space Daily News
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:16 2020
Date:
May 19, 2020
Source:
University of Surrey
Summary:
Scientists have taken inspiration from the biomimicry of
butterfly wings and peacock feathers to develop an innovative
opal-like material that could be the cornerstone of next
generation smart sensors.
FULL STORY
__________________________________________________________________
Scientists have taken inspiration from the biomimicry of butterfly
wings and peacock feathers to develop an innovative opal-like material
that could be the cornerstone of next generation smart sensors.
An international team of scientists, led by the Universities of Surrey
and Sussex, has developed colour-changing, flexible photonic crystals
that could be used to develop sensors that warn when an earthquake
might strike next.
The wearable, robust and low-cost sensors can respond sensitively to
light, temperature, strain or other physical and chemical stimuli
making them an extremely promising option for cost-effective smart
visual sensing applications in a range of sectors including healthcare
and food safety.
In a study published by the journal Advanced Functional Materials,
researchers outline a method to produce photonic crystals containing a
minuscule amount of graphene resulting in a wide range of desirable
qualities with outputs directly observable by the naked eye.
Intensely green under natural light, the extremely versatile sensors
change colour to blue when stretched or turn transparent after being
heated.
Dr. Izabela Jurewicz, Lecturer in Soft Matter Physics at the University
of Surrey's Faculty of Engineering and Physical Sciences, said "This
work provides the first experimental demonstration of mechanically
robust yet soft, free-standing and flexible, polymer-based opals
containing solution-exfoliated pristine graphene. While these crystals
are beautiful to look at, we're also very excited about the huge impact
they could make to people's lives."
Alan Dalton, Professor Of Experimental Physics at the University of
Sussex's School of Mathematical and Physical Sciences, said: ""Our
research here has taken inspiration from the amazing biomimicry
abilities in butterfly wings, peacock feathers and beetle shells where
the colour comes from structure and not from pigments. Whereas nature
has developed these materials over millions of years we are slowly
catching up in a much shorter period."
Among their many potential applications are:
* Time-temperature indicators (TTI) for intelligent packaging -- The
sensors are able to give a visual indication if perishables, such
as food or pharmaceuticals, have experienced undesirable
time-temperature histories. The crystals are extremely sensitive to
even a small rise in temperature between 20 and 100 degrees C.
* Finger print analysis -- Their pressure-responsive shape-memory
characteristics are attractive for biometric and
anti-counterfeiting applications. Pressing the crystals with a bare
finger can reveal fingerprints with high precision showing
well-defined ridges from the skin.
* Bio-sensing -- The photonic crystals can be used as tissue
scaffolds for understanding human biology and disease. If
functionalised with biomolecules could act as highly sensitive
point-of-care testing devices for respiratory viruses offering
inexpensive, reliable, user-friendly biosensing systems.
* Bio/health monitoring -- The sensors mechanochromic response allows
for their application as body sensors which could help improve
technique in sports players.
* Healthcare safety -- Scientists suggest the sensors could be used
in a wrist band which changes colour to indicate to patients if
their healthcare practitioner has washed their hands before
entering an examination room.
The research draws on the Materials Physics Group's (University of
Sussex) expertise in the liquid processing of two-dimensional
nanomaterials, Soft Matter Group's (University of Surrey) experience in
polymer colloids and combines it with expertise at the Advanced
Technology Institute in optical modelling of complex materials. Both
universities are working with the Sussex-based company Advanced
Materials Development (AMD) Ltd to commercialise the technology.
Joseph Keddie, Professor of Soft Matter Physics at the University of
Surrey, said: "Polymer particles are used to manufacture everyday
objects such as inks and paints. In this research, we were able finely
distribute graphene at distances comparable to the wavelengths of
visible light and showed how adding tiny amounts of the two-dimensional
wonder-material leads to emerging new capabilities."
John Lee, CEO of Advanced Materials Development (AMD) Ltd, said: "Given
the versatility of these crystals, this method represents a simple,
inexpensive and scalable approach to produce multi-functional graphene
infused synthetic opals and opens up exciting applications for novel
nanomaterial-based photonics. We are very excited to be able to bring
it to market in near future."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Surrey. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Izabela Jurewicz, Alice A. K. King, Ravi Shanker, Matthew J. Large,
Ronan J. Smith, Ross Maspero, Sean P. Ogilvie, Jurgen Scheerder,
Jun Han, Claudia Backes, Joselito M. Razal, Marian Florescu, Joseph
L. Keddie, Jonathan N. Coleman, Alan B. Dalton. Mechanochromic and
Thermochromic Sensors Based on Graphene Infused Polymer Opals.
Advanced Functional Materials, 2020; 2002473 DOI:
[19]10.1002/adfm.202002473
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:16 2020
Date:
May 19, 2020
Source:
Chalmers University of Technology
Summary:
Solid state batteries are of great interest to the electric
vehicle industry. Scientists now present a new way of bringing
this promising concept closer to application. An interlayer,
made of a spreadable, 'butter-like' material helps improve the
current density tenfold, while also increasing performance and
safety.
FULL STORY
__________________________________________________________________
Solid state batteries are of great interest to the electric vehicle
industry. Scientists at Chalmers University of Technology, Sweden, and
Xi'an Jiaotong University, China now present a new way of taking this
promising concept closer to large-scale application. An interlayer,
made of a spreadable, 'butter-like' material helps improve the current
density tenfold, while also increasing performance and safety.
"This interlayer makes the battery cell significantly more stable, and
therefore able to withstand much higher current density. What is also
important is that it is very easy to apply the soft mass onto the
lithium metal anode in the battery -- like spreading butter on a
sandwich," says researcher Shizhao Xiong at the Department of Physics
at Chalmers.
Alongside Chalmers Professor Aleksandar Matic and Professor Song's
research group in Xi'an, Shizhao Xiong has been working for a long time
on crafting a suitable interlayer to stabilise the interface for solid
state batteries. The new results were recently presented in the
scientific journal Advanced Functional Materials.
Solid state batteries could revolutionise electric transport. Unlike
today's lithium-ion batteries, solid-state batteries have a solid
electrolyte and therefore contain no environmentally harmful or
flammable liquids.
Simply put, a solid-state battery can be likened to a dry sandwich. A
layer of the metal lithium acts as a slice of bread, and a ceramic
substance is laid on top like a filling. This hard substance is the
solid electrolyte of the battery, which transports lithium ions between
the electrodes of the battery. But the 'sandwich' is so dry, it is
difficult to keep it together -- and there are also problems caused by
the compatibility between the 'bread' and the 'topping'. Many
researchers around the world are working to develop suitable
resolutions to address this problem.
The material which the researchers in Gothenburg and Xi'an are now
working with is a soft, spreadable, 'butter-like' substance, made of
nanoparticles of the ceramic electrolyte, LAGP, mixed with an ionic
liquid. The liquid encapsulates the LAGP particles and makes the
interlayer soft and protective. The material, which has a similar
texture to butter from the fridge, fills several functions and can be
spread easily.
Although the potential of solid-state batteries is very well known,
there is as yet no established way of making them sufficiently stable,
especially at high current densities, when a lot of energy is extracted
from a battery cell very quickly, that is at fast charge or discharge.
The Chalmers researchers see great potential in the development of this
new interlayer.
"This is an important step on the road to being able to manufacture
large-scale, cost-effective, safe and environmentally friendly
batteries that deliver high capacity and can be charged and discharged
at a high rate," says Aleksandar Matic, Professor at the Department of
Physics at Chalmers, who predicts that solid state batteries will be on
the market within five years.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Chalmers University of Technology. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Shizhao Xiong, Yangyang Liu, Piotr Jankowski, Qiao Liu, Florian
Nitze, Kai Xie, Jiangxuan Song, Aleksandar Matic. Design of a
Multifunctional Interlayer for NASCION-Based Solid-State Li Metal
Batteries. Advanced Functional Materials, 2020; 2001444 DOI:
[19]10.1002/adfm.202001444
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:16 2020
quantum world
Date:
May 19, 2020
Source:
Iowa State University
Summary:
Scientists are using light waves to accelerate supercurrents to
access the unique and potentially useful properties of the
quantum world.
FULL STORY
__________________________________________________________________
Scientists are using light waves to accelerate supercurrents and access
the unique properties of the quantum world, including forbidden light
emissions that one day could be applied to high-speed, quantum
computers, communications and other technologies.
The scientists have seen unexpected things in supercurrents --
electricity that moves through materials without resistance, usually at
super cold temperatures -- that break symmetry and are supposed to be
forbidden by the conventional laws of physics, said Jigang Wang, a
professor of physics and astronomy at Iowa State University, a senior
scientist at the U.S. Department of Energy's Ames Laboratory and the
leader of the project.
Wang's lab has pioneered use of light pulses at terahertz frequencies-
trillions of pulses per second -- to accelerate electron pairs, known
as Cooper pairs, within supercurrents. In this case, the researchers
tracked light emitted by the accelerated electrons pairs. What they
found were "second harmonic light emissions," or light at twice the
frequency of the incoming light used to accelerate electrons.
That, Wang said, is analogous to color shifting from the red spectrum
to the deep blue.
"These second harmonic terahertz emissions are supposed to be forbidden
in superconductors," he said. "This is against the conventional
wisdom."
Wang and his collaborators -- including Ilias Perakis, professor and
chair of physics at the University of Alabama at Birmingham and
Chang-beom Eom, the Raymond R. Holton Chair for Engineering and
Theodore H. Geballe Professor at the University of Wisconsin-Madison --
report their discovery in a research paper just published online by the
scientific journal Physical Review Letters.
"The forbidden light gives us access to an exotic class of quantum
phenomena -- that's the energy and particles at the small scale of
atoms -- called forbidden Anderson pseudo-spin precessions," Perakis
said.
(The phenomena are named after the late Philip W. Anderson, co-winner
of the 1977 Nobel Prize in Physics who conducted theoretical studies of
electron movements within disordered materials such as glass that lack
a regular structure.)
Wang's recent studies have been made possible by a tool called quantum
terahertz spectroscopy that can visualize and steer electrons. It uses
terahertz laser flashes as a control knob to accelerate supercurrents
and access new and potentially useful quantum states of matter. The
National Science Foundation has supported development of the instrument
as well as the current study of forbidden light.
The scientists say access to this and other quantum phenomena could
help drive major innovations:
* "Just like today's gigahertz transistors and 5G wireless routers
replaced megahertz vacuum tubes or thermionic valves over half a
century ago, scientists are searching for a leap forward in design
principles and novel devices in order to achieve quantum computing
and communication capabilities," said Perakis, with Alabama at
Birmingham. "Finding ways to control, access and manipulate the
special characteristics of the quantum world and connect them to
real-world problems is a major scientific push these days. The
National Science Foundation has included quantum studies in its '10
Big Ideas' for future research and development critical to our
nation."
* Wang said, "The determination and understanding of symmetry
breaking in superconducting states is a new frontier in both
fundamental quantum matter discovery and practical quantum
information science. Second harmonic generation is a fundamental
symmetry probe. This will be useful in the development of future
quantum computing strategies and electronics with high speeds and
low energy consumption."
Before they can get there, though, researchers need to do more
exploring of the quantum world. And this forbidden second harmonic
light emission in superconductors, Wang said, represents "a fundamental
discovery of quantum matter."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Iowa State University. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. C. Vaswani, M. Mootz, C. Sundahl, D. H. Mudiyanselage, J. H. Kang,
X. Yang, D. Cheng, C. Huang, R. H. J. Kim, Z. Liu, L. Luo, I. E.
Perakis, C. B. Eom, and J. Wang. Terahertz Second-Harmonic
Generation from Lightwave Acceleration of Symmetry-Breaking
Nonlinear Supercurrents. Physical Review Letters, 2020 DOI:
[19]10.1103/PhysRevLett.124.207003
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:16 2020
Date:
May 19, 2020
Source:
Helmholtz-Zentrum Dresden-Rossendorf
Summary:
Higher frequencies mean faster data transfer and more powerful
processors. Technically, however, it is anything but easy to
keep increasing clock rates and radio frequencies. New materials
could solve the problem. Experiments have now produced a
promising result: Researchers were able to get a novel material
to increase the frequency of a terahertz radiation flash by a
factor of seven: a first step for potential IT applications.
FULL STORY
__________________________________________________________________
Higher frequencies mean faster data transfer and more powerful
processors -- the formula that has been driving the IT industry for
years. Technically, however, it is anything but easy to keep increasing
clock rates and radio frequencies. New materials could solve the
problem. Experiments at Helmholtz-Zentrum Dresden-Rossendorf (HZDR)
have now produced a promising result: An international team of
researchers was able to get a novel material to increase the frequency
of a terahertz radiation flash by a factor of seven: a first step for
potential IT applications, as the group reports in the journal Nature
Communications.
When smartphones receive data and computer chips perform calculations,
such processes always involve alternating electric fields that send
electrons on clearly defined paths. Higher field frequencies mean that
electrons can do their job faster, enabling higher data transfer rates
and greater processor speeds. The current ceiling is the terahertz
range, which is why researchers all over the world are keen to
understand how terahertz fields interact with novel materials. "Our
TELBE terahertz facility at HZDR is an outstanding source for studying
these interactions in detail and identifying promising materials," says
Jan-Christoph Deinert from HZDR's Institute of Radiation Physics. "A
possible candidate is cadmium arsenide, for example."
The physicist has studied this compound alongside researchers from
Dresden, Cologne, and Shanghai. Cadmium arsenide (Cd3As2) belongs to
the group of so-called three-dimensional Dirac materials, in which
electrons can interact very quickly and efficiently, both with each
other and with rapidly oscillating alternating electric fields. "We
were particularly interested in whether the cadmium arsenide also emits
terahertz radiation at new, higher frequencies," explains TELBE
beamline scientist Sergey Kovalev. "We have already observed this very
successfully in graphene, a two-dimensional Dirac material." The
researchers suspected that cadmium arsenide's three-dimensional
electronic structure would help attain high efficiency in this
conversion.
In order to test this, the experts used a special process to produce
ultra-thin high-purity platelets from cadmium arsenide, which they then
subjected to terahertz pulses from the TELBE facility. Detectors behind
the back of the platelet recorded how the cadmium arsenide reacted to
the radiation pulses. The result: "We were able to show that cadmium
arsenide acts as a highly effective frequency multiplier and does not
lose its efficiency, not even under the very strong terahertz pulses
that can be generated at TELBE," reports former HZDR researcher Zhe
Wang, who now works at the University of Cologne. The experiment was
the first ever to demonstrate the phenomenon of terahertz frequency
multiplication up to the seventh harmonic in this still young class of
materials.
Electrons dance to their own beat
In addition to the experimental evidence, the team together with
researchers form the Max Planck Institute for the Physics of Complex
Systems also provided a detailed theoretical description of what
occurred: The terahertz pulses that hit the cadmium arsenide generate a
strong electric field. "This field accelerates the free electrons in
the material," Deinert describes. "Imagine a huge number of tiny steel
pellets rolling around on a plate that is being tipped from side to
side very fast."
The electrons in the cadmium arsenide respond to this acceleration by
emitting electromagnetic radiation. The crucial thing is that they do
not exactly follow the rhythm of the terahertz field, but oscillate on
rather more complicated paths, which is a consequence of the material's
unusual electronic structure. As a result, the electrons emit new
terahertz pulses at odd integer multiples of the original frequency --
a non-linear effect similar to a piano: When you hit the A key on the
keyboard, the instrument not only sounds the key you played, but also a
rich spectrum of overtones, the harmonics.
For a post 5G-world
The phenomenon holds promise for numerous future applications, for
example in wireless communication, which trends towards ever higher
radio frequencies that can transmit far more data than today's
conventional channels. The industry is currently rolling out the 5G
standard. Components made of Dirac materials could one day use even
higher frequencies -- and thus enable even greater bandwidth than 5G.
The new class of materials also seems to be of interest for future
computers as Dirac-based components could, in theory, facilitate higher
clock rates than today's silicon-based technologies.
But first, the basic science behind it requires further study. "Our
research result was only the first step," stresses Zhe Wang. "Before we
can envision concrete applications, we need to increase the efficiency
of the new materials." To this end, the experts want to find out how
well they can control frequency multiplication by applying an electric
current. And they want to dope their samples, i.e. enrich them with
foreign atoms, in the hope of optimizing nonlinear frequency
conversion.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Helmholtz-Zentrum Dresden-Rossendorf.
Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Sergey Kovalev, Renato M. A. Dantas, Semyon Germanskiy,
Jan-Christoph Deinert, Bertram Green, Igor Ilyakov, Nilesh Awari,
Min Chen, Mohammed Bawatna, Jiwei Ling, Faxian Xiu, Paul H. M. van
Loosdrecht, Piotr Surówka, Takashi Oka, Zhe Wang. Non-perturbative
terahertz high-harmonic generation in the three-dimensional Dirac
semimetal Cd3As2. Nature Communications, 2020; 11 (1) DOI:
[19]10.1038/s41467-020-16133-8
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:16 2020
Date:
May 19, 2020
Source:
Institute of Industrial Science, The University of Tokyo
Summary:
Researchers have created a way for artificial neuronal networks
to communicate with biological neuronal networks. The new system
converts artificial electrical spiking signals to a visual
pattern than is then used to entrain the real neurons via
optogenetic stimulation of the network. This advance will be
important for future neuroprosthetic devices that replace
damages neurons with artificial neuronal circuitry.
FULL STORY
__________________________________________________________________
Researchers have created a way for artificial neuronal networks to
communicate with biological neuronal networks. The new system converts
artificial electrical spiking signals to a visual pattern than is then
used to entrain the real neurons via optogenetic stimulation of the
network. This advance will be important for future neuroprosthetic
devices that replace damages neurons with artificial neuronal
circuitry.
A prosthesis is an artificial device that replaces an injured or
missing part of the body. You can easily imagine a stereotypical pirate
with a wooden leg or Luke Skywalker's famous robotic hand. Less
dramatically, think of old-school prosthetics like glasses and contact
lenses that replace the natural lenses in our eyes. Now try to imagine
a prosthesis that replaces part of a damaged brain. What could
artificial brain matter be like? How would it even work?
Creating neuroprosthetic technology is the goal of an international
team led by by the Ikerbasque Researcher Paolo Bonifazi from Biocruces
Health Research Institute (Bilbao, Spain), and Timothée Levi from
Institute of Industrial Science, The University of Tokyo and from IMS
lab, University of Bordeaux. Although several types of artificial
neurons have been developed, none have been truly practical for
neuroprostheses. One of the biggest problems is that neurons in the
brain communicate very precisely, but electrical output from the
typical electrical neural network is unable to target specific neurons.
To overcome this problem, the team converted the electrical signals to
light. As Levi explains, "advances in optogenetic technology allowed us
to precisely target neurons in a very small area of our biological
neuronal network."
Optogenetics is a technology that takes advantage of several
light-sensitive proteins found in algae and other animals. Inserting
these proteins into neurons is a kind of hack; once they are there,
shining light onto a neuron will make it active or inactive, depending
on the type of protein. In this case, the researchers used proteins
that were activated specifically by blue light. In their experiment,
they first converted the electrical output of the spiking neuronal
network into the checkered pattern of blue and black squares. Then,
they shined this pattern down onto a 0.8 by 0.8 mm square of the
biological neuronal network growing in the dish. Within this square,
only neurons hit by the light coming from the blue squares were
directly activated.
Spontaneous activity in cultured neurons produces synchronous activity
that follows a certain kind of rhythm. This rhythm is defined by the
way the neurons are connected together, the types of neurons, and their
ability to adapt and change.
"The key to our success," says Levi, "was understanding that the
rhythms of the artificial neurons had to match those of the real
neurons. Once we were able to do this, the biological network was able
to respond to the "melodies" sent by the artificial one. Preliminary
results obtained during the European Brainbow project, help us to
design these biomimetic artificial neurons."
They tuned the artificial neuronal network to use several different
rhythms until they found the best match. Groups of neurons were
assigned to specific pixels in the image grid and the rhythmic activity
was then able to change the visual pattern that was shined onto the
cultured neurons. The light patterns were shown onto a very small area
of the cultured neurons, and the researchers were able to verify local
reactions as well as changes in the global rhythms of the biological
network.
"Incorporating optogenetics into the system is an advance towards
practicality," says Levi. "It will allow future biomimetic devices to
communicate with specific types of neurons or within specific neuronal
circuits." The team is optimistic that future prosthetic devices using
their system will be able to replace damaged brain circuits and restore
communication between brain regions. "At University of Tokyo, in
collaboration with Pr Kohno and Dr Ikeuchi, we are focusing on the
design of bio-hybrid neuromorphic systems to create new generation of
neuroprosthesis," says Levi.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Institute of Industrial Science, The
University of Tokyo. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Yossi Mosbacher, Farad Khoyratee, Miri Goldin, Sivan Kanner,
Yenehaetra Malakai, Moises Silva, Filippo Grassia, Yoav Ben Simon,
Jesus Cortes, Ari Barzilai, Timothée Levi, Paolo Bonifazi. Toward
neuroprosthetic real-time communication from in silico to
biological neuronal network via patterned optogenetic stimulation.
Scientific Reports, 2020; 10 (1) DOI:
[19]10.1038/s41598-020-63934-4
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:08 2020
Date:
April 30, 2020
Source:
North Carolina State University
Summary:
Engineering researchers have created ultrathin, stretchable
electronic material that is gas permeable, allowing the material
to 'breathe.' The material was designed specifically for use in
biomedical or wearable technologies, since the gas permeability
allows sweat and volatile organic compounds to evaporate away
from the skin, making it more comfortable for users --
especially for long-term wear.
FULL STORY
__________________________________________________________________
Engineering researchers have created ultrathin, stretchable electronic
material that is gas permeable, allowing the material to "breathe." The
material was designed specifically for use in biomedical or wearable
technologies, since the gas permeability allows sweat and volatile
organic compounds to evaporate away from the skin, making it more
comfortable for users -- especially for long-term wear.
"The gas permeability is the big advance over earlier stretchable
electronics," says Yong Zhu, co-corresponding author of a paper on the
work and a professor of mechanical and aerospace engineering at North
Carolina State University. "But the method we used for creating the
material is also important because it's a simple process that would be
easy to scale up."
Specifically, the researchers used a technique called the breath figure
method to create a stretchable polymer film featuring an even
distribution of holes. The film is coated by dipping it in a solution
that contains silver nanowires. The researchers then heat-press the
material to seal the nanowires in place.
"The resulting film shows an excellent combination of electric
conductivity, optical transmittance and water-vapor permeability," Zhu
says. "And because the silver nanowires are embedded just below the
surface of the polymer, the material also exhibits excellent stability
in the presence of sweat and after long-term wear."
"The end result is extremely thin -- only a few micrometers thick,"
says Shanshan Yao, co-author of the paper and a former postdoctoral
researcher at NC State who is now on faculty at Stony Brook University.
"This allows for better contact with the skin, giving the electronics a
better signal-to-noise ratio.
"And gas permeability of wearable electronics is important for more
than just comfort," Yao says. "If a wearable device is not gas
permeable, it can also cause skin irritation."
To demonstrate the material's potential for use in wearable
electronics, the researchers developed and tested prototypes for two
representative applications.
The first prototype consisted of skin-mountable, dry electrodes for use
as electrophysiologic sensors. These have multiple potential
applications, such as measuring electrocardiography (ECG) and
electromyography (EMG) signals.
"These sensors were able to record signals with excellent quality, on
par with commercially available electrodes," Zhu says.
The second prototype demonstrated textile-integrated touch sensing for
human-machine interfaces. The authors used a wearable textile sleeve
integrated with the porous electrodes to play computer games such as
Tetris.
"If we want to develop wearable sensors or user interfaces that can be
worn for a significant period of time, we need gas-permeable electronic
materials," Zhu says. "So this is a significant step forward."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]North Carolina State University. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Weixin Zhou, Shanshan Yao, Hongyu Wang, Qingchuan Du, Yanwen Ma,
Yong Zhu. Gas-Permeable, Ultrathin, Stretchable Epidermal
Electronics with Porous Electrodes. ACS Nano, 2020; DOI:
[19]10.1021/acsnano.0c00906
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:08 2020
Mathematical model can predict cumulative deaths in US
Date:
April 30, 2020
Source:
Rutgers University
Summary:
A new mathematical model has been created to estimate the death
toll linked to the COVID-19 pandemic in the United States and
could be used around the world.
FULL STORY
__________________________________________________________________
A Rutgers engineer has created a mathematical model that accurately
estimates the death toll linked to the COVID-19 pandemic in the United
States and could be used around the world.
"Based on data available on April 28, the model showed that the
COVID-19 pandemic might be over in the United States, meaning no more
American deaths, by around late June 2020," said Hoang Pham, a
distinguished professor in the Department of Industrial and Systems
Engineering in the School of Engineering at Rutgers University-New
Brunswick. "But if testing and contact tracing strategies,
social-distancing policies, reopening of community strategies or
stay-at-home policies change significantly in the coming days and
weeks, the predicted death toll will also change."
The model, detailed in a study published in the journal Mathematics,
predicted the death toll would eventually reach about 68,120 in the
United States as a result of the SARS-CoV-2 coronavirus that causes
COVID-19. That's based on data available on April 28, and there was
high confidence (99 percent) the expected death toll would be between
66,055 and 70,304.
The model's estimates and predictions closely match reported death
totals. As of April 29, more than 58,000 Americans had succumbed to
COVID-19, according to the Johns Hopkins University COVID-19 Tracking
Map.
The next steps include applying the model to global COVID-19 death data
as well as to other nations such as Italy and Spain, both of which have
experienced thousands of deaths due to COVID-19. The model could also
be used to evaluate population mortality and the spread of other
diseases.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Rutgers University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Hoang Pham. On Estimating the Number of Deaths Related to Covid-19.
Mathematics, 2020; 8 (5): 655 DOI: [19]10.3390/math8050655
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:08 2020
Computer models of merging neutron stars predicts how to tell when this
happens
Date:
April 30, 2020
Source:
Goethe University Frankfurt
Summary:
According to modern particle physics, matter produced when
neutron stars merge is so dense that it could exist in a state
of dissolved elementary particles. This state of matter, called
quark-gluon plasma, might produce a specific signature in
gravitational waves. Physicists have now calculated this process
using supercomputers.
FULL STORY
__________________________________________________________________
Neutron stars are among the densest objects in the universe. If our
Sun, with its radius of 700,000 kilometres were a neutron star, its
mass would be condensed into an almost perfect sphere with a radius of
around 12 kilometres. When two neutron stars collide and merge into a
hyper-massive neutron star, the matter in the core of the new object
becomes incredibly hot and dense. According to physical calculations,
these conditions could result in hadrons such as neutrons and protons,
which are the particles normally found in our daily experience,
dissolving into their components of quarks and gluons and thus
producing a quark-gluon plasma.
In 2017 it was discovered for the first time that merging neutron stars
send out a gravitational wave signal that can be detected on Earth. The
signal not only provides information on the nature of gravity, but also
on the behaviour of matter under extreme conditions. When these
gravitational waves were first discovered in 2017, however, they were
not recorded beyond the merging point.
This is where the work of the Frankfurt physicists begins. They
simulated merging neutron stars and the product of the merger to
explore the conditions under which a transition from hadrons to a
quark-gluon plasma would take place and how this would affect the
corresponding gravitational wave. The result: in a specific, late phase
of the life of the merged object a phase transition to the quark-gluon
plasma took place and left a clear and characteristic signature on the
gravitational-wave signal.
Professor Luciano Rezzolla from Goethe University is convinced:
"Compared to previous simulations, we have discovered a new signature
in the gravitational waves that is significantly clearer to detect. If
this signature occurs in the gravitational waves that we will receive
from future neutron-star mergers, we would have a clear evidence for
the creation of quark-gluon plasma in the present universe."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Goethe University Frankfurt. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Lukas R. Weih, Matthias Hanauske, Luciano Rezzolla. Post-merger
gravitational wave signatures of phase transitions in binary
mergers. Physical Review Letters, 2020 DOI:
[19]10.1103/PhysRevLett.124.171103
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:08 2020
Date:
April 30, 2020
Source:
Tohoku University
Summary:
A multinational team of researchers has revealed the magnetic
states of nanoscale gyroids, 3D chiral network-like
nanostructures. The findings add a new candidate system for
research into unconventional information processing and emergent
phenomena relevant to spintronics.
FULL STORY
__________________________________________________________________
A multinational team of researchers from Tohoku University and
institutions in the UK, Germany and Switzerland has revealed the
magnetic states of nanoscale gyroids, 3D chiral network-like
nanostructures. The findings add a new candidate system for research
into unconventional information processing and emergent phenomena
relevant to spintronics.
Arrays of interacting nanostructures offer the ability to realize
unprecedented material properties, as interactions can give rise to
new, "emergent" phenomena. In magnetism, such emergent phenomena have
so far only been demonstrated in 2D, in artificial spin ices and
magnonic crystals. However, progress towards realizing magnetic
"metamaterials", which could form the basis of advanced spintronic
devices by displaying emergent effects in 3D, has been hampered by two
obstacles. The first is the need to fabricate complex 3D building
blocks at dimensions smaller than 100 nm (comparable to intrinsic
magnetic lengthscales) and the second is the challenge of visualizing
their magnetic configurations.
The research team therefore decided to study nanoscale magnetic
gyroids, 3D networks composed of 3 connected vertices defined by triads
of curved nanowire-like struts (Figure 1). Gyroids have attracted much
interest, as despite their complexity they can self-assemble from a
carefully formulated combination of polymers, which can be used as a 3D
mold or template to form free-standing nanostructures (Figure 2). As
the struts connect to form spirals, gyroids have a "handedness" or
chirality, and their shape makes magnetic gyroids ideal systems to test
predictions of new magnetic properties emerging from curvature.
Measurements of the optical properties of gyroids even showed that
gyroids can have topological properties, which along with chiral
effects are currently the subject of intense study to develop new
classes of spintronic devices. However, the magnetic states which might
exist in gyroids had not yet been established, leading to the present
study.
The researchers produced Ni[75]Fe[25] single-gyroid and double-gyroid
(formed from a mirror-image pair of single-gyroids) nanostructures with
11 nm diameter struts and a 42 nm unit cell, via block co-polymer
templating and electrodeposition. These dimensions are comparable to
domain wall widths and spin wave wavelengths in Ni-Fe. They then imaged
the gyroid nanoparticles with off-axis electron holography, which could
map the magnetization and stray magnetic field patterns in and around
the gyroids' struts with nanometer spatial resolution. Analysis of the
patterns with the aid of finite-element micromagnetic simulations
revealed a very intricate magnetic state which is overall ferromagnetic
but without a unique equilibrium configuration (Figure 3), implying
that a magnetic gyroid can adopt a large number of stable states.
"These findings establish magnetic gyroids as a candidate of interest
for applications such as reservoir computing and spin-wave logic," said
lead author Justin Llandro." The research takes an exciting first step
towards 3D nanoscale magnetic metamaterials which can be used to
uncover new emergent effects and advance both fundamental and applied
spintronics research."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Tohoku University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Justin Llandro, David M. Love, András Kovács, Jan Caron, Kunal N.
Vyas, Attila Kákay, Ruslan Salikhov, Kilian Lenz, Jürgen
Fassbender, Maik R. J. Scherer, Christian Cimorra, Ullrich Steiner,
Crispin H. W. Barnes, Rafal E. Dunin-Borkowski, Shunsuke Fukami,
Hideo Ohno. Visualizing Magnetic Structure in 3D Nanoscale Ni–Fe
Gyroid Networks. Nano Letters, 2020; DOI:
[19]10.1021/acs.nanolett.0c00578
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:12 2020
hope
Date:
April 30, 2020
Source:
University of Sussex
Summary:
A mass move to working-from-home accelerated by the coronavirus
pandemic might not be as beneficial to the planet as many hope,
according to a new study.
FULL STORY
__________________________________________________________________
A mass move to working-from-home accelerated by the Coronavirus
pandemic might not be as beneficial to the planet as many hope,
according to a new study by the Centre for Research into Energy Demand
Solutions (CREDS).
The majority of studies on the subject analysed by University of Sussex
academics agree that working-from-home reduced commuter travel and
energy use -- by as much as 80% in some cases.
But a small number of studies found that telecommuting increased energy
use or had a negligible impact, since the energy savings were offset by
increased travel for recreation or other purposes, together with
additional energy use in the home.
The authors found that more methodologically rigorous studies were less
likely to estimate energy savings -- all six of the studies analysed
that found negligible energy reductions or increases were judged to be
methodologically good.
Dr Andrew Hook, Lecturer in Human Geography at the University of
Sussex, said:
"While most studies conclude that teleworking can contribute energy
savings, the more rigorous studies and those with a broader scope
present more ambiguous findings. Where studies include additional
impacts, such as non-work travel or office and home energy use, the
potential energy savings appear more limited -- with some studies
suggesting that, in the context of growing distances between the
workplace and home, part-week teleworking could lead to a net increase
in energy consumption."
Dr Victor Court, Lecturer at the Center for Energy Economics and
Management, IFP School, said:
"It is our belief from examining the relevant literature that
teleworking has some potential to reduce energy consumption and
associated emissions -- both through reducing commuter travel and
displacing office-related energy consumption. But if it encourages
people to live further away from work or to take additional trips, the
savings could be limited or even negative."
Studies indicate it would be better for workers to continue working
from home for all of the working week rather than splitting time
between office and home once lockdown rules are relaxed. Similarly,
companies will need to encourage the majority of staff to switch to
home working and to downsize office space to ensure significant energy
savings.
Even the mass migration of workers to home working might have only a
small impact on overall energy usage. One study noted that even if all
US information workers teleworked for four days a week, the drop in
national energy consumption would be significantly less effective than
a 20% improvement in car fuel efficiency.
The study also warns that technological advances could erode some of
the energy savings due to the short lifetime and rapid replacement of
ICTs, their increasingly complex supply chains, their dependence on
rare earth elements and the development of energy-intensive processes
such as cloud storage and video streaming.
The authors add that modern-day work patterns are becoming increasingly
complex, diversified and personalised, making it harder to track
whether teleworking is definitively contributing energy savings.
Steven Sorrell, Professor of Energy Policy at the Science Policy
Research Unit, University of Sussex, said:
"While the lockdown has clearly reduced energy consumption, only some
of those savings will be achieved in more normal patterns of
teleworking. To assess whether teleworking is really sustainable, we
need to look beyond the direct impact on commuting and investigate how
it changes a whole range of daily activities."
The paper, published in Environmental Research Letters, provides a
systematic review of current knowledge of the energy impacts of
teleworking, synthesising the results of 39 empirical studies from the
US, Europe, Thailand, Malaysia and Iran published between 1995 and
2019.
Among the potential energy increases from working-from-home practices
the study identified include:
Teleworkers living further away from their place of work so making
longer commutes on days they worked in the office -- one study found UK
teleworkers have a 10.7 mile longer commute than those who travelled
into work every day.
The time gained from not participating in daily commute used by
teleworkers to make additional journeys for leisure and social
purposes.
Teleworking households' spending money saved from the daily commute on
goods, activities and services also requiring energy and producing
emissions.
Isolated and sedentary teleworkers taking on more journeys to combat
negative feelings.
Other household members making trips in cars freed up from the daily
commute
Benjamin K Sovacool, Professor of Energy Policy at the Science Policy
Research Unit, University of Sussex, said:
"The body of research on the subject shows that it is too simple to
assume that teleworking is inevitably a more sustainable option. Unless
workers and employers fully commit to the working from home model, many
of the potential energy savings could be lost. A scenario after the
threat of Coronavirus has cleared where workers will want the best of
both worlds; retaining the freedom and flexibility they found from
working from home but the social aspects of working at an office that
they've missed out on during lockdown, will not deliver the energy
savings the world needs."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Sussex. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Andrew Hook, Victor Court, Benjamin Sovacool, Steven Sorrell. A
systematic review of the energy and climate impacts of teleworking.
Environmental Research Letters, 2020; DOI:
[19]10.1088/1748-9326/ab8a84
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:06 2020
Date:
May 11, 2020
Source:
Royal Ontario Museum
Summary:
New research on a rock collected by the Apollo 17 astronauts has
revealed evidence for a mineral phase that can only form above
2300 °C. Such temperatures are uniquely achieved in melt sheets
following a meteorite impact, allowing the researchers to link
the 4.33-billion-year-old crystal to an ancient collision on the
Moon. The study opens the door for many of the more complex
rocks on the Moon to have formed in these destructive
environments.
FULL STORY
__________________________________________________________________
New research published today in the journal Nature Astronomy reveals a
type of destructive event most often associated with disaster movies
and dinosaur extinction may have also contributed to the formation of
the Moon's surface.
A group of international scientists led by the Royal Ontario Museum has
discovered that the formation of ancient rocks on the Moon may be
directly linked to large-scale meteorite impacts.
The scientists conducted new research of a unique rock collected by
NASA astronauts during the 1972 Apollo 17 mission to the Moon. They
found it contains mineralogical evidence that it formed at incredibly
high temperatures (in excess of 2300 °C/ 4300 °F) that can only be
achieved by the melting of the outer layer of a planet in a large
impact event.
In the rock, the researchers discovered the former presence of cubic
zirconia, a mineral phase often used as a substitute for diamond in
jewellery. The phase would only form in rocks heated to above 2300 °C,
and though it has since reverted to a more stable phase (the mineral
known as baddeleyite), the crystal retains distinctive evidence of a
high-temperature structure. An interactive image of the complex crystal
used in the study can be seen here using the Virtual Microscope.
While looking at the structure of the crystal, the researchers also
measured the age of the grain, which reveals the baddeleyite formed
over 4.3 billion years ago. It was concluded that the high-temperature
cubic zirconia phase must have formed before this time, suggesting that
large impacts were critically important to forming new rocks on the
early Moon.
Fifty years ago, when the first samples were brought back from the
surface of the Moon, lunar scientists raised questions about how lunar
crustal rocks formed. Even today, a key question remains unanswered:
how did the outer and inner layers of the Moon mix after the Moon
formed? This new research suggests that large impacts over 4 billion
years ago could have driven this mixing, producing the complex range of
rocks seen on the surface of the Moon today.
"Rocks on Earth are constantly being recycled, but the Moon doesn't
exhibit plate tectonics or volcanism, allowing older rocks to be
preserved," explains Dr. Lee White, Hatch Postdoctoral Fellow at the
ROM. "By studying the Moon, we can better understand the earliest
history of our planet. If large, super-heated impacts were creating
rocks on the Moon, the same process was probably happening here on
Earth."
"By first looking at this rock, I was amazed by how differently the
minerals look compared to other Apollo 17 samples," says Dr. Ana
Cernok, Hatch Postdoctoral Fellow at the ROM and co-author of the
study. "Although smaller than a millimetre, the baddeleyite grain that
caught our attention was the largest one I have ever seen in Apollo
samples. This small grain is still holding the evidence for formation
of an impact basin that was hundreds of kilometres in diameter. This is
significant, because we do not see any evidence of these old impacts on
Earth."
Dr. James Darling, a reader at the University of Portsmouth and
co-author of the study, says the findings completely change scientists'
understanding of the samples collected during the Apollo missions, and,
in effect, the geology of the Moon. "These unimaginably violent
meteorite impacts helped to build the lunar crust, not only destroy
it," he says.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[19]Materials provided by [20]Royal Ontario Museum. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. L. F. White, A. Černok, J. R. Darling, M. J. Whitehouse, K. H. Joy,
C. Cayron, J. Dunlop, K. T. Tait, M. Anand. Evidence of extensive
lunar crust formation in impact melt sheets 4,330 Myr ago. Nature
Astronomy, 2020; DOI: [21]10.1038/s41550-020-1092-5
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:06 2020
to early life
The oldest molecular fluids in the solar system could have supported the
rapid formation and evolution of the building blocks of life
Date:
May 11, 2020
Source:
Royal Ontario Museum
Summary:
Scientists have analyzed a meteorite atom by atom to reveal the
chemistry and acidity of the earliest fluids in the solar
system. By finding evidence of sodium-rich alkaline water in the
Tagish Lake meteorite, this new study suggests amino acids could
have formed rapidly on the parent asteroid, opening the door for
the early evolution of microbial life.
FULL STORY
__________________________________________________________________
The oldest molecular fluids in the solar system could have supported
the rapid formation and evolution of the building blocks of life, new
research in the journal Proceedings of the National Academy of Sciences
reveals.
An international group of scientists, led by researchers from the Royal
Ontario Museum (ROM) and co-authors from McMaster University and York
University, used state-of-the-art techniques to map individual atoms in
minerals formed in fluids on an asteroid over 4.5 billion years ago.
Studying the ROM's iconic Tagish Lake meteorite, scientists used
atom-probe tomography, a technique capable of imaging atoms in 3D, to
target molecules along boundaries and pores between magnetite grains
that likely formed on the asteroid's crust. There, they discovered
water precipitates left in the grain boundaries on which they conducted
their ground-breaking research.
"We know water was abundant in the early solar system," explains lead
author Dr. Lee White, Hatch postdoctoral fellow at the ROM, "but there
is very little direct evidence of the chemistry or acidity of these
liquids, even though they would have been critical to the early
formation and evolution of amino acids and, eventually, microbial
life."
This new atomic-scale research provides the first evidence of the
sodium-rich (and alkaline) fluids in which the magnetite framboids
formed. These fluid conditions are preferential for the synthesis of
amino acids, opening the door for microbial life to form as early as
4.5 billion years ago.
"Amino acids are essential building blocks of life on Earth, yet we
still have a lot to learn about how they first formed in our solar
system," says Beth Lymer, a PhD student at York University and
co-author of the study. "The more variables that we can constrain, such
as temperature and pH, allows us to better understand the synthesis and
evolution of these very important molecules into what we now know as
biotic life on Earth."
The Tagish Lake carbonaceous chondrite was retrieved from an ice sheet
in B.C.'s Tagish Lake in 2000, and later acquired by the ROM, where it
is now considered to be one of the museums iconic objects. This history
means that the sample used by the team has never been above room
temperature or exposed to liquid water, allowing the scientists to
confidently link the measured fluids to the parent asteroid.
By using new techniques, such as atom probe tomography, the scientists
hope to develop analytical methods for planetary materials returned to
Earth by space craft, such as by NASA's OSIRIS-REx mission or a planned
sample-return mission to Mars in the near future.
"Atom probe tomography gives us an opportunity to make fantastic
discoveries on bits of material a thousand times thinner than a human
hair," says White. "Space missions are limited to bringing back tiny
amounts of material, meaning these techniques will be critical to
allowing us to understand more about the solar system while also
preserving material for future generations."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Royal Ontario Museum. Note: Content may
be edited for style and length.
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:06 2020
Date:
May 11, 2020
Source:
Southwest Research Institute
Summary:
Scientists have modeled the atmosphere of Mars to help determine
that salty pockets of water present on the Red Planet are likely
not habitable by life as we know it on Earth. The team helped
allay planetary protection concerns about contaminating
potential Martian ecosystems.
FULL STORY
__________________________________________________________________
A Southwest Research Institute scientist modeled the atmosphere of Mars
to help determine that salty pockets of water present on the Red Planet
are likely not habitable by life as we know it on Earth. A team that
also included scientists from Universities Space Research Association
(USRA) and the University of Arkansas helped allay planetary protection
concerns about contaminating potential Martian ecosystems. These
results were published this month in Nature Astronomy.
Due to Mars' low temperatures and extremely dry conditions, a droplet
of liquid water on its surface would instantly freeze, boil or
evaporate, unless the droplet had dissolved salts in it. This brine
would have a lower freezing temperature and would evaporate more slowly
than pure liquid water. Salts are found across Mars, so brines could
form there.
"Our team looked at specific regions on Mars -- areas where liquid
water temperature and accessibility limits could possibly allow known
terrestrial organisms to replicate -- to understand if they could be
habitable," said SwRI's Dr. Alejandro Soto, a senior research scientist
and co-author of the study. "We used Martian climate information from
both atmospheric models and spacecraft measurements. We developed a
model to predict where, when and for how long brines are stable on the
surface and shallow subsurface of Mars."
Mars' hyper-arid conditions require lower temperatures to reach high
relative humidities and tolerable water activities, which are measures
of how easily the water content may be utilized for hydration. The
maximum brine temperature expected is -55 F -- at the boundary of the
theoretical low temperature limit for life.
"Even extreme life on Earth has its limits, and we found that brine
formation from some salts can lead to liquid water over 40% of the
Martian surface but only seasonally, during 2% of the Martian year,"
Soto continued. "This would preclude life as we know it."
While pure liquid water is unstable on the Martian surface, models
showed that stable brines can form and persist from the equator to high
latitudes on the surface of Mars for a few percent of the year for up
to six consecutive hours, a broader range than previously thought.
However, the temperatures are well below the lowest temperatures to
support life.
"These new results reduce some of the risk of exploring the Red Planet
while also contributing to future work on the potential for habitable
conditions on Mars," Soto said.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Southwest Research Institute. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Edgard G. Rivera-Valentín, Vincent F. Chevrier, Alejandro Soto,
Germán Martínez. Distribution and habitability of (meta)stable
brines on present-day Mars. Nature Astronomy, 2020; DOI:
[19]10.1038/s41550-020-1080-9
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:06 2020
Ryugu's interaction with the sun changes what we know about asteroid history
Date:
May 11, 2020
Source:
University of Tokyo
Summary:
In February and July of 2019, the Hayabusa2 spacecraft briefly
touched down on the surface of near-Earth asteroid Ryugu. The
readings it took with various instruments at those times have
given researchers insight into the physical and chemical
properties of the 1-kilometer-wide asteroid. These findings
could help explain the history of Ryugu and other asteroids, as
well as the solar system at large.
FULL STORY
__________________________________________________________________
In February and July of 2019, the Hayabusa2 spacecraft briefly touched
down on the surface of near-Earth asteroid Ryugu. The readings it took
with various instruments at those times have given researchers insight
into the physical and chemical properties of the 1-kilometer-wide
asteroid. These findings could help explain the history of Ryugu and
other asteroids, as well as the solar system at large.
When our solar system formed around 5 billion years ago, most of the
material it formed from became the sun, and a fraction of a percent
became the planets and solid bodies, including asteroids. Planets have
changed a lot since the early days of the solar system due to
geological processes, chemical changes, bombardments and more. But
asteroids have remained more or less the same as they are too small to
experience those things, and are therefore useful for researchers who
investigate the early solar system and our origins.
"I believe knowledge of the evolutionary processes of asteroids and
planets are essential to understand the origins of the Earth and life
itself," said Associate Professor Tomokatsu Morota from the Department
of Earth and Planetary Science at the University of Tokyo. "Asteroid
Ryugu presents an amazing opportunity to learn more about this as it is
relatively close to home, so Hayabusa2 could make a return journey
relatively easily. "
Hayabusa2 launched in December 2014 and reached Ryugu in June 2018. At
the time of writing, Hayabusa2 is on its way back to Earth and is
scheduled to deliver a payload in December 2020. This payload consists
of small samples of surface material from Ryugu collected during two
touchdowns in February and July of 2019. Researchers will learn much
from the direct study of this material, but even before it reaches us,
Hayabusa2 helped researchers to investigate the physical and chemical
makeup of Ryugu.
"We used Hayabusa2's ONC-W1 and ONC-T imaging instruments to look at
dusty matter kicked up by the spacecraft's engines during the
touchdowns," said Morota. "We discovered large amounts of very fine
grains of dark-red colored minerals. These were produced by solar
heating, suggesting at some point Ryugu must have passed close by the
sun."
Morota and his team investigated the spatial distribution of the
dark-red matter around Ryugu as well as its spectra or light signature.
The strong presence of the material around specific latitudes
corresponded to the areas that would have received the most solar
radiation in the asteroid's past; hence, the belief that Ryugu must
have passed by the sun.
"From previous studies we know Ryugu is carbon-rich and contains
hydrated minerals and organic molecules. We wanted to know how solar
heating chemically changed these molecules," said Morota. "Our theories
about solar heating could change what we know of orbital dynamics of
asteroids in the solar system. This in turn alters our knowledge of
broader solar system history, including factors that may have affected
the early Earth."
When Hayabusa2 delivers material it collected during both touchdowns,
researchers will unlock even more secrets of our solar history. Based
on spectral readings and albedo, or reflectivity, from within the
touchdown sites, researchers are confident that both dark-red
solar-heated material and gray unheated material were collected by
Hayabusa2. Morota and his team hope to study larger properties of
Ryugu, such as its many craters and boulders.
"I wish to study the statistics of Ryugu's surface craters to better
understand the strength characteristics of its rocks, and history of
small impacts it may have received," said Morota. "The craters and
boulders on Ryugu meant there were limited safe landing locations for
Hayabusa2. Finding a suitable location was hard work and the eventual
first successful touchdown was one of the most exciting events of my
life."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Tokyo. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. T. Morota, S. Sugita, Y. Cho, M. Kanamaru, E. Tatsumi, N. Sakatani,
R. Honda, N. Hirata, H. Kikuchi, M. Yamada, Y. Yokota, S. Kameda,
M. Matsuoka, H. Sawada, C. Honda, T. Kouyama, K. Ogawa, H. Suzuki,
K. Yoshioka, M. Hayakawa, N. Hirata, M. Hirabayashi, H. Miyamoto,
T. Michikami, T. Hiroi, R. Hemmi, O. S. Barnouin, C. M. Ernst, K.
Kitazato, T. Nakamura, L. Riu, H. Senshu, H. Kobayashi, S. Sasaki,
G. Komatsu, N. Tanabe, Y. Fujii, T. Irie, M. Suemitsu, N. Takaki,
C. Sugimoto, K. Yumoto, M. Ishida, H. Kato, K. Moroi, D. Domingue,
P. Michel, C. Pilorget, T. Iwata, M. Abe, M. Ohtake, Y. Nakauchi,
K. Tsumura, H. Yabuta, Y. Ishihara, R. Noguchi, K. Matsumoto, A.
Miura, N. Namiki, S. Tachibana, M. Arakawa, H. Ikeda, K. Wada, T.
Mizuno, C. Hirose, S. Hosoda, O. Mori, T. Shimada, S. Soldini, R.
Tsukizaki, H. Yano, M. Ozaki, H. Takeuchi, Y. Yamamoto, T. Okada,
Y. Shimaki, K. Shirai, Y. Iijima, H. Noda, S. Kikuchi, T.
Yamaguchi, N. Ogawa, G. Ono, Y. Mimasu, K. Yoshikawa, T. Takahashi,
Y. Takei, A. Fujii, S. Nakazawa, F. Terui, S. Tanaka, M. Yoshikawa,
T. Saiki, S. Watanabe, Y. Tsuda. Sample collection from asteroid
(162173) Ryugu by Hayabusa2: Implications for surface evolution.
Science, 2020; 368 (6491): 654 DOI: [19]10.1126/science.aaz6306
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:08 2020
Date:
May 11, 2020
Source:
King's College London
Summary:
Researchers have developed an artificial intelligence diagnostic
that can predict whether someone is likely to have COVID-19
based on their symptoms.
FULL STORY
__________________________________________________________________
Researchers at King's College London, Massachusetts General Hospital
and health science company ZOE have developed an artificial
intelligence diagnostic that can predict whether someone is likely to
have COVID-19 based on their symptoms. Their findings are published
today in Nature Medicine.
The AI model uses data from the COVID Symptom Study app to predict
COVID-19 infection, by comparing people's symptoms and the results of
traditional COVID tests. Researchers say this may provide help for
populations where access to testing is limited. Two clinical trials in
the UK and the US are due to start shortly.
More than 3.3 million people globally have downloaded the app and are
using it to report daily on their health status, whether they feel well
or have any new symptoms such as persistent cough, fever, fatigue and
loss of taste or smell (anosmia).
In this study, the researchers analysed data gathered from just under
2.5 million people in the UK and US who had been regularly logging
their health status in the app, around a third of whom had logged
symptoms associated with COVID-19. Of these, 18,374 reported having had
a test for coronavirus, with 7,178 people testing positive.
The research team investigated which symptoms known to be associated
with COVID-19 were most likely to be associated with a positive test.
They found a wide range of symptoms compared to cold and flu, and warn
against focusing only on fever and cough. Indeed, they found loss of
taste and smell (anosmia) was particularly striking, with two thirds of
users testing positive for coronavirus infection reporting this symptom
compared with just over a fifth of the participants who tested
negative. The findings suggest that anosmia is a stronger predictor of
COVID-19 than fever, supporting anecdotal reports of loss of smell and
taste as a common symptom of the disease.
The researchers then created a mathematical model that predicted with
nearly 80% accuracy whether an individual is likely to have COVID-19
based on their age, sex and a combination of four key symptoms: loss of
smell or taste, severe or persistent cough, fatigue and skipping meals.
Applying this model to the entire group of over 800,000 app users
experiencing symptoms predicted that just under a fifth of those who
were unwell (17.42%) were likely to have COVID-19 at that time.
Researchers suggest that combining this AI prediction with widespread
adoption of the app could help to identify those who are likely to be
infectious as soon as the earliest symptoms start to appear, focusing
tracking and testing efforts where they are most needed.
Professor Tim Spector from King's College London said: "Our results
suggest that loss of taste or smell is a key early warning sign of
COVID-19 infection and should be included in routine screening for the
disease. We strongly urge governments and health authorities everywhere
to make this information more widely known, and advise anyone
experiencing sudden loss of smell or taste to assume that they are
infected and follow local self-isolation guidelines."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]King's College London. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Cristina Menni, Ana M. Valdes, Maxim B. Freidin, Carole H. Sudre,
Long H. Nguyen, David A. Drew, Sajaysurya Ganesh, Thomas Varsavsky,
M. Jorge Cardoso, Julia S. El-Sayed Moustafa, Alessia Visconti,
Pirro Hysi, Ruth C. E. Bowyer, Massimo Mangino, Mario Falchi,
Jonathan Wolf, Sebastien Ourselin, Andrew T. Chan, Claire J.
Steves, Tim D. Spector. Real-time tracking of self-reported
symptoms to predict potential COVID-19. Nature Medicine, 2020; DOI:
[19]10.1038/s41591-020-0916-2
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:08 2020
Tiny material differences are crucial for the functional behavior of
memristive devices
Date:
May 11, 2020
Source:
Forschungszentrum Juelich
Summary:
Memristive devices behave similarly to neurons in the brain.
Researchers have now discovered how to systematically control
the functional behaviour of these elements. The smallest
differences in material composition are found crucial:
differences so small that until now experts had failed to notice
them.
FULL STORY
__________________________________________________________________
Scientists around the world are intensively working on memristive
devices, which are capable in extremely low power operation and behave
similarly to neurons in the brain. Researchers from the Jülich Aachen
Research Alliance (JARA) and the German technology group Heraeus have
now discovered how to systematically control the functional behaviour
of these elements. The smallest differences in material composition are
found crucial: differences so small that until now experts had failed
to notice them. The researchers' design directions could help to
increase variety, efficiency, selectivity and reliability for
memristive technology-based applications, for example for
energy-efficient, non-volatile storage devices or neuro-inspired
computers.
Memristors are considered a highly promising alternative to
conventional nanoelectronic elements in computer Chips. Because of the
advantageous functionalities, their development is being eagerly
pursued by many companies and research institutions around the world.
The Japanese corporation NEC installed already the first prototypes in
space satellites back in 2017. Many other leading companies such as
Hewlett Packard, Intel, IBM, and Samsung are working to bring
innovative types of computer and storage devices based on memristive
elements to market.
Fundamentally, memristors are simply "resistors with memory," in which
high resistance can be switched to low resistance and back again. This
means in principle that the devices are adaptive, similar to a synapse
in a biological nervous system. "Memristive elements are considered
ideal candidates for neuro-inspired computers modelled on the brain,
which are attracting a great deal of interest in connection with deep
learning and artificial intelligence," says Dr. Ilia Valov of the Peter
Grünberg Institute (PGI-7) at Forschungszentrum Jülich.
In the latest issue of the open access journal Science Advances, he and
his team describe how the switching and neuromorphic behaviour of
memristive elements can be selectively controlled. According to their
findings, the crucial factor is the purity of the switching oxide
layer. "Depending on whether you use a material that is 99.999999 %
pure, and whether you introduce one foreign atom into ten million atoms
of pure material or into one hundred atoms, the properties of the
memristive elements vary substantially" says Valov.
This effect had so far been overlooked by experts. It can be used very
specifically for designing memristive systems, in a similar way to
doping semiconductors in information technology. "The introduction of
foreign atoms allows us to control the solubility and transport
properties of the thin oxide layers," explains Dr. Christian Neumann of
the technology group Heraeus. He has been contributing his materials
expertise to the project ever since the initial idea was conceived in
2015.
"In recent years there has been remarkable progress in the development
and use of memristive devices, however that progress has often been
achieved on a purely empirical basis," according to Valov. Using the
insights that his team has gained, manufacturers could now methodically
develop memristive elements selecting the functions they need. The
higher the doping concentration, the slower the resistance of the
elements changes as the number of incoming voltage pulses increases and
decreases, and the more stable the resistance remains. "This means that
we have found a way for designing types of artificial synapses with
differing excitability," explains Valov.
Design specification for artificial synapses
The brain's ability to learn and retain information can largely be
attributed to the fact that the connections between neurons are
strengthened when they are frequently used. Memristive devices, of
which there are different types such as electrochemical metallization
cells (ECMs) or valence change memory cells (VCMs), behave similarly.
When these components are used, the conductivity increases as the
number of incoming voltage pulses increases. The changes can also be
reversed by applying voltage pulses of the opposite polarity.
The JARA researchers conducted their systematic experiments on ECMs,
which consist of a copper electrode, a platinum electrode, and a layer
of silicon dioxide between them. Thanks to the cooperation with Heraeus
researchers, the JARA scientists had access to different types of
silicon dioxide: one with a purity of 99.999999 % -- also called 8N
silicon dioxide -- and others containing 100 to 10,000 ppm (parts per
million) of foreign atoms. The precisely doped glass used in their
experiments was specially developed and manufactured by quartz glass
specialist Heraeus Conamic, which also holds the patent for the
procedure. Copper and protons acted as mobile doping agents, while
aluminium and gallium were used as non-volatile doping.
Record switching time confirms theory
Based on their series of experiments, the researchers were able to show
that the ECMs' switching times change as the amount of doping atoms
changes. If the switching layer is made of 8N silicon dioxide, the
memristive component switches in only 1.4 nanoseconds. To date, the
fastest value ever measured for ECMs had been around 10 nanoseconds. By
doping the oxide layer of the components with up to 10,000 ppm of
foreign atoms, the switching time was prolonged into the range of
milliseconds. "We can also theoretically explain our results. This is
helping us to understand the physico-chemical processes on the
nanoscale and apply this knowledge in the practice" says Valov. Based
on generally applicable theoretical considerations, supported by
experimental results, some also documented in the literature, he is
convinced that the doping/impurity effect occurs and can be employed in
all types memristive elements.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Forschungszentrum Juelich. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. M. Lübben, F. Cüppers, J. Mohr, M. von Witzleben, U. Breuer, R.
Waser, C. Neumann, I. Valov. Design of defect-chemical properties
and device performance in memristive systems. Science Advances,
2020; 6 (19): eaaz9079 DOI: [19]10.1126/sciadv.aaz9079
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:10 2020
Date:
May 11, 2020
Source:
Helmholtz-Zentrum Berlin für Materialien und Energie
Summary:
Quantum spin liquids are candidates for potential use in future
information technologies. So far, quantum spin liquids have
usually only been found in one or two dimensional magnetic
systems only. Now an international team has investigated
crystals of PbCuTe2O6 with neutron experiments.
FULL STORY
__________________________________________________________________
They found spin liquid behaviour in 3D, due to a so called hyper
hyperkagome lattice. The experimental data fit extremely well to
theoretical simulations also done at HZB.
IT devices today are based on electronic processes in semiconductors.
The next real breakthrough could be to exploit other quantum phenomena,
for example interactions between tiny magnetic moments in the material,
the so-called spins. So-called quantum-spin liquid materials could be
candidates for such new technologies. They differ significantly from
conventional magnetic materials because quantum fluctuations dominate
the magnetic interactions: Due to geometric constraints in the crystal
lattice, spins cannot all "freeze" together in a ground state -- they
are forced to fluctuate, even at temperatures close to absolute zero.
Quantum spin liquids: a rare phenomenon
Quantum spin liquids are rare and have so far been found mainly in
two-dimensional magnetic systems. Three-dimensional isotropic spin
liquids are mostly sought in materials where the magnetic ions form
pyrochlore or hyperkagome lattices. An international team led by HZB
physicist Prof. Bella Lake has now investigated samples of PbCuTe2O6,
which has a three-dimensional lattice called hyper-hyperkagome lattice.
Magnetic interactions simulated
HZB physicist Prof. Johannes Reuther calculated the behaviour of such a
three-dimensional hyper-hyperkagome lattice with four magnetic
interactions and showed that the system exhibits quantum-spin liquid
behaviour with a specific magnetic energy spectrum.
Experiments at neutron sources find 3D quantum spin liquid
With neutron experiments at ISIS, UK, ILL, France and NIST, USA the
team was able to prove the very subtle signals of this predicted
behaviour. "We were surprised how well our data fit into the
calculations. This gives us hope that we can really understand what
happens in these systems," explains first author Dr. Shravani Chillal,
HZB.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Helmholtz-Zentrum Berlin für Materialien
und Energie. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Shravani Chillal, Yasir Iqbal, Harald O. Jeschke, Jose A.
Rodriguez-Rivera, Robert Bewley, Pascal Manuel, Dmitry Khalyavin,
Paul Steffens, Ronny Thomale, A. T. M. Nazmul Islam, Johannes
Reuther, Bella Lake. Evidence for a three-dimensional quantum spin
liquid in PbCuTe2O6. Nature Communications, 2020; 11 (1) DOI:
[19]10.1038/s41467-020-15594-1
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:10 2020
resources
Date:
May 11, 2020
Source:
CNRS
Summary:
In the past few weeks, oil prices have fallen to record lows.
This development was not predicted by the Hotelling rule, an
equation proposed in 1931 that remains central to the economics
of natural resources today. Economists present the results of a
groundbreaking historical survey of documents from Harold
Hotelling's archives. They show that in fact this 'rule' was not
designed to investigate energy markets.
FULL STORY
__________________________________________________________________
In an article written in 1931, the American economist and mathematician
Harold Hotelling published a model to describe the evolution of the
prices of non-renewable resources. Following the 1973 oil crisis, the
model aroused fresh interest: the growth theorist Robert Solow named
the initial equation in this article 'the Hotelling rule', establishing
it as a fundamental principle of the economics of non-renewable
resources. However, the prices observed over the past century have
never been in line with this equation*, something which has constantly
puzzled economists.
Despite everything, the Hotelling rule still retains its central status
in the economics of mineral and energy resources: it is on this basis
that more sophisticated 'extensions' are constructed to account for
market realities. Roberto Ferreira da Cunha, from the Berkeley Research
Group (Brazil), and Antoine Missemer, a CNRS researcher attached to
CIRED, the International Centre for Research on Environment and
Development (CNRS/CIRAD/AgroParisTech/Ecole des Ponts ParisTech/EHESS),
undertook a detailed and unprecedented examination of Harold
Hotelling's archives**. By analysing the origins of the model, they
conclude that its scope of validity is more limited t han commonly
established, and decisively clarify the reasons for its empirical
weaknesses.
Hotelling's drafts, as well as his correspondence, with oil engineers
for example, point to a reinterpretation of the 1931 article. It turns
out that the 'rule', which he had devised as early as 1924 for abstract
assets, was in no way intended to be applied to the concrete case of
mineral and energy resources. From 1925 to 1930, Hotelling himself
identified unavoidable geological constraints that changed his initial
result: increased production costs as extraction progresses, or the
cost resulting from ramped up production. As he outlined, this
transformed his model, which was then potentially able to describe
bell-shaped production paths, such as those used in debates about peak
oil.
The two researchers thus show that, if the Hotelling rule has such
difficulty in passing the hurdle of empirical tests in the field of
energy and mineral resources, it is because it was not designed for
that! They propose to reconstruct the models used in this area, taking
as a starting point an alternative Hotelling rule that is more in line
with geological realities. More generally, their study questions the
theoretical instruments used to address energy and environmental issues
today. History, and in this case the history of economic thought, can
help to take a fresh look at tools that, although considered well
established, still deserve to be questioned.
This work was carried out as part of the project Bifurcations in
Natural Resource Economics (1920s-1930s), funded by the European
Society for the History of Economic Thought (ESHET).
*- The equation states that, in a competitive situation, the price of
such resources increases over time at the interest rate observed in the
economy.
**- Thousands of pages, contained in 58 archive boxes, stored at
Columbia University, New York. 20 to 30 documents taken from various
files were identified and then used by the two researchers for their
analysis.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]CNRS. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Roberto Ferreira da Cunha, Antoine Missemer. The Hotelling rule in
non‐renewable resource economics: A reassessment. Canadian Journal
of Economics/Revue canadienne d'économique, 2020; DOI:
[19]10.1111/caje.12444
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:10 2020
Date:
May 11, 2020
Source:
Springer
Summary:
Scientists have developed a mathematical model of the flow of
ultra-cold superfluids, showing how they deform when they
encounter impurities.
FULL STORY
__________________________________________________________________
Superfluids, which form only at temperatures close to absolute zero,
have unique and in some ways bizarre mechanical properties, Yvan Buggy
of the Institute of Photonics and Quantum Sciences at Heriot-Watt
University in Edinburgh, Scotland, and his co-workers have developed a
new quantum mechanical model of some of these properties, which
illustrates how these fluids will deform as they flow around
impurities. This work is published in the journal EPJ D.
Imagine that you start stirring a cup of tea, come back to it five
minutes later and find that the tea is still circulating. In itself,
this is clearly impossible, but if you could stir a cup of an
ultra-cold liquid this is exactly what would happen. Below about -270oC
-- that is, just a few degrees above the coldest possible temperature,
absolute zero -- the liquid becomes a superfluid: a weird substance
that has no viscosity and that therefore will flow without losing
kinetic energy, creep along surfaces and along vessel walls, and
continue to spin indefinitely around vertices.
Superfluids acquire these properties because so many of their atoms
fall into the lowest energy state that quantum mechanical properties
dominate over classical ones. They therefore provide a unique
opportunity for studying quantum phenomena on a macroscopic level, if
in extreme conditions. In this study, Buggy and his colleagues use the
essential equations of quantum mechanics to calculate the stresses and
flows in such an ultracold superfluid under changes in potential
energy. They show that the fluid flow will be steady and homogeneous in
the absence of impurities. If an impurity is present, however, the
fluid will become deformed in the vicinity of that impurity.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Springer. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Yvan Buggy, Lawrence G. Phillips, Patrik Öhberg. On the
hydrodynamics of nonlinear gauge-coupled quantum fluids. The
European Physical Journal D, 2020; 74 (5) DOI:
[19]10.1140/epjd/e2020-100524-3
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:14 2020
Date:
May 11, 2020
Source:
Arizona State University
Summary:
Researchers investigated how solar reflective coatings on select
Los Angeles city streets affected radiant heat and, in turn,
pedestrians' comfort on a typical summer day. The idea is, if
you coat a street with a lighter color than traditional pavement
black, it will actually lower the surrounding temperatures. But
researchers wanted to measure what effect reflective coating had
on pedestrians.
FULL STORY
__________________________________________________________________
One day last July, Ariane Middel and two other Arizona State University
researchers headed west on Interstate 10. Squeezed inside their van
were MaRTy 1 and MaRTy 2, mobile biometeorological instrument platforms
that can tell you exactly what you feel in the summer heat. All five
were destined for Los Angeles.
The researchers and their colleagues were headed to L.A. to start
investigating how solar reflective coatings on select city streets
affected radiant heat and, in turn, pedestrians' comfort on a typical
summer day.
The Los Angeles Bureau of Street Surfaces has pioneered the use of
solar reflective coatings in a quest to cool city streets.
The idea is, if you coat a street with a lighter color than traditional
pavement black, it will actually lower the surrounding temperatures.
But Middel and her collaborators now wanted to see what effect
reflective coating had on pedestrians.
"If you're in a hot, dry and sunny climate like Phoenix or L.A., the
mean radiant temperature has the biggest impact on how a person
experiences the heat," explains Middel, assistant professor in the ASU
School of Arts, Media and Engineering and a senior sustainability
scientist in the Julie Ann Wrigley Global Institute of Sustainability.
"The mean radiant temperature is essentially the heat that hits the
human body. It includes the radiation from the sun, so if you are
standing in direct sunlight you will feel much hotter than in the
shade."
Thanks to remote-sensing satellites, decades of data exist on the
Earth's land surface temperature; that is, how hot a single point on
the Earth's surface would feel to the touch. But that data should not
be confused with near-surface ambient and radiant temperature, the heat
that humans and animals "experience," said Middel, lead author of the
study and director of ASU's SHaDE Lab, which stands for Sensable
Heatscapes and Digital Environments.
The researchers' study is the first to measure the thermal performance
of solar reflective coatings using instruments that sense
meteorological variables relevant to a pedestrian's experience: radiant
heat, ambient temperature, wind and humidity.
The researchers focused on two variables, surface temperature and
radiant temperature over highly reflective surfaces. They took MaRTy 1
and 2 on hourly strolls through a Los Angeles neighborhood to measure a
pedestrian's heat exposure over regular asphalt roads, reflective
coated roads and sidewalks next to the roads.
MaRTy, which stands for mean radiant temperature, looks like a weather
station in a wagon. The station measures the total radiation that hits
the body, including sunlight and the heat emitted from surfaces like
asphalt.
The study showed that the surface temperature of the coated asphalt
road was up to 6 degrees Celsius cooler than the regular road in the
afternoon. However, the radiant heat over coated asphalt was 4 degrees
Celsius higher than non-coated areas, basically negating any
heat-limiting factor.
"So, if you're a pedestrian walking over the surface, you get hit by
the shortwave radiation reflected back at you," Middel said.
The study also found that the coating didn't have a big impact on air
temperature, only half a degree in the afternoon and 0.1 degrees
Celsius at night.
The upshot, said V. Kelly Turner, assistant professor of urban planning
at UCLA and the study's co-author, is that to cool off cities, urban
climatologists and city planners need to focus on different solutions
or combinations of solutions depending on a desired goal.
"The solutions are context dependent and depend on what you want to
achieve," Turner explained.
A solution that addresses surface temperature is not necessarily suited
to the reduction of building energy use. For example, if you want
cooler surface temperatures on a playground because children are
running across its surface, a reflective coating would be best. But if
you want to reduce the thermal load on people, planting trees or
providing shade would be more effective.
But what happens if you combine trees with cool pavement? Does the cool
pavement lose its ability to reduce surface temperature? Or perhaps the
cool pavement is costly to maintain when the trees drop their leaves?
"So, reflective coating is not a panacea," Turner said. "It's one
tool."
It should also be noted that temperature is a multifaceted measurement
of heat. Surface temperature, ambient temperature and mean radiant
temperature are distinct from one another and require distinct
solutions when it comes to mitigating heat.
"We need more of these experiments," Middel said. "There have been a
lot of large-scale modeling studies on this. So, we don't know in real
life if we get the same effects. The urban environment is so complex,
and models have to always simplify. So, we don't know what really
happens on the ground unless we measure, and there haven't been these
types of measurements in the past."
The researchers report their findings of the Los Angeles study in,
"Solar reflective pavements -- A policy panacea to heat mitigation?"
which was published on April 8, 2020 in the journal Environmental
Research Letters. Co-authors on the paper include Florian Schneider and
Yujia Zhang of ASU, and Matthew Stiller of Kent State University.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Arizona State University. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Ariane Middel, V. Kelly Turner, Florian Arwed Schneider, Yujia
Zhang, Matthew Stiller. Solar reflective pavements – a policy
panacea to heat mitigation? Environmental Research Letters, 2020;
DOI: [19]10.1088/1748-9326/ab87d4
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:14 2020
Date:
May 11, 2020
Source:
North Carolina State University
Summary:
A new study suggests that a polymer compound embedded with
bismuth trioxide particles holds tremendous potential for
replacing conventional radiation shielding materials, such as
lead.
FULL STORY
__________________________________________________________________
A new study from researchers at North Carolina State University
suggests that a material consisting of a polymer compound embedded with
bismuth trioxide particles holds tremendous potential for replacing
conventional radiation shielding materials, such as lead.
The bismuth trioxide compound is lightweight, effective at shielding
against ionizing radiation such as gamma rays, and can be manufactured
quickly -- making it a promising material for use in applications such
as space exploration, medical imaging and radiation therapy.
"Traditional radiation shielding materials, like lead, are often
expensive, heavy and toxic to human health and the environment," says
Ge Yang, an assistant professor of nuclear engineering at NC State and
corresponding author of a paper on the work. "This proof-of-concept
study shows that a bismuth trioxide compound could serve as effective
radiation shielding, while mitigating the drawbacks associated with
traditional shielding materials."
In the new study, researchers demonstrated that they could create the
compound using a curing method that relies on ultraviolet (UV) light --
rather than relying on time-consuming high-temperature techniques.
"Using the UV curing method, we were able to create the compound on the
order of minutes at room temperature -- which holds potential for the
rapid manufacturing of radiation shielding materials," Yang says. "This
is an important point because thermal polymerization, a frequently used
method for making polymer compounds, often relies on high temperatures
and can take hours or even days to complete. The UV curing method is
both faster and less expensive."
Using the UV curing method, the researchers created samples of the
polymer compound that include as much as 44% bismuth trioxide by
weight. The researchers then tested the samples to determine the
material's mechanical properties and whether it could effectively
shield against ionizing radiation.
"This is foundational work," Yang says. "We have determined that the
compound is effective at shielding gamma rays, is lightweight and is
strong. We are working to further optimize this technique to get the
best performance from the material.
"We are excited about finding a novel radiation shielding material that
works this well, is this light, and can be manufactured this quickly."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]North Carolina State University. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Da Cao, Ge Yang, Mohamed Bourham, Dan Moneghan. Gamma radiation
shielding properties of poly (methyl methacrylate) / Bi2O3
composites. Nuclear Engineering and Technology, 2020; DOI:
[19]10.1016/j.net.2020.04.026
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:14 2020
Date:
May 11, 2020
Source:
University of Alaska Fairbanks
Summary:
A research team has developed a way to use satellite images to
determine the amount of methane being released from northern
lakes, a technique that could help climate change modelers
better account for this potent greenhouse gas. By using
synthetic aperture radar, or SAR, researchers were able to find
a correlation between 'brighter' satellite images of frozen
lakes and the amount of methane they produce.
FULL STORY
__________________________________________________________________
A University of Alaska Fairbanks-led research team has developed a way
to use satellite images to determine the amount of methane being
released from northern lakes, a technique that could help climate
change modelers better account for this potent greenhouse gas.
By using synthetic aperture radar, or SAR, researchers were able to
find a correlation between "brighter" satellite images of frozen lakes
and the amount of methane they produce. Comparing those SAR images with
ground-level methane measurements confirmed that the satellite readings
were consistent with on-site data.
SAR data, which were provided by UAF's Alaska Satellite Facility, are
well-suited to the Arctic. The technology can penetrate dry snow, and
doesn't require daylight or cloud-free conditions. SAR is also good at
imaging frozen lakes, particularly ones filled with bubbles that often
form in ice when methane is present.
"We found that backscatter is brighter when there are more bubbles
trapped in the lake ice," said Melanie Engram, the lead author of the
study and a researcher at UAF's Water and Environmental Research
Center. "Bubbles form an insulated blanket, so ice beneath them grows
more slowly, causing a warped surface which reflects the radar signal
back to the satellite."
The new technique could have significant implications for climate
change predictions. Methane is about 30 times more powerful than carbon
dioxide as a heat-trapping gas, so accurate estimates about its
prevalence are particularly important in scientific models.
Previous research had confirmed that vast amounts of methane are being
released from thermokarst lakes as the permafrost beneath them thaws.
But collecting on-site data from those lakes is often expensive and
logistically challenging. Because of that, information about methane
production is available from only a tiny percentage of Arctic lakes.
"This new technique is a major breakthrough for understanding the
Arctic methane budget," said UAF researcher Katey Walter Anthony, who
also contributed to the study. "It helps to resolve a longstanding
discrepancy between estimates of Arctic methane emissions from
atmospheric measurements and data upscaled from a small number of
individual lakes."
To confirm the SAR data, researchers compared satellite images with
field measurements from 48 lakes in five geographic areas in Alaska. By
extrapolating those results, researchers can now estimate the methane
production of more than 5,000 Alaska lakes.
"It's important to know how much methane comes out of these lakes and
whether the level is increasing," Engram said. "We can't get out to
every single lake and do field work, but we can extrapolate field
measurements using SAR remote sensing to get these regional estimates."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Alaska Fairbanks. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. M. Engram, K. M. Walter Anthony, T. Sachs, K. Kohnert, A.
Serafimovich, G. Grosse, F. J. Meyer. Remote sensing northern lake
methane ebullition. Nature Climate Change, 2020; DOI:
[19]10.1038/s41558-020-0762-8
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:14 2020
Date:
May 11, 2020
Source:
International Institute for Applied Systems Analysis
Summary:
A new study investigated the impacts of different levels of
global warming on hydropower potential and found that this type
of electricity generation benefits more from a 1.5°C than a 2°C
climate scenario.
FULL STORY
__________________________________________________________________
A new study by researchers from IIASA and China investigated the
impacts of different levels of global warming on hydropower potential
and found that this type of electricity generation benefits more from a
1.5°C than a 2°C climate scenario.
In a sustainable and less carbon-intensive future, hydropower will play
an increasingly crucial role as an important source of renewable and
clean energy in the world's overall energy supply. In fact, hydropower
generation has doubled over the last three decades and is projected to
double again from the present level by 2050. Global warming is however
threatening the world's water supplies, posing a significant threat to
hydropower generation, which is a problem in light of the continuous
increase in energy demand due to global population growth and
socioeconomic development.
The study, undertaken by researchers from IIASA in collaboration with
colleagues at several Chinese institutions and published in the journal
Water Resources Research, employed a coupled hydrological and
techno-economic model framework to identify optimal locations for
hydropower plants under global warming levels of 1.5°C and 2°C, while
also considering gross hydropower potential, power consumption, and
economic factors. According to the authors, while determining the
effects of different levels of global warming has become a hot topic in
water resources research, there are still relatively few studies on the
impacts of different global warming levels on hydropower potential.
The researchers specifically looked at the potential for hydropower
production under the two different levels of warming in Sumatra, one of
the Sunda Islands of western Indonesia. Sumatra was chosen as it is
vulnerable to global warming because of sea level rise, and the
island's environmental conditions make it an ideal location for
developing and utilizing hydropower resources. They also modeled and
visualized optimal locations of hydropower plants using the IIASA
BeWhere model, and discussed hydropower production based on selected
hydropower plants and the reduction in carbon emissions that would
result from using hydropower instead of fossil fuels.
The results show that global warming levels of both 1.5°C and 2°C will
have a positive impact on the hydropower production of Sumatra relative
to the historical period. The ratio of hydropower production to power
demand provided by 1.5°C of global warming is however greater than that
provided by 2°C of global warming under a scenario that assumes
stabilization without overshooting the target after 2100. This is due
to a decrease in precipitation and the fact that the south east of
Indonesia observes the highest discharge decrease under this scenario.
In addition, the reduction in CO2 emissions under global warming of
1.5°C is greater than that achieved under global warming of 2°C, which
reveals that global warming decreases the benefits necessary to relieve
global warming levels. The findings also illustrate the tension between
greenhouse gas-related goals and ecosystem conservation-related goals
by considering the trade-off between the protected areas and hydropower
plant expansion.
"Our study could significantly contribute to establishing a basis for
decision making on energy security under 1.5°C and 2°C global warming
scenarios. Our findings can also potentially be an important basis for
a large range of follow-up studies to, for instance, investigate the
trade-off between forest conservancy and hydropower development, to
contribute to the achievement of countries' Nationally Determined
Contributions under the Paris Agreement," concludes study lead author
Ying Meng, who started work on this project as a participant of the
2018 IIASA Young Scientists Summer Program (YSSP). She is currently
affiliated with the School of Environment at the Harbin Institute of
Technology in China.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]International Institute for Applied
Systems Analysis. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Ying Meng, Junguo Liu, Sylvain Leduc, Sennai Mesfun, Florian
Kraxner, Ganquan Mao, Wei Qi, Zifeng Wang. Hydropower Production
Benefits More From 1.5 °C than 2 °C Climate Scenario. Water
Resources Research, 2020; 56 (5) DOI: [19]10.1029/2019WR025519
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 11 21:30:14 2020
Linking multiple copies of these devices may lay the foundation for quantum computing
Date:
May 11, 2020
Source:
National Institute of Standards and Technology (NIST)
Summary:
Researchers have developed a step-by-step recipe to produce
single-atom transistors.
FULL STORY
__________________________________________________________________
Once unimaginable, transistors consisting only of several-atom clusters
or even single atoms promise to become the building blocks of a new
generation of computers with unparalleled memory and processing power.
But to realize the full potential of these tiny transistors --
miniature electrical on-off switches -- researchers must find a way to
make many copies of these notoriously difficult-to-fabricate
components.
Now, researchers at the National Institute of Standards and Technology
(NIST) and their colleagues at the University of Maryland have
developed a step-by-step recipe to produce the atomic-scale devices.
Using these instructions, the NIST-led team has become only the second
in the world to construct a single-atom transistor and the first to
fabricate a series of single electron transistors with atom-scale
control over the devices' geometry.
The scientists demonstrated that they could precisely adjust the rate
at which individual electrons flow through a physical gap or electrical
barrier in their transistor -- even though classical physics would
forbid the electrons from doing so because they lack enough energy.
That strictly quantum phenomenon, known as quantum tunneling, only
becomes important when gaps are extremely tiny, such as in the
miniature transistors. Precise control over quantum tunneling is key
because it enables the transistors to become "entangled" or interlinked
in a way only possible through quantum mechanics and opens new
possibilities for creating quantum bits (qubits) that could be used in
quantum computing.
To fabricate single-atom and few-atom transistors, the team relied on a
known technique in which a silicon chip is covered with a layer of
hydrogen atoms, which readily bind to silicon. The fine tip of a
scanning tunneling microscope then removed hydrogen atoms at selected
sites. The remaining hydrogen acted as a barrier so that when the team
directed phosphine gas (PH[3]) at the silicon surface, individual PH[3]
molecules attached only to the locations where the hydrogen had been
removed (see animation). The researchers then heated the silicon
surface. The heat ejected hydrogen atoms from the PH[3] and caused the
phosphorus atom that was left behind to embed itself in the surface.
With additional processing, bound phosphorus atoms created the
foundation of a series of highly stable single- or few-atom devices
that have the potential to serve as qubits.
Two of the steps in the method devised by the NIST teams -- sealing the
phosphorus atoms with protective layers of silicon and then making
electrical contact with the embedded atoms -- appear to have been
essential to reliably fabricate many copies of atomically precise
devices, NIST researcher Richard Silver said.
In the past, researchers have typically applied heat as all the silicon
layers are grown, in order to remove defects and ensure that the
silicon has the pure crystalline structure required to integrate the
single-atom devices with conventional silicon-chip electrical
components. But the NIST scientists found that such heating could
dislodge the bound phosphorus atoms and potentially disrupt the
structure of the atomic-scale devices. Instead, the team deposited the
first several silicon layers at room temperature, allowing the
phosphorus atoms to stay put. Only when subsequent layers were
deposited did the team apply heat.
"We believe our method of applying the layers provides more stable and
precise atomic-scale devices," said Silver. Having even a single atom
out of place can alter the conductivity and other properties of
electrical components that feature single or small clusters of atoms.
The team also developed a novel technique for the crucial step of
making electrical contact with the buried atoms so that they can
operate as part of a circuit. The NIST scientists gently heated a layer
of palladium metal applied to specific regions on the silicon surface
that resided directly above selected components of the silicon-embedded
device. The heated palladium reacted with the silicon to form an
electrically conducting alloy called palladium silicide, which
naturally penetrated through the silicon and made contact with the
phosphorus atoms.
In a recent edition of Advanced Functional Materials, Silver and his
colleagues, who include Xiqiao Wang, Jonathan Wyrick, Michael Stewart
Jr. and Curt Richter, emphasized that their contact method has a nearly
100% success rate. That's a key achievement, noted Wyrick. "You can
have the best single-atom-transistor device in the world, but if you
can't make contact with it, it's useless," he said.
Fabricating single-atom transistors "is a difficult and complicated
process that maybe everyone has to cut their teeth on, but we've laid
out the steps so that other teams don't have to proceed by trial and
error," said Richter.
In related work published today in Communications Physics, Silver and
his colleagues demonstrated that they could precisely control the rate
at which individual electrons tunnel through atomically precise tunnel
barriers in single-electron transistors. The NIST researchers and their
colleagues fabricated a series of single-electron transistors identical
in every way except for differences in the size of the tunneling gap.
Measurements of current flow indicated that by increasing or decreasing
the gap between transistor components by less than a nanometer
(billionth of a meter), the team could precisely control the flow of a
single electron through the transistor in a predictable manner.
"Because quantum tunneling is so fundamental to any quantum device,
including the construction of qubits, the ability to control the flow
of one electron at a time is a significant achievement," Wyrick said.
In addition, as engineers pack more and more circuitry on a tiny
computer chip and the gap between components continues to shrink,
understanding and controlling the effects of quantum tunneling will
become even more critical, Richter said.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]National Institute of Standards and
Technology (NIST). Note: Content may be edited for style and length.
__________________________________________________________________
Journal References:
1. Xiqiao Wang, Jonathan Wyrick, Ranjit V. Kashid, Pradeep Namboodiri,
Scott W. Schmucker, Andrew Murphy, M. D. Stewart, Richard M.
Silver. Atomic-scale control of tunneling in donor-based devices.
Communications Physics, 2020; 3 (1) DOI:
[19]10.1038/s42005-020-0343-1
2. Jonathan Wyrick, Xiqiao Wang, Ranjit V. Kashid, Pradeep Namboodiri,
Scott W. Schmucker, Joseph A. Hagmann, Keyi Liu, Michael D.
Stewart, Curt A. Richter, Garnett W. Bryant, Richard M. Silver.
Atom‐by‐Atom Fabrication of Single and Few Dopant Quantum Devices.
Advanced Functional Materials, 2019; 29 (52): 1903475 DOI:
[20]10.1002/adfm.201903475
__________________________________________________________________
--- up 15 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 12 21:30:00 2020
data
Date:
May 12, 2020
Source:
University of California - Santa Cruz
Summary:
Researchers have developed a powerful new computer program
called Morpheus that can analyze astronomical image data pixel
by pixel to identify and classify all of the galaxies and stars
in large data sets from astronomy surveys. Morpheus is a
deep-learning framework that incorporates a variety of
artificial intelligence technologies developed for applications
such as image and speech recognition.
FULL STORY
__________________________________________________________________
Researchers at UC Santa Cruz have developed a powerful new computer
program called Morpheus that can analyze astronomical image data pixel
by pixel to identify and classify all of the galaxies and stars in
large data sets from astronomy surveys.
Morpheus is a deep-learning framework that incorporates a variety of
artificial intelligence technologies developed for applications such as
image and speech recognition. Brant Robertson, a professor of astronomy
and astrophysics who leads the Computational Astrophysics Research
Group at UC Santa Cruz, said the rapidly increasing size of astronomy
data sets has made it essential to automate some of the tasks
traditionally done by astronomers.
"There are some things we simply cannot do as humans, so we have to
find ways to use computers to deal with the huge amount of data that
will be coming in over the next few years from large astronomical
survey projects," he said.
Robertson worked with Ryan Hausen, a computer science graduate student
in UCSC's Baskin School of Engineering, who developed and tested
Morpheus over the past two years. With the publication of their results
May 12 in the Astrophysical Journal Supplement Series, Hausen and
Robertson are also releasing the Morpheus code publicly and providing
online demonstrations.
The morphologies of galaxies, from rotating disk galaxies like our own
Milky Way to amorphous elliptical and spheroidal galaxies, can tell
astronomers about how galaxies form and evolve over time. Large-scale
surveys, such as the Legacy Survey of Space and Time (LSST) to be
conducted at the Vera Rubin Observatory now under construction in
Chile, will generate huge amounts of image data, and Robertson has been
involved in planning how to use that data to understand the formation
and evolution of galaxies. LSST will take more than 800 panoramic
images each night with a 3.2-billion-pixel camera, recording the entire
visible sky twice each week.
"Imagine if you went to astronomers and asked them to classify billions
of objects -- how could they possibly do that? Now we'll be able to
automatically classify those objects and use that information to learn
about galaxy evolution," Robertson said.
Other astronomers have used deep-learning technology to classify
galaxies, but previous efforts have typically involved adapting
existing image recognition algorithms, and researchers have fed the
algorithms curated images of galaxies to be classified. Hausen built
Morpheus from the ground up specifically for astronomical image data,
and the model uses as input the original image data in the standard
digital file format used by astronomers.
Pixel-level classification is another important advantage of Morpheus,
Robertson said. "With other models, you have to know something is there
and feed the model an image, and it classifies the entire galaxy at
once," he said. "Morpheus discovers the galaxies for you, and does it
pixel by pixel, so it can handle very complicated images, where you
might have a spheroidal right next to a disk. For a disk with a central
bulge, it classifies the bulge separately. So it's very powerful."
To train the deep-learning algorithm, the researchers used information
from a 2015 study in which dozens of astronomers classified about
10,000 galaxies in Hubble Space Telescope images from the CANDELS
survey. They then applied Morpheus to image data from the Hubble Legacy
Fields, which combines observations taken by several Hubble deep-field
surveys.
When Morpheus processes an image of an area of the sky, it generates a
new set of images of that part of the sky in which all objects are
color-coded based on their morphology, separating astronomical objects
from the background and identifying point sources (stars) and different
types of galaxies. The output includes a confidence level for each
classification. Running on UCSC's lux supercomputer, the program
rapidly generates a pixel-by-pixel analysis for the entire data set.
"Morpheus provides detection and morphological classification of
astronomical objects at a level of granularity that doesn't currently
exist," Hausen said.
An interactive visualization of the Morpheus model results for GOODS
South, a deep-field survey that imaged millions of galaxies, has been
publicly released. This work was supported by NASA and the National
Science Foundation.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of California - Santa Cruz.
Original written by Tim Stephens. Note: Content may be edited for style
and length.
__________________________________________________________________
Journal Reference:
1. Ryan Hausen, Brant E. Robertson. Morpheus: A Deep Learning
Framework for the Pixel-level Analysis of Astronomical Image Data.
The Astrophysical Journal Supplement Series, 2020; 248 (1): 20 DOI:
[19]10.3847/1538-4365/ab8868
__________________________________________________________________
--- up 16 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 12 21:30:02 2020
Date:
May 12, 2020
Source:
University of Cambridge
Summary:
Machine learning and AI are highly unstable in medical image
reconstruction, and may lead to false positives and false
negatives, a new study suggests.
FULL STORY
__________________________________________________________________
Machine learning and AI are highly unstable in medical image
reconstruction, and may lead to false positives and false negatives, a
new study suggests.
A team of researchers, led by the University of Cambridge and Simon
Fraser University, designed a series of tests for medical image
reconstruction algorithms based on AI and deep learning, and found that
these techniques result in myriad artefacts, or unwanted alterations in
the data, among other major errors in the final images. The effects
were typically not present in non-AI based imaging techniques.
The phenomenon was widespread across different types of artificial
neural networks, suggesting that the problem will not be easily
remedied. The researchers caution that relying on AI-based image
reconstruction techniques to make diagnoses and determine treatment
could ultimately do harm to patients. Their results are reported in the
Proceedings of the National Academy of Sciences.
"There's been a lot of enthusiasm about AI in medical imaging, and it
may well have the potential to revolutionise modern medicine: however,
there are potential pitfalls that must not be ignored," said Dr Anders
Hansen from Cambridge's Department of Applied Mathematics and
Theoretical Physics, who led the research with Dr Ben Adcock from Simon
Fraser University. "We've found that AI techniques are highly unstable
in medical imaging, so that small changes in the input may result in
big changes in the output."
A typical MRI scan can take anywhere between 15 minutes and two hours,
depending on the size of the area being scanned and the number of
images being taken. The longer the patient spends inside the machine,
the higher resolution the final image will be. However, limiting the
amount of time patients spend inside the machine is desired, both to
reduce the risk to individual patients and to increase the overall
number of scans that can be performed.
Using AI techniques to improve the quality of images from MRI scans or
other types of medical imaging is an attractive possibility for solving
the problem of getting the highest quality image in the smallest amount
of time: in theory, AI could take a low-resolution image and make it
into a high-resolution version. AI algorithms 'learn' to reconstruct
images based on training from previous data, and through this training
procedure aim to optimise the quality of the reconstruction. This
represents a radical change compared to classical reconstruction
techniques that are solely based on mathematical theory without
dependency on previous data. In particular, classical techniques do not
learn.
Any AI algorithm needs two things to be reliable: accuracy and
stability. An AI will usually classify an image of a cat as a cat, but
tiny, almost invisible changes in the image might cause the algorithm
to instead classify the cat as a truck or a table, for instance. In
this example of image classification, the one thing that can go wrong
is that the image is incorrectly classified. However, when it comes to
image reconstruction, such as that used in medical imaging, there are
several things that can go wrong. For example, details like a tumour
may get lost or may falsely be added. Details can be obscured and
unwanted artefacts may occur in the image.
"When it comes to critical decisions around human health, we can't
afford to have algorithms making mistakes," said Hansen. "We found that
the tiniest corruption, such as may be caused by a patient moving, can
give a very different result if you're using AI and deep learning to
reconstruct medical images -- meaning that these algorithms lack the
stability they need."
Hansen and his colleagues from Norway, Portugal, Canada and the UK
designed a series of tests to find the flaws in AI-based medical
imaging systems, including MRI, CT and NMR. They considered three
crucial issues: instabilities associated with tiny perturbations, or
movements; instabilities with respect to small structural changes, such
as a brain image with or without a small tumour; and instabilities with
respect to changes in the number of samples.
They found that certain tiny movements led to myriad artefacts in the
final images, details were blurred or completely removed, and that the
quality of image reconstruction would deteriorate with repeated
subsampling. These errors were widespread across the different types of
neural networks.
According to the researchers, the most worrying errors are the ones
that radiologists might interpret as medical issues, as opposed to
those that can easily be dismissed due to a technical error.
"We developed the test to verify our thesis that deep learning
techniques would be universally unstable in medical imaging," said
Hansen. "The reasoning for our prediction was that there is a limit to
how good a reconstruction can be given restricted scan time. In some
sense, modern AI techniques break this barrier, and as a result become
unstable. We've shown mathematically that there is a price to pay for
these instabilities, or to put it simply: there is still no such thing
as a free lunch."
The researchers are now focusing on providing the fundamental limits to
what can be done with AI techniques. Only when these limits are known
will we be able to understand which problems can be solved. "Trial and
error-based research would never discover that the alchemists could not
make gold: we are in a similar situation with modern AI," said Hansen.
"These techniques will never discover their own limitations. Such
limitations can only be shown mathematically."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Cambridge. The original
story is licensed under a [19]Creative Commons License. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Vegard Antun, Francesco Renna, Clarice Poon, Ben Adcock, Anders C.
Hansen. On instabilities of deep learning in image reconstruction
and the potential costs of AI. Proceedings of the National Academy
of Sciences, 2020; 201907377 DOI: [20]10.1073/pnas.1907377117
__________________________________________________________________
--- up 16 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 12 21:30:04 2020
Date:
May 12, 2020
Source:
eLife
Summary:
A new tool using cutting-edge technology is able to distinguish
different types of blood clots based on what caused them,
according to a new study.
FULL STORY
__________________________________________________________________
A new tool using cutting-edge technology is able to distinguish
different types of blood clots based on what caused them, according to
a study published in eLife.
The tool could help physicians diagnose what caused a blood clot and
help them select a treatment that targets cause to break it up. For
example, it could help them determine if aspirin or another kind of
anti-clotting drug would be the best choice for a person who has just
had a heart attack or stroke.
Blood clots occur when small sticky blood cells called platelets
cluster together. This can help stop bleeding after a cut, but it can
also be harmful in causing a stroke or a heart attack by blocking a
blood vessel. "Different types of blood clots are caused by different
molecules, but they all look very similar," explains lead author Yuqi
Zhou, a PhD student at the Department of Chemistry, University of
Tokyo, Japan. "What's more, they are nearly impossible to tell apart
using existing tools such as microscopes."
To develop a more effective approach to identifying different types of
blood clots, Zhou and her colleagues took blood samples from a healthy
individual and then exposed them to different clotting agents. The team
captured thousands of images of the different types of clots using a
technique called high-throughput imaging flow cytometry.
They next used a type of machine-learning technology called a
convolutional neural network to train a computer to identify subtle
differences in the shape of different types of clots caused by
different molecules. They tested this tool on 25,000 clot images that
the computer had never seen before and found it was also able to
distinguish most of the clot types in the images.
Finally, they tested whether this new tool, which they named the
intelligent platelet aggregate classifier (iPAC), can diagnose
different clot types in human blood samples. They took blood samples
from four healthy people, exposed them to different clotting agents,
and showed that iPAC could tell the different types of clots apart.
"We showed that iPAC is a powerful tool for studying the underlying
mechanism of clot formation," Zhou says. She adds that, given recent
reports that COVID-19 causes blood clots, the technology could one day
be used to better understand the mechanism behind these clots too,
although much about the virus currently remains unknown.
"Using this new tool may uncover the characteristics of different types
of clots that were previously unrecognised by humans, and enable the
diagnosis of clots caused by combinations of clotting agents," says
senior author Keisuke Goda, Professor at the Department of Chemistry,
University of Tokyo. "Information about the causes of clots can help
researchers and medical doctors evaluate the effectiveness of
anti-clotting drugs and choose the right treatment, or combination of
treatments, for a particular patient."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]eLife. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Yuqi Zhou, Atsushi Yasumoto, Cheng Lei, Chun-Jung Huang, Hirofumi
Kobayashi, Yunzhao Wu, Sheng Yan, Chia-Wei Sun, Yutaka Yatomi,
Keisuke Goda. Intelligent classification of platelet aggregates by
agonist type. eLife, 2020; 9 DOI: [19]10.7554/eLife.52938
__________________________________________________________________
--- up 16 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 12 21:30:04 2020
Date:
May 12, 2020
Source:
University of Alberta
Summary:
Trained dogs can detect fire accelerants such as gasoline in
quantities as small as one billionth of a teaspoon, according to
new research by chemists. The study provides the lowest estimate
of the limit of sensitivity of dogs' noses and has implications
for arson investigations.
FULL STORY
__________________________________________________________________
Trained dogs can detect fire accelerants such as gasoline in quantities
as small as one billionth of a teaspoon, according to new research by
University of Alberta chemists. The study provides the lowest estimate
of the limit of sensitivity of dogs' noses and has implications for
arson investigations.
"During an arson investigation, a dog may be used to identify debris
that contains traces of ignitable liquids -- which could support a
hypothesis that a fire was the result of arson," explained Robin Abel,
graduate student in the Department of Chemistry and lead author of the
study. "Of course, a dog cannot give testimony in court, so debris from
where the dog indicated must be taken back to the laboratory and
analyzed. This estimate provides a target for forensic labs when
processing evidence flagged by detection dogs at sites of potential
arson."
The study involved two dog-and-handler teams. The first was trained to
detect a variety of ignitable liquids, while the other was trained
primarily with gasoline. Results show that the dog trained on a variety
of liquids performed well detecting all accelerants, while the dog
trained on gasoline was not able to generalize to other accelerants at
extremely low concentrations.
Another outcome of the study was the development of a protocol that can
be used to generate suitable ultra-clean substrates necessary for
assessing the performance of accelerant-detection dogs for trace-level
detection.
"In this field, it is well-known that dogs are more sensitive than
conventional laboratory tests," said James Harynuk, associate professor
of chemistry and Abel's supervisor. "There have been many cases where a
dog will flag debris that then tests negative in the lab. In order for
us to improve laboratory techniques so that they can match the
performance of the dogs, we must first assess the dogs. This work gives
us a very challenging target to meet for our laboratory methods."
So, just how small a volume of gasoline can a dog detect?
"The dogs in this study were able to detect down to one billionth of a
teaspoon -- or 5 pL -- of gasoline," added Harynuk. "Their noses are
incredibly sensitive."
This research was conducted in collaboration with Jeff Lunder, vice
president of the Canine Accelerant Detection Association (CADA) Fire
Dogs. Funding was provided by the Natural Sciences and Engineering
Research Council of Canada (NSERC).
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Alberta. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Robin J. Abel, Jeffrey L. Lunder, James J. Harynuk. A novel
protocol for producing low-abundance targets to characterize the
sensitivity limits of ignitable liquid detection canines. Forensic
Chemistry, 2020; 18: 100230 DOI: [19]10.1016/j.forc.2020.100230
__________________________________________________________________
--- up 16 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 12 21:30:04 2020
Researchers use plasmonics to enhance fluorescent markers in lab-on-a-chip diagnostic devices
Date:
May 12, 2020
Source:
Duke University
Summary:
Engineers have shown that nanosized silver cubes can make
diagnostic tests that rely on fluorescence easier to read by
making them more than 150 times brighter. Combined with an
emerging point-of-care diagnostic platform already shown to be
able to detect small traces of viruses and other biomarkers, the
approach could allow such tests to become much cheaper and more
widespread.
FULL STORY
__________________________________________________________________
Engineers at Duke University have shown that nanosized silver cubes can
make diagnostic tests that rely on fluorescence easier to read by
making them more than 150 times brighter. Combined with an emerging
point-of-care diagnostic platform already shown capable of detecting
small traces of viruses and other biomarkers, the approach could allow
such tests to become much cheaper and more widespread.
The results appeared online on May 6 in the journal Nano Letters.
Plasmonics is a scientific field that traps energy in a feedback loop
called a plasmon onto the surface of silver nanocubes. When fluorescent
molecules are sandwiched between one of these nanocubes and a metal
surface, the interaction between their electromagnetic fields causes
the molecules to emit light much more vigorously. Maiken Mikkelsen, the
James N. and Elizabeth H. Barton Associate Professor of Electrical and
Computer Engineering at Duke, has been working with her laboratory at
Duke to create new types of hyperspectral cameras and superfast optical
signals using plasmonics for nearly a decade.
At the same time, researchers in the laboratory of Ashutosh Chilkoti,
the Alan L. Kaganov Distinguished Professor of Biomedical Engineering,
have been working on a self-contained, point-of-care diagnostic test
that can pick out trace amounts of specific biomarkers from biomedical
fluids such as blood. But because the tests rely on fluorescent markers
to indicate the presence of the biomarkers, seeing the faint light of a
barely positive test requires expensive and bulky equipment.
"Our research has already shown that plasmonics can enhance the
brightness of fluorescent molecules tens of thousands of times over,"
said Mikkelsen. "Using it to enhance diagnostic assays that are limited
by their fluorescence was clearly a very exciting idea."
"There are not a lot of examples of people using plasmon-enhanced
fluorescence for point-of-care diagnostics, and the few that exist have
not been yet implemented into clinical practice," added Daria Semeniak,
a graduate student in Chilkoti's laboratory. "It's taken us a couple of
years, but we think we've developed a system that can work."
In the new paper, researchers from the Chilkoti lab build their
super-sensitive diagnostic platform called the D4 Assay onto a thin
film of gold, the preferred yin to the plasmonic silver nanocube's
yang. The platform starts with a thin layer of polymer brush coating,
which stops anything from sticking to the gold surface that the
researchers don't want to stick there. The researchers then use an
ink-jet printer to attach two groups of molecules tailored to latch on
to the biomarker that the test is trying to detect. One set is attached
permanently to the gold surface and catches one part of the biomarker.
The other is washed off of the surface once the test begins, attaches
itself to another piece of the biomarker, and flashes light to indicate
it's found its target.
After several minutes pass to allow the reactions to occur, the rest of
the sample is washed away, leaving behind only the molecules that have
managed to find their biomarker matches, floating like fluorescent
beacons tethered to a golden floor.
"The real significance of the assay is the polymer brush coating," said
Chilkoti. "The polymer brush allows us to store all of the tools we
need on the chip while maintaining a simple design."
While the D4 Assay is very good at grabbing small traces of specific
biomarkers, if there are only trace amounts, the fluorescent beacons
can be difficult to see. The challenge for Mikkelsen and her colleagues
was to place their plasmonic silver nanocubes above the beacons in such
a way that they supercharged the beacons' fluorescence.
But as is usually the case, this was easier said than done.
"The distance between the silver nanocubes and the gold film dictates
how much brighter the fluorescent molecule becomes," said Daniela Cruz,
a graduate student working in Mikkelsen's laboratory. "Our challenge
was to make the polymer brush coating thick enough to capture the
biomarkers -- and only the biomarkers of interest -- but thin enough to
still enhance the diagnostic lights."
The researchers attempted two approaches to solve this Goldilocks
riddle. They first added an electrostatic layer that binds to the
detector molecules that carry the fluorescent proteins, creating a sort
of "second floor" that the silver nanocubes could sit on top of. They
also tried functionalizing the silver nanocubes so that they would
stick directly to individual detector molecules on a one-on-one basis.
While both approaches succeeded in boosting the amount of light coming
from the beacons, the former showed the best improvement, increasing
its fluorescence by more than 150 times. However, this method also
requires an extra step of creating a "second floor," which adds another
hurdle to engineering a way to make this work on a commercial
point-of-care diagnostic rather than only in a laboratory. And while
the fluorescence didn't improve as much in the second approach, the
test's accuracy did.
"Building microfluidic lab-on-a-chip devices through either approach
would take time and resources, but they're both doable in theory," said
Cassio Fontes, a graduate student in the Chilkoti laboratory. "That's
what the D4 Assay is moving toward."
And the project is moving forward. Earlier in the year, the researchers
used preliminary results from this research to secure a five-year, $3.4
million R01 research award from the National Heart, Lung, and Blood
Institute. The collaborators will be working to optimize these
fluorescence enhancements while integrating wells, microfluidic
channels and other low-cost solutions into a single-step diagnostic
device that can run through all of these steps automatically and be
read by a common smartphone camera in a low-cost device.
"One of the big challenges in point-of-care tests is the ability to
read out results, which usually requires very expensive detectors,"
said Mikkelsen. "That's a major roadblock to having disposable tests to
allow patients to monitor chronic diseases at home or for use in
low-resource settings. We see this technology not only as a way to get
around that bottleneck, but also as a way to enhance the accuracy and
threshold of these diagnostic devices."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Duke University. Original written by Ken
Kingery. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Daniela F. Cruz, Cassio M. Fontes, Daria Semeniak, Jiani Huang,
Angus Hucknall, Ashutosh Chilkoti, Maiken H. Mikkelsen. Ultrabright
Fluorescence Readout of an Ink-Jet Printed Immunoassay Using
Plasmonic Nanogap Cavities. Nano Letters, 2020; DOI:
[19]10.1021/acs.nanolett.0c01051
__________________________________________________________________
--- up 16 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:08 2020
Date:
May 13, 2020
Source:
NASA/Goddard Space Flight Center
Summary:
In late May and early June, Earthlings may be able to glimpse
Comet SWAN. The comet is currently faintly visible to the
unaided eye in the Southern Hemisphere just before sunrise. The
new comet was first spotted in April 2020, by an amateur
astronomer named Michael Mattiazzo using data from the SOHO
satellite.
FULL STORY
__________________________________________________________________
In late May and early June, Earthlings may be able to glimpse Comet
SWAN. The comet is currently faintly visible to the unaided eye in the
Southern Hemisphere just before sunrise -- providing skywatchers with a
relatively rare glimpse of a comet bright enough to be seen without a
telescope. But Comet SWAN's initial discovery was made not from the
ground, but via an instrument on board ESA (the European Space Agency)
and NASA's Solar and Heliospheric Observatory, or SOHO, satellite.
The new comet was first spotted in April 2020, by an amateur astronomer
named Michael Mattiazzo using data from a SOHO instrument called Solar
Wind Anisotropies, or SWAN -- as seen [17]here. The comet appears to
leave the left side of the image and reappear on the right side around
May 3, because of the way SWAN's 360-degree all-sky maps are shown,
much like a globe is represented by a 2D map.
SWAN maps the constantly outflowing solar wind in interplanetary space
by focusing on a particular wavelength of ultraviolet light emitted by
hydrogen atoms. The new comet -- officially classified C/2020 F8 (SWAN)
but nicknamed Comet SWAN -- was spotted in the images because it's
releasing huge amounts of water, about 1.3 tons per second. As water is
made of hydrogen and oxygen, this release made Comet SWAN visible to
SOHO's instruments.
Comet SWAN is the 3,932nd comet discovered using data from SOHO. Almost
all of the nearly 4,000 discoveries have been made using data from
SOHO's coronagraph, an instrument that blocks out the Sun's bright face
using a metal disk to reveal the comparatively faint outer atmosphere,
the corona. This is only the 12th comet discovered with the SWAN
instrument since SOHO's launch in 1995, eight of which were also
discovered by Mattiazzo.
Comet SWAN makes its closest approach to Earth on May 13, at a distance
of about 53 million miles. Comet SWAN's closest approach to the Sun,
called perihelion, will happen on May 27.
Though it can be very difficult to predict the behavior of comets that
make such close approaches to the Sun, scientists are hopeful that
Comet SWAN will remain bright enough to be seen as it continues its
journey.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[18]Materials provided by [19]NASA/Goddard Space Flight Center.
Original written by Sarah Frazier. Note: Content may be edited for
style and length.
__________________________________________________________________
Related Multimedia:
* [20]Animation showing SOHO observations of Comet SWAN
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:08 2020
Date:
May 13, 2020
Source:
NASA/Goddard Space Flight Center
Summary:
Astronomers have detected elusive pulsation patterns in dozens
of young, rapidly rotating stars thanks to data from NASA's
Transiting Exoplanet Survey Satellite (TESS).
FULL STORY
__________________________________________________________________
Astronomers have detected elusive pulsation patterns in dozens of
young, rapidly rotating stars thanks to data from NASA's Transiting
Exoplanet Survey Satellite (TESS). The discovery will revolutionize
scientists' ability to study details like the ages, sizes and
compositions of these stars -- all members of a class named for the
prototype, the bright star Delta Scuti.
"Delta Scuti stars clearly pulsate in interesting ways, but the
patterns of those pulsations have so far defied understanding," said
Tim Bedding, a professor of astronomy at the University of Sydney. "To
use a musical analogy, many stars pulsate along simple chords, but
Delta Scuti stars are complex, with notes that seem to be jumbled. TESS
has shown us that's not true for all of them."
A paper describing the findings, led by Bedding, appears in the May 14
issue of the journal Nature and is now available online.
Geologists studying seismic waves from earthquakes figured out Earth's
internal structure from the way the reverberations changed speed and
direction as they traveled through it. Astronomers apply the same
principle to study the interiors of stars through their pulsations, a
field called asteroseismology.
Sound waves travel through a star's interior at speeds that change with
depth, and they all combine into pulsation patterns at the star's
surface. Astronomers can detect these patterns as tiny fluctuations in
brightness and use them to determine the star's age, temperature,
composition, internal structure and other properties.
Delta Scuti stars are between 1.5 and 2.5 times the Sun's mass. They're
named after Delta Scuti, a star visible to the human eye in the
southern constellation Scutum that was first identified as variable in
1900. Since then, astronomers have identified thousands more like Delta
Scuti, many with NASA's Kepler space telescope, another planet-hunting
mission that operated from 2009 to 2018.
But scientists have had trouble interpreting Delta Scuti pulsations.
These stars generally rotate once or twice a day, at least a dozen
times faster than the Sun. The rapid rotation flattens the stars at
their poles and jumbles the pulsation patterns, making them more
complicated and difficult to decipher.
To determine if order exists in Delta Scuti stars' apparently chaotic
pulsations, astronomers needed to observe a large set of stars multiple
times with rapid sampling. TESS monitors large swaths of the sky for 27
days at a time, taking one full image every 30 minutes with each of its
four cameras. This observing strategy allows TESS to track changes in
stellar brightness caused by planets passing in front of their stars,
which is its primary mission, but half-hour exposures are too long to
catch the patterns of the more rapidly pulsating Delta Scuti stars.
Those changes can happen in minutes.
But TESS also captures snapshots of a few thousand pre-selected stars
-- including some Delta Scuti stars -- every two minutes. When Bedding
and his colleagues began sorting through the measurements, they found a
subset of Delta Scuti stars with regular pulsation patterns. Once they
knew what to look for, they searched for other examples in data from
Kepler, which used a similar observing strategy. They also conducted
follow-up observations with ground-based telescopes, including one at
the W.M. Keck Observatory in Hawaii and two in the global Las Cumbres
Observatory network. In total, they identified a batch of 60 Delta
Scuti stars with clear patterns.
"This really is a breakthrough. Now we have a regular series of
pulsations for these stars that we can understand and compare with
models," said co-author Simon Murphy, a postdoctoral researcher at the
University of Sydney. "It's going to allow us to measure these stars
using asteroseismology in a way that we've never been able to do. But
it's also shown us that this is just a stepping-stone in our
understanding of Delta Scuti stars."
Pulsations in the well-behaved Delta Scuti group fall into two major
categories, both caused by energy being stored and released in the
star. Some occur as the whole star expands and contracts symmetrically.
Others occur as opposite hemispheres alternatively expand and contract.
Bedding's team inferred the alterations by studying each star's
fluctuations in brightness.
The data have already helped settle a debate over the age of one star,
called HD 31901, a member of a recently discovered stream of stars
orbiting within our galaxy. Scientists placed the age of the overall
stream at 1 billion years, based on the age of a red giant they
suspected belonged to the same group. A later estimate, based on the
rotation periods of other members of the stellar stream, suggested an
age of only about 120 million years. Bedding's team used the TESS
observations to create an asteroseismic model of HD 31901 that supports
the younger age.
"Delta Scuti stars have been frustrating targets because of their
complicated oscillations, so this is a very exciting discovery," said
Sarbani Basu, a professor of astronomy at Yale University in New Haven,
Connecticut, who studies asteroseismology but was not involved in the
study. "Being able to find simple patterns and identify the modes of
oscillation is game changing. Since this subset of stars allows normal
seismic analyses, we will finally be able to characterize them
properly."
The team thinks their set of 60 stars has clear patterns because
they're younger than other Delta Scuti stars, having only recently
settled into producing all of their energy through nuclear fusion in
their cores. The pulsations occur more rapidly in the fledgling stars.
As the stars age, the frequency of the pulsations slows, and they
become jumbled with other signals.
Another factor may be TESS's viewing angle. Theoretical calculations
predict that a spinning star's pulsation patterns should be simpler
when its rotational pole faces us instead of its equator. The team's
TESS data set included around 1,000 Delta Scuti stars, which means that
some of them, by chance, must be viewed close to pole-on.
Scientists will continue to develop their models as TESS begins taking
full images every 10 minutes instead of every half hour in July.
Bedding said the new observing strategy will help capture the
pulsations of even more Delta Scuti stars.
"We knew when we designed TESS that, in addition to finding many
exciting new exoplanets, the satellite would also advance the field of
asteroseismology," said TESS Principal Investigator George Ricker at
the Massachusetts Institute of Technology's Kavli Institute for
Astrophysics and Space Research in Cambridge. "The mission has already
found a new type of star that pulsates on one side only and has
unearthed new facts about well-known stars. As we complete the initial
two-year mission and commence the extended mission, we're looking
forward to a wealth of new stellar discoveries TESS will make."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]NASA/Goddard Space Flight Center.
Original written by Jeanette Kazmierczak. Note: Content may be edited
for style and length.
__________________________________________________________________
Related Multimedia:
* [19]Video illustrating pulsations of a Delta Scuti star; animation
showing sound waves bouncing around inside a star cause it to
expand and contract; the rapid beat of HD 31901, a Delta Scuti star
in the southern constellation Lepus
__________________________________________________________________
Journal Reference:
1. Timothy R. Bedding, Simon J. Murphy, Daniel R. Hey, Daniel Huber,
Tanda Li, Barry Smalley, Dennis Stello, Timothy R. White, Warrick
H. Ball, William J. Chaplin, Isabel L. Colman, Jim Fuller, Eric
Gaidos, Daniel R. Harbeck, J. J. Hermes, Daniel L. Holdsworth, Gang
Li, Yaguang Li, Andrew W. Mann, Daniel R. Reese, Sanjay Sekaran,
Jie Yu, Victoria Antoci, Christoph Bergmann, Timothy M. Brown,
Andrew W. Howard, Michael J. Ireland, Howard Isaacson, Jon M.
Jenkins, Hans Kjeldsen, Curtis McCully, Markus Rabus, Adam D.
Rains, George R. Ricker, Christopher G. Tinney, Roland K.
Vanderspek. Very regular high-frequency pulsation modes in young
intermediate-mass stars. Nature, 2020; 581 (7807): 147 DOI:
[20]10.1038/s41586-020-2226-8
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:08 2020
Researchers simulate the core of Mars to investigate its composition and
origin
Date:
May 13, 2020
Source:
University of Tokyo
Summary:
Earth-based experiments on iron-sulfur alloys thought to
comprise the core of Mars reveal details about the planet's
seismic properties for the first time. This information will be
compared to observations made by Martian space probes in the
near future. Whether the results between experiment and
observation coincide or not will either confirm existing
theories about Mars' composition or call into question the story
of its origin.
FULL STORY
__________________________________________________________________
Earth-based experiments on iron-sulfur alloys thought to comprise the
core of Mars reveal details about the planet's seismic properties for
the first time. This information will be compared to observations made
by Martian space probes in the near future. Whether the results between
experiment and observation coincide or not will either confirm existing
theories about Mars' composition or call into question the story of its
origin.
Mars is one of our closest terrestrial neighbors, yet it's still very
far away -- between about 55 million and 400 million kilometers
depending on where Earth and Mars are relative to the sun. At the time
of writing, Mars is around 200 million kilometers away, and in any
case, it is extremely difficult, expensive and dangerous to get to. For
these reasons, it is sometimes more sensible to investigate the red
planet through simulations here on Earth than it is to send an
expensive space probe or, perhaps one day, people.
Keisuke Nishida, an Assistant Professor from the University of Tokyo's
Department of Earth and Planetary Science at the time of the study, and
his team are keen to investigate the inner workings of Mars. They look
at seismic data and composition which tell researchers not just about
the present state of the planet, but also about its past, including its
origins.
"The exploration of the deep interiors of Earth, Mars and other planets
is one of the great frontiers of science," said Nishida. "It's
fascinating partly because of the daunting scales involved, but also
because of how we investigate them safely from the surface of the
Earth."
For a long time it has been theorized that the core of Mars probably
consists of an iron-sulfur alloy. But given how inaccessible the
Earth's core is to us, direct observations of Mars' core will likely
have to wait some time. This is why seismic details are so important,
as seismic waves, akin to enormously powerful sound waves, can travel
through a planet and offer a glimpse inside, albeit with some caveats.
"NASA's Insight probe is already on Mars collecting seismic readings,"
said Nishida. "However, even with the seismic data there was an
important missing piece of information without which the data could not
be interpreted. We needed to know the seismic properties of the
iron-sulfur alloy thought to make up the core of Mars."
Nishida and team have now measured the velocity for what is known as
P-waves (one of two types of seismic wave, the other being S-waves) in
molten iron-sulfur alloys.
"Due to technical hurdles, it took more than three years before we
could collect the ultrasonic data we needed, so I am very pleased we
now have it," said Nishida. "The sample is extremely small, which might
surprise some people given the huge scale of the planet we are
effectively simulating. But microscale high-pressure experiments help
exploration of macroscale structures and long time-scale evolutionary
histories of planets."
A molten iron-sulfur alloy just above its melting point of 1,500
degrees Celsius and subject to 13 gigapascals of pressure has a P-Wave
velocity of 4,680 meters per second; this is over 13 times faster than
the speed of sound in air, which is 343 meters per second. The
researchers used a device called a Kawai-type multianvil press to
compress the sample to such pressures. They used X-ray beams from two
synchrotron facilities, KEK-PF and SPring-8, to help them image the
samples in order to then calculate the P-wave values.
"Taking our results, researchers reading Martian seismic data will now
be able to tell whether the core is primarily iron-sulfur alloy or
not," said Nishida. "If it isn't, that will tell us something of Mars'
origins. For example, if Mars' core includes silicon and oxygen, it
suggests that, like the Earth, Mars suffered a huge impact event as it
formed. So, what is Mars made of and how was it formed? I think we are
about to find out."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Tokyo. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Keisuke Nishida, Yuki Shibazaki, Hidenori Terasaki, Yuji Higo, Akio
Suzuki, Nobumasa Funamori, Kei Hirose. Effect of sulfur on sound
velocity of liquid iron under Martian core conditions. Nature
Communications, 2020; 11 (1) DOI: [19]10.1038/s41467-020-15755-2
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:10 2020
Date:
May 13, 2020
Source:
Brigham Young University
Summary:
A recent six-year study, the longest study ever done on video
game addiction, found that about 90% of gamers do not play in a
way that is harmful or causes negative long-term consequences. A
significant minority, though, can become truly addicted to video
games and as a result can suffer mentally, socially and
behaviorally.
FULL STORY
__________________________________________________________________
For most adolescents, playing video games is an enjoyable and often
social form of entertainment. While playing video games is a fun
pastime, there is a growing concern that spending too much time playing
video games is related to negative developmental outcomes and can
become an addiction.
A recent six-year study, the longest study ever done on video game
addiction, found that about 90% of gamers do not play in a way that is
harmful or causes negative long-term consequences. A significant
minority, though, can become truly addicted to video games and as a
result can suffer mentally, socially and behaviorally.
"The aim of this particular study is to look at the longer-term impact
of having a particular relationship with video games and what it does
to a person over time," said Sarah Coyne, a professor of family life at
BYU and lead author of the research. "To see the impact, we examined
the trajectories of pathological video gameplay across six years, from
early adolescence to emerging adulthood."
In addition to finding long-term consequences for addicted gamers, this
study, published in Developmental Psychology, also breaks down gamer
stereotypes and found that pathological gaming is not a one size fits
all disorder.
Pathological video gameplay is characterized by excessive time spent
playing video games, difficulty disengaging from them and disruption to
healthy functioning due to gaming.
Only about 10% of gamers fall into the pathological video gameplay
category. When compared to the non-pathological group, those in the
study displayed higher levels of depression, aggression, shyness,
problematic cell phone use and anxiety by emerging adulthood. This was
despite the groups being the same in all these variables at the initial
time point, suggesting that video games may have been important in
developing these negative outcomes.
To measure predictors and outcomes to video game addiction, Coyne
studied 385 adolescents as they transitioned into adulthood. Each
individual completed multiple questionnaires once a year over a
six-year period. These questionnaires measured depression, anxiety,
aggression, delinquency, empathy, prosocial behavior, shyness, sensory
reactivity, financial stress and problematic cell phone use.
Two main predictors for video game addiction were found: being male and
having low levels of prosocial behavior. Having higher levels of
prosocial behavior, or voluntary behavior meant to benefit another
person, tended to be a protective factor against the addiction
symptoms.
Aside from the predictors, Coyne also found three distinct trajectories
of video game use. Seventy-two percent of adolescents were relatively
low in addiction symptoms across the six years of data collection.
Another 18% of adolescents started with moderate symptoms that did not
change over time, and only 10% of adolescents showed increasing levels
of pathological gaming symptoms throughout the study.
The results suggest that while about 90% of gamers are not playing in a
way that is dysfunctional or detrimental to the individual's life,
there is still a sizable minority who are truly addicted to video games
and suffer addiction symptoms over time.
These findings also go against the stereotype of gamers living in their
parent's basement, unable to support themselves financially or get a
job because of their fixation on video games. At least in their early
twenties, pathological users of video games appear to be just as
financially stable and forward-moving as gamers who are not addicted.
"I really do think that there are some wonderful things about video
games," Coyne said. "The important thing is to use them in healthy ways
and to not get sucked into the pathological levels."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Brigham Young University. Original
written by Cami Buckley. Note: Content may be edited for style and
length.
__________________________________________________________________
Journal Reference:
1. Sarah M. Coyne, Laura A. Stockdale, Wayne Warburton, Douglas A.
Gentile, Chongming Yang, Brett M. Merrill. Pathological video game
symptoms from adolescence to emerging adulthood: A 6-year
longitudinal study of trajectories, predictors, and outcomes..
Developmental Psychology, 2020; DOI: [19]10.1037/dev0000939
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:10 2020
3D interface provides cellular-level, full-body blood flow modeling to study and treat cardiovascular disease
Date:
May 13, 2020
Source:
Duke University
Summary:
Biomedical engineers are developing a massive fluid dynamics
simulator that can model blood flow through the full human
arterial system at subcellular resolution. One of the goals of
the effort is to provide doctors with a virtual reality system
that can guide their treatment plans by allowing them to
simulate a patient's specific vasculature and accurately predict
how decisions such as stent placement, conduit insertions and
other geometric alterations will affect surgical outcomes.
FULL STORY
__________________________________________________________________
Biomedical engineers at Duke University are developing a massive fluid
dynamics simulator that can model blood flow through the full human
arterial system at subcellular resolution. One of the goals of the
effort is to provide doctors with guidance in their treatment plans by
allowing them to simulate a patient's specific vasculature and
accurately predict how decisions such as stent placement, conduit
insertions and other geometric alterations to blood flow will affect
surgical outcomes.
One of the largest barriers to clinical adoption however, is developing
a user interface that allows clinicians to easily explore their options
without needing any expertise in computer science. As any programmer
will tell you, designing a smooth, intuitive interface that people from
all types of backgrounds can quickly master is a tall task.
In a new study published on May 7 in the Journal of Computational
Science, the Duke researchers report on their initial foray into
creating a user interface for their blood flow simulation tool called
HARVEY. They explored various interfaces ranging from standard desktop
displays to immersive virtual reality experiences and found that, while
users might be comfortable using a standard mouse and keyboard, some
more futuristic interfaces might hold the key to widespread adoption.
"HARVEY currently requires knowledge of C coding and command line
interfaces, which really limits who can use the program," said Amanda
Randles, the Alfred Winborne and Victoria Stover Mordecai Assistant
Professor of Biomedical Sciences at Duke. "This paper introduces a
graphical user interface we've developed called Harvis, so that anybody
can use Harvey, whether they're surgeons trying to figure out the best
placement for a stent or biomedical researchers trying to design a new
type of stent altogether."
Randles has been developing the HARVEY code for nearly a decade, having
begun the work as a doctoral student in the research group of Efthimios
Kaxiras, the John Hasbrouck Van Vleck Professor of Pure and Applied
Physics at Harvard University. In that time, she has demonstrated that
HARVEY can accurately model blood flow through patient-specific aortas
and other vascular geometries on longer scales. She's also shown the
program can model 3D blood flows on the scale of the full human body.
Putting HARVEY to work, Randles has helped researchers understand stent
treatment of cerebral aneurysms and the growth of aneurysms. She has
created a quick, noninvasive way to check for peripheral arterial
disease, and to better understand how circulating cancer cells adhere
to different tissues. With steady progress on the computational
abilities of the code and demonstrated usefulness in real-world
applications, Randles is now working to make sure others can make the
best use of its abilities.
"As cardiovascular disease continues to be the number one cause of
death in the US, the ability to improve treatment planning and outcome
remains a significant challenge," said Randles. "With the maturity and
availability of VR/AR devices, we need to understand the role these
technologies can play in the interaction with such data. This research
is a much-needed step for developing future software to combat
cardiovascular disease."
In the new study, Randles and her biomedical engineering colleagues,
research associate Harvey Shi and graduate student Jeff Ames, put the
Harvis interface they've been developing to the test. They asked
medical students and biomedical researchers to simulate three different
situations -- placing a conduit between two blood vessels, expanding or
shrinking the size of a blood vessel, or placing a stent within a blood
vessel. The test users attempted these tasks using either a standard
mouse and computer screen, a "Z-space" semi-immersive virtual reality
device, or a fully immersive virtual reality experience with an HTC
Vive display device.
The results show that the students and researchers could use the
standard mouse and keyboard interface and the fully immersive VR
interface equally as well in a majority of cases both quantitatively
and qualitatively. The semi-immersive display, basically a special
pointing tool combined with a monitor and 3D glasses, however, ranked
behind the other two devices, as the users had some issues adjusting to
the unique hardware setup and controls.
The study also presents a generalizable design architecture for other
simulated workflows, laying out a detailed description of the rationale
for the design of Harvis, which can be extended to similar platforms.
While the study did not find any major differences between the most and
least immersive interfaces in terms of quality and efficiency, Randles
did notice a major difference between the users' reactions to the
equipment.
"People enjoyed the 3D interface more," said Randles. "And if they
enjoyed it more, they're more likely to actually use it. It could also
be a fun and exciting way to get students engaged in classes about the
vasculature system and hemodynamics."
Randles says she plans on running experiments to see if her 3D blood
flow interface can help medical students retain important knowledge
better than current standards. In the future, tools like this could
assist with treatment planning such as placements of stents using a
more intuitive virtual reality interface. Randles also expects these
types of tools will facilitate biomedical research in the personalized
flow space.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Duke University. Original written by Ken
Kingery. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Harvey Shi, Jeff Ames, Amanda Randles. Harvis: an interactive
virtual reality tool for hemodynamic modification and simulation.
Journal of Computational Science, 2020; 101091 DOI:
[19]10.1016/j.jocs.2020.101091
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:10 2020
online
Researchers warn scientists are fighting health misinformation in the wrong place
Date:
May 13, 2020
Source:
George Washington University
Summary:
Communities on Facebook that distrust establishment health
guidance are more effective than government health agencies and
other reliable health groups at reaching and engaging
'undecided' individuals, according to a new study.
FULL STORY
__________________________________________________________________
Communities on Facebook that distrust establishment health guidance are
more effective than government health agencies and other reliable
health groups at reaching and engaging "undecided" individuals,
according to a study published today in the journal Nature.
Researchers at the George Washington University developed a
first-of-its-kind map to track the vaccine conversation among 100
million Facebook users during the height of the 2019 measles outbreak.
The new study and its "battleground" map reveal how distrust in
establishment health guidance could spread and dominate online
conversations over the next decade, potentially jeopardizing public
health efforts to protect populations from COVID-19 and future
pandemics through vaccinations.
Professor Neil Johnson and his GW research team, including professor
Yonatan Lupu and researchers Nicolas Velasquez, Rhys Leahy and Nico
Restrepo, collaborated with researchers at the University of Miami,
Michigan State University and Los Alamos National Laboratory to better
understand how distrust in scientific expertise evolves online,
especially related to vaccines.
"There is a new world war online surrounding trust in health expertise
and science, particularly with misinformation about COVID-19, but also
distrust in big pharmaceuticals and governments," Dr. Johnson said.
"Nobody knew what the field of battle looked like, though, so we set to
find out."
During the 2019 measles outbreak, the research team examined Facebook
communities, totaling nearly 100 million users, which were active
around the vaccine topic and which formed a highly dynamic,
interconnected network across cities, countries, continents and
languages. The team identified three camps comprising pro-vaccination
communities, anti-vaccination communities and communities of undecided
individuals such as parenting groups. Starting with one community, the
researchers looked to find a second one that was strongly entangled
with the original, and so on, to better understand how they interacted
with each other.
They discovered that, while there are fewer individuals with
anti-vaccination sentiments on Facebook than with pro-vaccination
sentiments, there are nearly three times the number of anti-vaccination
communities on Facebook than pro-vaccination communities. This allows
anti-vaccination communities to become highly entangled with undecided
communities, while pro-vaccination communities remain mostly
peripheral. In addition, pro-vaccination communities which focused on
countering larger anti-vaccination communities may be missing
medium-sized ones growing under the radar.
The researchers also found anti-vaccination communities offer more
diverse narratives around vaccines and other established health
treatments -- promoting safety concerns, conspiracy theories or
individual choice, for example -- that can appeal to more of Facebook's
approximately 3 billion users, thus increasing the chances of
influencing individuals in undecided communities. Pro-vaccination
communities, on the other hand, mostly offered monothematic messaging
typically focused on the established public health benefits of
vaccinations. The GW researchers noted that individuals in these
undecided communities, far from being passive bystanders, were actively
engaging with vaccine content.
"We thought we would see major public health entities and state-run
health departments at the center of this online battle, but we found
the opposite. They were fighting off to one side, in the wrong place,"
Dr. Johnson said.
As scientists around the world scramble to develop an effective
COVID-19 vaccine, the spread of health disinformation and
misinformation has important public health implications, especially on
social media, which often serves as an amplifier and information
equalizer. In their study, the GW researchers proposed several
different strategies to fight against online disinformation, including
influencing the heterogeneity of individual communities to delay onset
and decrease their growth and manipulating the links between
communities in order to prevent the spread of negative views.
"Instead of playing whack-a-mole with a global network of communities
that consume and produce (mis)information, public health agencies,
social media platforms and governments can use a map like ours and an
entirely new set of strategies to identify where the largest theaters
of online activity are and engage and neutralize those communities
peddling in misinformation so harmful to the public," Dr. Johnson said.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]George Washington University. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Neil F. Johnson, Nicolas Velásquez, Nicholas Johnson Restrepo, Rhys
Leahy, Nicholas Gabriel, Sara El Oud, Minzhang Zheng, Pedro
Manrique, Stefan Wuchty, Yonatan Lupu. The online competition
between pro- and anti-vaccination views. Nature, 2020; DOI:
[19]10.1038/s41586-020-2281-1
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
Date:
May 13, 2020
Source:
University of Massachusetts Amherst
Summary:
Scientists report that they have developed bioelectronic ammonia
gas sensors that are among the most sensitive ever made. It uses
electric-charge-conducting protein nanowires derived from the
bacterium Geobacter to provide biomaterials for electrical
devices. They grow hair-like protein filaments that work as
nanoscale ''wires'' to transfer charges for their nourishment
and to communicate with other bacteria.
FULL STORY
__________________________________________________________________
Writing in the journal NanoResearch, a team at the University of
Massachusetts Amherst reports this week that they have developed
bioelectronic ammonia gas sensors that are among the most sensitive
ever made.
The sensor uses electric-charge-conducting protein nanowires derived
from the bacterium Geobacter to provide biomaterials for electrical
devices. More than 30 years ago, senior author and microbiologist Derek
Lovley discovered Geobacter in river mud. The microbes grow hair-like
protein filaments that work as nanoscale "wires" to transfer charges
for their nourishment and to communicate with other bacteria.
First author and biomedical engineering doctoral student Alexander
Smith, with his advisor Jun Yao and Lovley, say they designed this
first sensor to measure ammonia because that gas is important to
agriculture, the environment and biomedicine. For example, in humans,
ammonia on the breath may signal disease, while in poultry farming, the
gas must be closely monitored and controlled for bird health and
comfort and to avoid feed imbalances and production losses.
Yao says, "This sensor allows you to do high-precision sensing; it's
much better than previous electronic sensors." Smith adds, "Every time
I do a new experiment, I'm pleasantly surprised. We didn't expect them
to work as well as they have. I really think they could have a real
positive impact on the world."
Smith says existing electronic sensors often have either limited or low
sensitivity, and they are prone to interference from other gases. In
addition to superior function and low cost, he adds, "our sensors are
biodegradable so they do not produce electronic waste, and they are
produced sustainably by bacteria using renewable feedstocks without the
need for toxic chemicals."
Smith conducted the experiments over the past 18 months as part of his
Ph.D. work. It was known from Lovley's earlier studies that the protein
nanowires' conductivity changed in response to pH -- the acid or base
level- of solution around the protein nanowires. This moved the
researchers to test the idea that they could be highly responsive to
molecule binding for biosensing. "If you expose them to a chemical, the
properties change and you can measure the response," Smith notes.
When he exposed the nanowires to ammonia, "the response was really
noticeable and significant," Smith says. "Early on, we found we could
tune the sensors in a way that shows this significant response. They
are really sensitive to ammonia and much less to other compounds, so
the sensors can be very specific."
Lovley adds, that the "very stable" nanowires last a long time, the
sensor functions consistently and robustly after months of use, and
work so well "it is remarkable."
Yao says, "These protein nanowires are always amazing me. This new use
is in a completely different area than we had worked in before."
Previously, the team has reported using protein nanowires to harvest
energy from humidity and applying them as memristors for biological
computing.
Smith, who calls himself "entrepreneurial," won first place in UMass
Amherst's 2018 Innovation Challenge for the startup business plan for
the company he formed with Yao and Lovley, e-Biologics. The researchers
have followed up with a patent application, fundraising, business
development and research and development plans.
Lovley says, "This work is the first proof-of-concept for the nanowire
sensor. Once we get back in the lab, we'll develop sensors for other
compounds. We are working on tuning them for an array of other
compounds."
Support for the work came as a CAREER grant and Graduate Research
Fellowship from the National Science Foundation, UMass Amherst's Office
of Technology Commercialization and Ventures and the campus's Center
for Hierarchical Manufacturing, an NSF-funded Nanoscale Science and
Engineering Center.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Massachusetts Amherst.
Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Alexander F. Smith, Xiaomeng Liu, Trevor L. Woodard, Tianda Fu,
Todd Emrick, Juan M. Jiménez, Derek R. Lovley, Jun Yao.
Bioelectronic protein nanowire sensors for ammonia detection. Nano
Research, 2020; DOI: [19]10.1007/s12274-020-2825-6
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
New research could lead to safely reusable PPE
Date:
May 13, 2020
Source:
University of Pittsburgh
Summary:
Researchers have created a textile coating that can not only
repel liquids like blood and saliva but can also prevent viruses
from adhering to the surface.
FULL STORY
__________________________________________________________________
Masks, gowns, and other personal protective equipment (PPE) are
essential for protecting healthcare workers. However, the textiles and
materials used in such items can absorb and carry viruses and bacteria,
inadvertently spreading the disease the wearer sought to contain.
When the coronavirus spread amongst healthcare professionals and left
PPE in short supply, finding a way to provide better protection while
allowing for the safe reuse of these items became paramount.
Research from the LAMP Lab at the University of Pittsburgh Swanson
School of Engineering may have a solution. The lab has created a
textile coating that can not only repel liquids like blood and saliva
but can also prevent viruses from adhering to the surface. The work was
recently published in the journal ACS Applied Materials and Interfaces.
"Recently there's been focus on blood-repellent surfaces, and we were
interested in achieving this with mechanical durability," said Anthony
Galante, PhD student in industrial engineering at Pitt and lead author
of the paper. "We want to push the boundary on what is possible with
these types of surfaces, and especially given the current pandemic, we
knew it'd be important to test against viruses."
What makes the coating unique is its ability to withstand ultrasonic
washing, scrubbing and scraping. With other similar coatings currently
in use, washing or rubbing the surface of the textile will reduce or
eliminate its repellent abilities.
"The durability is very important because there are other surface
treatments out there, but they're limited to disposable textiles. You
can only use a gown or mask once before disposing of it," said Paul
Leu, co-author and associate professor of industrial engineering, who
leads the LAMP Lab. "Given the PPE shortage, there is a need for
coatings that can be applied to reusable medical textiles that can be
properly washed and sanitized."
Galante put the new coating to the test, running it through tens of
ultrasonic washes, applying thousands of rotations with a scrubbing pad
(not unlike what might be used to scour pots and pans), and even
scraping it with a sharp razor blade. After each test, the coating
remained just as effective.
The researchers worked with the Charles T. Campbell Microbiology
Laboratory's Research Director Eric Romanowski and Director of Basic
Research Robert Shanks, in the Department of Ophthalmology at Pitt, to
test the coating against a strain of adenovirus.
"As this fabric was already shown to repel blood, protein and bacteria,
the logical next step was to determine whether it repels viruses. We
chose human adenovirus types 4 and 7, as these are causes of acute
respiratory disease as well as conjunctivitis (pink eye)," said
Romanowski. "It was hoped that the fabric would repel these viruses
similar to how it repels proteins, which these viruses essentially are:
proteins with nucleic acid inside. As it turned out, the adenoviruses
were repelled in a similar way as proteins."
The coating may have broad applications in healthcare: everything from
hospital gowns to waiting room chairs could benefit from the ability to
repel viruses, particularly ones as easily spread as adenoviruses.
"Adenovirus can be inadvertently picked up in hospital waiting rooms
and from contaminated surfaces in general. It is rapidly spread in
schools and homes and has an enormous impact on quality of life --
keeping kids out of school and parents out of work," said Shanks. "This
coating on waiting room furniture, for example, could be a major step
towards reducing this problem."
The next step for the researchers will be to test the effectiveness
against betacoronaviruses, like the one that causes COVID-19.
"If the treated fabric would repel betacornonaviruses, and in
particular SARS-CoV-2, this could have a huge impact for healthcare
workers and even the general public if PPE, scrubs, or even clothing
could be made from protein, blood-, bacteria-, and virus-repelling
fabrics," said Romanowski.
At the moment, the coating is applied using drop casting, a method that
saturates the material with a solution from a syringe and applies a
heat treatment to increase stability. But the researchers believe the
process can use a spraying or dipping method to accommodate larger
pieces of material, like gowns, and can eventually be scaled up for
production.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Pittsburgh. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Anthony J. Galante, Sajad Haghanifar, Eric G. Romanowski, Robert M.
Q. Shanks, Paul W. Leu. Superhemophobic and Antivirofouling Coating
for Mechanically Durable and Wash-Stable Medical Textiles. ACS
Applied Materials & Interfaces, 2020; 12 (19): 22120 DOI:
[19]10.1021/acsami.9b23058
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
Date:
May 13, 2020
Source:
Max Planck Institute for the Science of Human History
Summary:
Due to the improvement and increased use of geochemical
fingerprinting techniques during the last 25 years, the
archaeological compositional data of stone tools has grown
exponentially. The Pofatu Database is a large-scale
collaborative project that enables curation and data sharing.
The database also provides instrumental details, analytical
procedures and reference standards used for calibration purposes
or quality control. Thus, Pofatu ensures reproducibility and
comparability between provenance studies.
FULL STORY
__________________________________________________________________
Due to the improvement and increased use of geochemical fingerprinting
techniques during the last 25 years, the archaeological compositional
data of stone tools has grown exponentially. The Pofatu Database is a
large-scale collaborative project that enables curation and data
sharing. The database also provides instrumental details, analytical
procedures and reference standards used for calibration purposes or
quality control. Thus, Pofatu ensures reproducibility and comparability
between provenance studies.
Provenance studies (documenting where artefacts are found relative to
their sources or place of manufacture) help archaeologists understand
the "life-histories" of artefacts, in this case, stone tools. They show
where the raw material come from and how artefacts were manufactured
and distributed between individuals and groups. Reliable data allows
scientists to reconstruct technological, economic, and social behaviors
of human societies over many thousands of years.
To facilitate access to this growing body of geochemical data, Aymeric
Hermann and Robert Forkel of the Department for Linguistic and Cultural
Evolution, Max Planck Institute for the Science of Human History,
conceived and designed Pofatu, the first open-access database of
geochemical compositions and contextual information for archaeological
sources and artefacts in a form readily accessible to the scientific
community.
Reconstructing ancient strategies of raw material and artefact
procurement
Geochemical "fingerprinting" of artefacts is the most effective way to
reconstruct how and where ancient peoples extracted, transformed, and
exchanged stone materials and artefacts. These fingerprints also serve
as clues to understand a number of phenomenon in past human societies,
such as technical and economic behaviors, as well as sociopolitical
organizations.
The Pofatu Database provides researchers with access to an
ever-expanding dataset and facilitates comparability and
reproducibility in provenance studies. Each sample is comprehensively
documented for elemental and isotopic compositions, and includes
detailed archaeological provenance, as well as supporting analytical
metadata, such as sampling processes, analytical procedures, and
quality control.
"By providing analytical data and comprehensive archaeological details
in a form that can be readily accessed by the scientific community,"
Hermann says, "the Pofatu Database will facilitate assigning
unambiguous provenance to artefacts in future studies and will lead to
more robust, large-scope modelling of long-distance voyaging and
traditional exchange systems."
Additionally, Marshall Weisler, a collaborator in the Pofatu project
from the University of Queensland in Australia, stated that "By tracing
the transport of artefacts carried across the wide expanse of the
Pacific Ocean, we will be able to reconstruct the ancient journeys
enabling the greatest maritime migration in human history."
Pofatu -- an operational framework for data sharing in archaeometry
Pofatu's structure was designed by Forkel and Hermann. Hermann compiled
and described the data with contributions and validations by colleagues
and co-authors from universities and research institutions in New
Zealand, Australia, and the USA. The database uses GitHub for
open-source storage and version control and common non-proprietary file
formats (CSV) to enable transparency and built-in reproducibility for
future studies of prehistoric exchange. The database currently contains
7759 individual samples from archaeological sites and geological
sources across the Pacific Islands, but Pofatu is made for even more,
Hermann notes.
"With Pofatu we activated an operational framework for data sharing in
archaeometry. The database is currently focused on sites and
collections from the Pacific Islands, but we welcome all contributions
of geochemical data on archaeological material, regardless of
geographic or chrono-cultural boundaries. Our vision is an inclusive
and collaborative data resource that will hopefully continue to develop
with more datasets from the Pacific as well as from other regions. The
ultimate goal is a more global project contemporary to other existing
online repositories for geological materials."
Although the Pofatu Database is meant to be used primarily by
archaeologists, analyses of geological samples and raw material
extracted from prehistoric quarries could also be used by geologists to
gather essential information on the smaller or more remote Pacific
islands, which are among the least studied places on the planet and
sometimes lack geochemical documentation. In that sense, Pofatu is a
tool that will facilitate interdisciplinary research.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Max Planck Institute for the Science of
Human History. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Aymeric Hermann, Robert Forkel, Andrew McAlister, Arden
Cruickshank, Mark Golitko, Brendan Kneebone, Mark McCoy, Christian
Reepmeyer, Peter Sheppard, John Sinton, Marshall Weisler. Pofatu, a
curated and open-access database for geochemical sourcing of
archaeological materials. Scientific Data, 2020; 7 (1) DOI:
[19]10.1038/s41597-020-0485-8
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
Multi-scale structured materials for electrocatalysis and
photoelectrocatalysis
Date:
May 13, 2020
Source:
Technische Universität Dresden
Summary:
Chemists developed a freeze-thaw method, capable of synthesizing
various noble metal aerogels (NMAs) with clean surfaces and
multiscale structure. In virtue of their hierarchical structures
and unique optical properties, outstanding performance for
electro-oxidation of ethanol is found. The research provides new
ideas for designing various gel or foam materials for
high-performance electrocatalysis and photoelectrocatalysis.
FULL STORY
__________________________________________________________________
As a new class of porous materials, noble metal aerogels (NMAs) have
drawn tremendous attention because of their combined features including
self-supported architectures, high surface areas, and numerous
optically and catalytically active sites, enabling their impressive
performance in diverse fields. However, current fabrication methods
suffer from long fabrication periods, unavoidable impurities, and
uncontrolled multiscale structures, discouraging their fundamental and
application-orientated studies.
Dr. Ran Du from China has been an Alexander von Humboldt research
fellow at TU Dresden since 2017. In collaboration with the Dresden
chemists Dr. Jan-Ole Joswig and Professor Alexander Eychmüller, they
recently crafted a novel freeze-thaw method capable of acquiring
various multi-scale structured noble metal aerogels as superior
photoelectrocatalysts for electro-oxidation of ethanol, promoting the
application for fuel cells. Their work has now been published as cover
story in the journal Angewandte Chemie International Edition.
Ran Du and his team have found unusual self-healing properties of noble
metal gels in their previous works. Inspired by this fact, a
freeze-thaw method was developed as an additive-free approach to
directly destabilise various dilute metal nanoparticle solutions
(concentration of 0.2-0.5 mM). Upon freezing, large aggregates were
generated due to the intensified salting-out effects incurred by the
dramatically raised local solute concentration; meanwhile, they were
shaped at micrometer scale by in situ formed ice crystals. After
thawing, aggregates settled down and assembled to monolithic hydrogels
as a result of their self-healing properties. Purified and dried, clean
hydrogels and the corresponding aerogels were obtained.
Due to the hierarchically porous structures, the cleanliness, and the
combined catalytic/optical properties, the resulting gold-palladium
(Au-Pd) aerogels were found to display impressive light-driven
photoelectrocatalytic performance, delivering a current density of up
to 6.5 times higher than that of commercial palladium-on-carbon (Pd/C)
for the ethanol oxidation reaction.
"The current work provides a new idea to create clean and
hierarchically structured gel materials directly from dilute precursor
solutions, and it should adapt to various material systems for enhanced
application performance for catalysis and beyond," assumes chemist Ran
Du.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Technische Universität Dresden. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Ran Du, Jan-Ole Joswig, René Hübner, Lin Zhou, Wei Wei, Yue Hu,
Alexander Eychmüller. Freeze-Thaw-Promoted Fabrication of Clean and
Hierarchically Structured Noble-Metal Aerogels for Electrocatalysis
and Photoelectrocatalysis. Angewandte Chemie International Edition,
2020; DOI: [19]10.1002/anie.201916484
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
Date:
May 13, 2020
Source:
Cell Press
Summary:
Wind plants in the United States remain relatively efficient
over time, with only a 13% drop in performance over 17 years,
researchers report. Their study also suggests that a production
tax credit provides an effective incentive to maintain the
plants during the 10-year window in which they are eligible to
receive it. When this window closes, wind plant performance
drops.
FULL STORY
__________________________________________________________________
Wind plants in the United States -- especially the newest models --
remain relatively efficient over time, with only a 13% drop in the
plants' performance over 17 years, researchers at the Lawrence Berkeley
National Laboratory report in the May 13 issue of the journal Joule.
Their study also suggests that a production tax credit provides an
effective incentive to maintain the plants during the 10-year window in
which they are eligible to receive it. When this tax credit window
closes, wind plant performance drops.
"Since wind plant operators are now receiving less revenue after the
tax credit expires, the effective payback period to recoup the costs of
any maintenance expenditure is longer," says study author Dev
Millstein, a research scientist at Lawrence Berkeley National
Laboratory. "Due to this longer payback period, we hypothesize that
plants may choose to spend less on maintenance overall, and their
performance may therefore drop."
Wind power is on the rise, supplying 7.3% of electricity generation in
the United States in 2019 and continuing to grow around the world due
to its low cost and ability to help states and countries reach their
carbon emission reduction goals. But while the technology is highly
promising, it isn't infallible -- like any engineered system, wind
plant performance declines with age, although the rate of decline
varies based on the location of the plant. In order to understand the
potential growth of this technology and its ability to impact
electricity systems, accurate estimates of future wind plant
performance are essential.
Building from previous research with a European focus, Millstein and
colleagues assessed the US onshore wind fleet, evaluating the
performance of 917 US wind plants (including newer plants introduced in
2008 or later as well as older plants) over a 10-year period. Since
measurements of long-term wind speed are typically not available for a
given location, the researchers determined wind speed using global
reanalysis data, accounting for shifts in available wind from one year
to the next. They obtained data on the energy generated from each plant
from the US Energy Information Administration, which tracks electricity
generation from each plant on a monthly basis, and they performed a
statistical analysis to determine the average rate of age-related
performance decline across the entire fleet.
Millstein and colleagues found significant differences in performance
decline between the older and younger wind plant models, with older
vintages declining by 0.53% each year for the first 10 years while
their younger counterparts declined by only 0.17% per year during the
same decade.
But a notable change occurred as soon as the plants turned 10 years old
-- a trend in decline that has not been observed in Europe. As soon as
the plants lost their eligibility for a production tax credit of 2.3
cents per kilowatt-hour, their performance began dropping at a yearly
rate of 3.6%.
Still, the researchers are optimistic about the ability of US wind
plants to weather the years.
"We found that performance decline with age for US plants was on the
lower end of the spectrum found from wind fleets in other countries,
specifically compared to European research studies," says Millstein.
"This is generally good news for the US wind fleet. This study will
help people account for a small amount of performance loss with age
while not exaggerating the magnitude of such losses."
As the wind energy sector continues to swell, the researchers note that
their findings can be used to inform investors, operators, and energy
modelers, enabling accurate long-term wind plant energy production
estimates and guiding the development of an evolving electrical grid.
"The hope is that, overall, the improved estimates of wind generation
and costs will lead to more effective decision making from industry,
academia, and policy makers," says Millstein.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
Materials provided by [17]Cell Press. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Sofia D. Hamilton, Dev Millstein, Mark Bolinger, Ryan Wiser,
Seongeun Jeong. How Does Wind Project Performance Change with Age
in the United States? Joule, 2020; DOI:
[18]10.1016/j.joule.2020.04.005
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
Date:
May 13, 2020
Source:
Graz University of Technology
Summary:
Researchers have developed ultra-light tattoo electrodes that
are hardly noticeable on the skin and make long-term
measurements of brain activity cheaper and easier.
FULL STORY
__________________________________________________________________
Researchers have developed ultra-light tattoo electrodes that are
hardly noticeable on the skin and make long-term measurements of brain
activity cheaper and easier.
In 2015 Francesco Greco, head of the Laboratory of Applied Materials
for Printed and Soft electronics (LAMPSe) at the Institute of Solid
State Physics at Graz University of Technology, developed so-called
"tattoo electrodes" together with Italian scientists. These are
conductive polymers that are printed using an inkjet printer on
standard tattoo paper and then stuck to the skin like transfers to
measure heart or muscle activity.
This type of electrode, optimised in 2018, opened up completely new
paths in electrophysiological examinations, such as electrocardiography
(ECG) or electromyography (EMG). Thanks to a thickness of 700 to 800
nanometres -- that is about 100 times thinner than a human hair -- the
tattoos adapt to uneven skin and are hardly noticeable on the body.
Moreover, the "tattoos" are dry electrodes; in contrast to gel
electrodes, they work without a liquid interface and cannot dry out.
They are excellently suited for long-term measurements. Even hairs
growing through the tattoo do not interfere with the signal recording.
New generation of tattoo electrodes
Building on this pioneering achievement, Greco, together with Esma
Ismailova (Department of Bioelectronics, École Nationale Supérieure des
Mines de Saint-Étienne, France) and Laura Ferrari (The BioRobotics
Institute, Scuola Superiore Sant'Anna, Italy), has now achieved a
further milestone in the measurement of bioelectrical signals: the
group has modified the tattoo electrodes in such a way that they can
also be used in electroencephalography (EEG) -- i.e. to measure brain
activity.
To do this, the researchers used the same approach as in 2018, i.e.
inkjet printing of conductive polymer on tattoo paper. The composition
and thickness of the transfer paper and conductive polymer have been
optimized to achieve an even better connection between the tattoo
electrode and the skin and to record the EEG signals with maximum
quality, because: "Brain waves are in the low frequency range and EEG
signals have a very low amplitude. They are much more difficult to
capture in high quality than EMG or ECG signals," explains Laura
Ferrari, who worked on this project during her PhD and is now a postdoc
researcher in France.
Tests under real clinical conditions have shown that the EEG
measurement with the optimized tattoos is as successful as with
conventional EEG electrodes. "Due to inkjet printing and the
commercially available substrates, however, our tattoos are
significantly less expensive than current EEG electrodes and also offer
more advantages in terms of wearing comfort and long-term measurements
in direct comparison," says Greco.
First ever MEG-compatible dry electrodes
The new tattoo electrodes are the very first dry electrode type that is
suitable for long-term EEG measurements and at the same time compatible
with magneto-encephalography (MEG). MEG is a well-established method
for monitoring brain activity, for which so far only so-called "wet
electrodes" can be used. Such electrodes work on the basis of
electrolyte, gel or an electrode paste, and thus dry out quickly and
are unsuitable for long-term measurements. The new generation of tattoo
electrodes consists exclusively of conductive polymers, i.e. it does
not contain any metals which can be problematic for MEG examinations,
and is printed exclusively with inkjet. "With our method, we produce
the perfect MEG-compatible electrode while reducing costs and
production time," says Greco happily. The TU Graz researcher is
currently spinning ideas on how this technology can be used in clinics
and in neuroengineering as well as in the field of brain computer
interfaces.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Graz University of Technology. Original
written by Christoph Pelzl. Note: Content may be edited for style and
length.
__________________________________________________________________
Journal Reference:
1. Laura M. Ferrari, Usein Ismailov, Jean-Michel Badier, Francesco
Greco, Esma Ismailova. Conducting polymer tattoo electrodes in
clinical electro- and magneto-encephalography. npj Flexible
Electronics, 2020; 4 (1) DOI: [19]10.1038/s41528-020-0067-z
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
Radioactive period following nuclear bomb tests changed rainfall patterns thousands of miles from the detonation sites
Date:
May 13, 2020
Source:
University of Reading
Summary:
Historic records from weather stations show that rainfall
patterns in Scotland were affected by charge in the atmosphere
released by radiation from nuclear bomb tests carried out in the
1950s and '60s.
FULL STORY
__________________________________________________________________
Nuclear bomb tests during the Cold War may have changed rainfall
patterns thousands of miles from the detonation sites, new research has
revealed.
Scientists at the University of Reading have researched how the
electric charge released by radiation from the test detonations,
carried out predominantly by the US and Soviet Union in the 1950s and
1960s, affected rainclouds at the time.
The study, published in Physical Review Letters, used historic records
between 1962-64 from a research station in Scotland. Scientists
compared days with high and low radioactively-generated charge, finding
that clouds were visibly thicker, and there was 24% more rain on
average on the days with more radioactivity.
Professor Giles Harrison, lead author and Professor of Atmospheric
Physics at the University of Reading, said: "By studying the
radioactivity released from Cold War weapons tests, scientists at the
time learnt about atmospheric circulation patterns. We have now reused
this data to examine the effect on rainfall.
"The politically charged atmosphere of the Cold War led to a nuclear
arms race and worldwide anxiety. Decades later, that global cloud has
yielded a silver lining, in giving us a unique way to study how
electric charge affects rain."
It has long been thought that electric charge modifies how water
droplets in clouds collide and combine, potentially affecting the size
of droplets and influencing rainfall, but this is difficult to observe
in the atmosphere. By combining the bomb test data with weather
records, the scientists were able to retrospectively investigate this.
Through learning more about how charge affects non-thunderstorm clouds,
it is thought that scientists will now have a better understanding of
important weather processes.
The race to develop nuclear weapons was a key feature of the Cold War,
as the world's superpowers sought to demonstrate their military
capabilities during heightened tensions following the Second World War.
Although detonations were carried out in remote parts of the world,
such as the Nevada Desert in the US, and on Pacific and Arctic islands,
radioactive pollution spread widely throughout the atmosphere.
Radioactivity ionises the air, releasing electric charge.
The researchers, from the Universities of Reading, Bath and Bristol,
studied records from well-equipped Met Office research weather stations
at Kew near London and Lerwick in the Shetland Isles.
Located 300 miles north west of Scotland, the Shetland site was
relatively unaffected by other sources of anthropogenic pollution. This
made it well suited as a test site to observe rainfall effects which,
although likely to have occurred elsewhere too, would be much more
difficult to detect.
Atmospheric electricity is most easily measured on fine days, so the
Kew measurements were used to identify nearly 150 days where there was
high or low charge generation over the UK while it was cloudy in
Lerwick. The Shetland rainfall on these days showed differences which
vanished after the major radioactivity episode was over.
The findings may be helpful for cloud-related geoengineering research,
which is exploring how electric charge could influence rain, relieve
droughts or prevent floods, without the use of chemicals.
Professor Harrison is leading a project investigating electrical
effects on dusts and clouds in the United Arab Emirates, as part of
their national programme in Rain Enhancement Science. These new
findings will help to show the typical charges possible in natural
non-thunderstorm clouds.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Reading. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Harrison, G., Nicoll, K., Ambaum, M., Marlton, G., Aplin, K.,
Lockwood, M. Precipitation modification by ionisation. Physical
Review Letters, 2020 DOI: [19]10.1103/PhysRevLett.124.198701
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed May 13 21:30:14 2020
hydride complexes
Date:
May 13, 2020
Source:
Tohoku University
Summary:
There is currently a strong demand to replace organic liquid
electrolytes used in conventional rechargeable batteries, with
solid-state ionic conductors which will enable the batteries to
be safer and have higher energy density.
FULL STORY
__________________________________________________________________
There is currently a strong demand to replace organic liquid
electrolytes used in conventional rechargeable batteries, with
solid-state ionic conductors which will enable the batteries to be
safer and have higher energy density.
To that end, much effort has been devoted to finding materials with
superior ionic conductivities. Among the most promising, are
solid-state ionic conductors that contain polyanions such as
B[12]H[12]^2-. They constitute a particular class of materials due to
their unique transport behavior, which has the polyanions rotating at
an elevated temperature, thereby greatly promoting cation
conductivities.
However, a major drawback is the high temperature (=energy) required to
activate the rotation, which conversely means low conductivities at
room temperature.
To address that problem, a research group at Tohoku University, led by
Associate Professor Shigeyuki Takagi and Professor Shin-ichi Orimo, has
established a new principle for room-temperature superionic conduction.
Its findings were recently published in Applied Physics Letters.
The research group was able to reduce the activation temperature by
using transition metal hydride complexes as a new class of rotatable
polyanions, wherein hydrogen is the sole ligand species, covalently
binding to single transition metals. Unlike in B[12]H[12]^2-
polyanions, the rotation of transition metal hydride complexes only
requires displacements of highly mobile hydrogen and can therefore be
expected to occur with low activation energy.
The group then studied the dynamics of transition metal hydride
complexes in several existing hydrides, and found them reoriented -- as
if rotating by repeating small deformations -- even at room
temperature.
This kind of motion is known as "pseudorotation," and is rarely
observed in solid matter. Due to the small displacements of hydrogen
atoms, the activation energy of the pseudorotation is relatively low --
more than 40 times lower than what's reportedly needed for the rotation
of B[12]H[12]^2-.
As a result of a cation conduction being promoted from a low
temperature region by pseudorotation, the lithium ion conductivity in
Li[5]MoH[11] containing MoH[9]^3-, for example, can reach 79 mS cm^-1
at room temperature. This is more than three times the world record of
room-temperature lithium ion conductivity reported so far. This
suggests that an all-solid-state lithium ion battery with shorter
charging time at room temperature can be realised.
The discovered mechanism is quite general and would be useful in
lowering the temperature required to activate the rotation of
polyanions. This may positively contribute towards finding compositions
that are amenable to room-temperature superionic conductors.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Tohoku University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Shigeyuki Takagi, Tamio Ikeshoji, Toyoto Sato, Shin-ichi Orimo.
Pseudorotating hydride complexes with high hydrogen coordination: A
class of rotatable polyanions in solid matter. Applied Physics
Letters, 2020; 116 (17): 173901 DOI: [19]10.1063/5.0002992
__________________________________________________________________
--- up 16 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu May 14 21:30:06 2020
Date:
May 14, 2020
Source:
DOE/Lawrence Berkeley National Laboratory
Summary:
A new study revealed hundreds of new strong gravitational
lensing candidates based on a deep dive into data. The study
benefited from the winning machine-learning algorithm in an
international science competition.
FULL STORY
__________________________________________________________________
Like crystal balls for the universe's deeper mysteries, galaxies and
other massive space objects can serve as lenses to more distant objects
and phenomena along the same path, bending light in revelatory ways.
Gravitational lensing was first theorized by Albert Einstein more than
100 years ago to describe how light bends when it travels past massive
objects like galaxies and galaxy clusters.
These lensing effects are typically described as weak or strong, and
the strength of a lens relates to an object's position and mass and
distance from the light source that is lensed. Strong lenses can have
100 billion times more mass than our sun, causing light from more
distant objects in the same path to magnify and split, for example,
into multiple images, or to appear as dramatic arcs or rings.
The major limitation of strong gravitational lenses has been their
scarcity, with only several hundred confirmed since the first
observation in 1979, but that's changing ... and fast.
A new study by an international team of scientists revealed 335 new
strong lensing candidates based on a deep dive into data collected for
a U.S. Department of Energy-supported telescope project in Arizona
called the Dark Energy Spectroscopic Instrument (DESI). The study,
published May 7 in The Astrophysical Journal, benefited from the
winning machine-learning algorithm in an international science
competition.
"Finding these objects is like finding telescopes that are the size of
a galaxy," said David Schlegel, a senior scientist in Lawrence Berkeley
National Laboratory's (Berkeley Lab's) Physics Division who
participated in the study. "They're powerful probes of dark matter and
dark energy."
These newly discovered gravitational lens candidates could provide
specific markers for precisely measuring distances to galaxies in the
ancient universe if supernovae are observed and precisely tracked and
measured via these lenses, for example.
Strong lenses also provide a powerful window into the unseen universe
of dark matter, which makes up about 85 percent of the matter in the
universe, as most of the mass responsible for lensing effects is
thought to be dark matter. Dark matter and the accelerating expansion
of the universe, driven by dark energy, are among the biggest mysteries
that physicists are working to solve.
In the latest study, researchers enlisted Cori, a supercomputer at
Berkeley Lab's National Energy Research Scientific Computing Center
(NERSC), to automatically compare imaging data from the Dark Energy
Camera Legacy Survey (DECaLS) -- one of three surveys conducted in
preparation for DESI -- with a training sample of 423 known lenses and
9,451 non-lenses.
The researchers grouped the candidate strong lenses into three
categories based on the likelihood that they are, in fact, lenses:
Grade A for the 60 candidates that are most likely to be lenses, Grade
B for the 105 candidates with less pronounced features, and Grade C for
the 176 candidate lenses that have fainter and smaller lensing features
than those in the other two categories.
Xiaosheng Huang, the study's lead author, noted that the team already
succeeded in winning time on the Hubble Space Telescope to confirm some
of the most promising lensing candidates revealed in the study, with
observing time on the Hubble that began in late 2019.
"The Hubble Space Telescope can see the fine details without the
blurring effects of Earth's atmosphere," Huang said.
The lens candidates were identified with the assistance of a neural
network, which is a form of artificial intelligence in which the
computer program is trained to gradually improve its image-matching
over time to provide an increasing success rate in identifying lenses.
Computerized neural networks are inspired by the biological network of
neurons in the human brain.
"It takes hours to train the neural network," Huang said. "There is a
very sophisticated fitting model of 'What is a lens?' and 'What is not
a lens?'"
There was some painstaking manual analysis of lensing images to help
pick the best images to train the network from tens of thousands of
images, Huang noted. He recalled one Saturday during which he sat down
with student researchers for the entire day to pore over tens of
thousands of images to develop sample lists of lenses and non-lenses.
"We didn't just select these at random," Huang said. "We had to augment
this set with hand-selected examples that look like lenses but are not
lenses," for example, "and we selected those that could be potentially
confusing."
Student involvement was key in the study, he added. "The students
worked diligently on this project and solved many tough problems, all
while taking a full load of classes," he said. One of the students who
worked on the study, Christopher Storfer, was later selected to
participate in the DOE Science Undergraduate Laboratory Internship
(SULI) program at Berkeley Lab.
Researchers have already improved upon the algorithm that was used in
the latest study to speed up the identification of possible lenses.
While an estimated 1 in 10,000 galaxies acts as a lens, the neural
network can eliminate most of the non-lenses. "Rather than going
through 10,000 images to find one, now we have just a few tens," he
said.
The neural network was originally developed for The Strong
Gravitational Lens Finding Challenge, a programming competition that
ran from November 2016 to February 2017 that motivated the development
of automated tools for finding strong lenses.
With a growing body of observational data, and new telescope projects
like DESI and the Large Synoptic Survey Telescope (LSST) that is now
scheduled to start up in 2023, there is heated competition to mine this
data using sophisticated artificial intelligence tools, Schlegel said.
"That competition is good," he said. A team based in Australia, for
example, also found many new lensing candidates using a different
approach. "About 40 percent of what they found we didn't," and likewise
the study that Schlegel participated in found many lensing candidates
that the other team hadn't.
Huang said the team has expanded its search for lenses in other sources
of sky-imaging data, and the team is also considering whether to plug
into a broader set of computing resources to expedite the hunt.
"The goal for us is to reach 1,000" new lensing candidates, Schlegel
said.
NERSC is a DOE Office of Science User Facility.
Study participants included researchers from the University of San
Francisco, Berkeley Lab, the National Optical Astronomy Observatory,
Siena College, the University of Wyoming, the University of Arizona,
the University of Toronto and the Perimeter Institute for Theoretical
Physics in Canada, and Université Paris-Saclay in France.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]DOE/Lawrence Berkeley National
Laboratory. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. X. Huang, C. Storfer, V. Ravi, A. Pilon, M. Domingo, D. J.
Schlegel, S. Bailey, A. Dey, R. R. Gupta, D. Herrera, S. Juneau, M.
Landriau, D. Lang, A. Meisner, J. Moustakas, A. D. Myers, E. F.
Schlafly, F. Valdes, B. A. Weaver, J. Yang, C. Yèche. Finding
Strong Gravitational Lenses in the DESI DECam Legacy Survey. The
Astrophysical Journal, 2020; 894 (1): 78 DOI:
[19]10.3847/1538-4357/ab7ffb
__________________________________________________________________
--- up 16 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu May 14 21:30:06 2020
First scientific result by the new spectrograph on the Subaru telescope
Date:
May 14, 2020
Source:
National Institutes of Natural Sciences
Summary:
Astronomers have determined that the Earth-like planets of the
TRAPPIST-1 system are not significantly misaligned with the
rotation of the star. This is an important result for
understanding the evolution of planetary systems around very
low-mass stars in general, and in particular the history of the
TRAPPIST-1 planets including the ones near the habitable zone.
FULL STORY
__________________________________________________________________
Astronomers using the Subaru Telescope have determined that the
Earth-like planets of the TRAPPIST-1 system are not significantly
misaligned with the rotation of the star. This is an important result
for understanding the evolution of planetary systems around very
low-mass stars in general, and in particular the history of the
TRAPPIST-1 planets including the ones near the habitable zone.
Stars like the Sun are not static, but rotate about an axis. This
rotation is most noticeable when there are features like sunspots on
the surface of the star. In the Solar System, the orbits of all of the
planets are aligned to within 6 degrees with the Sun's rotation. In the
past it was assumed that planetary orbits would be aligned with the
rotation of the star, but there are now many known examples of
exoplanet systems where the planetary orbits are strongly misaligned
with the central star's rotation. This raises the question: can
planetary systems form out of alignment, or did the observed misaligned
systems start out aligned and were later thrown out of alignment by
some perturbation? The TRAPPIST-1 system has attracted attention
because it has three small rocky planets located in or near the
habitable zone where liquid water can exist. The central star is a very
low-mass and cool star, called an M dwarf, and those planets are
situated very close to the central star. Therefore, this planetary
system is very different from our Solar System. Determining the history
of this system is important because it could help determine if any of
the potentially habitable planets are actually inhabitable. But it is
also an interesting system because it lacks any nearby objects which
could have perturbed the orbits of the planets, meaning that the orbits
should still be located close to where the planets first formed. This
gives astronomers a chance to investigate the primordial conditions of
the system.
Because stars rotate, the side rotating into view has a relative
velocity towards the viewer, while the side rotating out of view has a
relative velocity away from the viewer. If a planet transits, passes
between the star and the Earth and blocks a small portion of the light
from the star, it is possible to tell which edge of the star the planet
blocks first. This phenomenon is called the Rossiter-McLaughlin effect.
Using this method, it is possible to measure the misalignment between
the planetary orbit and the star's rotation. However, until now those
observations have been limited to large planets such as Jupiter-like or
Neptune-like ones.
A team of researchers, including members from the Tokyo Institute of
Technology and the Astrobiology Center in Japan, observed TRAPPIST-1
with the Subaru Telescope to look for misalignment between the
planetary orbits and the star. The team took advantage of a chance on
August 31, 2018, when three of the exoplanets orbiting TRAPPIST-1
transited in front of the star in a single night. Two of the three were
rocky planets near the habitable zone. Since low-mass stars are
generally faint, it had been impossible to probe the stellar obliquity
(spin-orbit angle) for TRAPPIST-1. But thanks to the light gathering
power of the Subaru Telescope and high spectral resolution of the new
infrared spectrograph IRD, the team was able to measure the obliquity.
They found that the obliquity was low, close to zero. This is the first
measurement of the stellar obliquity for a very low-mass star like
TRAPPIST-1 and also the first Rossiter-McLaughlin measurement for
planets in the habitable zone.
However the leader of the team, Teruyuki Hirano at the Tokyo Institute
of Technology, cautions, "The data suggest alignment of the stellar
spin with the planetary orbital axes, but the precision of the
measurements was not good enough to completely rule out a small
spin-orbit misalignment. Nonetheless, this is the first detection of
the effect with Earth-like planets and more work will better
characterize this remarkable exoplanet system."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]National Institutes of Natural Sciences.
Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Teruyuki Hirano, Eric Gaidos, Joshua N. Winn, Fei Dai, Akihiko
Fukui, Masayuki Kuzuhara, Takayuki Kotani, Motohide Tamura, Maria
Hjorth, Simon Albrecht, Daniel Huber, Emeline Bolmont, Hiroki
Harakawa, Klaus Hodapp, Masato Ishizuka, Shane Jacobson, Mihoko
Konishi, Tomoyuki Kudo, Takashi Kurokawa, Jun Nishikawa, Masashi
Omiya, Takuma Serizawa, Akitoshi Ueda, Lauren M. Weiss. Evidence
for Spin–Orbit Alignment in the TRAPPIST-1 System. The
Astrophysical Journal, 2020; 890 (2): L27 DOI:
[19]10.3847/2041-8213/ab74dc
__________________________________________________________________
--- up 16 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu May 14 21:30:18 2020
Date:
May 14, 2020
Source:
University of Hawaii at Manoa
Summary:
Researchers revealed the largest and hottest shield volcano on
Earth. A team of volcanologists and ocean explorers used several
lines of evidence to determine P?h?honu, a volcano within the
Papah?naumoku?kea Marine National Monument now holds this
distinction.
FULL STORY
__________________________________________________________________
In a recently published study, researchers from the University of
Hawai'i at Mānoa School of Ocean and Earth Science and Technology
revealed the largest and hottest shield volcano on Earth. A team of
volcanologists and ocean explorers used several lines of evidence to
determine Pūhāhonu, a volcano within the Papahānaumokuākea Marine
National Monument now holds this distinction.
Geoscientists and the public have long thought Mauna Loa, a
culturally-significant and active shield volcano on the Big Island of
Hawai'i, was the largest volcano in the world. However, after surveying
the ocean floor along the mostly submarine Hawaiian leeward volcano
chain, chemically analyzing rocks in the UH Mānoa rock collection, and
modeling the results of these studies, the research team came to a new
conclusion. Pūhāhonu, meaning 'turtle rising for breath' in Hawaiian,
is nearly twice as big as Mauna Loa.
"It has been proposed that hotspots that produce volcano chains like
Hawai'i undergo progressive cooling over 1-2 million years and then
die," said Michael Garcia, lead author of the study and retired
professor of Earth Sciences at SOEST. "However, we have learned from
this study that hotspots can undergo pulses of melt production. A small
pulse created the Midway cluster of now extinct volcanoes and another,
much bigger one created Pūhāhonu. This will rewrite the textbooks on
how mantle plumes work."
In 1974, Pūhāhonu (then called Gardner Pinnacles) was suspected as the
largest Hawaiian volcano based on very limited survey data. Subsequent
studies of the Hawaiian Islands concluded that Mauna Loa was the
largest volcano but they included the base of the volcano that is below
sea level that was not considered in the 1974 study. The new
comprehensive surveying and modeling, using methods similar to those
used for Mauna Loa show that Pūhāhonu is the largest.
This study highlights Hawaiian volcanoes, not only now but for millions
of years, have been erupting some of the hottest magma on Earth. This
work also draws attention to an infrequently visited part of the state
of Hawai'i that has ecological, historical and cultural importance.
"We are sharing with the science community and the public that we
should be calling this volcano by the name the Hawaiians have given to
it, rather than the western name for the two rocky small islands that
are the only above sea level remnants of this once majestic volcano,"
said Garcia.
This work was funded by the National Science Foundation, Schmidt Ocean
Institute and the University of Hawai'i.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Hawaii at Manoa. Original
written by Marcie Grabowski. Note: Content may be edited for style and
length.
__________________________________________________________________
Journal Reference:
1. Michael O. Garcia, Jonathan P. Tree, Paul Wessel, John R. Smith.
Pūhāhonu: Earth's biggest and hottest shield volcano. Earth and
Planetary Science Letters, 2020; 542: 116296 DOI:
[19]10.1016/j.epsl.2020.116296
__________________________________________________________________
--- up 16 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu May 14 21:30:18 2020
Study reveals how wavelike plasmons could power up a new class of sensing and photochemical technologies at the nanoscale
Date:
May 14, 2020
Source:
DOE/Lawrence Berkeley National Laboratory
Summary:
A team of researchers has observed unusually long-lived wavelike
electrons called 'plasmons' in a new class of electronically
conducting material. Plasmons are very important for determining
the optical and electronic properties of metals for the
development of new sensors and communication devices.
FULL STORY
__________________________________________________________________
Wavelike, collective oscillations of electrons known as "plasmons" are
very important for determining the optical and electronic properties of
metals.
In atomically thin 2D materials, plasmons have an energy that is more
useful for applications, including sensors and communication devices,
than plasmons found in bulk metals. But determining how long plasmons
live and whether their energy and other properties can be controlled at
the nanoscale (billionths of a meter) has eluded many.
Now, as reported in the journal Nature Communications, a team of
researchers co-led by the Department of Energy's Lawrence Berkeley
National Laboratory (Berkeley Lab) -- with support from the Department
of Energy's Center for Computational Study of Excited-State Phenomena
in Energy Materials (C2SEPEM) -- has observed long-lived plasmons in a
new class of conducting transition metal dichalcogenide (TMD) called
"quasi 2D crystals."
To understand how plasmons operate in quasi 2D crystals, the
researchers characterized the properties of both nonconductive
electrons as well as conductive electrons in a monolayer of the TMD
tantalum disulfide. Previous studies only looked at conducting
electrons. "We discovered that it was very important to carefully
include all the interactions between both types of electrons," said
C2SEPEM Director Steven Louie, who led the study. Louie also holds
titles as senior faculty scientist in the Materials Sciences Division
at Berkeley Lab and professor of physics at UC Berkeley.
The researchers developed sophisticated new algorithms to compute the
material's electronic properties, including plasmon oscillations with
long wavelengths, "as this was a bottleneck with previous computational
approaches," said lead author Felipe da Jornada, who was a postdoctoral
researcher in Berkeley Lab's Materials Sciences Division at the time of
the study. Jornada is currently an assistant professor in materials
science and engineering at Stanford University.
To the researchers' surprise, the results from calculations performed
by the Cori supercomputer at Berkeley Lab's National Energy Research
Scientific Computing Center (NERSC) revealed that plasmons in quasi 2D
TMDs are much more stable -- for as long as approximately 2
picoseconds, or 2 trillionths of a second -- than previously thought.
Their findings also suggest that plasmons generated by quasi 2D TMDs
could enhance the intensity of light by more than 10 million times,
opening the door for renewable chemistry (chemical reactions triggered
by light), or the engineering of electronic materials that can be
controlled by light.
In future studies, the researchers plan to investigate how to harness
the highly energetic electrons released by such plasmons upon decay,
and if they can be used to catalyze chemical reactions.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]DOE/Lawrence Berkeley National
Laboratory. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Felipe H. da Jornada, Lede Xian, Angel Rubio, Steven G. Louie.
Universal slow plasmons and giant field enhancement in atomically
thin quasi-two-dimensional metals. Nature Communications, 2020; 11
(1) DOI: [19]10.1038/s41467-020-14826-8
__________________________________________________________________
--- up 16 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu May 14 21:30:18 2020
Date:
May 14, 2020
Source:
Tohoku University
Summary:
Researchers have developed a new multi-beam method for
conducting CT scans that improve image quality whilst
drastically cutting the required time to one millisecond.
FULL STORY
__________________________________________________________________
Many will undergo a CT scan at some point in their lifetime -- being
slid in and out of a tunnel as a large machine rotates around. X-ray
computed tomography, better known by its acronym CT, is a widely used
method of obtaining cross-sectional images of objects.
Now a research team -- led by Tohoku University Professor, Wataru
Yashiro -- has developed a new method using intense synchrotron
radiation that produces higher quality images within milliseconds.
High-speed, high-resolution X-ray CT is currently possible using
intense synchrotron radiation. However, this requires samples to be
rotated at high speed to obtain images from many directions. This would
make CT scans more akin to a rollercoaster ride!
Extreme rotation also makes controlling the temperature or atmosphere
of the sample impossible.
Nevertheless, the research team solved this conundrum by creating an
optical system that splits single synchrotron X-ray beams into many.
These beams then shine onto the sample from different directions at the
same time; thus, negating the need to rotate the sample.
This "multi-beam" method is no easy task since the direction of X-rays
cannot be easily changed. Unlike visible light, X-rays interact with
matters weakly, making it difficult to utilize mirrors and prisms to
change the path of the beams.
To overcome this, the research team used micro-fabrication techniques
to create uniquely shaped crystals. These crystals were then bent in
the shape of a hyperbola. By combining three rows of crystals, the
multi-beam optics were able to cover an angle of ±70°.
Carrying out their experiments at the SPring-8 synchrotron radiation
facility, the research team took advantage of a cutting-edge
compressed-sensing algorithm that needs only a few dozen projection
images for image reconstruction.
"The invention makes 3-D observations of living beings and liquid
samples within milliseconds possible" exclaimed Professor Yashiro. "Its
possible application is wide-spread, from fundamental material science
to life sciences to industry," added Yashiro.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Tohoku University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Wolfgang Voegeli, Kentaro Kajiwara, Hiroyuki Kudo, Tetsuroh
Shirasawa, Xiaoyu Liang, Wataru Yashiro. Multibeam x-ray optical
system for high-speed tomography. Optica, 2020; 7 (5): 514 DOI:
[19]10.1364/OPTICA.384804
__________________________________________________________________
--- up 16 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu May 14 21:30:18 2020
Date:
May 14, 2020
Source:
University of Washington
Summary:
Prior to the COVID-19 pandemic, in cities where bike-share
systems have been introduced, bike commuting increased by 20%,
according to a new study.
FULL STORY
__________________________________________________________________
In the past couple of years, if you lived in a major, or even mid-sized
city, you were likely familiar with bike-share bikes.
Whether propped against a tree, strewn along the sidewalk or standing
"docked" at a station, the often brightly colored bikes with whimsical
company names promised a ready means to get from Point A to Point B.
But one person's spontaneous ride is another person's commute to work.
Prior to the COVID-19 pandemic, in cities where bike-share systems have
been introduced, bike commuting increased by 20%, said Dafeng Xu, an
assistant professor in the University of Washington's Evans School of
Public Policy & Governance. Xu studied U.S. cities with and without
bike-share systems, using Census and company data to analyze how
commuting patterns change when bike shares become available.
"This study shows that bike-share systems can drive a population to
commute by bike," said Xu, whose study was published May 11 in the
Journal of Policy Analysis and Management.
Bike-share systems, common in cities in Europe and Asia, were launched
in four U.S. cities in 2010 and as of 2016 had grown to more than 50.
Not all systems have been successful: Convenience -- how easy it is to
find and rent a bike -- is the key. In Seattle, for example, a
city-owned bike-share program failed in 2017 due largely to a limited
number of bikes and a lack of infrastructure, but private companies in
the same market thrived prior to the pandemic.
[Around the world, cities have enacted mobility restrictions during the
coronavirus outbreak. The responses of bike-share companies, and
bike-share usage, have varied by community.]
Among other interests in transportation and immigration policy, Xu
researches the effects of bicycling on the environment and human
health, and on the ways bike-share systems can play a role by expanding
access to cycling.
"In general, biking is good and healthy, and it means less pollution
and traffic, but it can be expensive, and people worry about their
bikes being stolen, things like that," Xu said. "Bike share solves some
of these problems, because people don't need to worry about the cost
and theft."
For this study, Xu sorted through nine years of demographic and commute
statistics from the American Community Survey, a detailed, annual
report by the Census Bureau. He then examined bike-share company data
(through the National Association of City Transportation Officials)
from 38 cities with systems, focusing on trips logged during morning
and afternoon rush hours. By comparing the number, location and time of
work-related bike commutes from Census data against bike-share company
records of trips logged, both before and after the launch of bike
shares, Xu was able to estimate the use of bike shares for commute
trips.
Xu found that in both bike-share and non-bike-share cities, the rate of
bike commuting increased, while car commuting decreased, from
2008-2016. However, the rate of bike commuting -- and the use of public
transportation -- was significantly greater in bike-share cities.
For example, in bike-share cities in 2008, roughly 66% of commuters
drove to work, about 1% biked, and 22% took transit. That compared to
non-bike-share cities, where about 88% of commuters drove, fewer than
1% biked, and 4% took transit.
By 2016 -- after many bike-share systems had launched -- car commuting
had fallen to 59% in bike-share cities, while bike commuting had
climbed to 1.7% and transit to 26%. Commuting by car in non-bike-share
cities had slipped to 83% in 2016, while bike commuting had grown to
1%, and transit to 6%.
Nationwide, 0.6% of commuters bike to work, according to an American
Community Survey report in 2017.
In general, cities with larger bike-share systems also experienced
sharper increases in bicycle commuting, Xu said.
"This is not surprising: A large bike-share system means a higher
density of public bicycles and is thus more accessible by commuters,"
he said. "In contrast, sadly, Seattle's Pronto struggled to attract
commuters and was finally doomed only after three years of operation
partially due to its relatively small size."
In his paper, Xu points to Chicago, which operates a municipally owned
bike-share system called Divvy. Prior to Divvy's launch in 2013, 1.5%
of commuters biked to work, Xu said, but afterward, that rate grew to
2%.
The trends held, he said, even when controlling for a city's expansion
of protected bike lanes -- another significant factor in whether people
choose to bike to work, according to other research.
Overall, the numbers before COVID-19 were promising, Xu said. The
numbers could grow, he said, if communities and bike-share companies
make changes that can boost the appeal of bike commuting: adding bike
lanes to city streets, expanding programs to outlying communities, or
increasing the allowable rental time. Many bike shares, for instance,
last only up to a half-hour before a user has to pay for a new trip.
Xu is also the author of a previous paper analyzed the impact of
bike-share systems on obesity rates.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Washington. Original
written by Kim Eckart. Note: Content may be edited for style and
length.
__________________________________________________________________
Journal References:
1. Dafeng Xu. Free Wheel, Free Will! The Effects of Bikeshare Systems
on Urban Commuting Patterns in the U.S. . Journal of Policy
Analysis and Management, 2020; DOI: [19]10.1002/pam.22216
2. Dafeng Xu. Burn Calories, Not Fuel! The effects of bikeshare
programs on obesity rates. Transportation Research Part D:
Transport and Environment, 2019; 67: 89 DOI:
[20]10.1016/j.trd.2018.11.002
__________________________________________________________________
--- up 16 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri May 15 21:30:12 2020
states
Date:
May 15, 2020
Source:
DOE/SLAC National Accelerator Laboratory
Summary:
Until now, electron spins and orbitals were thought to go hand
in hand in a class of materials that's the cornerstone of modern
information technology; you couldn't quickly change one without
changing the other. But a new study shows that a pulse of laser
light can dramatically change the spin state of one important
class of materials while leaving its orbital state intact.
FULL STORY
__________________________________________________________________
In designing electronic devices, scientists look for ways to manipulate
and control three basic properties of electrons: their charge; their
spin states, which give rise to magnetism; and the shapes of the fuzzy
clouds they form around the nuclei of atoms, which are known as
orbitals.
Until now, electron spins and orbitals were thought to go hand in hand
in a class of materials that's the cornerstone of modern information
technology; you couldn't quickly change one without changing the other.
But a study at the Department of Energy's SLAC National Accelerator
Laboratory shows that a pulse of laser light can dramatically change
the spin state of one important class of materials while leaving its
orbital state intact.
The results suggest a new path for making a future generation of logic
and memory devices based on "orbitronics," said Lingjia Shen, a SLAC
research associate and one of the lead researchers for the study.
"What we're seeing in this system is the complete opposite of what
people have seen in the past," Shen said. "It raises the possibility
that we could control a material's spin and orbital states separately,
and use variations in the shapes of orbitals as the 0s and 1s needed to
make computations and store information in computer memories."
The international research team, led by Joshua Turner, a SLAC staff
scientist and investigator with the Stanford Institute for Materials
and Energy Science (SIMES), reported their results this week in
Physical Review B Rapid Communications.
An intriguing, complex material
The material the team studied was a manganese oxide-based quantum
material known as NSMO, which comes in extremely thin crystalline
layers. It's been around for three decades and is used in devices where
information is stored by using a magnetic field to switch from one
electron spin state to another, a method known as spintronics. NSMO is
also considered a promising candidate for making future computers and
memory storage devices based on skyrmions, tiny particle-like vortexes
created by the magnetic fields of spinning electrons.
But this material is also very complex, said Yoshinori Tokura, director
of the RIKEN Center for Emergent Matter Science in Japan, who was also
involved in the study.
"Unlike semiconductors and other familiar materials, NSMO is a quantum
material whose electrons behave in a cooperative, or correlated,
manner, rather than independently as they usually do," he said. "This
makes it hard to control one aspect of the electrons' behavior without
affecting all the others."
One common way to investigate this type of material is to hit it with
laser light to see how its electronic states respond to an injection of
energy. That's what the research team did here. They observed the
material's response with X-ray laser pulses from SLAC's Linac Coherent
Light Source (LCLS).
One melts, the other doesn't
What they expected to see was that orderly patterns of electron spins
and orbitals in the material would be thrown into total disarray, or
"melted," as they absorbed pulses of near-infrared laser light.
But to their surprise, only the spin patterns melted, while the orbital
patterns stayed intact, Turner said. The normal coupling between the
spin and orbital states had been completely broken, he said, which is a
challenging thing to do in this type of correlated material and had not
been observed before.
Tokura said, "Usually only a tiny application of photoexcitation
destroys everything. Here, they were able to keep the electron state
that is most important for future devices -- the orbital state --
undamaged. This is a nice new addition to the science of orbitronics
and correlated electrons."
Much as electron spin states are switched in spintronics, electron
orbital states could be switched to provide a similar function. These
orbitronic devices could, in theory, operate 10,000 faster than
spintronic devices, Shen said.
Switching between two orbital states could be made possible by using
short bursts of terahertz radiation, rather than the magnetic fields
used today, he said: "Combining the two could achieve much better
device performance for future applications." The team is working on
ways to do that.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]DOE/SLAC National Accelerator Laboratory.
Original written by Glennda Chui. Note: Content may be edited for style
and length.
__________________________________________________________________
Journal Reference:
1. L. Shen, S. A. Mack, G. Dakovski, G. Coslovich, O. Krupin, M.
Hoffmann, S.-W. Huang, Y-D. Chuang, J. A. Johnson, S. Lieu, S.
Zohar, C. Ford, M. Kozina, W. Schlotter, M. P. Minitti, J. Fujioka,
R. Moore, W-S. Lee, Z. Hussain, Y. Tokura, P. Littlewood, J. J.
Turner. Decoupling spin-orbital correlations in a layered manganite
amidst ultrafast hybridized charge-transfer band excitation.
Physical Review B, 2020; 101 (20) DOI:
[19]10.1103/PhysRevB.101.201103
__________________________________________________________________
--- up 16 weeks, 3 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri May 15 21:30:12 2020
Combined muscles and sensors made from soft materials allow for flexible
robots
Date:
May 15, 2020
Source:
University of Tokyo
Summary:
Robots can be made from soft materials, but the flexibility of
such robots is limited by the inclusion of rigid sensors
necessary for their control. Researchers created embedded
sensors, to replace rigid sensors, that offer the same
functionality but afford the robot greater flexibility. Soft
robots can be more adaptable and resilient than more traditional
rigid designs. The team used cutting-edge machine learning
techniques to create their design.
FULL STORY
__________________________________________________________________
Robots can be made from soft materials, but the flexibility of such
robots is limited by the inclusion of rigid sensors necessary for their
control. Researchers created embedded sensors, to replace rigid
sensors, that offer the same functionality but afford the robot greater
flexibility. Soft robots can be more adaptable and resilient than more
traditional rigid designs. The team used cutting-edge machine learning
techniques to create their design.
Automation is an increasingly important subject, and core to this
concept are the often paired fields of robotics and machine learning.
The relationship between machine learning and robotics is not just
limited to the behavioral control of robots, but is also important for
their design and core functions. A robot which operates in the real
world needs to understand its environment and itself in order to
navigate and perform tasks.
If the world was entirely predictable, then a robot would be fine
moving around without the need to learn anything new about its
environment. But reality is unpredictable and ever changing, so machine
learning helps robots adapt to unfamiliar situations. Although this is
theoretically true for all robots, it is especially important for
soft-bodied robots as the physical properties of these are
intrinsically less predictable than their rigid counterparts.
"Take for example a robot with pneumatic artificial muscles (PAM),
rubber and fiber-based fluid-driven systems which expand and contract
to move," said Associate Professor Kohei Nakajima from the Graduate
School of Information Science and Technology. "PAMs inherently suffer
random mechanical noise and hysteresis, which is essentially material
stress over time. Accurate laser-based monitors help maintain control
through feedback, but these rigid sensors restrict a robot's movement,
so we came up with something new."
Nakajima and his team thought if they could model a PAM in real time,
then they could maintain good control of it. However, given the
ever-changing nature of PAMs, this is not realistic with traditional
methods of mechanical modeling. So the team turned to a powerful and
established machine learning technique called reservoir computing. This
is where information about a system, in this case the PAM, is fed into
a special artificial neural network in real time, so the model is ever
changing and thus adapts to the environment.
"We found the electrical resistance of PAM material changes depending
on its shape during a contraction. So we pass this data to the network
so it can accurately report on the state of the PAM," said Nakajima.
"Ordinary rubber is an insulator, so we incorporated carbon into our
material to more easily read its varying resistance. We found the
system emulated the existing laser-displacement sensor with equally
high accuracy in a range of test conditions."
Thanks to this method, a new generation of soft robotic technology may
be possible. This could include robots that work with humans, for
example wearable rehabilitation devices or biomedical robots, as the
extra soft touch means interactions with them are gentle and safe.
"Our study suggests reservoir computing could be used in applications
besides robotics. Remote-sensing applications, which need real-time
information processed in a decentralized manner, could greatly
benefit," said Nakajima. "And other researchers who study neuromorphic
computing -- intelligent computer systems -- might also be able to
incorporate our ideas into their own work to improve the performance of
their systems."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Tokyo. Note: Content may be
edited for style and length.
__________________________________________________________________
--- up 16 weeks, 3 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri May 15 21:30:12 2020
Date:
May 15, 2020
Source:
Kansas State University
Summary:
Researchers developed a computer simulation that revealed beef
supply chain vulnerabilities that need safeguarding -- a
realistic concern during the COVID-19 pandemic.
FULL STORY
__________________________________________________________________
An interdisciplinary team of Kansas State University researchers
developed a computer simulation that revealed beef supply chain
vulnerabilities that need safeguarding -- a realistic concern during
the COVID-19 pandemic.
Caterina Scoglio, professor, and Qihui Yang, doctoral student, both in
electrical and computer engineering, recently published "Developing an
agent-based model to simulate the beef cattle production and
transportation in southwest Kansas" in Physica A, an Elsevier journal
publication.
The paper describes a model of the beef production system and the
transportation industry, which are interdependent critical
infrastructures -- similar to the electrical grid and computer
technology. According to the model, disruptions in the cattle industry
-- especially in the beef packing plants -- will affect the
transportation industry and together cause great economic harm. The
disruptions modeled in the simulation share similarities with how the
packing plants have been affected during the COVID-19 pandemic.
"When we first started working on this project, there was a lot of
emphasis on studying critical infrastructures; especially ones that are
interdependent, meaning that they need to work together with other
critical infrastructures," Scoglio said. "The idea is if there is a
failure in one of the systems, it can propagate to the other system,
increasing the catastrophic effects."
The study included a variety of viewpoints to create a realistic and
integrated model of both systems. Co-authors on the paper include Don
Gruenbacher, associate professor and department head of electrical and
computer engineering; Jessica Heier Stamm, associate professor of
industrial and manufacturing systems engineering; Gary Brase, professor
of psychological sciences; Scott DeLoach, professor and department head
of computer science; and David Amrine, research director of the Beef
Cattle Institute.
The researchers used the model to evaluate which supply chain
components were more robust and which were not. They determined that
packing plants are the most vulnerable. Scoglio said that recent events
in the middle of the COVID-19 pandemic raise important issues about how
to safeguard the system.
"An important message is that after understanding the critical role of
these packers, we need to decide how we could protect both them and the
people who work there," Scoglio said. "While the plants are a critical
infrastructure and need to be protected, taking care of the health of
the workers is very important. How can we design a production process
that can be flexible and adaptable in an epidemic?"
According to the paper, the beef cattle industry contributes
approximately $8.9 billion to the Kansas economy and employs more than
42,000 people in the state. Since trucks are needed to move cattle, any
disruption in either cattle production or transportation almost
certainly would harm the regional economy, Scoglio said.
"Packers need to be considered as a critical point of a much longer
supply chain, which needs specific attention to make sure it will not
fail and can continue working," Scoglio said. "Beef packers are a
critical infrastructure in the United States."
The project was supported by the National Science Foundation and
focused on southwest Kansas, but the researchers acknowledge that
cattle come from outside the region and interruptions may have larger
national effects.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Kansas State University. Original written
by Stephanie Jacques. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Qihui Yang, Don Gruenbacher, Jessica L. Heier Stamm, Gary L. Brase,
Scott A. DeLoach, David E. Amrine, Caterina Scoglio. Developing an
agent-based model to simulate the beef cattle production and
transportation in southwest Kansas. Physica A: Statistical
Mechanics and its Applications, 2019; 526: 120856 DOI:
[19]10.1016/j.physa.2019.04.092
__________________________________________________________________
--- up 16 weeks, 3 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri May 15 21:30:14 2020
Date:
May 15, 2020
Source:
European Synchrotron Radiation Facility
Summary:
Moisture is the main environmental factor that triggers the
degradation of the masterpiece The Scream (1910) by Edvard
Munch, according to new findings using a combination of in situ
non-invasive spectroscopic methods and synchrotron X-ray
techniques.
FULL STORY
__________________________________________________________________
Moisture is the main environmental factor that triggers the degradation
of the masterpiece The Scream (1910?) by Edvard Munch, according to the
finding of an international team of scientists led by the CNR (Italy),
using a combination of in situ non-invasive spectroscopic methods and
synchrotron X-ray techniques. After exploiting the capability of the
European mobile platform MOLAB in situ and non-invasively at the Munch
Museum in Oslo, the researchers came to the ESRF, the European
Synchrotron (Grenoble, France), the world's brightest X-ray source, to
carry out non-destructive experiments on micro-flakes originating from
one of the most well-known versions of The Scream. The findings could
help better preserve this masterpiece, which is seldom exhibited due to
its degradation. The study is published in Science Advances.
The Scream is among the most famous paintings of the modern era. The
now familiar image is interpreted as the ultimate representation of
anxiety and mental anguish. There are a number of versions of The
Scream, namely two paintings, two pastels, several lithographic prints
and a few drawings and sketches. The two most well-known versions are
the paintings that Edvard Munch created in 1893 and 1910. Each version
of The Scream is unique. Munch clearly experimented to find the exact
colours to represent his personal experience, mixing diverse binding
media (tempera, oil and pastel) with brilliant and bold synthetic
pigments to make 'screaming colours'. Unfortunately, the extensive use
of these new coloured materials poses a challenge for the long-term
preservation of Munch's artworks.
The version of the Scream (1910?) that belongs to the Munch Museum
(Oslo, Norway) clearly exhibits signs of degradation in different areas
where cadmium-sulfide-based pigments have been used: cadmium yellow
brushstrokes have turned to an off-white colour in the sunset cloudy
sky and in the neck area of the central figure. In the lake, a thickly
applied opaque cadmium yellow paint is flaking. Throughout its
existence, several elements have played a role in the deterioration of
the masterpiece: the yellow pigments used, the environmental conditions
and a theft in 2004, when the painting disappeared for two years.
Since the recovery of the painting after the theft, the masterpiece has
rarely been shown to the public. Instead, it is preserved in a
protected storage area in the Munch Museum, in Norway, under controlled
conditions of lighting, temperature (about 18°C) and relative humidity
(about 50%).
An international collaboration, led by the CNR (Italy), with the
University of Perugia (Italy), the University of Antwerp (Belgium), the
Bard Graduate Center in New York City (USA), the European Synchrotron
(ESRF, France), the German Electron Synchrotron (DESY, Hamburg) and the
Munch Museum, has studied in detail the nature of the various
cadmium-sulfide pigments used by Munch, and how these have degraded
over the years.
The findings provide relevant hints about the deterioration mechanism
of cadmium-sulfide-based paints, with significant implication for the
preventive conservation of The Scream.
"The synchrotron micro-analyses allowed us to pinpoint the main reason
that made the painting decline, which is moisture. We also found that
the impact of light in the paint is minor. I am very pleased that our
study could contribute to preserve this famous masterpiece," explains
Letizia Monico, one of the corresponding authors of the study.
Hitting the right formula for preservation
Monico and her colleagues studied selected cadmium-sulfide-based areas
of The Scream (1910?), as well as a corresponding micro-sample, using a
series of non-invasive in-situ spectroscopic analyses with portable
equipment of the European MOLAB platform in combination with the
techniques of micro X-ray diffraction, X-ray micro fluorescence and
micro X-ray absorption near edge structure spectroscopy mainly at the
ESRF, the European Synchrotron, in France, the world's most powerful
synchrotron. The study of the painting was integrated with
investigations of artificially aged mock-ups. The latter were prepared
using a historical cadmium yellow pigment powder and a cadmium yellow
oil paint tube that belonged to Munch. Both mock-ups had a similar
composition to the lake in the painting. "Our goal was to compare the
data from all these different pigments, in order to extrapolate the
causes that can lead to deterioration," says Monico.
The study shows that the original cadmium sulfide turns into cadmium
sulfate in the presence of chloride-compounds in high-moisture
conditions (relative humidity, or RH ?95%). This happens even if there
is no light.
"The right formula to preserve and display the main version of The
Scream on a permanent basis should include the mitigation of the
degradation of the cadmium yellow pigment by minimising the exposure of
the painting to excessively high moisture levels (trying to reach 45%
RH or lower), while keeping the lighting at standard values foreseen
for lightfast painting materials. The results of this study provide new
knowledge, which may lead to practical adjustments to the Museum's
conservation strategy," explains Irina C. A. Sandu, conservation
scientist at the Munch Museum.
"Today the Munch Museum stores and exhibits Edvard Munch's artworks at
a relative humidity of about 50% and at a temperature of around 20 °C.
These environmental conditions will also apply to the new Munch Museum
to be opened in Spring 2020. That said, the Museum will now look into
how this study may affect the current regime. Part of such a review
will be to consider how other materials in the collection will respond
to possible adjustments," adds Eva Storevik Tveit, paintings
conservator at the Munch Museum.
Cadmium-sulfide-based yellows are not only present in Munch's artwork
but also in the work of other artists contemporary to him, such as
Henri Matisse, Vincent van Gogh and James Ensor.
"The integration of non-invasive in-situ investigations at the
macro-scale level with synchrotron micro-analyses proved its worth in
helping us to understand complex alteration processes. It can be
profitably exploited for interrogating masterpieces that could suffer
from the same weakness," reports Costanza Miliani, coordinator of the
mobile platform MOLAB (operating in Europe under the IPERION CH
project) and second corresponding author of this study.
Monico and colleagues, especially Koen Janssens (University of
Antwerp), have a long-standing collaboration with the ESRF, the
European Synchrotron, and in particular with the scientist Marine
Cotte, to investigate these pigments and the best way to preserve the
original masterpieces.
"At the ESRF, ID21 is one of the very few beamlines in the world where
we can perform imaging X-ray absorption and fluorescence spectroscopy
analysis of the entire sample, at low energy and with sub-micrometer
spatial resolution," explains Janssens.
"EBS, the new Extremely Brilliant Source, the first-of-a-kind
high-energy synchrotron, which is under commissioning at the moment at
the ESRF, will further improve the capabilities of our instruments for
the benefit of world heritage science. We will be able to perform
microanalyses with increased sensitivity, and a greater level of
detail. Considering the complexity of these artistic materials, such
instrumental developments will highly benefit the analysis of our
cultural heritage," adds Cotte, ESRF scientist and CNRS researcher
director.
"This kind of work shows that art and science are intrinsically linked
and that science can help preserve pieces of art so that the world can
continue admiring them for years to come," concludes Miliani,
coordinator of MOLAB.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]European Synchrotron Radiation Facility.
Original written by Montserrat Capellas Espuny. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Letizia Monico, Laura Cartechini, Francesca Rosi, Annalisa Chieli,
Chiara Grazia, Steven De Meyer, Gert Nuyts, Frederik Vanmeert, Koen
Janssens, Marine Cotte, Wout De Nolf, Gerald Falkenberg, Irina
Crina Anca Sandu, Eva Storevik Tveit, Jennifer Mass, Renato Pereira
De Freitas, Aldo Romani, and Costanza Miliani. Probing the
chemistry of CdS paints in The Scream by in situ noninvasive
spectroscopies and synchrotron radiation x-ray techniques. Science
Advances, 2020 DOI: [19]10.1126/sciadv.aay3514
__________________________________________________________________
--- up 16 weeks, 3 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri May 15 21:30:14 2020
Date:
May 15, 2020
Source:
University of Exeter
Summary:
Scientists have pioneered a new technique to expose hidden
biochemical pathways involving single molecules at the
nanoscale.
FULL STORY
__________________________________________________________________
Scientists have pioneered a new technique to expose hidden biochemical
pathways involving single molecules at the nanoscale.
A team of researchers from the University of Exeter's Living Systems
Institute used light to establish a means to monitor the structure and
properties of individual molecules in real time.
This innovative approach has allowed the team to temporarily bridge
molecules together to provide a crucial lens into their dynamics.
The study is published in the leading journal Nature Communications.
The structure of individual molecules and their properties, such as
chirality, are difficult to probe.
In the new study, led by Professor Frank Vollmer, the group was able to
observe reactions at the nanoscale which would otherwise be
inaccessible.
Thiol/disulfide exchange -- or the principal way disulfide bonds are
formed and rearranged in a protein -- has not yet been fully
scrutinised at equilibrium at the single-molecule level, in part
because this cannot be optically resolved in bulk samples.
However, light can circulate around micron-sized glass spheres to form
resonances. The trapped light can then repeatedly interact with its
surrounding environment. By attaching gold nanoparticles to the sphere,
light is enhanced and spatially confined down to the size of viruses
and amino acids.
The resulting optoplasmonic coupling allows for the detection of
biomolecules that approach the nanoparticles while they attach to the
gold, detach, and interact in a variety of ways.
Despite the sensitivity of this technique, there is lacking
specificity. Molecules as simple as atomic ions can be detected and
certain dynamics can be discerned, yet we cannot necessarily
discriminate them.
Serge Vincent remarks: "It took some time before we could narrow down
how to reliably sample individual molecules. Forward and backward
reaction rates at equilibrium are counterbalanced and, to certain
extent, we sought to lift the veil over these subtle dynamics."
Reaction pathways regulated by disulfide bonds can constrain
interactions to single thiol sensing sites on the nanoparticles. The
high fidelity of this approach establishes precise probing of the
characteristics of molecules undergoing the reaction.
By placing linkers on the gold surface, interactions with thiolated
species are isolated for based on their charge and the cycling itself.
Sensor signals have clear patterns related to whether reducing agent is
present. If it is, the signal oscillates in a controlled way, while if
it is not, the oscillations become stochastic.
For each reaction the monomer or dimer state of the leaving group can
be resolved.
Surprisingly, the optoplasmonic resonance shifts in frequency and/or
changes in linewidth when single molecules interact with it. In many
cases this result suggests a plasmon-vibrational coupling that could
help identify individual molecules, finally achieving characterisation.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Exeter. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Serge Vincent, Sivaraman Subramanian, Frank Vollmer. Optoplasmonic
characterisation of reversible disulfide interactions at single
thiol sites in the attomolar regime. Nature Communications, 2020;
11 (1) DOI: [19]10.1038/s41467-020-15822-8
__________________________________________________________________
--- up 16 weeks, 3 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:10 2020
planet
Date:
May 18, 2020
Source:
Columbia University
Summary:
Humans have been wondering whether we alone in the universe
since antiquity. We know from the geological record that life
started relatively quickly, as soon our planet's environment was
stable enough to support it. We also know that the first
multicellular organism, which eventually produced today's
technological civilization, took far longer to evolve,
approximately 4 billion years.
FULL STORY
__________________________________________________________________
Humans have been wondering whether we alone in the universe since
antiquity.
We know from the geological record that life started relatively
quickly, as soon our planet's environment was stable enough to support
it. We also know that the first multicellular organism, which
eventually produced today's technological civilization, took far longer
to evolve, approximately 4 billion years.
But despite knowing when life first appeared on Earth, scientists still
do not understand how life occurred, which has important implications
for the likelihood of finding life elsewhere in the universe.
In a new paper published in the Proceeding of the National Academy of
Sciences today, David Kipping, an assistant professor in Columbia's
Department of Astronomy, shows how an analysis using a statistical
technique called Bayesian inference could shed light on how complex
extraterrestrial life might evolve in alien worlds.
"The rapid emergence of life and the late evolution of humanity, in the
context of the timeline of evolution, are certainly suggestive,"
Kipping said. "But in this study it's possible to actually quantify
what the facts tell us."
To conduct his analysis, Kipping used the chronology of the earliest
evidence for life and the evolution of humanity. He asked how often we
would expect life and intelligence to re-emerge if Earth's history were
to repeat, re-running the clock over and over again.
He framed the problem in terms of four possible answers: Life is common
and often develops intelligence, life is rare but often develops
intelligence, life is common and rarely develops intelligence and,
finally, life is rare and rarely develops intelligence.
This method of Bayesian statistical inference -- used to update the
probability for a hypothesis as evidence or information becomes
available -- states prior beliefs about the system being modeled, which
are then combined with data to cast probabilities of outcomes.
"The technique is akin to betting odds," Kipping said. "It encourages
the repeated testing of new evidence against your position, in essence
a positive feedback loop of refining your estimates of likelihood of an
event."
From these four hypotheses, Kipping used Bayesian mathematical formulas
to weigh the models against one another. "In Bayesian inference, prior
probability distributions always need to be selected," Kipping said.
"But a key result here is that when one compares the rare-life versus
common-life scenarios, the common-life scenario is always at least nine
times more likely than the rare one."
The analysis is based on evidence that life emerged within 300 million
years of the formation of the Earth's oceans as found in
carbon-13-depleted zircon deposits, a very fast start in the context of
Earth's lifetime. Kipping emphasizes that the ratio is at least 9:1 or
higher, depending on the true value of how often intelligence develops.
Kipping's conclusion is that if planets with similar conditions and
evolutionary time lines to Earth are common, then the analysis suggests
that life should have little problem spontaneously emerging on other
planets. And what are the odds that these extraterrestrial lives could
be complex, differentiated and intelligent? Here, Kipping's inquiry is
less assured, finding just 3:2 odds in favor of intelligent life.
This result stems from humanity's relatively late appearance in Earth's
habitable window, suggesting that its development was neither an easy
nor ensured process. "If we played Earth's history again, the emergence
of intelligence is actually somewhat unlikely," he said.
Kipping points out that the odds in the study aren't overwhelming,
being quite close to 50:50, and the findings should be treated as no
more than a gentle nudge toward a hypothesis.
"The analysis can't provide certainties or guarantees, only statistical
probabilities based on what happened here on Earth," Kipping said. "Yet
encouragingly, the case for a universe teeming with life emerges as the
favored bet. The search for intelligent life in worlds beyond Earth
should be by no means discouraged."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Columbia University. Original written by
Carla Cantor. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. David Kipping. An objective Bayesian analysis of life’s early start
and our late arrival. PNAS, 2020 DOI: [19]10.1073/pnas.1921655117
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:10 2020
Date:
May 18, 2020
Source:
Cornell University
Summary:
After examining a dozen types of suns and a roster of planet
surfaces, astronomers have developed a practical model - an
environmental color ''decoder'' - to tease out climate clues for
potentially habitable exoplanets in galaxies far away.
FULL STORY
__________________________________________________________________
After examining a dozen types of suns and a roster of planet surfaces,
Cornell University astronomers have developed a practical model -- an
environmental color "decoder" -- to tease out climate clues for
potentially habitable exoplanets in galaxies far away.
"We looked at how different planetary surfaces in the habitable zones
of distant solar systems could affect the climate on exoplanets," said
Jack Madden, who works in the lab of Lisa Kaltenegger, associate
professor of astronomy and director of Cornell's Carl Sagan Institute.
"Reflected light on the surface of planets plays a significant role not
only on the overall climate," Madden said, "but also on the detectable
spectra of Earth-like planets."
Madden and Kaltenegger are co-authors of "How Surfaces Shape the
Climate of Habitable Exoplanets," released May 18 in the Monthly
Notices of the Royal Astronomical Society.
In their research, they combine details of a planet's surface color and
the light from its host star to calculate a climate. For instance, a
rocky, black basalt planet absorbs light well and would be very hot,
but add sand or clouds and the planet cools; and a planet with
vegetation and circling a reddish K-star will likely have cool
temperatures because of how those surfaces reflect their suns' light.
"Think about wearing a dark shirt on a hot summer day. You're going to
heat up more, because the dark shirt is not reflecting light. It has a
low albedo (it absorbs light) and it retains heat," Madden said. "If
you wear a light color, such as white, its high albedo reflects the
light -- and your shirt keeps you cool.
It's the same with stars and planets, Kaltenegger said.
"Depending on the kind of star and the exoplanet's primary color -- or
the reflecting albedo -- the planet's color can mitigate some of the
energy given off by the star," Kaltenegger said. "What makes up the
surface of an exoplanet, how many clouds surround the planet, and the
color of the sun can change an exoplanet's climate significantly."
Madden said forthcoming instruments like the Earth-bound Extremely
Large Telescope will allow scientists to gather data in order to test a
catalog of climate predictions.
"There's an important interaction between the color of a surface and
the light hitting it," he said. "The effects we found based on a
planet's surface properties can help in the search for life."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Cornell University. Original written by
Blaine Friedlander. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Lisa Kaltenegger, Jack Madden. How surfaces shape the climate of
habitable exoplanets. Monthly Notices of the Royal Astronomical
Society, 2020; 495 (1): 1 DOI: [19]10.1093/mnras/staa387
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:10 2020
Date:
May 18, 2020
Source:
Heinrich-Heine University Duesseldorf
Summary:
Although most of the universe is made up of dark matter, very
little is known about it. Physicists have used a high-precision
experiment to look for interaction between dark matter and
normal matter.
FULL STORY
__________________________________________________________________
Although most of the universe is made up of dark matter, very little is
known about it. Physicists have used a high-precision experiment to
look for interaction between dark matter and normal matter.
The universe mainly consists of a novel substance and an energy form
that are not yet understood. This 'dark matter' and 'dark energy' are
not directly visible to the naked eye or through telescopes.
Astronomers can only provide proof of their existence indirectly, based
on the shape of galaxies and the dynamics of the universe. Dark matter
interacts with normal matter via the gravitational force, which also
determines the cosmic structures of normal, visible matter.
It is not yet known whether dark matter also interacts with itself or
with normal matter via the other three fundamental forces -- the
electromagnetic force, the weak and the strong nuclear force -- or some
additional force. Even very sophisticated experiments have so far not
been able to detect any such interaction. This means that if it does
exist at all, it must be very weak.
In order to shed more light on this topic, scientists around the globe
are carrying out various new experiments in which the action of the
non-gravitational fundamental forces takes place with as little outside
interference as possible and the action is then precisely measured. Any
deviations from the expected effects may indicate the influence of dark
matter or dark energy. Some of these experiments are being carried out
using huge research machines such as those housed at CERN, the European
Organization for Nuclear Research in Geneva. But laboratory-scale
experiments, for example in Düsseldorf, are also feasible, if designed
for maximum precision.
The team working under guidance of Prof. Stephan Schiller from the
Institute of Experimental Physics at HHU has presented the findings of
a precision experiment to measure the electrical force between the
proton ("p") and the deuteron ("d") in the journal Nature. The proton
is the nucleus of the hydrogen atom (H), the heavier deuteron is the
nucleus of deuterium (D) and consists of a proton and a neutron bound
together.
The Düsseldorf physicists study an unusual object, HD+, the ion of the
partially deuterated hydrogen molecule. One of the two electrons
normally contained in the electron shell is missing in this ion. Thus,
HD+ consists of a proton and deuteron bound together by just one
electron, which compensates for the repulsive electrical force between
them.
This results in a particular distance between the proton and the
deuteron, referred to as the 'bond length'. In order to determine this
distance, the HHU physicists have measured the rotation rate of the
molecule with eleven digits precision using a spectroscopy technique
they recently developed. The researchers used concepts that are also
relevant in the field of quantum technology, such as particle traps and
laser cooling.
It is extremely complicated to derive the bond length from the
spectroscopy results, and thus to deduct the strength of the force
exerted between the proton and the deuteron. This is because this force
has quantum properties. The theory of quantum electrodynamics (QED)
proposed in the 1940s must be used here. A member of the author team
spent two decades to advance the complex calculations and was recently
able to predict the bond length with sufficient precision.
This prediction corresponds to the measurement result. From the
agreement one can deduce the maximum strength of a modification of the
force between a proton and a deuteron caused by dark matter. Prof.
Schiller comments: "My team has now pushed down this upper limit more
than 20-fold. We have demonstrated that dark matter interacts much less
with normal matter than was previously considered possible. This
mysterious form of matter continues to remain undercover, at least in
the lab!"
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Heinrich-Heine University Duesseldorf.
Original written by Arne Claussen and editorial staff. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. S. Alighanbari, G. S. Giri, F. L. Constantin, V. I. Korobov, S.
Schiller. Precise test of quantum electrodynamics and determination
of fundamental constants with HD ions. Nature, 2020; 581 (7807):
152 DOI: [19]10.1038/s41586-020-2261-5
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:12 2020
2D order protects several entangled states that could be used in quantum computing
Date:
May 18, 2020
Source:
Rice University
Summary:
Physicists have found surprising evidence of a link between the
2D quantum Hall effect and 3D topological materials that could
be used in quantum computing.
FULL STORY
__________________________________________________________________
U.S. and German physicists have found surprising evidence that one of
the most famous phenomena in modern physics -- the quantum Hall effect
-- is "reincarnated" in topological superconductors that could be used
to build fault-tolerant quantum computers.
The 1980 discovery of the quantum Hall effect kicked off the study of
topological orders, electronic states with "protected" patterns of
long-range quantum entanglement that are remarkably robust. The
stability of these protected states is extremely attractive for quantum
computing, which uses quantum entanglement to store and process
information.
In a study published online this month in Physical Review X (PRX),
theoretical physicists from Rice University, the University of
California, Berkeley (UC Berkeley), and the Karlsruhe Institute of
Technology (KIT) in Karlsruhe, Germany, presented strong numerical
evidence for a surprising link between 2D and 3D phases of topological
matter. The quantum Hall effect was discovered in 2D materials, and
laboratories worldwide are in a race to make 3D topological
superconductors for quantum computing.
"In this work we've shown that a particular class of 3D topological
superconductors should exhibit 'energy stacks' of 2D electronic states
at their surfaces," said Rice co-author Matthew Foster, an associate
professor of physics and astronomy and member of the Rice Center for
Quantum Materials (RCQM). "Each of these stacked states is a robust
'reincarnation' of a single, very special state that occurs in the 2D
quantum Hall effect."
The quantum Hall effect was first measured in two-dimensional
materials. Foster uses a "percolation" analogy to help visualize the
strange similarities between what occurs in 2D quantum Hall experiments
and the study's 3D computational models.
"Picture a sheet of paper with a map of rugged peaks and valleys, and
then imagine what happens as you fill that landscape with water," he
said. "The water is our electrons, and when the level of fluid is low,
you just have isolated lakes of electrons. The lakes are disconnected
from one another, and the electrons can't conduct across the bulk. If
water level is high, you have isolated islands, and in this case the
islands are like the electrons, and you also don't get bulk
conduction."
In Foster's analogy the rugged landscape is the electric potential of
the 2D material, and the level of ruggedness corresponds to amount of
impurities in the system. The water level represents the "Fermi
energy," a concept in physics that refers to the filling level of
electrons in a system. The edges of the paper map are analogous to the
1D edges that surround the 2D material.
"If you add water and tune the fluid level precisely to the point where
you have little bridges of water connecting the lakes and little
bridges of land connecting the islands, then it's as easy to travel by
water or land," Foster said. "That is the percolation threshold, which
corresponds to the transition between topological states in quantum
Hall. This is the special 2D state in quantum Hall.
"If you increase the fluid level more, now the electrons are trapped in
isolated islands, and you'd think, 'Well, I have the same situation I
had before, with no conduction.' But, at the special transition, one of
the electronic states has peeled away to the edge. Adding more fluid
doesn't remove the edge state, which can go around the whole sample,
and nothing can stop it."
The analogy describes the relationship between robust edge conduction
and bulk fine-tuning through the special transition in the quantum Hall
effect. In the PRX study, Foster and co-authors Bjo?rn Sbierski of UC
Berkeley and Jonas Karcher of KIT studied 3D topological systems that
are similar to the 2D landscapes in the analogy.
"The interesting stuff in these 3D systems is also only happening at
the boundary," Foster said. "But now our boundaries aren't 1D edge
states, they are 2D surfaces."
Using "brute-force numerical calculations of the surface states,"
Sbierski, Karcher and Foster found a link between the critical 2D
quantum Hall state and the 3D systems. Like the 1D edge state that
persists above the transition energy in 2D quantum Hall materials, the
calculations revealed a persistent 2D boundary state in the 3D systems.
And not just any 2D state; it is exactly the same 2D percolation state
that gives rise to 1D quantum Hall edge states.
"What was a fine-tuned topological quantum phase transition in 2D has
been 'reincarnated' as the generic surface state for a higher
dimensional bulk," Foster said. "In 2018 study, my group identified an
analogous connection between a different, more exotic type of 2D
quantum Hall effect and the surface states of another class of 3D
topological superconductors. With this new evidence, we are now
confident there is a deep topological reason for these connections, but
at the moment the mathematics remain obscure."
Topological superconductors have yet to be realized experimentally, but
physicists are trying to create them by adding impurities to
topological insulators. This process, known as doping, has been widely
used to make other types of unconventional superconductors from bulk
insulators.
"We now have evidence that three of the five 3D topological phases are
tied to 2D phases that are versions of the quantum Hall effect, and all
three 3D phases could be realized in 'topological superconductors,'"
Foster said.
Foster said conventional wisdom in condensed matter physics has been
that topological superconductors would each host only one protected 2D
surface state and all other states would be adversely affected by
unavoidable imperfections in the solid-state materials used to make the
superconductors.
But Sbierski, Karcher and Foster's calculations suggest that isn't the
case.
"In quantum Hall, you can tune anywhere and still get this robust
plateau in conductance, due to the 1D edge states," Foster said. "Our
work suggests that is also the case in 3D. We see stacks of critical
states at different energy levels, and all of them are protected by
this strange reincarnation of the 2D quantum Hall transition state."
The authors also set the stage for experimental work to verify their
findings, working out details of how the surface states of the 3D
phases should appear in various experimental probes.
"We provide precise statistical 'fingerprints' for the surface states
of the topological phases," Foster said. "The actual wave functions are
random, due to disorder, but their distributions are universal and
match the quantum Hall transition."
The research was supported by a National Science Foundation CAREER
grant (1552327), the German National Academy of Sciences Leopoldina
(LPDS 2018-12), a KIT research travel grant, German state graduate
funding and the UC Berkeley Library's Berkeley Research Impact
Initiative.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Rice University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Björn Sbierski, Jonas F. Karcher, Matthew S. Foster. Spectrum-Wide
Quantum Criticality at the Surface of Class AIII Topological
Phases: An “Energy Stack” of Integer Quantum Hall Plateau
Transitions. Physical Review X, 2020; 10 (2) DOI:
[19]10.1103/PhysRevX.10.021025
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:12 2020
Novel training method could shrink carbon footprint for greener deep learning
Date:
May 18, 2020
Source:
Rice University
Summary:
Engineers have found a way to train deep neural networks for a
fraction of the energy required today. Their Early Bird method
finds key network connectivity patterns early in training,
reducing the computations and carbon footprint for training deep
learning.
FULL STORY
__________________________________________________________________
Rice University's Early Bird could care less about the worm; it's
looking for megatons of greenhouse gas emissions.
Early Bird is an energy-efficient method for training deep neural
networks (DNNs), the form of artificial intelligence (AI) behind
self-driving cars, intelligent assistants, facial recognition and
dozens more high-tech applications.
Researchers from Rice and Texas A&M University unveiled Early Bird
April 29 in a spotlight paper at ICLR 2020, the International
Conference on Learning Representations. A study by lead authors Haoran
You and Chaojian Li of Rice's Efficient and Intelligent Computing (EIC)
Lab showed Early Bird could use 10.7 times less energy to train a DNN
to the same level of accuracy or better than typical training. EIC Lab
director Yingyan Lin led the research along with Rice's Richard
Baraniuk and Texas A&M's Zhangyang Wang.
"A major driving force in recent AI breakthroughs is the introduction
of bigger, more expensive DNNs," Lin said. "But training these DNNs
demands considerable energy. For more innovations to be unveiled, it is
imperative to find 'greener' training methods that both address
environmental concerns and reduce financial barriers of AI research."
Training cutting-edge DNNs is costly and getting costlier. A 2019 study
by the Allen Institute for AI in Seattle found the number of
computations needed to train a top-flight deep neural network increased
300,000 times between 2012-2018, and a different 2019 study by
researchers at the University of Massachusetts Amherst found the carbon
footprint for training a single, elite DNN was roughly equivalent to
the lifetime carbon dioxide emissions of five U.S. automobiles.
DNNs contain millions or even billions of artificial neurons that learn
to perform specialized tasks. Without any explicit programming, deep
networks of artificial neurons can learn to make humanlike decisions --
and even outperform human experts -- by "studying" a large number of
previous examples. For instance, if a DNN studies photographs of cats
and dogs, it learns to recognize cats and dogs. AlphaGo, a deep network
trained to play the board game Go, beat a professional human player in
2015 after studying tens of thousands of previously played games.
"The state-of-art way to perform DNN training is called progressive
prune and train," said Lin, an assistant professor of electrical and
computer engineering in Rice's Brown School of Engineering. "First, you
train a dense, giant network, then remove parts that don't look
important -- like pruning a tree. Then you retrain the pruned network
to restore performance because performance degrades after pruning. And
in practice you need to prune and retrain many times to get good
performance."
Pruning is possible because only a fraction of the artificial neurons
in the network can potentially do the job for a specialized task.
Training strengthens connections between necessary neurons and reveals
which ones can be pruned away. Pruning reduces model size and
computational cost, making it more affordable to deploy fully trained
DNNs, especially on small devices with limited memory and processing
capability.
"The first step, training the dense, giant network, is the most
expensive," Lin said. "Our idea in this work is to identify the final,
fully functional pruned network, which we call the 'early-bird ticket,'
in the beginning stage of this costly first step."
By looking for key network connectivity patterns early in training, Lin
and colleagues were able to both discover the existence of early-bird
tickets and use them to streamline DNN training. In experiments on
various benchmarking data sets and DNN models, Lin and colleagues found
Early Bird could emerge as little as one-tenth or less of the way
through the initial phase of training.
"Our method can automatically identify early-bird tickets within the
first 10% or less of the training of the dense, giant networks," Lin
said. "This means you can train a DNN to achieve the same or even
better accuracy for a given task in about 10% or less of the time
needed for traditional training, which can lead to more than one order
savings in both computation and energy."
Developing techniques to make AI greener is the main focus of Lin's
group. Environmental concerns are the primary motivation, but Lin said
there are multiple benefits.
"Our goal is to make AI both more environmentally friendly and more
inclusive," she said. "The sheer size of complex AI problems has kept
out smaller players. Green AI can open the door enabling researchers
with a laptop or limited computational resources to explore AI
innovations."
Additional co-authors include Pengfei Xu, Yonggan Fu and Yue Wang, all
of Rice, and Xiaohan Chen of Texas A&M.
The research was supported by the National Science Foundation (1937592,
1937588).
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Rice University. Note: Content may be
edited for style and length.
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:12 2020
Date:
May 18, 2020
Source:
University of Pennsylvania
Summary:
To break through a looming bandwidth bottleneck, engineers are
exploring some of light's harder-to-control properties. Now, two
new studies have shown a system that can manipulate and detect
one such property: orbital angular momentum. Critically, they
are the first to do so on small semiconductor chips and with
enough precision that it can be used as a medium for
transmitting information.
FULL STORY
__________________________________________________________________
As computers get more powerful and connected, the amount of data that
we send and receive is in a constant race with the technologies that we
use to transmit it. Electrons are now proving insufficiently fast and
are being replaced by photons as the demand for fiber optic internet
cabling and data centers grow.
Though light is much faster than electricity, in modern optical
systems, more information is transmitted by layering data into multiple
aspects of a light wave, such as its amplitude, wavelength and
polarization. Increasingly sophisticated "multiplexing" techniques like
these are the only way to stay ahead of the increasing demand for data,
but those too are approaching a bottleneck. We are simply running out
of room to store more data in the conventional properties of light.
To break through this barrier, engineers are exploring some of light's
harder-to-control properties. Now, two studies from the University of
Pennsylvania's School of Engineering and Applied Science have shown a
system that can manipulate and detect one such property known as the
orbital angular momentum, or OAM, of light. Critically, they are the
first to do so on small semiconductor chips and with enough precision
that it can be used as a medium for transmitting information.
The matched pair of studies, published in the journal Science, was done
in collaboration with researchers at Duke University, Northeastern
University, the Polytechnic University of Milan, Hunan University and
the U.S. National Institute of Standards and Technology.
One study, led by Liang Feng, assistant professor in the departments of
Materials Science and Engineering and Electrical and Systems
Engineering, demonstrates a microlaser which can be dynamically tuned
to multiple distinct OAM modes. The other, led by Ritesh Agarwal,
professor in the Department of Materials Science and Engineering, shows
how a laser's OAM mode can be measured by a chip-based detector. Both
studies involve collaborations between the Agarwal and Feng groups at
Penn.
Such "vortex" lasers, named for the way their light spirals around
their axis of travel, were first demonstrated by Feng with quantum
symmetry-driven designs in 2016. However, Feng and other researchers in
the field have thus far been limited to transmitting a single, pre-set
OAM mode, making them impractical for encoding more information. On the
receiving end, existing detectors have relied on complex filtering
techniques using bulky components that have prevented them from being
integrated directly onto a chip, and are thus incompatible with most
practical optical communications approaches.
Together, this new tunable vortex micro-transceiver and receiver
represents the two most critical components of a system that can enable
a way of multiplying the information density of optical communication,
potentially shattering that looming bandwidth bottleneck.
The ability to dynamically tune OAM values would also enable a photonic
update to a classic encryption technique: frequency hopping. By rapidly
switching between OAM modes in a pre-defined sequence known only to the
sender and receiver, optical communications could be made impossible to
intercept.
"Our findings mark a large step towards launching large-capacity
optical communication networks and confronting the upcoming information
crunch," says Feng.
In the most basic form of optical communication, transmitting a binary
message is as simple as representing 1s and 0s by whether the light is
on or off. This is effectively a measure of the light's amplitude --
how high the peak of the wave is -- which we experience as brightness.
As lasers and detectors become more precise, they can consistently emit
and distinguish between different levels of amplitude, allowing for
more bits of information to be contained in the same signal.
Even more sophisticated lasers and detectors can alter other properties
of light, such as its wavelength, which corresponds to color, and its
polarization, which is the orientation of the wave's oscillations
relative to its direction of travel. Many of these properties can be
set independently of each other, allowing for increasingly dense
multiplexing.
Orbital angular momentum is yet another property of light, though it is
considerably harder to manipulate, given the complexity of the
nanoscale features necessary to generate it from computer-chip-sized
lasers. Circularly polarized light carries an electric field that
rotates around its axis of travel, meaning its photons have a quality
known as spin angular momentum, or SAM. Under highly controlled
spin-orbit interactions, SAM can be locked or converted into another
property, orbital angular momentum, or OAM.
The research on a dynamically tunable OAM laser based on this concept
was led by Feng and graduate student Zhifeng Zhang.
In this new study, Feng, Zhang and their colleagues began with a
"microring" laser, which consists of a ring of semiconductor, only a
few microns wide, through which light can circulate indefinitely as
long as power is supplied. When additional light is "pumped" into the
ring from control arms on either side of the ring, the delicately
designed ring emits circularly polarized laser light. Critically,
asymmetry between the two control arms allows for the SAM of the
resulting laser to be coupled with OAM in a particular direction.
This means that rather than merely rotating around the axis of the
beam, as circularly polarized light does, the wavefront of such a laser
orbits that axis and thus travels in a helical pattern. A laser's OAM
"mode" corresponds to its chirality, the direction those helices twist,
and how close together its twists are.
"We demonstrated a microring laser that is capable of emitting five
distinct OAM modes," Feng says. "That may increase the data channel of
such lasers by up to five times."
Being able to multiplex the OAM, SAM and wavelength of laser light is
itself unprecedented, but not particularly useful without a detector
that can differentiate between those states and read them out.
In concert with Feng's work on the tunable vortex microlaser, the
research on the OAM detector was led by Agarwal and Zhurun Ji, a
graduate student in his lab.
"OAM modes are currently detected through bulk approaches such as mode
sorters, or by filtering techniques such as modal decomposition,"
Agarwal says, "but none of these methods are likely to work on a chip,
or interface seamlessly with electronic signals."
Agarwal and Ji built upon their previous work with Weyl semimetals, a
class of quantum materials that have bulk quantum states whose
electrical properties can be controlled using light. Their experiments
showed that they could control the direction of electrons in those
materials by shining light with different SAM onto it.
Along with their collaborators, Agarwal and Ji drew on this phenomenon
by designing a photodetector that is similarly responsive to different
OAM modes. In their new detector, the photocurrent generated by light
with different OAM modes produced unique current patterns, which
allowed the researchers determine the OAM of light impinging on their
device.
"These results not only demonstrate a novel quantum phenomenon in the
light-matter interaction," Agarwal says, "but for the first time enable
the direct read-out of the phase information of light using an on-chip
photodetector. These studies hold great promise for designing highly
compact systems for future optical communication systems."
Next, Agarwal and Feng plan to collaborate on such systems. By
combining their unique expertise to fabricate on-chip vortex
microlasers and detectors that can uniquely detect light's OAM, they
will design integrated systems to demonstrate new concepts in optical
communications with enhanced data transmission capabilities for
classical light and upon increasing the sensitivity to single photons,
for quantum applications. This demonstration of a new dimension for
storing information based on OAM modes can help create richer
superposition quantum states to increase information capacity by a few
orders of magnitude.
These two strongly-tied studies were partially supported by the
National Science Foundation, the U.S. Army Research Office and the
Office of Naval Research. Research on the vortex microlaser was done in
collaboration with Josep M. Jornet, associate professor at Northeastern
University and Stefano Longhi, professor at the Polytechnic University
of Milan in Italy and Natalia M. Litchinitser, professor at Duke
University. Penn's Xingdu Qiao, Bikashkali Midya, Kevin Liu, Tianwei
Wu, Wenjing Liu and Duke's Jingbo Sun also contributed to the work.
Research on the photodetector was done in collaboration with Albert
Davydov from the National Institute of Standards and Technology (NIST)
and Anlian Pan from Hunan University. Penn's Wenjing Liu, Xiaopeng Fan,
Zhifeng Zhang and NIST's Sergiy Krylyuk also contributed to the work.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Pennsylvania. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal References:
1. Zhifeng Zhang, Xingdu Qiao, Bikashkali Midya, Kevin Liu, Jingbo
Sun, Tianwei Wu, Wenjing Liu, Ritesh Agarwal, Josep Miquel Jornet,
Stefano Longhi, Natalia M. Litchinitser, Liang Feng. Tunable
topological charge vortex microlaser. Science, 2020; 368 (6492):
760 DOI: [19]10.1126/science.aba8996
2. Zhurun Ji, Wenjing Liu, Sergiy Krylyuk, Xiaopeng Fan, Zhifeng
Zhang, Anlian Pan, Liang Feng, Albert Davydov, Ritesh Agarwal.
Photocurrent detection of the orbital angular momentum of light.
Science, 2020; 368 (6492): 763 DOI: [20]10.1126/science.aba9192
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:18 2020
Researchers find aluminum in water could affect lead's solubility -- in
certain cases
Date:
May 18, 2020
Source:
Washington University in St. Louis
Summary:
Until recently, researchers have not inspected the interplay
between three common chemicals found in drinking water. Research
has now found they all affect each other and a closer look is
needed.
FULL STORY
__________________________________________________________________
It is not uncommon to find aluminum in municipal water systems. It's
part of a treatment chemical used in some water treatment processes.
Recently, however, it has been discovered in lead scale, deposits that
form on lead water pipes.
The aluminum presence in pipes is both unsurprising and, in the
quantities researchers saw in water pipes, not a health concern,
according to Daniel Giammar, the Walter E. Browne Professor of
Environmental Engineering in the McKelvey School of Engineering at
Washington University in St. Louis. But no one had looked at how it
might affect the larger municipal system.
In particular, Giammar wanted to find out, "What is that aluminum doing
to the behavior of the lead in the scale?" As long as the lead is bound
to the scale, it doesn't enter the water system.
Giammar and a team ran several experiments and found that, in a lab
setting, aluminum does have a small but important effect on lead's
solubility under certain conditions. Their results were published in
late April in Environmental Science & Technology.
The experiments were carried out in large part by visiting PhD student
Guiwei Li, who was able to complete the work during his brief stay at
Washington University before returning to the Chinese Academy of
Sciences.
In simplified models, the researchers took a look at how phosphate,
aluminum and a combination of the two, affected a strip of lead in a
jar of water with a composition close to that of water found in many
water systems. The aim: to better understand lead's solubility, or the
amount that would dissolve and make its way into the water when
impacted by those chemicals.
In the jar in which only aluminum was added, there was no effect on the
solubility of the lead strip; lead had dissolved into the water at a
concentration of about 100 micrograms per liter.
In the jar in which only phosphate was added, the concentration of lead
in the water decreased from about 100 micrograms per liter to less than
one.
In the jar in which both aluminum and phosphate were added, the
concentration of lead in the water decreased from about 100 micrograms
per liter to about 10 micrograms per liter.
Ten micrograms of lead per liter of water is still below drinking water
standards, Giammar said, but it's still more lead in the water than was
seen in the jar without aluminum. "This tells us what our next
experiment should be," he said. His lab will do these experiments with
real lead pipes, as they have done in the past.
"This showed us things that were surprising," he said. "Some people
would have thought that aluminum wasn't doing anything because it's
inert. But then in our work, we saw that it actually affects lead
solubility."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Washington University in St. Louis.
Original written by Brandie Jefferson. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Guiwei Li, Yeunook Bae, Anushka Mishrra, Baoyou Shi, Daniel E.
Giammar. Effect of Aluminum on Lead Release to Drinking Water from
Scales of Corrosion Products. Environmental Science & Technology,
2020; DOI: [19]10.1021/acs.est.0c00738
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon May 18 21:30:18 2020
to insulator
Date:
May 18, 2020
Source:
RIKEN
Summary:
Tantalum disulfide is a mysterious material. According to
textbook theory, it should be a conducting metal, but in the
real world it acts like an insulator. Using a scanning tunneling
microscope, researchers have taken a high-resolution look at the
structure of the material, revealing why it demonstrates this
unintuitive behavior.
FULL STORY
__________________________________________________________________
Tantalum disulfide is a mysterious material. According to textbook
theory, it should be a conducting metal, but in the real world it acts
like an insulator. Using a scanning tunneling microscope, researchers
from the RIKEN Center for Emergent Matter Science have taken a
high-resolution look at the structure of the material, revealing why it
demonstrates this unintuitive behavior. It has long been known that
crystalline materials should be good conductors when they have an odd
number of electrons in each repeating cell of the structure, but may be
poor conductors when the number is even. However, sometimes this
formula does not work, with one case being "Mottness," a property based
on the work of Sir Nevill Mott. According to that theory, when there is
strong repulsion between electrons in the structure, it leads the
electrons to become "localized" -- paralyzed in other words -- and
being unable to move around freely to create an electric current. What
makes the situation complicated is that there are also situations where
electrons in different layers of a 3-D structure can interact, pairing
up to create a bilayer structure with an even number of electrons. It
has been previously suggested that this "pairing" of electrons would
restore the textbook understanding of the insulator, making it
unnecessary to invoke "Mottness" as an explanation.
For the current study, published in Nature Communications, the research
group decided to look at tantalum disulfide, a material with 13
electrons in each repeating structure, which should therefore be a
conductor. However, it is not, and there has been controversy over
whether this property is caused by its "Mottness" or by a pairing
structure.
To perform the research, the researchers created crystals of tantalum
disulfide and then cleaved the crystals in a vacuum to reveal
ultra-clean surfaces which they then examined, at a temperature close
to absolute zero -- with a method known as scanning tunneling
microscopy -- a method involving a tiny and extremely sensitive metal
tip that can sense where electrons are in a material, and their degree
of conducting behavior, by using the quantum tunneling effect. Their
results showed that there was indeed a stacking of layers which
effectively arranged them into pairs. Sometimes the crystals cleaved
between the pairs of layers, and sometimes through a pair, breaking it.
They performed spectroscopy on both the paired and unpaired layers and
found that even the unpaired ones are insulating, leaving Mottness as
the only explanation.
According to Christopher Butler, the first author of the study, "The
exact nature of the insulating state and of the phase transitions in
tantalum disulfide have been long-standing mysteries, and it was very
exciting to find that Mottness is a key player, aside from the pairing
of the layers. This is because theorists suspect that a Mott state
could set the stage for an interesting phase of matter known as a
quantum spin liquid."
Tetsuo Hanaguri, who led the research team, said, "The question of what
makes this material move between insulating to conducting phases has
long been a puzzle for physicists, and I am very satisfied we have been
able to put a new piece into the puzzle. Future work may help us to
find new interesting and useful phenomena emerging from Mottness, such
as high-temperature superconductivity."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]RIKEN. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. C. J. Butler, M. Yoshida, T. Hanaguri, Y. Iwasa. Mottness versus
unit-cell doubling as the driver of the insulating state in
1T-TaS2. Nature Communications, 2020; 11 (1) DOI:
[19]10.1038/s41467-020-16132-9
__________________________________________________________________
--- up 16 weeks, 6 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:10 2020
Date:
May 19, 2020
Source:
NASA/Goddard Space Flight Center
Summary:
By studying the chemical elements on Mars today -- including
carbon and oxygen -- scientists can work backwards to piece
together the history of a planet that once had the conditions
necessary to support life.
FULL STORY
__________________________________________________________________
By studying the chemical elements on Mars today -- including carbon and
oxygen -- scientists can work backwards to piece together the history
of a planet that once had the conditions necessary to support life.
Weaving this story, element by element, from roughly 140 million miles
(225 million kilometers) away is a painstaking process. But scientists
aren't the type to be easily deterred. Orbiters and rovers at Mars have
confirmed that the planet once had liquid water, thanks to clues that
include dry riverbeds, ancient shorelines, and salty surface chemistry.
Using NASA's Curiosity Rover, scientists have found evidence for
long-lived lakes. They've also dug up organic compounds, or life's
chemical building blocks. The combination of liquid water and organic
compounds compels scientists to keep searching Mars for signs of past
-- or present -- life.
Despite the tantalizing evidence found so far, scientists'
understanding of Martian history is still unfolding, with several major
questions open for debate. For one, was the ancient Martian atmosphere
thick enough to keep the planet warm, and thus wet, for the amount of
time necessary to sprout and nurture life? And the organic compounds:
are they signs of life -- or of chemistry that happens when Martian
rocks interact with water and sunlight?
In a recent Nature Astronomy report on a multi-year experiment
conducted in the chemistry lab inside Curiosity's belly, called Sample
Analysis at Mars (SAM), a team of scientists offers some insights to
help answer these questions. The team found that certain minerals in
rocks at Gale Crater may have formed in an ice-covered lake. These
minerals may have formed during a cold stage sandwiched between warmer
periods, or after Mars lost most of its atmosphere and began to turn
permanently cold.
Gale is a crater the size of Connecticut and Rhode Island combined. It
was selected as Curiosity's 2012 landing site because it had signs of
past water, including clay minerals that might help trap and preserve
ancient organic molecules. Indeed, while exploring the base of a
mountain in the center of the crater, called Mount Sharp, Curiosity
found a layer of sediments 1,000 feet (304 meters) thick that was
deposited as mud in ancient lakes. To form that much sediment an
incredible amount of water would have flowed down into those lakes for
millions to tens of millions of warm and humid years, some scientists
say. But some geological features in the crater also hint at a past
that included cold, icy conditions.
"At some point, Mars' surface environment must have experienced a
transition from being warm and humid to being cold and dry, as it is
now, but exactly when and how that occurred is still a mystery," says
Heather Franz, a NASA geochemist based at NASA's Goddard Space Flight
Center in Greenbelt, Maryland.
Franz, who led the SAM study, notes that factors such as changes in
Mars' obliquity and the amount of volcanic activity could have caused
the Martian climate to alternate between warm and cold over time. This
idea is supported by chemical and mineralogical changes in Martian
rocks showing that some layers formed in colder environments and others
formed in warmer ones.
In any case, says Franz, the array of data collected by Curiosity so
far suggests that the team is seeing evidence for Martian climate
change recorded in rocks.
Carbon and oxygen star in the Martian climate story
Franz's team found evidence for a cold ancient environment after the
SAM lab extracted the gases carbon dioxide, or CO[2], and oxygen from
13 dust and rock samples. Curiosity collected these samples over the
course of five Earth years (Earth years vs. Mars years).
CO[2] is a molecule of one carbon atom bonded with two oxygen atoms,
with carbon serving as a key witness in the case of the mysterious
Martian climate. In fact, this simple yet versatile element is as
critical as water in the search for life elsewhere. On Earth, carbon
flows continuously through the air, water, and surface in a
well-understood cycle that hinges on life. For example, plants absorb
carbon from the atmosphere in the form of CO[2]. In return, they
produce oxygen, which humans and most other life forms use for
respiration in a process that ends with the release of carbon back into
the air, again via CO[2], or into the Earth's crust as life forms die
and are buried.
Scientists are finding there's also a carbon cycle on Mars and they're
working to understand it. With little water or abundant surface life on
the Red Planet for at least the past 3 billion years, the carbon cycle
is much different than Earth's.
"Nevertheless, the carbon cycling is still happening and is still
important because it's not only helping reveal information about Mars'
ancient climate," says Paul Mahaffy, principal investigator on SAM and
director of the Solar System Exploration Division at NASA Goddard.
"It's also showing us that Mars is a dynamic planet that's circulating
elements that are the buildings blocks of life as we know it."
The gases build a case for a chilly period
After Curiosity fed rock and dust samples into SAM, the lab heated each
one to nearly 1,650 degrees Fahrenheit (900 degrees Celsius) to
liberate the gases inside. By looking at the oven temperatures that
released the CO[2] and oxygen, scientists could tell what kind of
minerals the gases were coming from. This type of information helps
them understand how carbon is cycling on Mars.
Various studies have suggested that Mars' ancient atmosphere,
containing mostly CO[2], may have been thicker than Earth's is today.
Most of it has been lost to space, but some may be stored in rocks at
the planet's surface, particularly in the form of carbonates, which are
minerals made of carbon and oxygen. On Earth, carbonates are produced
when CO[2] from the air is absorbed in the oceans and other bodies of
water and then mineralized into rocks. Scientists think the same
process happened on Mars and that it could help explain what happened
to some of the Martian atmosphere.
Yet, missions to Mars haven't found enough carbonates in the surface to
support a thick atmosphere.
Nonetheless, the few carbonates that SAM did detect revealed something
interesting about the Martian climate through the isotopes of carbon
and oxygen stored in them. Isotopes are versions of each element that
have different masses. Because different chemical processes, from rock
formation to biological activity, use these isotopes in different
proportions, the ratios of heavy to light isotopes in a rock provide
scientists with clues to how the rock formed.
In some of the carbonates SAM found, scientists noticed that the oxygen
isotopes were lighter than those in the Martian atmosphere. This
suggests that the carbonates did not form long ago simply from
atmospheric CO[2] absorbed into a lake. If they had, the oxygen
isotopes in the rocks would have been slightly heavier than the ones in
the air.
While it's possible that the carbonates formed very early in Mars'
history, when the atmospheric composition was a bit different than it
is today, Franz and her colleagues suggest that the carbonates more
likely formed in a freezing lake. In this scenario, the ice could have
sucked up heavy oxygen isotopes and left the lightest ones to form
carbonates later. Other Curiosity scientists have also presented
evidence suggesting that ice-covered lakes could have existed in Gale
Crater.
So where is all the carbon?
The low abundance of carbonates on Mars is puzzling, scientists say. If
there aren't many of these minerals at Gale Crater, perhaps the early
atmosphere was thinner than predicted. Or maybe something else is
storing the missing atmospheric carbon.
Based on their analysis, Franz and her colleagues suggest that some
carbon could be sequestered in other minerals, such as oxalates, which
store carbon and oxygen in a different structure than carbonates. Their
hypothesis is based on the temperatures at which CO[2] was released
from some samples inside SAM -- too low for carbonates, but just right
for oxalates -- and on the different carbon and oxygen isotope ratios
than the scientists saw in the carbonates.
A model of a carbonate molecule next to an oxalate molecule
Oxalates are the most common type of organic mineral produced by plants
on Earth. But oxalates also can be produced without biology. One way is
through the interaction of atmospheric CO[2] with surface minerals,
water, and sunlight, in a process known as abiotic photosynthesis. This
type of chemistry is hard to find on Earth because there's abundant
life here, but Franz's team hopes to create abiotic photosynthesis in
the lab to figure out if it actually could be responsible for the
carbon chemistry they're seeing in Gale Crater.
On Earth, abiotic photosynthesis may have paved the way for
photosynthesis among some of the first microscopic life forms, which is
why finding it on other planets interests astrobiologists.
Even if it turns out that abiotic photosynthesis locked some carbon
from the atmosphere into rocks at Gale Crater, Franz and her colleagues
would like to study soil and dust from different parts of Mars to
understand if their results from Gale Crater reflect a global picture.
They may one day get a chance to do so. NASA's Perseverance Mars rover,
due to launch to Mars between July and August 2020, plans to pack up
samples in Jezero Crater for possible return to labs on Earth.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]NASA/Goddard Space Flight Center. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. H. B. Franz, P. R. Mahaffy, C. R. Webster, G. J. Flesch, E. Raaen,
C. Freissinet, S. K. Atreya, C. H. House, A. C. McAdam, C. A.
Knudson, P. D. Archer, J. C. Stern, A. Steele, B. Sutter, J. L.
Eigenbrode, D. P. Glavin, J. M. T. Lewis, C. A. Malespin, M.
Millan, D. W. Ming, R. Navarro-González, R. E. Summons. Indigenous
and exogenous organics and surface–atmosphere cycling inferred from
carbon and oxygen isotopes at Gale crater. Nature Astronomy, 2020;
4 (5): 526 DOI: [19]10.1038/s41550-019-0990-x
__________________________________________________________________
--- up 17 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:14 2020
Date:
May 19, 2020
Source:
University of Delaware
Summary:
Connected and automated vehicles use technology such as sensors,
cameras and advanced control algorithms to adjust their
operation to changing conditions with little or no input from
drivers. A research group optimized vehicle dynamics and
powertrain operation using connectivity and automation, while
developing and testing a control framework that reduced travel
time and energy use in a connected and automated vehicle.
FULL STORY
__________________________________________________________________
Imagine merging into busy traffic without ever looking over your
shoulder nor accelerating or braking too hard, irritating the driver in
the next lane over. Connected and automated vehicles that communicate
to coordinate optimal traffic patterns could enable this pleasant
driving scenario sooner than you think.
At the University of Delaware, a research group of students is
developing algorithms for connected and automated vehicles that reduce
energy consumption and travel delays. The Information and Decision
Science Lab is led by Andreas Malikopoulos, Terri Connor Kelly and John
Kelly Career Development Associate Professor.
Connected and automated vehicles use technology such as sensors,
cameras and advanced control algorithms to adjust their operation to
changing conditions with little or no input from drivers.
For doctoral student A M Ishtiaque Mahbub, the project has offered
unprecedented opportunities. He is the first author of two new
technical papers published by SAE -- formerly known as the Society of
Automotive Engineers -- describing how UD engineers optimized vehicle
dynamics and powertrain operation using connectivity and automation as
well as how they developed and tested a control framework that reduced
travel time and energy use in a connected and automated vehicle.
The team is optimizing an Audi A3 e-tron, a plug-in hybrid electric
vehicle. First, the team members developed control architectures to
reduce stop-and-go driving and travel time while ensuring that energy
efficiency. Next, the team tested the algorithms using driving
simulators in UD's Spencer Laboratory.
Then, in October 2019, they put their work to the test in the
University of Michigan's MCity, a testing ground for cutting-edge
vehicles. The software developed at UD went into the Audi A3 e-tron.
On test day, Mahbub stepped into the test car with two other engineers
from Bosch. Each was equipped with a laptop to take data as they drove
along a track that included a roundabout, merging zone, intersection
and other challenges. The connected and automated vehicle is designed
to take over and navigate these situations for you.
"This alleviates stress, and by eliminating stop-and-go driving
behavior where you're constantly braking and accelerating, braking and
accelerating, or even yielding, it also has a smooth margin in those
cases, which also as a byproduct increases the fuel efficiency," said
Mahbub.
Virtual reality was used to simulate challenges for the car to navigate
around, such as other cars and pedestrians.
With months of preparation behind him, Mahbub was excited for the test,
but nervous, too. "There is a certain level of uncertainty that plays
on your mind, that, OK: The theory and control algorithms worked in
simulation, but how about in the real world?" he said. "How might the
real-world uncertainties and unknown variables affect the system?"
The test was a success, with a 30 percent increase in energy
efficiency, more than the simulation even predicted.
The real-world scenario helped Mahbub put his analysis in context, gain
an even greater understanding of the vehicle's control architecture,
and collect data that could be used to realize and quantify even
greater gains in energy efficiency.
"At one point in the field test I was feeling a bit nauseous because
the centrifugal force was a little too much," he said. "I'm thinking
right now going forward if we plan to visit MCity, I will definitely
put that in my algorithm so that the passengers will have a more
comfortable drive."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Delaware. Original written
by Julie Stewart. Note: Content may be edited for style and length.
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:14 2020
humor
Date:
May 19, 2020
Source:
Oregon State University
Summary:
A robot comic is more funny when it has good timing.
FULL STORY
__________________________________________________________________
Standup comedian Jon the Robot likes to tell his audiences that he does
lots of auditions but has a hard time getting bookings.
"They always think I'm too robotic," he deadpans.
If raucous laughter follows, he comes back with, "Please tell the
booking agents how funny that joke was."
If it doesn't, he follows up with, "Sorry about that. I think I got
caught in a loop. Please tell the booking agents that you like me ...
that you like me ... that you like me ... that you like me."
Jon the Robot, with assistance from Oregon State University researcher
Naomi Fitter, recently wrapped up a 32-show tour of comedy clubs in
greater Los Angeles and in Oregon, generating guffaws and more
importantly data that scientists and engineers can use to help robots
and people relate more effectively with one another via humor.
"Social robots and autonomous social agents are becoming more and more
ingrained in our everyday lives," said Fitter, assistant professor of
robotics in the OSU College of Engineering. "Lots of them tell jokes to
engage users -- most people understand that humor, especially nuanced
humor, is essential to relationship building. But it's challenging to
develop entertaining jokes for robots that are funny beyond the novelty
level."
Live comedy performances are a way for robots to learn "in the wild"
which jokes and which deliveries work and which ones don't, Fitter
said, just like human comedians do.
Two studies comprised the comedy tour, which included assistance from a
team of Southern California comedians in coming up with material true
to, and appropriate for, a robot comedian.
The first study, consisting of 22 performances in the Los Angeles area,
demonstrated that audiences found a robot comic with good timing --
giving the audience the right amounts of time to react, etc. -- to be
significantly more funny than one without good timing.
The second study, based on 10 routines in Oregon, determined that an
"adaptive performance" -- delivering post-joke "tags" that acknowledge
an audience's reaction to the joke -- wasn't necessarily funnier
overall, but the adaptations almost always improved the audience's
perception of individual jokes. In the second study, all performances
featured appropriate timing.
"In bad-timing mode, the robot always waited a full five seconds after
each joke, regardless of audience response," Fitter said. "In
appropriate-timing mode, the robot used timing strategies to pause for
laughter and continue when it subsided, just like an effective human
comedian would. Overall, joke response ratings were higher when the
jokes were delivered with appropriate timing."
The number of performances, given to audiences of 10 to 20, provide
enough data to identify significant differences between distinct modes
of robot comedy performance, and the research helped to answer key
questions about comedic social interaction, Fitter said.
"Audience size, social context, cultural context, the
microphone-holding human presence and the novelty of a robot comedian
may have influenced crowd responses," Fitter said. "The current
software does not account for differences in laughter profiles, but
future work can account for these differences using a baseline response
measurement. The only sensing we used to evaluate joke success was
audio readings. Future work might benefit from incorporating additional
types of sensing."
Still, the studies have key implications for artificial intelligence
efforts to understand group responses to dynamic, entertaining social
robots in real-world environments, she said.
"Also, possible advances in comedy from this work could include
improved techniques for isolating and studying the effects of comedic
techniques and better strategies to help comedians assess the success
of a joke or routine," she said. "The findings will guide our next
steps toward giving autonomous social agents improved humor
capabilities."
The studies were published by the Association for Computing
Machinery/Institute of Electrical and Electronics Engineering's
International Conference on Human-Robot Interaction.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Oregon State University. Original written
by Steve Lundeberg. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. John Vilk, Naomi T. Fitter. Comedians in Cafes Getting Data. HRI
'20: Proceedings of the 2020 ACM/IEEE International Conference on
Human-Robot Interaction, 2020 DOI: [19]10.1145/3319502.3374780
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue May 19 21:30:16 2020
motion
Date:
May 19, 2020
Source:
Institute for Basic Science
Summary:
A research team has reported a diagnostic 'fidget spinner'
(Dx-FS) that allows for highly sensitive and rapid diagnosis and
prescription only with hand power.
FULL STORY
__________________________________________________________________
About 60% of women will experience urinary tract infection (UTI) at
least once in their lifetime. With antibiotic-resistant organisms
increasing, UTI is likely to bring more of the health and economic
burden. To turn things around, point of care testing (POCT) technology
has been identified as a breakthrough in diagnosing suspected UTI
patients. POCT enables staff to provide real-time, lab-quality patient
care when and where it is needed. Despite recent advances by POCT,
every year millions of people die of treatable illness such as UTI and
of the lack of diagnosis in developing parts of the world. It is a
pressing need for technologies to bridge this existing gap.
Researchers at the Center for Soft and Living Matter, within the
Institute for Basic Science (IBS, South Korea), reported a diagnostic
fidget spinner (Dx-FS) that allows for highly sensitive and rapid
diagnosis and prescription only with hand power. Fidget spinners are
boomerang-shaped toys whose ball bearings reduce friction and allow
things to rotate freely for a long time. One flick of the fidget with a
finger sets the gadget in motion. By exploiting the centrifugal force
derived from the design of a fidget spinner and their novel mechanism
called, a fluid assisted separation technology (FAST), the research
team optimized the fluidic dynamics in Dx-FS. This mechanism enables
the Dx-FS to work just with one or two spins by hand and to produce 100
times more enriched pathogens that can be easily seen by naked-eyes
without the need of bacteria culture.
Conventional approach for the diagnostics of the infectious disease
require time-consuming cell culture as well as modern laboratory
facilities. Worse yet, typical bacterial cell enrichment requires huge
force and it is prone to membrane fouling or clogging due to the
pressure imbalance in the filtration chamber. "Though the centrifugal
force serves as an "engine" of the device, the force is felt more
strongly in the outer path as it acts outwardly away from the center of
rotation. The imbalanced impact of the centrifugal force keeps some of
the sample left in the membrane. We utilized hydrodynamic forces that
acts vertically to the centrifugal force by filling the filter membrane
with liquid before the spinning process. This minimized the pressure
drop and brought the uniform pressure balance throughout the entire
area of the membrane. This allowed for maximized bacterial cell
enrichment efficiency while minimizing the force needed for the
filtration. Therefore, one or two spins were enough to filter 1 mL of
sample despite large variation in the spin speed among different
operators with different hand power." explains professor CHO
Yoon-Kyoung, the corresponding author of the study.
In FAST-based particle separation, the fluid flow caused by centrifugal
force is in a direction perpendicular to the filtration flow through
the membrane. In addition, the drainage chamber underneath the membrane
remains fully filled with the liquid during the entire filtration
process. This is achieved by placing a buffer solution in the bottom
chamber of the membrane prior to the spinning process, which ensures
uniform filtration across the entire area of the membrane and
significantly reduces the hydrodynamic resistance.
The research team verified Dx-FS can perform "sample-in-answer-out"
analyses. The research team tested urine samples from 39 UTI suspects
in Tiruchirappalli, India. Compared to the gold-standard culture
method, which has a relatively long turnaround time, Dx-FS provided a
comparable answer on site in 50 minutes. The experiment shows 59% of
UTI suspects were over/under-treated for antibiotics, which may be
saved by using Dx-FS. Further, they performed a rapid antimicrobial
susceptibility test (AST) for two antimicrobial drugs on 30 UTI
patients using Dx-FS. The test produced 100% accurate results within
120 minutes.
Overall, this simple, hand-powered, portable device allows rapid
enrichment of pathogens from human urine samples, showing high
potential for future low-cost POCT diagnostic applications. A simple
tool like Dx-FS provides UTI management and prevention of resistance in
low resource settings.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Institute for Basic Science. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Issac Michael, Dongyoung Kim, Oleksandra Gulenko, Sumit Kumar,
Saravana Kumar, Jothi Clara, Dong Yeob Ki, Juhee Park, Hyun Yong
Jeong, Taek Soo Kim, Sunghoon Kwon, Yoon-Kyoung Cho. A fidget
spinner for the point-of-care diagnosis of urinary tract infection.
Nature Biomedical Engineering, 2020; DOI:
[19]10.1038/s41551-020-0557-2
__________________________________________________________________
--- up 17 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 21 21:30:00 2020
Date:
April 21, 2020
Source:
Arizona State University
Summary:
In seeking to learn more about Neptune-like exoplanets, an
international team of researchers has provided one of the first
mineralogy lab studies for water-rich exoplanets.
FULL STORY
__________________________________________________________________
Astrophysical observations have shown that Neptune-like water-rich
exoplanets are common in our galaxy. These "water worlds" are believed
to be covered with a thick layer of water, hundreds to thousands of
miles deep, above a rocky mantle.
While water-rich exoplanets are common, their composition is very
different from Earth, so there are many unknowns in terms of these
planets' structure, composition and geochemical cycles.
In seeking to learn more about these planets, an international team of
researchers, led by Arizona State University, has provided one of the
first mineralogy lab studies for water-rich exoplanets. The results of
their study have been recently published in the journal Proceedings of
the National Academy of Sciences.
"Studying the chemical reactions and processes is an essential step
toward developing an understanding of these common planet types," said
co-author Dan Shim, of ASU's School of Earth and Space Exploration.
The general scientific conjecture is that water and rock form separate
layers in the interiors of water worlds. Because water is lighter,
underneath the water layer in water-rich planets, there should be a
rocky layer. However, the extreme pressure and temperature at the
boundary between water and rocky layers could fundamentally change the
behaviors of these materials.
To simulate this high pressure and temperature in the lab, lead author
and research scientist Carole Nisr conducted experiments at Shim's Lab
for Earth and Planetary Materials at ASU using high pressure
diamond-anvil cells.
For their experiment, the team immersed silica in water, compressed the
sample between diamonds to a very high pressure, then heated the sample
with laser beams to over a few thousand degrees Fahrenheit.
The team also conducted laser heating at the Argonne National
Laboratory in Illinois. To monitor the reaction between silica and
water, X-ray measurements were taken while the laser heated the sample
at high pressures.
What they found was an unexpected new solid phase with silicon,
hydrogen and oxygen all together.
"Originally, it was thought that water and rock layers in water-rich
planets were well-separated," Nisr said. "But we discovered through our
experiments a previously unknown reaction between water and silica and
stability of a solid phase roughly in an intermediate composition. The
distinction between water and rock appeared to be surprisingly 'fuzzy'
at high pressure and high temperature."
The researchers hope that these findings will advance our knowledge on
the structure and composition of water-rich planets and their
geochemical cycles.
"Our study has important implications and raises new questions for the
chemical composition and structure of the interiors of water-rich
exoplanets," Nisr said. "The geochemical cycle for water-rich planets
could be very different from that of the rocky planets, such as Earth."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Arizona State University. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Carole Nisr, Huawei Chen, Kurt Leinenweber, Andrew Chizmeshya,
Vitali B. Prakapenka, Clemens Prescher, Sergey N. Tkachev, Yue
Meng, Zhenxian Liu, Sang-Heon Shim. Large H2O solubility in dense
silica and its implications for the interiors of water-rich
planets. Proceedings of the National Academy of Sciences, 2020;
201917448 DOI: [19]10.1073/pnas.1917448117
__________________________________________________________________
--- up 13 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
MeaTLoTioN@1337:1/101 to
Black Panther on Wed Apr 22 08:48:45 2020
On 21 Apr 2020, SpaceDaily said the following...
Studying our galaxy's 'water worlds'
Date:
April 21, 2020
Source:
Arizona State University
Summary:
In seeking to learn more about Neptune-like exoplanets, an
international team of researchers has provided one of the first
mineralogy lab studies for water-rich exoplanets.
I love it!
This is superb, thank you BP!
---
|14Best regards,
|11Ch|03rist|11ia|15n |11a|03ka |11Me|03aTLoT|11io|15N
|07 |08[|10eml|08] |
15ml@erb.pw |07 |08[|10web|08] |15www.erb.pw |07Ŀ |07 |08[|09fsx|08] |1521:1/158 |07 |08[|11tqw|08] |151337:1/101 |07 |07 |08[|12rtn|08] |1580:774/81 |07 |08[|14fdn|08] |152:250/5 |07
|07 |08[|10ark|08] |1510:104/2 |07
--- Mystic BBS v1.12 A43 2019/03/02 (Linux/64)
* Origin: thE qUAntUm wOrmhOlE, rAmsgAtE, Uk. bbs.erb.pw (1337:1/101)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 22 21:30:04 2020
Date:
April 22, 2020
Source:
US Geological Survey
Summary:
For the first time, the entire lunar surface has been completely
mapped and uniformly classified. The lunar map, called the
'Unified Geologic Map of the Moon,' will serve as the definitive
blueprint of the moon's surface geology for future human
missions and will be invaluable for the international scientific
community, educators and the public-at-large.
FULL STORY
__________________________________________________________________
Have you ever wondered what kind of rocks make up those bright and dark
splotches on the moon? Well, the USGS has just released a new
authoritative map to help explain the 4.5-billion-year-old history of
our nearest neighbor in space.
For the first time, the entire lunar surface has been completely mapped
and uniformly classified by scientists from the USGS, in collaboration
with NASA and the Lunar Planetary Institute.
The lunar map, called the "Unified Geologic Map of the Moon," will
serve as the definitive blueprint of the moon's surface geology for
future human missions and will be invaluable for the international
scientific community, educators and the public-at-large. The digital
map is available online now and shows the moon's geology in incredible
detail (1:5,000,000 scale).
"People have always been fascinated by the moon and when we might
return," said current USGS Director and former NASA astronaut Jim
Reilly. "So, it's wonderful to see USGS create a resource that can help
NASA with their planning for future missions."
To create the new digital map, scientists used information from six
Apollo-era regional maps along with updated information from recent
satellite missions to the moon. The existing historical maps were
redrawn to align them with the modern data sets, thus preserving
previous observations and interpretations. Along with merging new and
old data, USGS researchers also developed a unified description of the
stratigraphy, or rock layers, of the moon. This resolved issues from
previous maps where rock names, descriptions and ages were sometimes
inconsistent.
"This map is a culmination of a decades-long project," said Corey
Fortezzo, USGS geologist and lead author. "It provides vital
information for new scientific studies by connecting the exploration of
specific sites on the moon with the rest of the lunar surface."
Elevation data for the moon's equatorial region came from stereo
observations collected by the Terrain Camera on the recent SELENE
(Selenological and Engineering Explorer) mission led by JAXA, the Japan
Aerospace Exploration Agency. Topography for the north and south poles
was supplemented with NASA's Lunar Orbiter Laser Altimeter data.
Further Information:
[17]
https://astrogeology.usgs.gov/search/map/Moon/Geology/Unified_Geolo
gic_Map_of_the_Moon_GIS_v2
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[18]Materials provided by [19]US Geological Survey. Note: Content may
be edited for style and length.
__________________________________________________________________
--- up 13 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
Black Panther@1337:3/111 to
MeaTLoTioN on Wed Apr 22 22:08:02 2020
On 22 Apr 2020, MeaTLoTioN said the following...
I love it!
This is superb, thank you BP!
I'm still doing some tweaking to the script, but I'm pretty happy with how
it's working.
It figures, when I had it running locally, there was like 6 stories being posted per day. Now, we're only getting 1... :)
Depending on how the volume of these categories is, I may add more to the script. Perhaps Computers & Math, or Matter & Energy next.
---
Black Panther(RCS)
Castle Rock BBS
--- Mystic BBS v1.12 A45 2020/02/18 (Linux/64)
* Origin: Castle Rock BBS - bbs.castlerockbbs.com - (1337:3/111)
-
From
MeaTLoTioN@1337:1/101 to
Black Panther on Thu Apr 23 11:42:44 2020
I'm still doing some tweaking to the script, but I'm pretty happy with
how it's working.
The results are pretty great if you ask me
It figures, when I had it running locally, there was like 6 stories being posted per day. Now, we're only getting 1... :)
Haha that's typical lol
Depending on how the volume of these categories is, I may add more to the script. Perhaps Computers & Math, or Matter & Energy next.
Yes this will be excellent, if the volume of these gets to become something with real bite, you know something to really chew on, perhaps I could set up
a couple of echomail bases just for those topics?
While on that subject, are there any bases that need updating/removing/adding? I am keen to make this something unique, and I already think we're on to a
good path with the science and secure transport of data through ZeroTier... what do y'all think?
Please anyone with suggestions whether small or large are very much welcomed.
---
|14Best regards,
|11Ch|03rist|11ia|15n |11a|03ka |11Me|03aTLoT|11io|15N
|07 |08[|10eml|08] |
15ml@erb.pw |07 |08[|10web|08] |15www.erb.pw |07Ŀ |07 |08[|09fsx|08] |1521:1/158 |07 |08[|11tqw|08] |151337:1/101 |07 |07 |08[|12rtn|08] |1580:774/81 |07 |08[|14fdn|08] |152:250/5 |07
|07 |08[|10ark|08] |1510:104/2 |07
--- Mystic BBS v1.12 A43 2019/03/02 (Linux/64)
* Origin: thE qUAntUm wOrmhOlE, rAmsgAtE, Uk. bbs.erb.pw (1337:1/101)
-
From
Black Panther@1337:3/111 to
MeaTLoTioN on Thu Apr 23 18:27:00 2020
On 23 Apr 2020, MeaTLoTioN said the following...
Depending on how the volume of these categories is, I may add more to script. Perhaps Computers & Math, or Matter & Energy next.
Yes this will be excellent, if the volume of these gets to become something with real bite, you know something to really chew on, perhaps
I could set up a couple of echomail bases just for those topics?
Let me see about adding another category, and see what happens with the
volume. You know that as soon as I get more categories added, each one will boost the number of stories each day. ;)
I'm not sure about adding new bases yet. Let's see how things work like this for now.
Please anyone with suggestions whether small or large are very much welcomed.
I think the Network Coordinator should provide everyone with their drink of choice... ;)
---
Black Panther(RCS)
Castle Rock BBS
--- Mystic BBS v1.12 A45 2020/02/18 (Linux/64)
* Origin: Castle Rock BBS - bbs.castlerockbbs.com - (1337:3/111)
-
From
alterego@1337:2/101 to
Black Panther on Fri Apr 24 12:20:59 2020
Re: Re: Space Daily News
By: Black Panther to MeaTLoTioN on Thu Apr 23 2020 06:27 pm
I think the Network Coordinator should provide everyone with their drink of choice... ;)
I'll second this! :)
...
... Honeymoon - the morning after the knot before.
--- SBBSecho 3.10-Linux
* Origin: I'm playing with ANSI+videotex - wanna play too? (1337:2/101)
-
From
Black Panther@1337:3/111 to
alterego on Thu Apr 23 20:25:00 2020
On 24 Apr 2020, alterego said the following...
I think the Network Coordinator should provide everyone with their dr of choice... ;)
I'll second this! :)
I'll third this! Wait, I can't third it if I firsted it. (That really doesn't sound good if you read it aloud...) ;)
---
Black Panther(RCS)
Castle Rock BBS
--- Mystic BBS v1.12 A45 2020/02/18 (Linux/64)
* Origin: Castle Rock BBS - bbs.castlerockbbs.com - (1337:3/111)
-
From
Black Panther@1337:3/111 to
MeaTLoTioN on Thu Apr 23 20:27:32 2020
On 23 Apr 2020, MeaTLoTioN said the following...
Depending on how the volume of these categories is, I may add more to script. Perhaps Computers & Math, or Matter & Energy next.
Alright, I felt ambitious, and added both Computers & Math, and Matter & Energy. It doesn't look like there will be too many messages being posted.
In my test run a few minutes ago, it generated 8 messages. I'll see how that goes for a few days before considering any more additions.
---
Black Panther(RCS)
Castle Rock BBS
--- Mystic BBS v1.12 A45 2020/02/18 (Linux/64)
* Origin: Castle Rock BBS - bbs.castlerockbbs.com - (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:06 2020
Date:
April 23, 2020
Source:
Stanford's School of Earth, Energy & Environmental Sciences
Summary:
New research indicates river delta deposits within Mars' Jezero
crater -- the destination of NASA' Perseverance rover on the Red
Planet -- formed over time scales that promoted habitability and
enhanced preservation of evidence.
FULL STORY
__________________________________________________________________
New research indicates river delta deposits within Mars' Jezero crater
-- the destination of NASA' Perseverance rover on the Red Planet --
formed over time scales that promoted habitability and enhanced
preservation of evidence.
Undulating streaks of land visible from space reveal rivers once
coursed across the Martian surface -- but for how long did the water
flow? Enough time to record evidence of ancient life, according to a
new Stanford study.
Scientists have speculated that the Jezero crater on Mars -- the site
of the next NASA rover mission to the Red Planet -- could be a good
place to look for markers of life. A new analysis of satellite imagery
supports that hypothesis. By modeling the length of time it took to
form the layers of sediment in a delta deposited by an ancient river as
it poured into the crater, researchers have concluded that if life once
existed near the Martian surface, traces of it could have been captured
within the delta layers.
"There probably was water for a significant duration on Mars and that
environment was most certainly habitable, even if it may have been
arid," according to lead author Mathieu Lapôtre, an assistant professor
of geological sciences at Stanford's School of Earth, Energy &
Environmental Sciences (Stanford Earth). "We showed that sediments were
deposited rapidly and that if there were organics, they would have been
buried rapidly, which means that they would likely have been preserved
and protected."
Jezero crater was selected for NASA's next rover mission partly because
the site contains a river delta, which on Earth are known to
effectively preserve organic molecules associated with life. But
without an understanding of the rates and durations of delta-building
events, the analogy remained speculative. The new research, published
online on April 23 in AGU Advances, offers guidance for sample recovery
in order to better understand the ancient Martian climate and duration
of the delta formation for NASA's Perseverance Rover to Mars, which is
expected to launch in July 2020 as part of the first Mars sample return
mission.
Extrapolating from Earth
The study incorporates a recent discovery the researchers made about
Earth: Single-threaded sinuous rivers that don't have plants growing
over their banks move sideways about ten times faster than those with
vegetation. Based on the strength of Mars' gravity, and assuming the
Red Planet did not have plants, the scientists estimate that the delta
in Jezero crater took at least 20 to 40 years to form, but that
formation was likely discontinuous and spread out across about 400,000
years.
"This is useful because one of the big unknowns on Mars is time,"
Lapôtre said. "By finding a way to calculate rate for the process, we
can start gaining that dimension of time."
Because single-threaded, meandering rivers are most often found with
vegetation on Earth, their occurrence without plants remained largely
undetected until recently. It was thought that before the appearance of
plants, only braided rivers, made up of multiple interlaced channels,
existed. Now that researchers know to look for them, they have found
meandering rivers on Earth today where there are no plants, such as in
the McLeod Springs Wash in the Toiyabe basin of Nevada.
"This specifically hadn't been done before because single-threaded
rivers without plants were not really on anyone's radar," Lapôtre said.
"It also has cool implications for how rivers might have worked on
Earth before there were plants."
The researchers also estimated that wet spells conducive to significant
delta buildup were about 20 times less frequent on ancient Mars than
they are on Earth today.
"People have been thinking more and more about the fact that flows on
Mars probably were not continuous and that there have been times when
you had flows and other times when you had dry spells," Lapôtre said.
"This is a novel way of putting quantitative constraints on how
frequently flows probably happened on Mars."
Findings from Jezero crater could aid our understanding of how life
evolved on Earth. If life once existed there, it likely didn't evolve
beyond the single-cell stage, scientists say. That's because Jezero
crater formed over 3.5 billion years ago, long before organisms on
Earth became multicellular. If life once existed at the surface, its
evolution was stalled by some unknown event that sterilized the planet.
That means the Martian crater could serve as a kind of time capsule
preserving signs of life as it might once have existed on Earth.
"Being able to use another planet as a lab experiment for how life
could have started somewhere else or where there's a better record of
how life started in the first place -- that could actually teach us a
lot about what life is," Lapôtre said. "These will be the first samples
that we've seen as a rock on Mars and then brought back to Earth, so
it's pretty exciting."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Stanford's School of Earth, Energy &
Environmental Sciences. Original written by Danielle Torrent Tucker.
Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Mathieu G. A. Lapôtre, Alessandro Ielpi. The Pace of Fluvial
Meanders on Mars and Implications for the Western Delta Deposits of
Jezero Crater. AGU Advances, 2020; 1 (2) DOI:
[19]10.1029/2019AV000141
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:06 2020
Date:
April 23, 2020
Source:
Cornell University
Summary:
After spotting a curious pattern in scientific papers -- they
described exoplanets as being cooler than expected --
astronomers have improved a mathematical model to accurately
gauge the temperatures of planets from solar systems hundreds of
light-years away.
FULL STORY
__________________________________________________________________
After spotting a curious pattern in scientific papers -- they described
exoplanets as being cooler than expected -- Cornell University
astronomers have improved a mathematical model to accurately gauge the
temperatures of planets from solar systems hundreds of light-years
away.
This new model allows scientists to gather data on an exoplanet's
molecular chemistry and gain insight on the cosmos' planetary
beginnings, according to research published April 23 in Astrophysical
Journal Letters.
Nikole Lewis, assistant professor of astronomy and the deputy director
of the Carl Sagan Institute (CSI), had noticed that over the past five
years, scientific papers described exoplanets as being much cooler than
predicted by theoretical models.
"It seemed to be a trend -- a new phenomenon," Lewis said. "The
exoplanets were consistently colder than scientists would expect."
To date, astronomers have detected more than 4,100 exoplanets. Among
them are "hot Jupiters," a common type of gaseous giant that always
orbits close to its host star. Thanks to the star's overwhelming
gravity, hot Jupiters always have one side facing their star, a
situation known as "tidal locking."
Therefore, as one side of the hot Jupiter broils, the planet's far side
features much cooler temperatures. In fact, the hot side of the tidally
locked exoplanet bulges like a balloon, shaping it like an egg.
From a distance of tens to hundreds of light-years away, astronomers
have traditionally seen the exoplanet's temperature as homogenous --
averaging the temperature -- making it seem much colder than physics
would dictate.
Temperatures on exoplanets -- particularly hot Jupiters -- can vary by
thousands of degrees, according to lead author Ryan MacDonald, a
researcher at CSI, who said wide-ranging temperatures can promote
radically different chemistry on different sides of the planets.
After poring over exoplanet scientific papers, Lewis, MacDonald and
research associate Jayesh Goyal solved the mystery of seemingly cooler
temperatures: Astronomers' math was wrong.
"When you treat a planet in only one dimension, you see a planet's
properties -- such as temperature -- incorrectly," Lewis said. "You end
up with biases. We knew the 1,000-degree differences were not correct,
but we didn't have a better tool. Now, we do."
Astronomers now may confidently size up exoplanets' molecules.
"We won't be able to travel to these exoplanets any time in the next
few centuries, so scientists must rely on models," MacDonald said,
explaining that when the next generation of space telescopes get
launched starting in 2021, the detail of exoplanet datasets will have
improved to the point where scientists can test the predictions of
these three-dimensional models.
"We thought we would have to wait for the new space telescopes to
launch," said MacDonald, "but our new models suggest the data we
already have -- from the Hubble Space Telescope -- can already provide
valuable clues."
With updated models that incorporate current exoplanet data,
astronomers can tease out the temperatures on all sides of an exoplanet
and better determine the planet's chemical composition.
Said MacDonald: "When these next-generation space telescopes go up, it
will be fascinating to know what these planets are really like."
Funding for this research was provided by Cornell University.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Cornell University. Original written by
Blaine Friedlander. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Ryan J. MacDonald, Jayesh M. Goyal, Nikole K. Lewis. Why Is it So
Cold in Here? Explaining the Cold Temperatures Retrieved from
Transmission Spectra of Exoplanet Atmospheres. The Astrophysical
Journal, 2020; 893 (2): L43 DOI: [19]10.3847/2041-8213/ab8238
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:08 2020
Date:
April 23, 2020
Source:
Cornell University
Summary:
Mathematicians are using game theory to model how this
competition could be leveraged, so cancer treatment -- which
also takes a toll on the patient's body -- might be administered
more sparingly, with maximized effect.
FULL STORY
__________________________________________________________________
Cancer cells not only ravage the body -- they also compete with each
other.
Cornell mathematicians are using game theory to model how this
competition could be leveraged, so cancer treatment -- which also takes
a toll on the patient's body -- might be administered more sparingly,
with maximized effect.
Their paper, "Optimizing Adaptive Cancer Therapy: Dynamic Programming
and Evolutionary Game Theory," published April 22 in Proceedings of the
Royal Society B: Biological Sciences.
"There are many game theoretic approaches for modeling how humans
interact, how biological systems interact, how economic entities
interact," said the paper's senior author, Alex Vladimirsky, professor
of mathematics in the College of Arts and Sciences. "You could also
model interactions between different types of cancer cells, which are
competing to proliferate inside the tumor. If you know exactly how
they're competing, you can try to leverage this to fight cancer
better."
Vladimirsky and the paper's lead author, doctoral student Mark Gluzman,
collaborated with oncologist and co-author Jacob Scott of the Cleveland
Clinic. They used evolutionary game theory to model the interactions of
three subpopulations of lung cancer cells that are differentiated by
their relationship to oxygen: glycoltyic cells (GLY), vascular
overproducers (VOP) and defectors (DEF).
In this model, previously co-developed by Scott, GLY cells are
anaerobic (i.e., they do not require oxygen); VOP and DEF cells both
use oxygen, but only VOP cells are willing to expend extra energy to
produce a protein that will improve the vasculature and bring more
oxygen to the cells.
Vladimirsky likens their competition to a game of rock, paper, scissors
in which a million people are vying against each other. If the majority
of participants choose to play rock, a greater number of players will
be tempted to switch to paper. As the number of people switching to
paper increases, fewer people will play rock and many more will shift
to playing scissors. As the popularity of scissors grows, rock will
become an attractive option again, and so on.
"So you have three populations, three competitive strategies,
undergoing these cyclic oscillations," said Vladimirsky, who directs
the Center for Applied Mathematics. "Without a drug therapy, the three
subtypes of cancer cells may follow similar oscillating trajectories.
Administering drugs can be viewed as temporarily changing the rules of
the game.
"A natural question is how and when to change the rules to achieve our
goals at a minimal cost -- both in terms of the time to recovery and
the total amount of drugs administered to the patient," he said. "Our
main contribution is in computing how to optimally time these periods
of drug treatment adaptively. We basically developed a map that shows
when to administer drugs based on the current ratio of different
subtypes of cancer."
In current clinical practice, cancer patients typically receive
chemotherapy at the highest dosage their body can safely tolerate, and
the side effects can be harsh. In addition, such a continuous treatment
regimen often leads the surviving cancer cells to develop drug
resistance, making further therapy far more difficult. The team's paper
shows that a well-timed "adaptive" application could potentially lead
to a patient's recovery with a greatly reduced amount of drugs.
But Vladimirsky cautions that, as is often the case in mathematical
modeling, reality is much messier than theory. Biological interactions
are complicated, often random, and can vary from patient to patient.
"Our optimization approach and computational experiments were all based
on a particular simplified model of cancer evolution," he said. "In
principle, the same ideas should also be applicable to much more
detailed, and even patient-specific, models, but we are still a long
way from there. We view this paper as a necessary early step on the
road to practical use of adaptive, personalized drug-therapy. Our
results are a strong argument for incorporating timing optimization
into the protocol of future clinical trials."
The research was supported by the National Institutes of Health Case
Comprehensive Cancer Center; National Cancer Institute; the Simons
Foundation; the National Science Foundation; and the Chinese University
of Hong Kong, Shenzhen.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Cornell University. Original written by
David Nutt. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Mark Gluzman, Jacob G. Scott, Alexander Vladimirsky. Optimizing
adaptive cancer therapy: dynamic programming and evolutionary game
theory. Proceedings of the Royal Society B: Biological Sciences,
2020; 287 (1925): 20192454 DOI: [19]10.1098/rspb.2019.2454
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:08 2020
Date:
April 23, 2020
Source:
DOE/Argonne National Laboratory
Summary:
Scientists have identified a new class of X-ray detectors based
on layered perovskites, a semiconducting material.
FULL STORY
__________________________________________________________________
New perovskite-based detectors can sense X-rays over a broad energy
range.
Getting an X-ray at the dentist or the doctor is at best a little
inconvenient and at worst a little risky, as radiation exposure has
been linked to an increased risk of cancer. But researchers may have
discovered a new way to generate precise X-ray images with a lower
amount of exposure, thanks to an exciting set of materials that is
generating a lot of interest.
Scientists at the U.S. Department of Energy's (DOE) Argonne National
Laboratory and Los Alamos National Laboratory have identified a new
class of X-ray detectors based on layered perovskites, a semiconducting
material also used in some other types of applications such as solar
cells and light-emitting diodes. The detector with the new material is
100 times more sensitive than conventional, silicon-based X-ray
detectors.
"This new material for detecting X-rays could soon find its way into a
variety of different everyday environments, from the doctor's office to
airport security lines to research labs," said Argonne X-ray physicist
Joseph Strzalka, who helped to characterize the perovskite material at
Argonne's Advanced Photon Source (APS), a DOE Office of Science User
Facility.
The perovskite materials work because they are deposited as a
sprayed-on thin film, a production method that helps to reduce cost
compared to having to grow a large silicon single crystal.
The new perovskite detectors can also detect X-rays over a broad energy
range, especially at higher energies. This is because the perovskite
contains heavy elements, such as lead and iodine, which tend to absorb
these X-rays more readily than silicon. The potential even exists for
the perovskite technology to be used as a gamma-ray detector, provided
the films are made a little bit thicker and a small external voltage is
applied.
"The perovskite material at the heart of our detector prototype can be
produced with low-cost solution process fabrication techniques," said
Hsinhan (Dave) Tsai, an Oppenheimer postdoctoral fellow at Los Alamos
National Laboratory. "The result is a cost-effective, highly sensitive
and self-powered detector that could radically improve existing X-ray
detectors, and potentially lead to a host of unforeseen applications."
The development and analysis of the perovskite material was a close
collaboration between Argonne APS (Sector 8-ID-E) and a Los Alamos team
lead by device physicist Wanyi Nie. The material and thin film was
created at Los Alamos and brought to Argonne to perform grazing
incidence wide-angle X-ray scattering, which gives information about
the crystallinity of the thin film. According to Strzalka, the
technique shows how the crystal is oriented in the thin film, which
relates to the performance of the detector.
Strzalka and Nie were also interested in how the charge transport
properties of the film related to the crystal structure and
temperature. By using a special stage that allowed the researchers to
change the temperature of the sample and make electrical contacts
during the measurement, they were able to understand the current
generation and transport processes induced in the sample by the X-ray
exposure.
"Our instrument at the beamline provides a versatile platform for
different kinds of in-situ measurements, including keeping the sample
in a vacuum environment while maintaining its temperature and also
performing charge transport measurements," Strzalka said.
According to Strzalka, perovskites may continue to offer important
breakthroughs. "The perovskite area is really hot right now, and users
come to us to say 'can we do this and can we do that,' and it's really
pushing us to develop our capabilities," he said.
The research was funded by the Los Alamos National Laboratory's
Laboratory Directed Research and Development (LDRD) funding and DOE's
Office of Science.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]DOE/Argonne National Laboratory. Original
written by Jared Sagoff. Note: Content may be edited for style and
length.
__________________________________________________________________
Journal Reference:
1. Hsinhan Tsai, Fangze Liu, Shreetu Shrestha, Kasun Fernando, Sergei
Tretiak, Brian Scott, Duc Ta Vo, Joseph Strzalka, Wanyi Nie. A
sensitive and robust thin-film x-ray detector using 2D layered
perovskite diodes. Science Advances, 2020; 6 (15): eaay0815 DOI:
[19]10.1126/sciadv.aay0815
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:08 2020
interface technology
Date:
April 23, 2020
Source:
Cell Press
Summary:
Researchers have been able to restore sensation to the hand of a
research participant with a severe spinal cord injury using a
brain-computer interface (BCI) system. The technology harnesses
neural signals that are so minuscule they can't be perceived and
enhances them via artificial sensory feedback sent back to the
participant, resulting in greatly enriched motor function.
FULL STORY
__________________________________________________________________
While we might often take our sense of touch for granted, for
researchers developing technologies to restore limb function in people
paralyzed due to spinal cord injury or disease, re-establishing the
sense of touch is an essential part of the process. And on April 23 in
the journal Cell, a team of researchers at Battelle and the Ohio State
University Wexner Medical Center report that they have been able to
restore sensation to the hand of a research participant with a severe
spinal cord injury using a brain-computer interface (BCI) system. The
technology harnesses neural signals that are so miniscule they can't be
perceived and enhances them via artificial sensory feedback sent back
to the participant, resulting in greatly enriched motor function.
"We're taking subperceptual touch events and boosting them into
conscious perception," says first author Patrick Ganzer, a principal
research scientist at Battelle. "When we did this, we saw several
functional improvements. It was a big eureka moment when we first
restored the participant's sense of touch."
The participant in this study is Ian Burkhart, a 28-year-old man who
suffered a spinal cord injury during a diving accident in 2010. Since
2014, Burkhart has been working with investigators on a project called
NeuroLife that aims to restore function to his right arm. The device
they have developed works through a system of electrodes on his skin
and a small computer chip implanted in his motor cortex. This setup,
which uses wires to route movement signals from the brain to the
muscles, bypassing his spinal cord injury, gives Burkhart enough
control over his arm and hand to lift a coffee mug, swipe a credit
card, and play Guitar Hero.
"Until now, at times Ian has felt like his hand was foreign due to lack
of sensory feedback," Ganzer says. "He also has trouble with
controlling his hand unless he is watching his movements closely. This
requires a lot of concentration and makes simple multitasking like
drinking a soda while watching TV almost impossible."
The investigators found that although Burkhart had almost no sensation
in his hand, when they stimulated his skin, a neural signal -- so small
it was his brain was unable to perceive it -- was still getting to his
brain. Ganzer explains that even in people like Burkhart who have what
is considered a "clinically complete" spinal cord injury, there are
almost always a few wisps of nerve fiber that remain intact. The Cell
paper explains how they were able to boost these signals to the level
where the brain would respond.
The subperceptual touch signals were artificially sent back to Burkhart
using haptic feedback. Common examples of haptic feedback are the
vibration from a mobile phone or game controller that lets the user
feel that something is working. The new system allows the subperceptual
touch signals coming from Burkhart's skin to travel back to his brain
through artificial haptic feedback that he can perceive.
The advances in the BCI system led to three important improvements.
They enable Burkhart to reliably detect something by touch alone: in
the future, this may be used to find and pick up an object without
being able to see it. The system also is the first BCI that allows for
restoration of movement and touch at once, and this ability to
experience enhanced touch during movement gives him a greater sense of
control and lets him to do things more quickly. Finally, these
improvements allow the BCI system to sense how much pressure to use
when handling an object or picking something up -- for example, using a
light touch when picking up a fragile object like a Styrofoam cup but a
firmer grip when picking up something heavy.
The investigators' long-term goal is to develop a BCI system that works
as well in the home as it does in the laboratory. They are working on
creating a next-generation sleeve containing the required electrodes
and sensors that could be easily put on and taken off. They also aim to
develop a system that can be controlled with a tablet rather than a
computer, making it smaller and more portable.
"It has been amazing to see the possibilities of sensory information
coming from a device that was originally created to only allow me to
control my hand in a one-way direction," Burkhart says.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
Materials provided by [17]Cell Press. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Patrick D. Ganzer, Samuel C. Colachis, Michael A. Schwemmer, David
A. Friedenberg, Collin F. Dunlap, Carly E. Swiftney, Adam F.
Jacobowitz, Doug J. Weber, Marcia A. Bockbrader, Gaurav Sharma.
Restoring the Sense of Touch Using a Sensorimotor Demultiplexing
Neural Interface. Cell, 2020; DOI: [18]10.1016/j.cell.2020.03.054
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:12 2020
Date:
April 23, 2020
Source:
National Institute of Standards and Technology (NIST)
Summary:
Scientists used UV light and glow powder to study the way small
amounts of drug residue get spread around a forensic chemistry
lab when analysts test seized drugs. Their study addresses
safety concerns in an age of super-potent synthetic drugs like
fentanyl, which can potentially be hazardous to chemists who
handle them frequently.
FULL STORY
__________________________________________________________________
When two scientists from the National Institute of Standards and
Technology (NIST) brought black lights and glow powder into the
Maryland State Police crime lab, they weren't setting up a laser tag
studio or nightclub.
Instead, their aim was to study the way drug particles get spread
around crime labs when analysts test suspected drug evidence. Their
study, recently published in Forensic Chemistry, addresses safety
concerns in an age of super-potent synthetic drugs like fentanyl, which
can potentially be hazardous to chemists who handle them frequently.
The spread of drug particles cannot be completely avoided -- it is an
inevitable result of the forensic analyses that crime labs must
perform. To see how it happens, the two NIST research scientists,
Edward Sisco and Matthew Staymates, fabricated a brick made of white
flour mixed with a small amount of fluorescent powder. Under everyday
lights the brick looked like evidence from a drug seizure, but under
ultraviolet light -- also called UV or black light -- it glowed a
bright orange.
Amber Burns, supervisor of the Maryland State Police forensic chemistry
lab and a co-author of the study, examined the brick and its contents
as she would real evidence. With a sheet of butcher paper covering her
workspace, she cut open the package with a scalpel, scooped out a
sample and transferred that scoop into a glass vial for analysis.
She also removed the powder to weigh it on a digital scale without the
packaging. When she was done, the black light revealed that some
particles had settled onto surfaces in her workspace. Some had also
adhered to her gloves and were transferred by touch onto a marker and
wash bottle.
All chemists clean their workspaces between cases to prevent evidence
from one case from contaminating the next. After Burns discarded the
butcher paper and cleaned her workspace, the black light showed that
her cleanup routine was effective.
Before the emergence of fentanyl and other super-potent drugs, such
small amounts of drug residue were not a major concern. But that has
changed, and not only for reasons of workplace safety. Drug dealers
often mix small amounts of fentanyl into heroin and cocaine, and some
labs are increasing the sensitivity of their instruments to detect
those small amounts. Highly sensitive instruments are more likely to
detect small amounts of drug residue in the environment, so those labs
have to be extra careful about limiting their spread.
This visualization experiment led the authors to suggest several steps
that might minimize spread. These include changing gloves frequently,
using vials and test tubes with large mouths to limit spillage when
transferring material into them, and having two sets of wash bottles,
one for casework and one for cleanup.
The researchers' paper is written in such a way that any laboratory can
reproduce the black-light experiment.
"This is a great way for labs to see which of their practices
contribute to the spread of drug residues, and to make sure that their
cleanup routines are effective," Sisco said.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]National Institute of Standards and
Technology (NIST). Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Edward Sisco, Matthew E. Staymates, Amber Burns. An easy to
implement approach for laboratories to visualize particle spread
during the handling and analysis of drug evidence. Forensic
Chemistry, 2020; 18: 100232 DOI: [19]10.1016/j.forc.2020.100232
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:12 2020
Date:
April 23, 2020
Source:
University of California - San Diego
Summary:
The mammalian cell lines that are engineered to produce
high-value recombinant-protein drugs also produce unwanted
proteins that push up the overall cost to manufacture these
drugs. These same proteins can also lower drug quality.
Researchers have now shown that their genome-editing techniques
could eliminate up to 70 percent of the contaminating protein by
mass in recombinant-protein drugs produced by the workhorses of
mammalian cells -- Chinese Hamster Ovary (CHO) cells.
FULL STORY
__________________________________________________________________
The mammalian cell lines that are engineered to produce high-value
recombinant-protein drugs also produce unwanted proteins that push up
the overall cost to manufacture these drugs. These same proteins can
also lower drug quality. In a new paper in Nature Communications,
researchers from the University of California San Diego and the
Technical University of Denmark showed that their genome-editing
techniques could eliminate up to 70 percent of the contaminating
protein by mass in recombinant-protein drugs produced by the workhorses
of mammalian cells -- Chinese Hamster Ovary (CHO) cells.
With the team's CRISPR-Cas mediated gene editing approach, the
researchers demonstrate a significant decrease in purification demands
across the mammalian cell lines they investigated. This work could lead
to both lower production costs and higher quality drugs.
Recombinant proteins currently account for the majority of the top
drugs by sales, including drugs for treating complex diseases ranging
from arthritis to cancer and even combating infectious diseases such as
COVID-19 by neutralizing antibodies. However, the cost of these drugs
puts them out of reach of much of the world population. The high cost
is due in part to the fact that they are produced in cultured cells in
the laboratory. One of the major costs is purification of these drugs,
which can account for up to 80 percent of the manufacturing costs.
In an international collaboration, researchers at the University of
California San Diego and the Technical University of Denmark recently
demonstrated the potential to protect the quality of recombinant
protein drugs while substantially increasing their purity prior to
purification, as reported in the study entitled "Multiplex secretome
engineering enhances recombinant protein production and purity"
published in April 2020 in the journal Nature Communications.
"Cells, such as Chinese hamster ovary (CHO) cells, are cultured and
used to produce many leading drugs," explained Nathan E. Lewis,
Associate Professor of Pediatrics and Bioengineering at the University
of California San Diego, and Co-Director of the CHO Systems Biology
Center at UC San Diego. "However, in addition to the medications we
want, the cells also produce and secrete at least hundreds of their own
proteins into the broth. The problem is that some of these proteins can
degrade the quality of the drugs or could elicit negative side effects
in a patient. That's why there are such strict rules for purification,
since we want the safest and most effective medications possible."
These host cell proteins (HCPs) that are secreted are carefully removed
from every batch of drug, but before they are removed, they can degrade
the quality and potency of the drugs. The various steps of purification
can remove or further damage the drugs.
"Already at an early stage of our research program, we wondered how
many of these secreted contaminating host cell proteins could be
removed," recounted Director Bjorn Voldborg, Head of the CHO Core
facility at the Center of Biosustainability at the Technical University
of Denmark.
In 2012 the Novo Nordisk Foundation awarded a large grant, which has
funded ground-breaking work in genomics, systems biology and large
scale genome editing for research and technology development of CHO
cells at the Center for Biosustainability at the Danish Technical
University (DTU) and the University of California San Diego. This
funded the first publicly accessible genome sequences for CHO cells,
and has provided a unique opportunity to combine synthetic and systems
biology to rationally engineer CHO cells for biopharmaceutical
production.
"Host cell proteins can be problematic if they pose a significant
metabolic demand, degrade product quality, or are maintained throughout
downstream purification," explained Stefan Kol, lead author on the
study who performed this research while at DTU. "We hypothesized that
with multiple rounds of CRISPR-Cas mediated gene editing, we could
decrease host cell protein levels in a stepwise fashion. At this point,
we did not expect to make a large impact on HCP secretion considering
that there are thousands of individual HCPs that have been previously
identified."
This work builds on promising computational work published earlier in
2020.
Researchers at UC San Diego had developed a computational model of
recombinant protein production in CHO cells, published earlier this
year in Nature Communications. Jahir Gutierrez, a former bioengineering
Ph.D. student at UC San Diego used this model to quantify the metabolic
cost of producing each host cell protein in the CHO secretome, and with
the help of Austin Chiang, a project scientist in the Department of
Pediatrics at UC San Diego, showed that a relatively small number of
secreted proteins account for the majority of the cell energy and
resources. Thus the idea to eliminate the dominant contaminating
proteins had the potential to free up a non-negligible amount of
cellular resources and protect drug quality. The authors identified and
removed 14 contaminating host-cell proteins in CHO cells. In doing this
they eliminated up to 70 percent of the contaminating protein by mass
and demonstrated a significant decrease in purification demands.
These modifications can be combined with additional advantageous
genetic modifications being identified by the team in an effort to
obtain higher quality medications at lower costs.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of California - San Diego.
Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Stefan Kol, Daniel Ley, Tune Wulff, Marianne Decker, Johnny
Arnsdorf, Sanne Schoffelen, Anders Holmgaard Hansen, Tanja Lyholm
Jensen, Jahir M. Gutierrez, Austin W. T. Chiang, Helen O. Masson,
Bernhard O. Palsson, Bjørn G. Voldborg, Lasse Ebdrup Pedersen,
Helene Faustrup Kildegaard, Gyun Min Lee, Nathan E. Lewis.
Multiplex secretome engineering enhances recombinant protein
production and purity. Nature Communications, 2020; 11 (1) DOI:
[19]10.1038/s41467-020-15866-w
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:12 2020
they're cool
Date:
April 23, 2020
Source:
University of California - Santa Barbara
Summary:
Have you ever noticed how a bite of warm cherry pie fills your
mouth with sweetness, but that same slice right out of the
refrigerator isn't nearly as tempting? Scientists know this
phenomenon to be true, but the mechanism behind it has been
poorly understood.
FULL STORY
__________________________________________________________________
Have you ever noticed how a bite of warm cherry pie fills your mouth
with sweetness, but that same slice right out of the refrigerator isn't
nearly as tempting? Scientists know this phenomenon to be true, but the
mechanism behind it has been poorly understood.
Now, using fruit flies as his subjects, UC Santa Barbara Distinguished
Professor Craig Montell has discovered one process responsible for this
occurrence. Montell's team, which includes Qiaoran Li, Nicolas
DeBeaubien and Takaaki Sokabe, found that cool temperatures suppress
the appeal of sweetness. However, these conditions did not affect the
sugar neurons themselves. Rather, they acted via other sensory cells by
way of a protein originally discovered to sense light in the eye.
Despite this, the perception of coolness in sugary food is not altered
by light. The results appear in the journal Current Biology.
"The appeal of food is influenced by more than just chemical
composition," said Montell, the Duggan professor in the Department of
Molecular, Cellular, and Developmental Biology. "We already know that
cool temperatures reduce the delectability of sweetness in humans." He
and his colleagues wondered whether this was also true in fruit flies,
and if so, what were the underlying mechanisms?
The team found a significant difference in fruit flies' interest in
feeding between 23 degrees Celsius (73.4° Fahrenheit) and 19° C (66.2°
F). That said, they measured no difference in the activity of the
flies' sweet-sensing taste neurons, despite the change in behavior.
"Since the temperature is not directly affecting the sugar neurons, it
must be affecting some other types of cells, which then indirectly
affect the propensity to consume sugar," Montell noted.
Fruit flies detect sugar with one type of taste neuron. Bitter is
sensed by another type of neuron, and mechanosensory neurons detect the
texture of food, such as hardness. However, temperature sensation is
not quite as simple. Both bitter and mechanosensory neurons are also
involved in detecting coolness. Only if both are activated does the
brain interpret that as a cool signal.
All of these stimuli seem to reduce the animal's desire to feed,
explained Montell. Bitter compounds trigger bitter neurons, which tell
the fly to stop feeding. Hard foods trigger the mechanosensory neurons,
which also tell the fly to stop feeding. And cool temperatures trigger
both, to the same effect.
Critical to this response is a protein called rhodopsin 6. Rhodopsins
are most commonly associated with vision, but over the past few years
the Montell group has connected rhodopsins to a variety of other
senses. Indeed, just a couple weeks prior, Montell's lab published the
first study connecting different members of this class of protein to
chemical taste.
"The bitter neurons express this rhodopsin called Rh6, and if you get
rid of it, then cool temperatures no longer suppress the appeal of
sugar," he said.
Without Rh6, the bitter-and-cool-detecting neurons are no longer turned
on by low temperatures. And since cool-sensation requires activating
multiple, different types of neurons, loss of Rh6 prevents the fly from
recognizing the lower temperature, thereby eliminating the decreased
attraction to sugary food.
"The surprise was finding that it was really the other neurons, not the
sugar neurons, whose activity went up," Montell said, "and that the
cool activation of other neurons was indirectly suppressing the sugar
neurons."
The sweet-sensing neurons are still activated by sugars at low
temperatures; however, the activation of these other neurons by
decreased temperature suppresses the communication between the
sweet-detecting neurons and the animal's brain. This is likely achieved
by an inhibitory neurotransmitter released by the bitter/cool-activated
neurons.
As for why fruit flies avoid food when it's chilly, Montell suspects
it's due to their metabolism. Fruit flies' metabolism, and thus food
requirements, are affected by temperature. Lower temperatures mean
slower metabolisms, and less need for food. And generally, if the food
is cold, so is the fly.
In fact, the fly generation time -- the time it takes an egg to turn
into an adult fly -- doubles from 10 days to 20 when the temperature is
lowered from 25 to 18 degrees Celsius. "Everything is just slowed
down," Montell said, "and that's why feeding is reduced. You don't want
to eat the same amount when your metabolism is slowed down." This
explanation doesn't hold true for warm-blooded animals like humans,
even if we show a similar behavior.
In the future, Montell and first author Qiaoran Li plan to further
investigate the mechanosensory side of food appeal by looking at how
particle size influences feeding behavior. As an example, he offers the
distinct difference between fresh and refrozen ice cream. Despite
having the same chemical composition and temperature, most people
prefer ice cream that hasn't melted and refrozen into a block.
Reflecting on the surprising finding, Montell remarked, "It's great for
your expectations to be wrong, as long as you can then figure out
what's right."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of California - Santa Barbara.
Original written by Harrison Tasoff. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Qiaoran Li, Nicolas A. DeBeaubien, Takaaki Sokabe, Craig Montell.
Temperature and Sweet Taste Integration in Drosophila. Current
Biology, 2020; DOI: [19]10.1016/j.cub.2020.03.066
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 23 21:30:12 2020
Date:
April 23, 2020
Source:
University of Illinois at Urbana-Champaign
Summary:
Researchers have demonstrated an inexpensive yet sensitive
smartphone-based testing device for viral and bacterial
pathogens that takes about 30 minutes to complete. The roughly
$50 smartphone accessory could reduce the pressure on testing
laboratories during a pandemic such as COVID-19.
FULL STORY
__________________________________________________________________
Most viral test kits rely on labor- and time-intensive laboratory
preparation and analysis techniques; for example, tests for the novel
coronavirus can take days to detect the virus from nasal swabs. Now,
researchers have demonstrated an inexpensive yet sensitive
smartphone-based testing device for viral and bacterial pathogens that
takes about 30 minutes to complete. The roughly $50 smartphone
accessory could reduce the pressure on testing laboratories during a
pandemic such as COVID-19.
The results of the new multi-institutional study, led by University of
Illinois at Urbana-Champaign electrical and computer engineering
professor Brian Cunningham and bioengineering professor Rashid Bashir,
are reported in the journal Lab on a Chip.
"The challenges associated with rapid pathogen testing contribute to a
lot of uncertainty regarding which individuals are quarantined and a
whole host of other health and economic issues," Cunningham said.
The study began with the goal of detecting a panel of viral and
bacterial pathogens in horses, including those that cause severe
respiratory illnesses similar to those presented in COVID-19, the
researchers said.
"Horse pathogens can lead to devastating diseases in animal
populations, of course, but one reason we work with them has to do with
safety. The horse pathogens in our study are harmless to humans,"
Cunningham said.
The new testing device is composed of a small cartridge containing
testing reagents and a port to insert a nasal extract or blood sample,
the researchers said. The whole unit clips to a smartphone.
Inside the cartridge, the reagents break open a pathogen's outer shell
to gain access to its RNA. A primer molecule then amplifies the genetic
material into many millions of copies in about 10 or 15 minutes, the
researchers said. A fluorescent dye stains the copies and glows green
when illuminated by blue LED light, which is then detected by the
smartphone's camera.
"This test can be performed rapidly on passengers before getting on a
flight, on people going to a theme park or before events like a
conference or concert," Cunningham said. "Cloud computing via a
smartphone application could allow a negative test result to be
registered with event organizers or as part of a boarding pass for a
flight. Or, a person in quarantine could give themselves daily tests,
register the results with a doctor, and then know when it's safe to
come out and rejoin society."
There are a few preparatory steps currently performed outside of the
device, and the team is working on a cartridge that has all of the
reagents needed to be a fully integrated system. Other researchers at
the U. of I. are using the novel coronavirus genome to create a mobile
test for COVID-19, and making an easily manufactured cartridge that
Cunningham said would improve testing efforts.
Study co-authors with Cunningham and Bashir were Fu Sun, Anurup Ganguli
and Matthew B. Wheeler, of the U. of I.; and Ryan Brisbin and David L.
Hirschberg, of RAIN Incubator; Krithika Shanmugam, of the University of
Washington; and veterinarian David M. Nash.
The National Science Foundation and the Center for Genomic Diagnostics
at the U. of I.'s Carl R. Woese Institute for Genomic Biology supported
this research.
Bashir also is the dean of the Grainger College of Engineering at
Illinois.
Cunningham also is affiliated with bioengineering and materials science
and engineering, the Carl R. Woese Institute for Genomic Biology, the
Holonyak Micro and Nanotechnology Lab, the Carle Illinois College of
Medicine and the Beckman Institute for Advanced Science and Technology
at Illinois.
Cunningham serves as a consultant to and owns stock in Reliant Immune
Diagnostics, the company that licensed the technology described in this
news release.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Illinois at
Urbana-Champaign. Original written by Lois Yoksoulian. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Fu Sun, Anurup Ganguli, Judy Nguyen, Ryan Brisbin, Krithika
Shanmugam, David L. Hirschberg, Matthew B Wheeler, Rashid Bashir,
David M. Nash, Brian T Cunningham. Smartphone-Based Multiplex
30-minute Nucleic Acid Test of Live Virus from Nasal Swab Extract.
Lab on a Chip, 2020; DOI: [19]10.1039/D0LC00304B
__________________________________________________________________
--- up 13 weeks, 2 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
MeaTLoTioN@1337:1/101 to
All on Fri Apr 24 09:30:50 2020
On Thu, 23 Apr 2020 20:25:00 GMT
Black Panther wrote:
On 24 Apr 2020, alterego said the following...
I think the Network Coordinator should provide everyone with
their dr
of choice... ;)
I'll second this! :)
I'll third this! Wait, I can't third it if I firsted it. (That really doesn't sound good if you read it aloud...) ;)
Making me thirsty now... I will buy you all a round in the local pub next
time we all meet up, how's that?
Can't say fairer than that right? =)
--
Best regards,
MeaTLoTioN
--- Mystic BBS/NNTP v1.12 A43 2019/03/02 (Linux/64)
* Origin: thE qUAntUm wOrmhOlE, rAmsgAtE, Uk. bbs.erb.pw (1337:1/101)
-
From
SpaceDaily@1337:3/111 to
All on Fri Apr 24 21:30:10 2020
Date:
April 24, 2020
Source:
University of Rochester
Summary:
Researchers have applied physics theory and calculations to
predict the presence of two new phenomena -- interspecies
radiative transition (IRT) and the breakdown of the dipole
selection rule -- in the transport of radiation in atoms and
molecules under high-energy-density (HED) conditions. The
research enhances an understanding of HED science and could lead
to more information about how stars and other astrophysical
objects evolve in the universe.
FULL STORY
__________________________________________________________________
Atoms and molecules behave very differently at extreme temperatures and
pressures. Although such extreme matter doesn't exist naturally on the
earth, it exists in abundance in the universe, especially in the deep
interiors of planets and stars. Understanding how atoms react under
high-pressure conditions -- a field known as high-energy-density
physics (HEDP) -- gives scientists valuable insights into the fields of
planetary science, astrophysics, fusion energy, and national security.
One important question in the field of HED science is how matter under
high-pressure conditions might emit or absorb radiation in ways that
are different from our traditional understanding.
In a paper published in Nature Communications, Suxing Hu, a
distinguished scientist and group leader of the HEDP Theory Group at
the University of Rochester Laboratory for Laser Energetics (LLE),
together with colleagues from the LLE and France, has applied physics
theory and calculations to predict the presence of two new phenomena --
interspecies radiative transition (IRT) and the breakdown of dipole
selection rule -- in the transport of radiation in atoms and molecules
under HEDP conditions. The research enhances an understanding of HEDP
and could lead to more information about how stars and other
astrophysical objects evolve in the universe.
WHAT IS INTERSPECIES RADIATIVE TRANSITION (IRT)?
Radiative transition is a physics process happening inside atoms and
molecules, in which their electron or electrons can "jump" from
different energy levels by either radiating/emitting or absorbing a
photon. Scientists find that, for matter in our everyday life, such
radiative transitions mostly happen within each individual atom or
molecule; the electron does its jumping between energy levels belonging
to the single atom or molecule, and the jumping does not typically
occur between different atoms and molecules.
However, Hu and his colleagues predict that when atoms and molecules
are placed under HED conditions, and are squeezed so tightly that they
become very close to each other, radiative transitions can involve
neighboring atoms and molecules.
"Namely, the electrons can now jump from one atom's energy levels to
those of other neighboring atoms," Hu says.
WHAT IS THE DIPOLE SELECTION RULE?
Electrons inside an atom have specific symmetries. For example, "s-wave
electrons" are always spherically symmetric, meaning they look like a
ball, with the nucleus located in the atomic center; "p-wave
electrons," on the other hand, look like dumbbells. D-waves and other
electron states have more complicated shapes. Radiative transitions
will mostly occur when the electron jumping follows the so-called
dipole selection rule, in which the jumping electron changes its shape
from s-wave to p-wave, from p-wave to d-wave, etc.
Under normal, non-extreme conditions, Hu says, "one hardly sees
electrons jumping among the same shapes, from s-wave to s-wave and from
p-wave to p-wave, by emitting or absorbing photons."
However, as Hu and his colleagues found, when materials are squeezed so
tightly into the exotic HED state, the dipole selection rule is often
broken down.
"Under such extreme conditions found in the center of stars and classes
of laboratory fusion experiments, non-dipole x-ray emissions and
absorptions can occur, which was never imagined before," Hu says.
USING SUPERCOMPUTERS TO STUDY HEDP
The researchers used supercomputers at both the University of
Rochester's Center for Integrated Research Computing (CIRC) and at the
LLE to conduct their calculations.
"Thanks to the tremendous advances in high-energy laser and
pulsed-power technologies, 'bringing stars to the Earth' has become
reality for the past decade or two," Hu says.
Hu and his colleagues performed their research using the
density-functional theory (DFT) calculation, which offers a quantum
mechanical description of the bonds between atoms and molecules in
complex systems. The DFT method was first described in the 1960s, and
was the subject of the 1998 Nobel Prize in Chemistry. DFT calculations
have been continually improved since. One such improvement to enable
DFT calculations to involve core electrons was made by Valentin
Karasev, a scientist at the LLE and a co-author of the paper.
The results indicate there are new emission/absorption lines appearing
in the x-ray spectra of these extreme matter systems, which are from
the previously-unknown channels of IRT and the breakdown of dipole
selection rule.
Hu and Philip Nilson, a senior scientist at the LLE and co-author of
the paper, are currently planning future experiments that will involve
testing these new theoretical predictions at the OMEGA laser facility
at the LLE. The facility lets users create exotic HED conditions in
nanosecond timescales, allowing scientists to probe the unique
behaviors of matters at extreme conditions.
"If proved to be true by experiments, these new discoveries will
profoundly change how radiation transport is currently treated in
exotic HED materials," Hu says. "These DFT-predicted new emission and
absorption channels have never been considered so far in textbooks."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Rochester. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. S. X. Hu, V. V. Karasiev, V. Recoules, P. M. Nilson, N. Brouwer, M.
Torrent. Interspecies radiative transition in warm and superdense
plasma mixtures. Nature Communications, 2020; 11 (1) DOI:
[19]10.1038/s41467-020-15916-3
__________________________________________________________________
--- up 13 weeks, 3 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri Apr 24 21:30:10 2020
Date:
April 24, 2020
Source:
ESA/Hubble Information Centre
Summary:
Hubble Space Telescope's iconic images and scientific
breakthroughs have redefined our view of the universe. To
commemorate three decades of scientific discoveries, this image
is one of the most photogenic examples of the many turbulent
stellar nurseries the telescope has observed during its 30-year
lifetime.
FULL STORY
__________________________________________________________________
Hubble Space Telescope's iconic images and scientific breakthroughs
have redefined our view of the Universe. To commemorate three decades
of scientific discoveries, this image is one of the most photogenic
examples of the many turbulent stellar nurseries the telescope has
observed during its 30-year lifetime. The portrait features the giant
nebula NGC 2014 and its neighbour NGC 2020 which together form part of
a vast star-forming region in the Large Magellanic Cloud, a satellite
galaxy of the Milky Way, approximately 163,000 light-years away. The
image is nicknamed the "Cosmic Reef" because it resembles an undersea
world.
On 24 April 1990 the Hubble Space Telescope was launched aboard the
space shuttle Discovery, along with a five-astronaut crew. Deployed
into low-Earth orbit a day later, the telescope has since opened a new
eye onto the cosmos that has been transformative for our civilization.
Hubble is revolutionising modern astronomy not only for astronomers,
but also by taking the public on a wondrous journey of exploration and
discovery. Hubble's seemingly never-ending, breathtaking celestial
snapshots provide a visual shorthand for its exemplary scientific
achievements. Unlike any other telescope before it, Hubble has made
astronomy relevant, engaging, and accessible for people of all ages.
The mission has yielded to date 1.4 million observations and provided
data that astronomers around the world have used to write more than
17,000 peer-reviewed scientific publications, making it one of the most
prolific space observatories in history. Its rich data archive alone
will fuel future astronomy research for generations to come.
Each year, the NASA/ESA Hubble Space Telescope dedicates a small
portion of its precious observing time to taking a special anniversary
image, showcasing particularly beautiful and meaningful objects. These
images continue to challenge scientists with exciting new surprises and
to fascinate the public with ever more evocative observations.
This year, Hubble is celebrating this new milestone with a portrait of
two colourful nebulae that reveals how energetic, massive stars sculpt
their homes of gas and dust. Although NGC 2014 and NGC 2020 appear to
be separate in this visible-light image, they are actually part of one
giant star formation complex. The star-forming regions seen here are
dominated by the glow of stars at least 10 times more massive than our
Sun. These stars have short lives of only a few million years, compared
to the 10-billion-year lifetime of our Sun.
The sparkling centerpiece of NGC 2014 is a grouping of bright, hefty
stars near the centre of the image that has blown away its cocoon of
hydrogen gas (coloured red) and dust in which it was born. A torrent of
ultraviolet radiation from the star cluster is illuminating the
landscape around it. These massive stars also unleash fierce winds that
are eroding the gas cloud above and to the right of them. The gas in
these areas is less dense, making it easier for the stellar winds to
blast through them, creating bubble-like structures reminiscent of
brain coral, that have earned the nebula the nickname the "Brain
Coral."
By contrast, the blue-coloured nebula below NGC 2014 has been shaped by
one mammoth star that is roughly 200,000 times more luminous than our
Sun. It is an example of a rare class of stars called Wolf-Rayet stars.
They are thought to be the descendants of the most massive stars.
Wolf-Rayet stars are very luminous and have a high rate of mass loss
through powerful winds. The star in the Hubble image is 15 times more
massive than the Sun and is unleashing powerful winds, which have
cleared out the area around it. It has ejected its outer layers of gas,
sweeping them around into a cone-like shape, and exposing its searing
hot core. The behemoth appears offset from the centre because the
telescope is viewing the cone from a slightly tilted angle. In a few
million years, the star might become a supernova. The brilliant blue
colour of the nebula comes from oxygen gas that is heated to roughly
11,000 degrees Celsius, which is much hotter than the hydrogen gas
surrounding it.
Stars, both big and small, are born when clouds of dust and gas
collapse because of gravity. As more and more material falls onto the
forming star, it finally becomes hot and dense enough at its centre to
trigger the nuclear fusion reactions that make stars, including our
Sun, shine. Massive stars make up only a few percent of the billions of
stars in our Universe. Yet they play a crucial role in shaping our
Universe, through stellar winds, supernova explosions, and the
production of heavy elements.
"The Hubble Space Telescope has shaped the imagination of truly a whole
generation, inspiring not only scientists, but almost everybody," said
Günther Hasinger, Director of Science for the European Space Agency.
"It is paramount for the excellent and long-lasting cooperation between
NASA and ESA."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]ESA/Hubble Information Centre. Note:
Content may be edited for style and length.
__________________________________________________________________
--- up 13 weeks, 3 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri Apr 24 21:30:10 2020
Technological advances will not help the world unless they lead to action
Date:
April 24, 2020
Source:
University of Melbourne
Summary:
The use of big data can help scientists' chart not only the
degradation of the environment but can be part of the solution
to achieve sustainability, according to a new commentary paper.
FULL STORY
__________________________________________________________________
The use of big data can help scientists' chart not only the degradation
of the environment but can be part of the solution to achieve
sustainability, according to a new commentary paper.
The paper, 'Opportunities for big data in conservation and
sustainability', published today in Nature Communications, said
increased computing speeds and data storage had grown the volume of big
data in the last 40 years, but the planet was still facing serious
decline.
Lead author Dr Rebecca Runting from the University of Melbourne's
School of Geography says that while we currently have an unprecedented
ability to generate, store, access and analyse data about the
environment, these technological advances will not help the world
unless they lead to action.
"Big data analyses must be closely linked to environmental policy and
management," Dr Runting said. "For example, many large companies
already possess the methodological, technical, and computational
capacity to develop solutions, so it is paramount that new developments
and resources are shared timely with government, and in the spirit of
'open data'."
Commentators noted that 2.3 million km^2 of forest was lost over the
years 2000 to 2012 and that dynamic marine and coastal ecosystems have
revealed similar declines. An analysis of over 700,000 satellite images
shows that Earth has lost more than 20,000 km2 of tidal flats since
1984.
"In light of the COVID-19 pandemic, we are currently seeing governments
making rapid (health) decisions based on fairly sophisticated data
analysis," Dr Runting said. "There may be opportunities to learn from
this and achieve a similarly tight coupling of analysis and
decision-making in the environmental sector."
Co-author Professor James Watson from the University of Queensland said
with platforms like Google Earth Engine and the capacity of satellites
to track and send information quickly to computers, big data was
capable of identifying eco-health risks globally.
"What the big data revolution has helped us understand is the
environment is often doing worse than what we thought it was. The more
we map and analyse, the more we find the state of the environment,
albeit Antarctic ice sheets, wetlands, or forests, is dire. Big data
tells us we are running out of time," Professor Watson said.
"The good news is the big data revolution can help us better understand
risk. For example, we can use data to better understand where future
ecosystem degradation will take place and where these interact with
wildlife trade, so as to map pandemic risk."
Dr Runting said big data has been pivotal in quantifying alarming
spatial and temporal trends across Earth. For example, an automated
vessel tracking and monitoring system is being used to predict illegal
fishing activity in real-time.
"This has allowed governments quickly investigate particular vessels
that may be undertaking illegal fishing activity within their
jurisdiction, including within Australian waters," she said. Similarly,
Queensland's Statewide Landcover and Trees Study uses satellite imagery
to monitor woody vegetation clearing, including the detection of
illegal clearing.
Professor Watson cited a similar example. "Global forest watch has been
a game change for monitoring the state of the world forests in near
real time. This can help identify illegal activities and informed
active enforcement of forest conservation around the world," Professor
Watson said.
The paper also noted positive environmental changes due to human
intervention such as greening seen in large expanses in China, which
was driven by large scale national policies, including forest
conservation and payments for restoration.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Melbourne. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Rebecca K. Runting, Stuart Phinn, Zunyi Xie, Oscar Venter, James E.
M. Watson. Opportunities for big data in conservation and
sustainability. Nature Communications, 2020; 11 (1) DOI:
[19]10.1038/s41467-020-15870-0
__________________________________________________________________
--- up 13 weeks, 3 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri Apr 24 21:30:10 2020
Date:
April 24, 2020
Source:
Brown University
Summary:
A team of researchers has demonstrated a way to help devices to
find each other in the ultra-fast terahertz data networks of the
future.
FULL STORY
__________________________________________________________________
When someone opens a laptop, a router can quickly locate it and connect
it to the local Wi-Fi network. That ability is a basic element of any
wireless network known as link discovery, and now a team of researchers
has developed a means of doing it with terahertz radiation, the
high-frequency waves that could one day make for ultra-fast wireless
data transmission.
Because of their high frequency, terahertz waves can carry hundreds of
times more data than the microwaves used to carry our data today. But
that high frequency also means that terahertz waves propagate
differently than microwaves. Whereas microwaves emanate from a source
in an omni-directional broadcast, terahertz waves propagate in narrow
beams.
"When you're talking about a network that's sending out beams, it
raises a whole myriad of questions about how you actually build that
network," said Daniel Mittleman, a professor in Brown's School of
Engineering. "One of those questions is how does an access point, which
you can think of as a router, find out where client devices are in
order to aim a beam at them. That's what we're thinking about here."
In a paper published in Nature Communications, researchers from Brown
and Rice University showed that a device known as a leaky waveguide can
be used for link discovery at terahertz frequencies. The approach
enables link discovery to be done passively, and in one shot.
The concept of a leaky waveguide is simple. It's just two metal plates
with a space between them where radiation can propagate. One of the
plates has a narrow slit cut into it, which allows a little bit of the
radiation to leak out. This new research shows the device can be used
for link discovery and tracking by exploiting one of its underlying
properties: that different frequencies leak out of the slit at
different angles.
"We input a wide range of terahertz frequencies into this waveguide in
a single pulse, and each one leaks out simultaneously at a different
angle," said Yasaman Ghasempour, a graduate student at Rice and
co-author on the study. "You can think of it like a rainbow leaking
out, with each color represents a unique spectral signature
corresponding to an angle."
Now imagine a leaky waveguide placed on an access point. Depending upon
where a client device is relative to the access point, it's going to
see a different color coming out of the waveguide. The client just
sends a signal back to the access point that says, "I saw yellow," and
now the access point knows exactly where the client is, and can
continue tracking it.
"It is not just about discovering the link once," Yasaman said. "In
fact, the direction of transmission needs to be continually adjusted as
the client moves. Our technique allows for ultra-fast adaptation which
is the key to achieving seamless connectivity."
The setup also uses a leaky waveguide on the client side. On that side,
the range of frequencies received through the slit in the waveguide can
be used to determine the position of the router relative to the local
rotation of the device -- like when someone swivels their chair while
using a laptop.
Mittleman says that finding a novel way to make link discovery work in
the terahertz realm is important because existing protocols for link
discovery in microwaves simply won't work for terahertz signals. Even
the protocols that have been developed for burgeoning 5G networks,
which are much more directional than standard microwaves, aren't
feasible for terahertz. That's because as narrow as 5G beams are,
they're still around 10 times wider than the beams in a terahertz
network.
"I think some people have assumed that since 5G is somewhat
directional, this problem had been solved, but the 5G solution simply
isn't scalable," Mittleman said. "A whole new idea is needed. This is
one of those fundamental protocol pieces that you need to start
building terahertz networks."
Other co-authors on the paper were Rabi Shrestha and Aaron Charous from
Brown University, and Edward Knightly from Rice University. The work
was supported by Cisco, Intel and by the National Science Foundation.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Brown University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Yasaman Ghasempour, Rabi Shrestha, Aaron Charous, Edward Knightly,
Daniel M. Mittleman. Single-shot link discovery for terahertz
wireless networks. Nature Communications, 2020; 11 (1) DOI:
[19]10.1038/s41467-020-15761-4
__________________________________________________________________
--- up 13 weeks, 3 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri Apr 24 21:30:10 2020
Date:
April 24, 2020
Source:
Columbia University School of Engineering and Applied Science
Summary:
Researchers have designed biocompatible ion-driven soft
transistors that can perform real-time neurologically relevant
computation and a mixed-conducting particulate composite that
allows creation of electronic components out of a single
material. These have promise for bioelectronic devices that are
fast, sensitive, biocompatible, soft, and flexible, with
long-term stability in physiological environments such as the
human body. In particular, they could facilitate diagnosis and
monitoring of neurological disease.
FULL STORY
__________________________________________________________________
Dion Khodagholy, assistant professor of electrical engineering, is
focused on developing bioelectronic devices that are not only fast,
sensitive, biocompatible, soft, and flexible, but also have long-term
stability in physiological environments such as the human body. Such
devices would greatly improve human health, from monitoring in-home
wellness to diagnosing and treating neuropsychiatric diseases,
including epilepsy and Parkinson's disease. The design of current
devices has been severely constrained by the rigid, non-biocompatible
electronic components needed for safe and effective use, and solving
this challenge would open the door to a broad range of exciting new
therapies.
In collaboration with Jennifer N. Gelinas, Department of Neurology, and
the Institute for Genomic Medicine at Columbia University Iriving
Medical Center, Khodagholy has recently published two papers, the first
in Nature Materials (March 16) on ion-driven soft and organic
transistors that he and Gelinas have designed to record individual
neurons and perform real-time computation that could facilitate
diagnosis and monitoring of neurological disease.
The second paper, published today in Science Advances, demonstrates a
soft, biocompatible smart composite -- an organic mixed-conducting
particulate material (MCP) -- that enables the creation of complex
electronic components which traditionally require several layers and
materials. It also enables easy and effective electronic bonding
between soft materials, biological tissue, and rigid electronics.
Because it is fully biocompatible and has controllable electronic
properties, MCP can non-invasively record muscle action potentials from
the surface of arm and, in collaboration with Sameer Sheth and Ashwin
Viswanathan at Baylor College of Medicine's department of neurosurgery,
large-scale brain activity during neurosurgical procedures to implant
deep brain stimulation electrodes.
"Instead of having large implants encapsulated in thick metal boxes to
protect the body and electronics from each other, such as those used in
pacemakers, and cochlear and brain implants, we could do so much more
if our devices were smaller, flexible, and inherently compatible with
our body environment," says Khodagholy, who directs the Translational
NeuroElectronics Lab at Columbia Engineering. "Over the past several
years, my group has been working to use unique properties of materials
to develop novel electronic devices that allow efficient interaction
with biological substrates -- specifically neural networks and the
brain."
Conventional transistors are made out of silicon, so they cannot
function in the presence of ions and water, and in fact break down
because of ion diffusion into the device. Therefore, the devices need
to be fully encapsulated in the body, usually in metal or plastic.
Moreover, although they work well with electrons, they are not very
effective at interacting with ionic signals, which is how the body's
cells communicate. As a result, these properties restrict the
abiotic/biotic coupling to capacitive interactions only on the surface
of material, resulting in lower performance. Organic materials have
been used to overcome these limitations as they are inherently
flexible, but the electrical performance of these devices was not
sufficient to perform real-time brain signal recording and processing.
Khodagholy's team took advantage of both the electronic and the ionic
conduction of organic materials to create ion driven transistors they
call e-IGTs, or enhancement-mode, internal ion-gated organic
electrochemical transistors, that have embedded mobile ions inside
their channels. Because the ions do not need to travel long distances
to participate in the channel switching process, they can be switched
on and off quickly and efficiently. The transient responses depend on
electron hole rather than ion mobility, and combine with high
transconductance to result in a gain-bandwidth that is several orders
of magnitude above that of other ion-based transistors.
The researchers used their e-IGTs to acquire a wide range of
electrophysiological signals, such as in vivo recording of neural
action impulses, and to create soft, biocompatible, long-term
implantable neural processing units for the real-time detection of
epileptic discharges.
"We're excited about these findings," says Gelinas. "We've shown that
E-IGTs offer a safe, reliable, and high-performance building block for
chronically implanted bioelectronics, and I am optimistic that these
devices will enable us to safely expand how we use bioelectronic
devices to address neurologic disease."
Another major advance is demonstrated by the researchers in their
Science Advances paper: enabling bioelectronic devices, specifically
those implanted in the body for diagnostics or therapy, to interface
effectively and safely with human tissue, while also making them
capable of performing complex processing. Inspired by electrically
active cells, similar to those in the brain that communicate with
electrical pulses, the team created a single material capable of
performing multiple, non-linear, dynamic electronic functions just by
varying the size and density of its composite mixed-conducting
particles.
"This innovation opens the door to a fundamentally different approach
to electronic device design, mimicking biological networks and creating
multifunctional circuits from purely biodegradable and biocompatible
components," says Khodagholy.
The researchers design and created mixed conducting particulate
(MCP)-based high performance anisotropic films, independently
addressable transistors, resistors, and diodes that are pattern-free,
scalable, and biocompatible. These devices carried out a variety of
functions, including recording neurophysiologic activity from
individual neurons, performing circuit operations, and bonding
high-resolution soft and rigid electronics.
"MCP substantially reduces the footprint of neural interface devices,
permitting recording of high-quality neurophysiological data even when
the amount of tissue exposed is very small, and thus decreases the risk
of surgical complications," says Gelinas. "And because MCP is composed
of only biocompatible and commercially available materials, it will be
much easier to translate into biomedical devices and medicine."
Both the E-IGTs and MCP hold great promise as critical components of
bioelectronics, from wearable miniaturized sensors to responsive
neurostimulators. The E-IGTs can be manufactured in large quantities
and are accessible to a broad range of fabrication processes.
Similarly, MCP components are inexpensive and easily accessible to
materils scientist and engineers. In combination, they form the
foundation for fully implantable biocompatible devices that can be
harnessed both to benefit health and to treat disease.
Khodagholy and Gelinas are now working on translating these components
into functional long-term implantable devices that can record and
modulate brain activity to help patients with neurological diseases
such as epilepsy.
"Our ultimate goal is to create accessible bioelectronic devices that
can improve peoples' quality of life," says Khodagholy, "and with these
new materials and components, it feels like we have stepped closer to
that."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Columbia University School of Engineering
and Applied Science. Original written by Holly Evarts. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal References:
1. Patricia Jastrzebska-Perfect, George D. Spyropoulos, Claudia Cea,
Zifang Zhao, Onni J. Rauhala, Ashwin Viswanathan, Sameer A. Sheth,
Jennifer N. Gelinas, and Dion Khodagholy. Mixed-conducting
particulate composites for soft electronics. Science Advances, 2020
DOI: [19]10.1126/sciadv.aaz6767
2. Claudia Cea, George D. Spyropoulos, Patricia Jastrzebska-Perfect,
José J. Ferrero, Jennifer N. Gelinas, Dion Khodagholy.
Enhancement-mode ion-based transistor as a comprehensive interface
and real-time processing unit for in vivo electrophysiology. Nature
Materials, 2020; DOI: [20]10.1038/s41563-020-0638-3
__________________________________________________________________
--- up 13 weeks, 3 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri Apr 24 21:30:12 2020
Date:
April 24, 2020
Source:
American Chemical Society
Summary:
Researchers report that a combination of cotton with natural
silk or chiffon can effectively filter out aerosol particles --
if the fit is good.
FULL STORY
__________________________________________________________________
In the wake of the COVID-19 pandemic, the U.S. Centers for Disease
Control and Prevention recommends that people wear masks in public.
Because N95 and surgical masks are scarce and should be reserved for
health care workers, many people are making their own coverings. Now,
researchers report in ACS Nano that a combination of cotton with
natural silk or chiffon can effectively filter out aerosol particles --
if the fit is good.
SARS-CoV-2, the new coronavirus that causes COVID-19, is thought to
spread mainly through respiratory droplets when an infected person
coughs, sneezes, speaks or breathes. These droplets form in a wide
range of sizes, but the tiniest ones, called aerosols, can easily slip
through the openings between certain cloth fibers, leading some people
to question whether cloth masks can actually help prevent disease.
Therefore, Supratik Guha at the University of Chicago and colleagues
wanted to study the ability of common fabrics, alone or in combination,
to filter out aerosols similar in size to respiratory droplets.
The researchers used an aerosol mixing chamber to produce particles
ranging from 10 nm to 6 μm in diameter. A fan blew the aerosol across
various cloth samples at an airflow rate corresponding to a person's
respiration at rest, and the team measured the number and size of
particles in air before and after passing through the fabric. One layer
of a tightly woven cotton sheet combined with two layers of
polyester-spandex chiffon -- a sheer fabric often used in evening gowns
-- filtered out the most aerosol particles (80-99%, depending on
particle size), with performance close to that of an N95 mask material.
Substituting the chiffon with natural silk or flannel, or simply using
a cotton quilt with cotton-polyester batting, produced similar results.
The researchers point out that tightly woven fabrics, such as cotton,
can act as a mechanical barrier to particles, whereas fabrics that hold
a static charge, like certain types of chiffon and natural silk, serve
as an electrostatic barrier. However, a 1% gap reduced the filtering
efficiency of all masks by half or more, emphasizing the importance of
a properly fitted mask.
The authors acknowledge use of the U.S. Department of Energy's Center
for Nanoscale Materials user facility at Argonne National Laboratory
and funding from the U.S. Department of Defense's Vannevar Bush
Fellowship.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]American Chemical Society. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Abhiteja Konda, Abhinav Prakash, Gregory A. Moss, Michael Schmoldt,
Gregory D. Grant, Supratik Guha. Aerosol Filtration Efficiency of
Common Fabrics Used in Respiratory Cloth Masks. ACS Nano, 2020;
DOI: [19]10.1021/acsnano.0c03252
__________________________________________________________________
--- up 13 weeks, 3 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Fri Apr 24 21:30:14 2020
analyze
Date:
April 24, 2020
Source:
Ruhr-University Bochum
Summary:
Scientists have been able to shed new light on the properties of
water at the molecular level. In particular, they were able to
describe accurately the interactions between three water
molecules, which contribute significantly to the energy
landscape of water. The research could pave the way to better
understand and predict water behavior at different conditions,
even under extreme ones.
FULL STORY
__________________________________________________________________
An international team of scientists lead by Professor Martina Havenith
from Ruhr-Universität Bochum (RUB) has been able to shed new light on
the properties of water at the molecular level. In particular, they
were able to describe accurately the interactions between three water
molecules, which contribute significantly to the energy landscape of
water. The research could pave the way to better understand and predict
water behaviour at different conditions, even under extreme ones. The
results have been published online in the journal Angewandte Chemie on
19 April 2020.
Interactions via vibrations
Despite water is at first glance looking like a simple liquid it has
many unusual properties, one of them being that it is less dense when
it is frozen than when it is liquid. In the simplest way liquids are
described by the interaction of their direct partners, which are mostly
sufficient for a good description, but not in the case of water: The
interactions in water dimers account for 75 per cent of the energy that
keeps water together. Martina Havenith, head of the Bochum-based Chair
of Physical Chemistry II and spokesperson for the Ruhr Explores
Solvation (Resolv) Cluster of Excellence, and her colleagues from Emory
University in Atlanta, US, recently published an accurate description
of the interactions related to the water dimer. In order to get access
to the cooperative interactions, which make up 25 per cent of the total
water interaction, the water trimer had to be investigated.
Now, the team lead by Martina Havenith in collaboration with colleagues
from Emory University and of the University of Mississipi, US, has been
able to describe for the first time in an accurate way the interaction
energy among three water molecules. They tested modern theoretical
descriptions against the result of the spectroscopic fingerprint of
these intermolecular interactions.
Obstacles for experimental research
Since more than 40 years, scientists have developed computational
models and simulations to describe the energies involved in the water
trimer. Experiments have been less successful, despite some pioneer
insights in gas phase studies, and they rely on spectroscopy. The
technique works by irradiating a water sample with radiation and
recording how much light has been absorbed. The obtained pattern is
related to the different type of excitations of intermolecular motions
involving more than one water molecules. Unfortunately, to obtain these
spectroscopic fingerprints for water dimers and trimers, one needs to
irradiate in the terahertz frequency region. And laser sources that
provide high-power have been lacking for that frequency region.
This technical gap has been filled only recently. In the current
publication, the RUB scientists used the free electron lasers at
Radboud University in Nijmegen in The Netherlands, which allows for
high powers in the terahertz frequency region. The laser was applied
through tiny droplets of superfluid helium, which is cooled down at
extremely low temperatures, at minus 272,75 degrees Celsius. These
droplets can collect water molecules one by one, allowing to isolate
small aggregates of dimers and trimers. In this way the scientists were
able to irradiate exactly the molecules they wanted to and to acquire
the first comprehensive spectrum of the water trimer in the terahertz
frequency region.
The experimental observations of the intermolecular vibrations were
compared to and interpreted using high level quantum calculations. In
this way the scientists could analyse the spectrum and assign up to six
different intermolecular vibrations.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Ruhr-University Bochum. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Martina Havenith-Newen, Raffael Schwan, Chen Qu, Devendra Mani,
Nitish Pal, Gerhard Schwaab, Joel M. Bowman, Gregory Tschumper.
Observation of the low frequency spectrum of water trimer as a
sensitive test of the water trimer potential and the dipole moment
surface. Angewandte Chemie International Edition, 2020; DOI:
[19]10.1002/anie.202003851
__________________________________________________________________
--- up 13 weeks, 3 days, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
Black Panther@1337:3/111 to
MeaTLoTioN on Fri Apr 24 22:46:18 2020
On 24 Apr 2020, MeaTLoTioN said the following...
Making me thirsty now... I will buy you all a round in the local pub next time we all meet up, how's that?
Can't say fairer than that right? =)
I might have to take you up on that. I do want to get to Europe before I get too old to enjoy it. :)
---
Black Panther(RCS)
Castle Rock BBS
--- Mystic BBS v1.12 A45 2020/02/18 (Linux/64)
* Origin: Castle Rock BBS - bbs.castlerockbbs.com - (1337:3/111)
-
From
MeaTLoTioN@1337:1/101 to
Black Panther on Sun Apr 26 09:37:44 2020
I might have to take you up on that. I do want to get to Europe before I get too old to enjoy it. :)
i hear that, well when this pandemic allows us all to get together again, if you do make it to jolly old england we will definitely have to find a nice
pub to sink a few in.
---
|14Best regards,
|11Ch|03rist|11ia|15n |11a|03ka |11Me|03aTLoT|11io|15N
|07 |08[|10eml|08] |
15ml@erb.pw |07 |08[|10web|08] |15www.erb.pw |07Ŀ |07 |08[|09fsx|08] |1521:1/158 |07 |08[|11tqw|08] |151337:1/101 |07 |07 |08[|12rtn|08] |1580:774/81 |07 |08[|14fdn|08] |152:250/5 |07
|07 |08[|10ark|08] |1510:104/2 |07
--- Mystic BBS v1.12 A43 2019/03/02 (Linux/64)
* Origin: thE qUAntUm wOrmhOlE, rAmsgAtE, Uk. bbs.erb.pw (1337:1/101)
-
From
SpaceDaily@1337:3/111 to
All on Mon Apr 27 21:30:06 2020
A direct, observation-based test of one of the pillars of cosmology
Date:
April 27, 2020
Source:
Carnegie Institution for Science
Summary:
The universe is full of billions of galaxies -- but their
distribution across space is far from uniform. Why do we see so
much structure in the universe today and how did it all form and
grow? A 10-year survey of tens of thousands of galaxies has
provided a new approach to answering this fundamental mystery.
FULL STORY
__________________________________________________________________
Credit: © Adanan / [17]Adobe Stock
Credit: © Adanan / [18]Adobe Stock
The universe is full of billions of galaxies -- but their distribution
across space is far from uniform. Why do we see so much structure in
the universe today and how did it all form and grow?
A 10-year survey of tens of thousands of galaxies made using the
Magellan Baade Telescope at Carnegie's Las Campanas Observatory in
Chile provided a new approach to answering this fundamental mystery.
The results, led by Carnegie's Daniel Kelson, are published in Monthly
Notices of the Royal Astronomical Society.
"How do you describe the indescribable?" asks Kelson. "By taking an
entirely new approach to the problem."
"Our tactic provides new -- and intuitive -- insights into how gravity
drove the growth of structure from the universe's earliest times," said
co-author Andrew Benson. "This is a direct, observation-based test of
one of the pillars of cosmology."
The Carnegie-Spitzer-IMACS Redshift Survey was designed to study the
relationship between galaxy growth and the surrounding environment over
the last 9 billion years, when modern galaxies' appearances were
defined.
The first galaxies were formed a few hundred million years after the
Big Bang, which started the universe as a hot, murky soup of extremely
energetic particles. As this material expanded outward from the initial
explosion, it cooled, and the particles coalesced into neutral hydrogen
gas. Some patches were denser than others and, eventually, their
gravity overcame the universe's outward trajectory and the material
collapsed inward, forming the first clumps of structure in the cosmos.
The density differences that allowed for structures both large and
small to form in some places and not in others have been a longstanding
topic of fascination. But until now, astronomers' abilities to model
how structure grew in the universe over the last 13 billion years faced
mathematical limitations.
"The gravitational interactions occurring between all the particles in
the universe are too complex to explain with simple mathematics,"
Benson said.
So, astronomers either used mathematical approximations -- which
compromised the accuracy of their models -- or large computer
simulations that numerically model all the interactions between
galaxies, but not all the interactions occurring between all of the
particles, which was considered too complicated.
"A key goal of our survey was to count up the mass present in stars
found in an enormous selection of distant galaxies and then use this
information to formulate a new approach to understanding how structure
formed in the universe," Kelson explained.
The research team -- which also included Carnegie's Louis Abramson,
Shannon Patel, Stephen Shectman, Alan Dressler, Patrick McCarthy, and
John S. Mulchaey, as well as Rik Williams , now of Uber Technologies --
demonstrated for the first time that the growth of individual
proto-structures can be calculated and then averaged over all of space.
Doing this revealed that denser clumps grew faster, and less-dense
clumps grew more slowly.
They were then able to work backward and determine the original
distributions and growth rates of the fluctuations in density, which
would eventually become the large-scale structures that determined the
distributions of galaxies we see today.
In essence, their work provided a simple, yet accurate, description of
why and how density fluctuations grow the way they do in the real
universe, as well as in the computational-based work that underpins our
understanding of the universe's infancy.
"And it's just so simple, with a real elegance to it," added Kelson.
The findings would not have been possible without the allocation of an
extraordinary number of observing nights at Las Campanas.
"Many institutions wouldn't have had the capacity to take on a project
of this scope on their own," said Observatories Director John Mulchaey.
"But thanks to our Magellan Telescopes, we were able to execute this
survey and create this novel approach to answering a classic question."
"While there's no doubt that this project required the resources of an
institution like Carnegie, our work also could not have happened
without the tremendous number of additional infrared images that we
were able to obtain at Kitt Peak and Cerro Tololo, which are both part
of the NSF's National Optical-Infrared Astronomy Research Laboratory,"
Kelson added.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[19]Materials provided by [20]Carnegie Institution for Science. Note:
Content may be edited for style and length.
__________________________________________________________________
Related Multimedia:
* [21]Illustration of distribution of density in the universe over
the last 9 billion years
__________________________________________________________________
Journal Reference:
1. Rik J Williams, John S Mulchaey, Patrick J McCarthy, Alan Dressler,
Stephen A Shectman, Shannon G Patel, Andrew J Benson, Louis E
Abramson, Daniel D Kelson. Gravity and the non-linear growth of
structure in the Carnegie-Spitzer-IMACS Redshift Survey. Monthly
Notices of the Royal Astronomical Society, 2020; 494 (2): 2628 DOI:
[22]10.1093/mnras/staa100
__________________________________________________________________
--- up 13 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon Apr 27 21:30:06 2020
Date:
April 27, 2020
Source:
University of New South Wales
Summary:
Not only does a universal constant seem annoyingly inconstant at
the outer fringes of the cosmos, it occurs in only one
direction, which is downright weird.
FULL STORY
__________________________________________________________________
Those looking forward to a day when science's Grand Unifying Theory of
Everything could be worn on a t-shirt may have to wait a little longer
as astrophysicists continue to find hints that one of the cosmological
constants is not so constant after all.
In a paper published in the journal Science Advances, scientists from
UNSW Sydney reported that four new measurements of light emitted from a
quasar 13 billion light years away reaffirm past studies that have
measured tiny variations in the fine structure constant.
UNSW Science's Professor John Webb says the fine structure constant is
a measure of electromagnetism -- one of the four fundamental forces in
nature (the others are gravity, weak nuclear force and strong nuclear
force).
"The fine structure constant is the quantity that physicists use as a
measure of the strength of the electromagnetic force," Professor Webb
says.
"It's a dimensionless number and it involves the speed of light,
something called Planck's constant and the electron charge, and it's a
ratio of those things. And it's the number that physicists use to
measure the strength of the electromagnetic force."
The electromagnetic force keeps electrons whizzing around a nucleus in
every atom of the universe -- without it, all matter would fly apart.
Up until recently, it was believed to be an unchanging force throughout
time and space. But over the last two decades, Professor Webb has
noticed anomalies in the fine structure constant whereby
electromagnetic force measured in one particular direction of the
universe seems ever so slightly different.
"We found a hint that that number of the fine structure constant was
different in certain regions of the universe. Not just as a function of
time, but actually also in direction in the universe, which is really
quite odd if it's correct...but that's what we found."
LOOKING FOR CLUES
Ever the sceptic, when Professor Webb first came across these early
signs of slightly weaker and stronger measurements of the
electromagnetic force, he thought it could be a fault of the equipment,
or of his calculations or some other error that had led to the unusual
readings. It was while looking at some of the most distant quasars --
massive celestial bodies emitting exceptionally high energy -- at the
edges of the universe that these anomalies were first observed using
the world's most powerful telescopes.
"The most distant quasars that we know of are about 12 to 13 billion
light years from us," Professor Webb says.
"So if you can study the light in detail from distant quasars, you're
studying the properties of the universe as it was when it was in its
infancy, only a billion years old. The universe then was very, very
different. No galaxies existed, the early stars had formed but there
was certainly not the same population of stars that we see today. And
there were no planets."
He says that in the current study, the team looked at one such quasar
that enabled them to probe back to when the universe was only a billion
years old which had never been done before. The team made four
measurements of the fine constant along the one line of sight to this
quasar. Individually, the four measurements didn't provide any
conclusive answer as to whether or not there were perceptible changes
in the electromagnetic force. However, when combined with lots of other
measurements between us and distant quasars made by other scientists
and unrelated to this study, the differences in the fine structure
constant became evident.
A WEIRD UNIVERSE
"And it seems to be supporting this idea that there could be a
directionality in the universe, which is very weird indeed," Professor
Webb says.
"So the universe may not be isotropic in its laws of physics -- one
that is the same, statistically, in all directions. But in fact, there
could be some direction or preferred direction in the universe where
the laws of physics change, but not in the perpendicular direction. In
other words, the universe in some sense, has a dipole structure to it.
"In one particular direction, we can look back 12 billion light years
and measure electromagnetism when the universe was very young. Putting
all the data together, electromagnetism seems to gradually increase the
further we look, while towards the opposite direction, it gradually
decreases. In other directions in the cosmos, the fine structure
constant remains just that -- constant. These new very distant
measurements have pushed our observations further than has ever been
reached before."
In other words, in what was thought to be an arbitrarily random spread
of galaxies, quasars, black holes, stars, gas clouds and planets --
with life flourishing in at least one tiny niche of it -- the universe
suddenly appears to have the equivalent of a north and a south.
Professor Webb is still open to the idea that somehow these
measurements made at different stages using different technologies and
from different locations on Earth are actually a massive coincidence.
"This is something that is taken very seriously and is regarded, quite
correctly with scepticism, even by me, even though I did the first work
on it with my students. But it's something you've got to test because
it's possible we do live in a weird universe."
But adding to the side of the argument that says these findings are
more than just coincidence, a team in the US working completely
independently and unknown to Professor Webb's, made observations about
X-rays that seemed to align with the idea that the universe has some
sort of directionality.
"I didn't know anything about this paper until it appeared in the
literature," he says.
"And they're not testing the laws of physics, they're testing the
properties, the X-ray properties of galaxies and clusters of galaxies
and cosmological distances from Earth. They also found that the
properties of the universe in this sense are not isotropic and there's
a preferred direction. And lo and behold, their direction coincides
with ours."
LIFE, THE UNIVERSE, AND EVERYTHING
While still wanting to see more rigorous testing of ideas that
electromagnetism may fluctuate in certain areas of the universe to give
it a form of directionality, Professor Webb says if these findings
continue to be confirmed, they may help explain why our universe is the
way it is, and why there is life in it at all.
"For a long time, it has been thought that the laws of nature appear
perfectly tuned to set the conditions for life to flourish. The
strength of the electromagnetic force is one of those quantities. If it
were only a few per cent different to the value we measure on Earth,
the chemical evolution of the universe would be completely different
and life may never have got going. It raises a tantalising question:
does this 'Goldilocks' situation, where fundamental physical quantities
like the fine structure constant are 'just right' to favour our
existence, apply throughout the entire universe?"
If there is a directionality in the universe, Professor Webb argues,
and if electromagnetism is shown to be very slightly different in
certain regions of the cosmos, the most fundamental concepts
underpinning much of modern physics will need revision.
"Our standard model of cosmology is based on an isotropic universe, one
that is the same, statistically, in all directions," he says.
"That standard model itself is built upon Einstein's theory of gravity,
which itself explicitly assumes constancy of the laws of Nature. If
such fundamental principles turn out to be only good approximations,
the doors are open to some very exciting, new ideas in physics."
Professor Webb's team believe this is the first step towards a far
larger study exploring many directions in the universe, using data
coming from new instruments on the world's largest telescopes. New
technologies are now emerging to provide higher quality data, and new
artificial intelligence analysis methods will help to automate
measurements and carry them out more rapidly and with greater
precision.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of New South Wales. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Michael R. Wilczynska, John K. Webb, Matthew Bainbridge, John D.
Barrow, Sarah E. I. Bosman, Robert F. Carswell, Mariusz P.
Dąbrowski, Vincent Dumont, Chung-Chi Lee, Ana Catarina Leite,
Katarzyna Leszczyńska, Jochen Liske, Konrad Marosek, Carlos J. A.
P. Martins, Dinko Milaković, Paolo Molaro, Luca Pasquini. Four
direct measurements of the fine-structure constant 13 billion years
ago. Science Advances, 2020; 6 (17): eaay9672 DOI:
[19]10.1126/sciadv.aay9672
__________________________________________________________________
--- up 13 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon Apr 27 21:30:08 2020
oxygen-ion conductivity
Date:
April 27, 2020
Source:
Tokyo Institute of Technology
Summary:
Scientists have discovered a layered perovskite that shows
unusually high oxide-ion conductivity, based on a new screening
method and a new design concept. Such materials are hard to come
by, so the discovery of this material, the new method and design
concept will make the realization of many environment-friendly
technologies.
FULL STORY
__________________________________________________________________
Upon hearing the word "conductor" in reference to chemistry, most will
immediately think of the movement of electrons within a material. But
electrons are not the only particles that can move across a material;
oxide ions can too. Many materials scientists and engineers are
currently searching for materials with high oxide-ion conductivity.
Such materials have many potential applications, particularly in the
development of environmentally friendly technologies. For example,
oxide-ion conductors could be used in fuel cells, which directly
convert clean fuel such as hydrogen into electrical energy, or in
oxygen separation membranes, which could be useful in systems for
capturing the CO2 we produce by burning coal or fossil fuels.
Unfortunately, high oxide-ion conductivities can be achieved by a
limited number of structure families of materials. The perovskites are
one such structure family. Perovskites and layered perovskites have
special crystal structures that sometimes exhibit outstanding physical
and chemical properties. Prof. Masatomo Yashima and colleagues from the
Tokyo Institute of Technology studied a class of layered perovskites, a
Dion-Jacobson phase, where two-dimensional perovskite-like "slabs" are
stacked and separated by a layer of alkali metal ions, such as the
cesium cation (Cs+). In their paper published in Nature Communications,
Professor Yashima and his colleagues explain their motivation:
"Numerous studies have been conducted on the electrical properties of
Dion-Jacobson phases, such as their proton, lithium-ion and sodium-ion
conduction. However, there are no reports on the oxide-ion conduction
in Dion-Jacobson phases."
In their study, the scientists first screened sixty-nine potential
Dion-Jacobson phases using the bond-valence method. This method allowed
them to calculate the energy barriers for oxide-ion migration in each
Dion-Jacobson phase, from which they identified CsBi[2]Ti[2]NbO[10-δ]
(CBTN) as a promising candidate because it has a low energy barrier and
does not contain expensive rare-earth elements. Further, they prepared
CBTN samples and found that the oxide-ion conductivity of CBTN was
higher than those of many other oxide-ion conductors, such as the
conventional yttria-stabilized zirconia.
To understand what causes such high oxide-ion conductivity in CBTN, the
scientists analyzed its crystal structure and watched how the structure
changes with temperature. Using a super-high-resolution neutron
diffractometer, SuperHRPD at J-PARC, they then identified several
possible paths across the crystal lattice through which oxide ions
could migrate at high temperatures. Most importantly, they discovered a
novel mechanism that seems to be one of the causes of the high
oxide-ion conductivity: Rise in temperature makes oxygen vacancies
appear, which facilitate oxide-ion migration. The large Cs cations and
the displacement of the Bi ions in the structure at high temperatures
expand the bottlenecks, enabling oxide-ion migration.
This study paves the way for finding inexpensive novel oxide-ion
conductors. Based on this oxide-ion conduction mechanism, one can
enhance the oxide-ion conductivities of materials of the CBTN family by
modifying the chemical composition of CBTN through the addition of
impurities (doping). "The present findings of high oxide-ion
conductivity in this new structure family, the Dion-Jacobson-type
CsBi[2]Ti[2]NbO[10-δ], and the new enlarged bottleneck concept
introduced, could facilitate the design of novel oxide-ion conductors
based on Dion-Jacobson phases," Prof. Yashima and his colleagues
conclude. The findings of this study open up the possibilities for many
novel applications that will lead to a sustainable future. In fact, the
present work was chosen as Editors' Highlights of Nature
Communications.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Tokyo Institute of Technology. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Wenrui Zhang, Kotaro Fujii, Eiki Niwa, Masato Hagihala, Takashi
Kamiyama, Masatomo Yashima. Oxide-ion conduction in the
Dion–Jacobson phase CsBi2Ti2NbO10−δ. Nature Communications, 2020;
11 (1) DOI: [19]10.1038/s41467-020-15043-z
__________________________________________________________________
--- up 13 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon Apr 27 21:30:08 2020
Microneedles made of silk-based material can target plant tissues for
delivery of micronutrients, hormones, or genes
Date:
April 27, 2020
Source:
Massachusetts Institute of Technology
Summary:
A new method developed by engineers may offer a starting point
for delivering life-saving treatments to plants ravaged by
diseases.
FULL STORY
__________________________________________________________________
While the human world is reeling from one pandemic, there are several
ongoing epidemics that affect crops and put global food production at
risk. Oranges, olives, and bananas are already under threat in many
areas due to diseases that affect plants' circulatory systems and that
cannot be treated by applying pesticides.
A new method developed by engineers at MIT may offer a starting point
for delivering life-saving treatments to plants ravaged by such
diseases.
These diseases are difficult to detect early and to treat, given the
lack of precision tools to access plant vasculature to treat pathogens
and to sample biomarkers. The MIT team decided to take some of the
principles involved in precision medicine for humans and adapt them to
develop plant-specific biomaterials and drug-delivery devices.
The method uses an array of microneedles made of a silk-based
biomaterial to deliver nutrients, drugs, or other molecules to specific
parts of the plant. The findings are described in the journal Advanced
Science, in a paper by MIT professors Benedetto Marelli and
Jing-Ke-Weng, graduate student Yunteng Cao, postdoc Eugene Lim at MIT,
and postdoc Menglong Xu at the Whitehead Institute for Biomedical
Research.
The microneedles, which the researchers call phytoinjectors, can be
made in a variety of sizes and shapes, and can deliver material
specifically to a plant's roots, stems, or leaves, or into its xylem
(the vascular tissue involved in water transportation from roots to
canopy) or phloem (the vascular tissue that circulates metabolites
throughout the plant). In lab tests, the team used tomato and tobacco
plants, but the system could be adapted to almost any crop, they say.
The microneedles can not only deliver targeted payloads of molecules
into the plant, but they can also be used to take samples from the
plants for lab analysis.
The work started in response to a request from the U.S. Department of
Agriculture for ideas on how to address the citrus greening crisis,
which is threatening the collapse of a $9 billion industry, Marelli
says. The disease is spread by an insect called the Asian citrus
psyllid that carries a bacterium into the plant. There is as yet no
cure for it, and millions of acres of U.S. orchards have already been
devastated. In response, Marelli's lab swung into gear to develop the
novel microneedle technology, led by Cao as his thesis project.
The disease infects the phloem of the whole plant, including roots,
which are very difficult to reach with any conventional treatment,
Marelli explains. Most pesticides are simply sprayed or painted onto a
plant's leaves or stems, and little if any penetrates to the root
system. Such treatments may appear to work for a short while, but then
the bacteria bounce back and do their damage. What is needed is
something that can target the phloem circulating through a plant's
tissues, which could carry an antibacterial compound down into the
roots. That's just what some version of the new microneedles could
potentially accomplish, he says.
"We wanted to solve the technical problem of how you can have a precise
access to the plant vasculature," Cao adds. This would allow
researchers to inject pesticides, for example, that would be
transported between the root system and the leaves. Present approaches
use "needles that are very large and very invasive, and that results in
damaging the plant," he says. To find a substitute, they built on
previous work that had produced microneedles using silk-based material
for injecting human vaccines.
"We found that adaptations of a material designed for drug delivery in
humans to plants was not straightforward, due to differences not only
in tissue vasculature, but also in fluid composition," Lim says. The
microneedles designed for human use were intended to biodegrade
naturally in the body's moisture, but plants have far less available
water, so the material didn't dissolve and was not useful for
delivering the pesticide or other macromolecules into the phloem. The
researchers had to design a new material, but they decided to stick
with silk as its basis. That's because of silk's strength, its
inertness in plants (preventing undesirable side effects), and the fact
that it degrades into tiny particles that don't risk clogging the
plant's internal vasculature systems.
They used biotechnology tools to increase silk's hydrophilicity (making
it attract water), while keeping the material strong enough to
penetrate the plant's epidermis and degradable enough to then get out
of the way.
Sure enough, they tested the material on their lab tomato and tobacco
plants, and were able to observe injected materials, in this case
fluorescent molecules, moving all they way through the plant, from
roots to leaves.
"We think this is a new tool that can be used by plant biologists and
bioengineers to better understand transport phenomena in plants," Cao
says. In addition, it can be used "to deliver payloads into plants, and
this can solve several problems. For example, you can think about
delivering micronutrients, or you can think about delivering genes, to
change the gene expression of the plant or to basically engineer a
plant."
"Now, the interests of the lab for the phytoinjectors have expanded
beyond antibiotic delivery to genetic engineering and point-of-care
diagnostics," Lim adds.
For example, in their experiments with tobacco plants, they were able
to inject an organism called Agrobacterium to alter the plant's DNA --
a typical bioengineering tool, but delivered in a new and precise way.
So far, this is a lab technique using precision equipment, so in its
present form it would not be useful for agricultural-scale
applications, but the hope is that it can be used, for example, to
bioengineer disease-resistant varieties of important crop plants. The
team has also done tests using a modified toy dart gun mounted to a
small drone, which was able to fire microneedles into plants in the
field. Ultimately, such a process might be automated using autonomous
vehicles, Marelli says, for agricultural-scale use.
Meanwhile, the team continues to work on adapting the system to the
varied needs and conditions of different kinds of plants and their
tissues. "There's a lot of variation among them, really," Marelli says,
so you need to think about having devices that are plant-specific. For
the future, our research interests will go beyond antibiotic delivery
to genetic engineering and point-of-care diagnostics based on
metabolite sampling."
The work was supported by the Office of Naval Research, the National
Science Foundation, and the Keck Foundation.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Massachusetts Institute of Technology.
Original written by David L. Chandler. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Yunteng Cao, Eugene Lim, Menglong Xu, Jing‐Ke Weng, Benedetto
Marelli. Precision Delivery of Multiscale Payloads to
Tissue‐Specific Targets in Plants. Advanced Science, 2020; 1903551
DOI: [19]10.1002/advs.201903551
__________________________________________________________________
--- up 13 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Mon Apr 27 21:30:08 2020
Date:
April 27, 2020
Source:
Vienna University of Technology
Summary:
Last summer, it was discovered that there are promising
superconductors in a special class of materials, the so-called
nickelates. However, it soon became apparent that these
initially spectacular could not be reproduced by other research
groups. Scientists have now found the reason for this: In some
nickelates additional hydrogen atoms are incorporated into the
material structure. This changes the electrical behavior of the
material.
FULL STORY
__________________________________________________________________
Last summer, a new age for high-temperature superconductivity was
proclaimed -- the nickel age. It was discovered that there are
promising superconductors in a special class of materials, the
so-called nickelates, which can conduct electric current without any
resistance even at high temperatures.
However, it soon became apparent that these initially spectacular
results from Stanford could not be reproduced by other research groups.
TU Wien (Vienna) has now found the reason for this: In some nickelates
additional hydrogen atoms are incorporated into the material structure.
This completely changes the electrical behaviour of the material. In
the production of the new superconductors, this effect must now be
taken into account.
The search for High-Temperature Superconductors
Some materials are only superconducting near absolute temperature zero
-- such superconductors are not suitable for technical applications.
Therefore, for decades, people have been looking for materials that
remain superconducting even at higher temperatures. In the 1980s,
"high-temperature superconductors" were discovered. What is referred to
as "high temperatures" in this context, however, is still very cold:
even high-temperature superconductors must be cooled strongly in order
to obtain their superconducting properties. Therefore, the search for
new superconductors at even higher temperatures continues.
"For a long time, special attention was paid to so-called cuprates,
i.e. compounds containing copper. This is why we also speak of the
copper age," explains Prof. Karsten Held from the Institute of Solid
State Physics at TU Wien. "With these cuprates, some important progress
was made, even though there are still many open questions in the theory
of high-temperature superconductivity today."
But for some time now, other possibilities have also been under
consideration. There was already a so-called "iron age" based on
iron-containing superconductors. In summer 2019, the research group of
Harold Y. Hwang's research group from Stanford then succeeded in
demonstrating high-temperature superconductivity in nickelates. "Based
on our calculations, we already proposed nickelates as superconductors
10 years ago, but they were somewhat different from the ones that have
now been discovered. They are related to cuprates, but contain nickel
instead of copper atoms," says Karsten Held.
The Trouble with Hydrogen
After some initial enthusiasm, however, it has become apparent in
recent months that nickelate superconductors are more difficult to
produce than initially thought. Other research groups reported that
their nickelates do not have superconducting properties. This apparent
contradiction has now been clarified at TU Wien.
"We analysed the nickelates with the help of supercomputers and found
that they are extremely receptive to hydrogen into the material,"
reports Liang Si (TU Vienna). In the synthesis of certain nickelates,
hydrogen atoms can be incorporated, which completely changes the
electronic properties of the material. "However, this does not happen
with all nickelates," says Liang Si, "Our calculations show that for
most of them, it is energetically more favourable to incorporate
hydrogen, but not for the nickelates from Stanford. Even small changes
in the synthesis conditions can make a difference." Last Friday (on
24.04.2020) the group around Ariando Ariando from the NUS Singapore
could report that they also succeeded in producing superconducting
nickelates. They let the hydrogen that is released in the production
process escape immediately.
Calculating the Critical Temperature with Supercomputers
At TU Wien new computer calculation methods are being developed and
used to understand and predict the properties of nickelates. "Since a
large number of quantum-physical particles always play a role here at
the same time, the calculations are extremely complex," says Liang Si,
"But by combining different methods, we are now even able to estimate
the critical temperature up to which the various materials are
superconducting. Such reliable calculations have not been possible
before." In particular, the team at TU Wien was able to calculate the
allowed range of strontium concentration for which the nickelates are
superconducting -- and this prediction has now been confirmed in
experiment.
"High-temperature superconductivity is an extremely complex and
difficult field of research," says Karsten Held. "The new nickelate
superconductors, together with our theoretical understanding and the
predictive power of computer calculations, open up a whole new
perspective on the great dream of solid state physics: a superconductor
at ambient temperature that hence works without any cooling."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Vienna University of Technology. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Liang Si, Wen Xiao, Josef Kaufmann, Jan M. Tomczak, Yi Lu, Zhicheng
Zhong, Karsten Held. Topotactic Hydrogen in Nickelate
Superconductors and Akin Infinite-Layer Oxides ABO2. Physical
Review Letters, 2020; 124 (16) DOI:
[19]10.1103/PhysRevLett.124.166402
__________________________________________________________________
--- up 13 weeks, 6 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:08 2020
Date:
April 28, 2020
Source:
NASA/Jet Propulsion Laboratory
Summary:
Scientists have finally figured out the precise timing of a
complicated dance between two enormous black holes, revealing
hidden details about the physical characteristics of these
mysterious cosmic objects.
FULL STORY
__________________________________________________________________
This image shows two massive black holes in the OJ 287 galaxy. The
smaller black hole orbits the larger one, which is also surrounded by a
disk of gas. When the smaller black hole crashes through the disk, it
produces a flare brighter than 1 trillion stars. | Credit:
NASA/JPL-Caltech
This image shows two massive black holes in the OJ 287 galaxy. The
smaller black hole orbits the larger one, which is also surrounded by a
disk of gas. When the smaller black hole crashes through the disk, it
produces a flare brighter than 1 trillion stars.
Credit: NASA/JPL-Caltech
This image shows two massive black holes in the OJ 287 galaxy. The
smaller black hole orbits the larger one, which is also surrounded by a
disk of gas. When the smaller black hole crashes through the disk, it
produces a flare brighter than 1 trillion stars. | Credit:
NASA/JPL-Caltech
This image shows two massive black holes in the OJ 287 galaxy. The
smaller black hole orbits the larger one, which is also surrounded by a
disk of gas. When the smaller black hole crashes through the disk, it
produces a flare brighter than 1 trillion stars.
Credit: NASA/JPL-Caltech
Black holes aren't stationary in space; in fact, they can be quite
active in their movements. But because they are completely dark and
can't be observed directly, they're not easy to study. Scientists have
finally figured out the precise timing of a complicated dance between
two enormous black holes, revealing hidden details about the physical
characteristics of these mysterious cosmic objects.
The OJ 287 galaxy hosts one of the largest black holes ever found, with
over 18 billion times the mass of our Sun. Orbiting this behemoth is
another black hole with about 150 million times the Sun's mass. Twice
every 12 years, the smaller black hole crashes through the enormous
disk of gas surrounding its larger companion, creating a flash of light
brighter than a trillion stars -- brighter, even, than the entire Milky
Way galaxy. The light takes 3.5 billion years to reach Earth.
But the smaller black hole's orbit is oblong, not circular, and it's
irregular: It shifts position with each loop around the bigger black
hole and is tilted relative to the disk of gas. When the smaller black
hole crashes through the disk, it creates two expanding bubbles of hot
gas that move away from the disk in opposite directions, and in less
than 48 hours the system appears to quadruple in brightness.
Because of the irregular orbit, the black hole collides with the disk
at different times during each 12-year orbit. Sometimes the flares
appear as little as one year apart; other times, as much as 10 years
apart. Attempts to model the orbit and predict when the flares would
occur took decades, but in 2010, scientists created a model that could
predict their occurrence to within about one to three weeks. They
demonstrated that their model was correct by predicting the appearance
of a flare in December 2015 to within three weeks.
Then, in 2018, a group of scientists led by Lankeswar Dey, a graduate
student at the Tata Institute of Fundamental Research in Mumbai, India,
published a paper with an even more detailed model they claimed would
be able to predict the timing of future flares to within four hours. In
a new study published in the Astrophysical Journal Letters, those
scientists report that their accurate prediction of a flare that
occurred on July 31, 2019, confirms the model is correct.
The observation of that flare almost didn't happen. Because OJ 287 was
on the opposite side of the Sun from Earth, out of view of all
telescopes on the ground and in Earth orbit, the black hole wouldn't
come back into view of those telescopes until early September, long
after the flare had faded. But the system was within view of NASA's
Spitzer Space Telescope, which the agency retired in January 2020.
After 16 years of operations, the spacecraft's orbit had placed it 158
million miles (254 million kilometers) from Earth, or more than 600
times the distance between Earth and the Moon. From this vantage point,
Spitzer could observe the system from July 31 (the same day the flare
was expected to appear) to early September, when OJ 287 would become
observable to telescopes on Earth.
"When I first checked the visibility of OJ 287, I was shocked to find
that it became visible to Spitzer right on the day when the next flare
was predicted to occur," said Seppo Laine, an associate staff scientist
at Caltech/IPAC in Pasadena, California, who oversaw Spitzer's
observations of the system. "It was extremely fortunate that we would
be able to capture the peak of this flare with Spitzer, because no
other human-made instruments were capable of achieving this feat at
that specific point in time."
Ripples in Space
Scientists regularly model the orbits of small objects in our solar
system, like a comet looping around the Sun, taking into account the
factors that will most significantly influence their motion. For that
comet, the Sun's gravity is usually the dominant force, but the
gravitational pull of nearby planets can change its path, too.
Determining the motion of two enormous black holes is much more
complex. Scientists must account for factors that might not noticeably
impact smaller objects; chief among them are something called
gravitational waves. Einstein's theory of general relativity describes
gravity as the warping of space by an object's mass. When an object
moves through space, the distortions turn into waves. Einstein
predicted the existence of gravitational waves in 1916, but they
weren't observed directly until 2015 by the Laser Interferometer
Gravitational Wave Observatory (LIGO).
The larger an object's mass, the larger and more energetic the
gravitational waves it creates. In the OJ 287 system, scientists expect
the gravitational waves to be so large that they can carry enough
energy away from the system to measurably alter the smaller black
hole's orbit -- and therefore timing of the flares.
While previous studies of OJ 287 have accounted for gravitational
waves, the 2018 model is the most detailed yet. By incorporating
information gathered from LIGO's detections of gravitational waves, it
refines the window in which a flare is expected to occur to just 1 1/2
days.
To further refine the prediction of the flares to just four hours, the
scientists folded in details about the larger black hole's physical
characteristics. Specifically, the new model incorporates something
called the "no-hair" theorem of black holes.
Published in the 1960s by a group of physicists that included Stephen
Hawking, the theorem makes a prediction about the nature of black hole
"surfaces." While black holes don't have true surfaces, scientists know
there is a boundary around them beyond which nothing -- not even light
-- can escape. Some ideas posit that the outer edge, called the event
horizon, could be bumpy or irregular, but the no-hair theorem posits
that the "surface" has no such features, not even hair (the theorem's
name was a joke).
In other words, if one were to cut the black hole down the middle along
its rotational axis, the surface would be symmetric. (The Earth's
rotational axis is almost perfectly aligned with its North and South
Poles. If you cut the planet in half along that axis and compared the
two halves, you would find that our planet is mostly symmetric, though
features like oceans and mountains create some small variations between
the halves.)
Finding Symmetry
In the 1970s, Caltech professor emeritus Kip Thorne described how this
scenario -- a satellite orbiting a massive black hole -- could
potentially reveal whether the black hole's surface was smooth or
bumpy. By correctly anticipating the smaller black hole's orbit with
such precision, the new model supports the no-hair theorem, meaning our
basic understanding of these incredibly strange cosmic objects is
correct. The OJ 287 system, in other words, supports the idea that
black hole surfaces are symmetric along their rotational axes.
So how does the smoothness of the massive black hole's surface impact
the timing of the smaller black hole's orbit? That orbit is determined
mostly by the mass of the larger black hole. If it grew more massive or
shed some of its heft, that would change the size of smaller black
hole's orbit. But the distribution of mass matters as well. A massive
bulge on one side of the larger black hole would distort the space
around it differently than if the black hole were symmetric. That would
then alter the smaller black hole's path as it orbits its companion and
measurably change the timing of the black hole's collision with the
disk on that particular orbit.
"It is important to black hole scientists that we prove or disprove the
no-hair theorem. Without it, we cannot trust that black holes as
envisaged by Hawking and others exist at all," said Mauri Valtonen, an
astrophysicist at University of Turku in Finland and a coauthor on the
paper.
Spitzer science data continues to be analyzed by the science community
via the Spitzer data archive located at the Infrared Science Archive
housed at IPAC at Caltech in Pasadena. JPL managed Spitzer mission
operations for NASA's Science Mission Directorate in Washington.
Science operations were conducted at the Spitzer Science Center at IPAC
at Caltech. Spacecraft operations were based at Lockheed Martin Space
in Littleton, Colorado. Caltech manages JPL for NASA.
For more information about Spitzer, visit:
* [17]
https://www.nasa.gov/spitzer
* [18]
http://www.spitzer.caltech.edu/
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[19]Materials provided by [20]NASA/Jet Propulsion Laboratory. Note:
Content may be edited for style and length.
__________________________________________________________________
Related Multimedia:
* [21]YouTube video: Timing of Black Hole Dance Revealed by NASA
Spitzer Space Telescope
__________________________________________________________________
Journal Reference:
1. Seppo Laine, Lankeswar Dey, Mauri Valtonen, A. Gopakumar, Stanislaw
Zola, S. Komossa, Mark Kidger, Pauli Pihajoki, José L. Gómez,
Daniel Caton, Stefano Ciprini, Marek Drozdz, Kosmas Gazeas, Vira
Godunova, Shirin Haque, Felix Hildebrandt, Rene Hudec, Helen
Jermak, Albert K. H. Kong, Harry Lehto, Alexios Liakos, Katsura
Matsumoto, Markus Mugrauer, Tapio Pursimo, Daniel E. Reichart,
Andrii Simon, Michal Siwak, Eda Sonbas. Spitzer Observations of the
Predicted Eddington Flare from Blazar OJ 287. The Astrophysical
Journal, 2020; 894 (1): L1 DOI: [22]10.3847/2041-8213/ab79a4
__________________________________________________________________
--- up 14 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:08 2020
Date:
April 28, 2020
Source:
NASA/Jet Propulsion Laboratory
Summary:
The large near-Earth object is well known to astronomers and
will get no closer than 3.9 million miles to our planet.
FULL STORY
__________________________________________________________________
A large near-Earth asteroid will safely pass by our planet on Wednesday
morning, providing astronomers with an exceptional opportunity to study
the 1.5-mile-wide (2-kilometer-wide) object in great detail.
The asteroid, called 1998 OR2, will make its closest approach at 5:55
a.m. EDT (2:55 a.m. PDT). While this is known as a "close approach" by
astronomers, it's still very far away: The asteroid will get no closer
than about 3.9 million miles (6.3 million kilometers), passing more
than 16 times farther away than the Moon.
Asteroid 1998 OR2 was discovered by the Near-Earth Asteroid Tracking
program at NASA's Jet Propulsion Laboratory in July 1998, and for the
past two decades astronomers have tracked it. As a result, we
understand its orbital trajectory very precisely, and we can say with
confidence that this asteroid poses no possibility of impact for at
least the next 200 years. Its next close approach to Earth will occur
in 2079, when it will pass by closer -- only about four times the lunar
distance.
Despite this, 1998 OR2 is still categorized as a large "potentially
hazardous asteroid" because, over the course of millennia, very slight
changes in the asteroid's orbit may cause it to present more of a
hazard to Earth than it does now. This is one of the reasons why
tracking this asteroid during its close approach -- using telescopes
and especially ground-based radar -- is important, as observations such
as these will enable an even better long-term assessment of the hazard
presented by this asteroid.
Close approaches by large asteroids like 1998 OR2 are quite rare. The
previous close approach by a large asteroid was made by asteroid
Florence in September 2017. That 3-mile-wide (5-kilometer-wide) object
zoomed past Earth at 18 lunar distances. On average, we expect
asteroids of this size to fly by our planet this close roughly once
every five years.
Since they are bigger, asteroids of this size reflect much more light
than smaller asteroids and are therefore easier to detect with
telescopes. Almost all near-Earth asteroids (about 98%) of the size of
1998 OR2 or larger have already been discovered, tracked and cataloged.
It is extremely unlikely there could be an impact over the next century
by one of these large asteroids, but efforts to discover all asteroids
that could pose an impact hazard to Earth continue.
JPL hosts the Center for Near-Earth Object Studies (CNEOS) for NASA's
Near-Earth Object Observations Program in NASA's Planetary Defense
Coordination Office.
More information about CNEOS, asteroids and near-Earth objects can be
found at:
[17]
https://cneos.jpl.nasa.gov
For more information about NASA's Planetary Defense Coordination
Office, visit:
[18]
https://www.nasa.gov/planetarydefense
For asteroid and comet news and updates, follow @AsteroidWatch on
Twitter:
[19]
https://twitter.com/AsteroidWatch
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[20]Materials provided by [21]NASA/Jet Propulsion Laboratory. Note:
Content may be edited for style and length.
__________________________________________________________________
--- up 14 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:08 2020
Date:
April 28, 2020
Source:
ESA/Hubble Information Centre
Summary:
The NASA/ESA Hubble Space Telescope has provided astronomers
with the sharpest view yet of the breakup of Comet C/2019 Y4
(ATLAS). The telescope resolved roughly 30 fragments of the
fragile comet on April 20 and 25 pieces on April 23.
FULL STORY
__________________________________________________________________
Hubble's new observations of Comet C/2019 Y4 (ATLAS). | Credit: NASA,
ESA, D. Jewitt (UCLA), Quanzhi Ye (University of Maryland)
Hubble's new observations of Comet C/2019 Y4 (ATLAS).
Credit: NASA, ESA, D. Jewitt (UCLA), Quanzhi Ye (University of
Maryland)
Hubble's new observations of Comet C/2019 Y4 (ATLAS). | Credit: NASA,
ESA, D. Jewitt (UCLA), Quanzhi Ye (University of Maryland)
Hubble's new observations of Comet C/2019 Y4 (ATLAS).
Credit: NASA, ESA, D. Jewitt (UCLA), Quanzhi Ye (University of
Maryland)
The NASA/ESA Hubble Space Telescope has provided astronomers with the
sharpest view yet of the breakup of Comet C/2019 Y4 (ATLAS). The
telescope resolved roughly 30 fragments of the fragile comet on 20
April and 25 pieces on 23 April.
The comet was first discovered in December 2019 by the ATLAS (Asteroid
Terrestrial-impact Last Alert System) robotic astronomical survey
system in Hawaiʻi, USA. It brightened quickly until mid-March, and some
astronomers initially anticipated that it might be visible to the naked
eye in May to become one of the most spectacular comets seen in the
last two decades. However, the comet abruptly began to get dimmer,
leading astronomers to speculate that the icy core may be fragmenting,
or even disintegrating. ATLAS's fragmentation was confirmed by amateur
astronomer Jose de Queiroz, who photographed around three pieces of the
comet on 11 April.
The Hubble Space Telescope's new observations of the comet's breakup on
20 and 23 April reveal that the broken fragments are all enveloped in a
sunlight-swept tail of cometary dust. These images provide further
evidence that comet fragmentation is probably common and might even be
the dominant mechanism by which the solid, icy nuclei of comets die.
"Their appearance changes substantially between the two days, so much
so that it's quite difficult to connect the dots," said David Jewitt of
UCLA, leader of one of two teams who imaged the doomed comet with
Hubble. "I don't know whether this is because the individual pieces are
flashing on and off as they reflect sunlight, acting like twinkling
lights on a Christmas tree, or because different fragments appear on
different days."
"This is really exciting -- both because such events are super cool to
watch and because they do not happen very often. Most comets that
fragment are too dim to see. Events at such scale only happen once or
twice a decade," said the leader of the second Hubble observing team,
Quanzhi Ye, of the University of Maryland.
Because comet fragmentation happens quickly and unpredictably, reliable
observations are rare. Therefore, astronomers remain largely uncertain
about the cause of fragmentation. One suggestion is that the original
nucleus spins itself into pieces because of the jet action of
outgassing from sublimating ices. As this venting is likely not evenly
dispersed across the comet, it enhances the breakup. "Further analysis
of the Hubble data might be able to show whether or not this mechanism
is responsible," said Jewitt. "Regardless, it's quite special to get a
look with Hubble at this dying comet."
Hubble's crisp images may yield new clues to the breakup. The telescope
has distinguished pieces as small as the size of a house. Before the
breakup, the entire nucleus may have been no more than the length of
two football fields.
The disintegrating ATLAS comet is currently located inside the orbit of
Mars, at a distance of approximately 145 million kilometres from Earth
when the latest Hubble observations were taken. The comet will make its
closest approach to Earth on 23 May at a distance of approximately 115
million kilometres, and eight days later it will skirt within 37
million kilometres of the Sun.
The Hubble Space Telescope is a project of international cooperation
between ESA and NASA.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]ESA/Hubble Information Centre. Note:
Content may be edited for style and length.
__________________________________________________________________
Related Multimedia:
* [19]Images and videos of Hubble's new observations of Comet C/2019
Y4 (ATLAS)
__________________________________________________________________
--- up 14 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:08 2020
Date:
April 28, 2020
Source:
National Institutes of Natural Sciences
Summary:
Researchers have used the infrastructure of the former TAMA300
gravitational wave detector in Mitaka, Tokyo to demonstrate a
new technique to reduce quantum noise in detectors. This new
technique will help increase the sensitivity of the detectors
comprising a collaborative worldwide gravitational wave network,
allowing them to observe fainter waves.
FULL STORY
__________________________________________________________________
Researchers at the National Astronomical Observatory of Japan (NAOJ)
have used the infrastructure of the former TAMA300 gravitational wave
detector in Mitaka, Tokyo to demonstrate a new technique to reduce
quantum noise in detectors. This new technique will help increase the
sensitivity of the detectors comprising a collaborative worldwide
gravitational wave network, allowing them to observe fainter waves.
When it began observations in 2000, TAMA300 was one of the world's
first large-scale interferometric gravitational wave detectors. At that
time TAMA300 had the highest sensitivity in the world, setting an upper
limit on the strength of gravitational wave signals; but the first
detection of actual gravitational waves was made 15 years later in 2015
by LIGO. Since then detector technology has improved to the point that
modern detectors are observing several signals per month. The
scientific results obtained from these observations are already
impressive and many more are expected in the next decades. TAMA300 is
no longer participating in observations, but is still active, acting as
a testbed for new technologies to improve other detectors.
The sensitivity of current and future gravitational wave detectors is
limited at almost all the frequencies by quantum noise caused by the
effects of vacuum fluctuations of the electromagnetic fields. But even
this inherent quantum noise can be sidestepped. It is possible to
manipulate the vacuum fluctuations to redistribute the quantum
uncertainties, deceasing one type of noise at the expense of increasing
a different, less obstructive type of noise. This technique, known as
vacuum squeezing, has already been implemented in gravitational wave
detectors, greatly increasing their sensitivity to higher frequency
gravitational waves. But the optomechanical interaction between the
electromagnetic field and the mirrors of the detector cause the effects
of vacuum squeezing to change depending on the frequency. So at low
frequencies vacuum squeezing increases the wrong type of noise,
actually degrading sensitivity.
To overcome this limitation and achieve reduced noise at all
frequencies, a team at NAOJ composed of members of the in-house
Gravitational Wave Science Project and the KAGRA collaboration (but
also including researchers of the Virgo and GEO collaborations) has
recently demonstrated the feasibility of a technique known as frequency
dependent vacuum squeezing, at the frequencies useful for gravitational
wave detectors. Because the detector itself interacts with the
electromagnetic fields differently depending on the frequency, the team
used the infrastructure of the former TAMA300 detector to create a
field which itself varies depending on frequency. A normal (frequency
independent) squeezed vacuum field is reflected off an optical cavity
300-m long, such that a frequency dependence is imprinted and it is
able counteract the optomechanical effect of the interferometer.
This technique will allow improved sensitivity at both high and low
frequencies simultaneously. This is a crucial result demonstrating a
key-technology to improve the sensitivity of future detectors. Its
implementation, planned as a near term upgrade together with other
improvements, is expected to double the observation range of
second-generation detectors.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]National Institutes of Natural Sciences.
Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Yuhang Zhao, Naoki Aritomi, Eleonora Capocasa, Matteo Leonardi,
Marc Eisenmann, Yuefan Guo, Eleonora Polini, Akihiro Tomura, Koji
Arai, Yoichi Aso, Yao-Chin Huang, Ray-Kuang Lee, Harald Lück, Osamu
Miyakawa, Pierre Prat, Ayaka Shoda, Matteo Tacca, Ryutaro
Takahashi, Henning Vahlbruch, Marco Vardaro, Chien-Ming Wu, Matteo
Barsuglia, Raffaele Flaminio. A frequency-dependent squeezed vacuum
source for broadband quantum noise reduction in advanced
gravitational-wave detectors. Physical Review Letters, 2020
[[19]link]
__________________________________________________________________
--- up 14 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:10 2020
Date:
April 28, 2020
Source:
University of Illinois at Urbana-Champaign, News Bureau
Summary:
Miniature biological robots are making greater strides than
ever, thanks to the spinal cord directing their steps.
Researchers developed the tiny walking 'spinobots,' powered by
rat muscle and spinal cord tissue on a soft, 3D-printed hydrogel
skeleton. While previous generations of biological robots, or
bio-bots, could move forward by simple muscle contraction, the
integration of the spinal cord gives them a more natural walking
rhythm.
FULL STORY
__________________________________________________________________
Miniature biological robots are making greater strides than ever,
thanks to the spinal cord directing their steps.
University of Illinois at Urbana-Champaign researchers developed the
tiny walking "spinobots," powered by rat muscle and spinal cord tissue
on a soft, 3D-printed hydrogel skeleton. While previous generations of
biological robots, or bio-bots, could move forward by simple muscle
contraction, the integration of the spinal cord gives them a more
natural walking rhythm, said study leader Martha Gillette, a professor
of cell and developmental biology.
"These are the beginnings of a direction toward interactive biological
devices that could have applications for neurocomputing and for
restorative medicine," Gillette said.
The researchers published their findings in the journal APL
Bioengineering.
To make the spinobots, the researchers first printed the tiny skeleton:
two posts for legs and a flexible "backbone," only a few millimeters
across. Then, they seeded it with muscle cells, which grew into muscle
tissue. Finally, they integrated a segment of lumbar spinal cord from a
rat.
"We specifically selected the lumbar spinal cord because previous work
has demonstrated that it houses the circuits that control left-right
alternation for lower limbs during walking," said graduate student
Collin Kaufman, the first author of the paper. "From an engineering
perspective, neurons are necessary to drive ever more complex,
coordinated muscle movements. The most challenging obstacle for
innervation was that nobody had ever cultured an intact rodent spinal
cord before."
The researchers had to devise a method not only to extract the intact
spinal cord and then culture it, but also to integrate it onto the
bio-bot and culture the muscle and nerve tissue together -- and do it
in a way that the neurons form junctions with the muscle.
The researchers saw spontaneous muscle contractions in the spinobots,
signaling that the desired neuro-muscular junctions had formed and the
two cell types were communicating. To verify that the spinal cord was
functioning as it should to promote walking, the researchers added
glutamate, a neurotransmitter that prompts nerves to signal muscle to
contract.
The glutamate caused the muscle to contract and the legs to move in a
natural walking rhythm. When the glutamate was rinsed away, the
spinobots stopped walking.
Next, the researchers plan to further refine the spinobots' movement,
making their gaits more natural. The researchers hope this small-scale
spinal cord integration is a first step toward creating in vitro models
of the peripheral nervous system, which is difficult to study in live
patients or animal models.
"The development of an in vitro peripheral nervous system -- spinal
cord, outgrowths and innervated muscle -- could allow researchers to
study neurodegenerative diseases such as ALS in real time with greater
ease of access to all the impacted components," Kaufman said. "There
are also a variety of ways that this technology could be used as a
surgical training tool, from acting as a practice dummy made of real
biological tissue to actually helping perform the surgery itself. These
applications are, for now, in the fairly distant future, but the
inclusion of an intact spinal cord circuit is an important step
forward."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Illinois at
Urbana-Champaign, News Bureau. Original written by Liz Ahlberg
Touchstone. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. C. D. Kaufman, S. C. Liu, C. Cvetkovic, C. A. Lee, G. Naseri
Kouzehgarani, R. Gillette, R. Bashir, M. U. Gillette. Emergence of
functional neuromuscular junctions in an engineered, multicellular
spinal cord-muscle bioactuator. APL Bioengineering, 2020; 4 (2):
026104 DOI: [19]10.1063/1.5121440
__________________________________________________________________
--- up 14 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:10 2020
Strings attached to hand, fingers create more realistic haptic feedback
Date:
April 28, 2020
Source:
Carnegie Mellon University
Summary:
Today's virtual reality systems can create immersive visual
experiences, but seldom do they enable users to feel anything --
particularly walls, appliances and furniture. A new device,
however, uses multiple strings attached to the hand and fingers
to simulate the feel of obstacles and heavy objects.
FULL STORY
__________________________________________________________________
Today's virtual reality systems can create immersive visual
experiences, but seldom do they enable users to feel anything --
particularly walls, appliances and furniture. A new device developed at
Carnegie Mellon University, however, uses multiple strings attached to
the hand and fingers to simulate the feel of obstacles and heavy
objects.
By locking the strings when the user's hand is near a virtual wall, for
instance, the device simulates the sense of touching the wall.
Similarly, the string mechanism enables people to feel the contours of
a virtual sculpture, sense resistance when they push on a piece of
furniture or even give a high five to a virtual character.
Cathy Fang, who will graduate from CMU next month with a joint degree
in mechanical engineering and human-computer interaction, said the
shoulder-mounted device takes advantage of spring-loaded strings to
reduce weight, consume less battery power and keep costs low.
"Elements such as walls, furniture and virtual characters are key to
building immersive virtual worlds, and yet contemporary VR systems do
little more than vibrate hand controllers," said Chris Harrison,
assistant professor in CMU's Human-Computer Interaction Institute
(HCII). User evaluation of the multistring device, as reported by
co-authors Harrison, Fang, Robotics Institute engineer Matthew Dworman
and HCII doctoral student Yang Zhang, found it was more realistic than
other haptic techniques.
"I think the experience creates surprises, such as when you interact
with a railing and can wrap your fingers around it," Fang said. "It's
also fun to explore the feel of irregular objects, such as a statue."
The team's research paper was named a best paper by the Conference on
Human Factors in Computing Systems (CHI 2020), which was scheduled for
this month but canceled due to the COVID-19 pandemic. The paper has now
been published in the conference proceedings in the Association for
Computing Machinery's Digital Library.
Other researchers have used strings to create haptic feedback in
virtual worlds, but typically they use motors to control the strings.
Motors wouldn't work for the CMU researchers, who envisioned a system
both light enough to be worn by the user and affordable for consumers.
"The downside to motors is they consume a lot of power," Fang said.
"They also are heavy."
Instead of motors, the team used spring-loaded retractors, similar to
those seen in key chains or ID badges. They added a ratchet mechanism
that can be rapidly locked with an electrically controlled latch. The
springs, not motors, keep the strings taut. Only a small amount of
electrical power is needed to engage the latch, so the system is energy
efficient and can be operated on battery power.
The researchers experimented with a number of different strings and
string placements, eventually concluding that attaching one string to
each fingertip, one to the palm and one to the wrist provided the best
experience. A Leap Motion sensor, which tracks hand and finger motions,
is attached to the VR headset. When it senses that a user's hand is in
proximity to a virtual wall or other obstacle, the ratchets are engaged
in a sequence suited to those virtual objects. The latches disengage
when the person withdraws their hand.
The entire device weighs less than 10 ounces. The researchers estimate
that a mass-produced version would cost less than $50.
Fang said the system would be suitable for VR games and experiences
that involve interacting with physical obstacles and objects, such a
maze. It might also be used for visits to virtual museums. And, in a
time when physically visiting retail stores is not always possible,
"you might also use it to shop in a furniture store," she added.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Carnegie Mellon University. Note: Content
may be edited for style and length.
__________________________________________________________________
--- up 14 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:10 2020
spread
Date:
April 28, 2020
Source:
American Institute of Physics
Summary:
Models drawing on chaos theory find growth in nine countries
conforms to a power law curve and highlight the value of strict
social distancing and testing policies.
FULL STORY
__________________________________________________________________
Many months since the first COVID-19 outbreak in Wuhan, China,
countries continue to explore solutions that are effective at managing
the spread of the virus and culturally feasible to implement. Chaos
theory analysis has provided insight on how well infection prevention
strategies can be adopted by multiple countries.
Researchers in Brazil analyzed the growth of confirmed infected
COVID-19 cases across four continents to better characterize the spread
of the virus and examine which strategies are effective in reducing its
spread.
Their results, published in Chaos, from AIP Publishing, found the virus
commonly grows along a power law curve, in which the social, economic
and geographical features of a particular area affect the exponent to
which the infection spreads rather than affecting traits of the
infection itself.
"We decided to use our expertise to perform extensive numerical
analysis using the real-time series of the cumulative confirmed cases
of COVID-19 in order to search for answers about the spreading of this
pathogen," said author Cesar Manchein.
The study draws on data current through March 27 from Brazil, China,
France, Germany, Italy, Japan, South Korea, Spain and the United
States.
The group's approach draws from a technique called numerical modeling,
which leverages computing power to solve a set of differential
equations in drawing comparisons between groups.
The high correlation in power law curves between each of the countries
has allowed the group to single out generic effective strategies.
Softer quarantine measures, they write, are inefficient at flattening
curves compared to stricter isolation guidelines.
"Our results essentially show that an efficient strategy to avoid the
increase of the number of infected individuals by coronavirus combines
two actions: Keep to a high level of social distance and implement a
significant number of tests to identify and isolate asymptomatic
individuals," said author Rafael M. da Silva.
They mention that the combination of the two actions, stay-at-home
measures and more aggressive disease testing, are essentially the
strategies used in South Korea.
The researchers plan on continuing to apply real-world data to further
improve their model. Da Silva said the group hopes to use their models
to test distinct strategies that could avoid the use of long
quarantines.
"Physics and chaos theory researchers can have a fundamental role in
the battle against the coronavirus," said author Cesar Manchein. "From
the theoretical point of view, researchers can use their knowledge and
experience to study the time and territorial evolution of the disease."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
Materials provided by [17]American Institute of Physics. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Cesar Manchein, Eduardo L. Brugnago, Rafael M. da Silva, Carlos F.
O. Mendes, Marcus W. Beims. Strong correlations between power-law
growth of COVID-19 in four continents and the inefficiency of soft
quarantine strategies. Chaos: An Interdisciplinary Journal of
Nonlinear Science, 2020; 30 (4): 041102 DOI: [18]10.1063/5.0009454
__________________________________________________________________
--- up 14 weeks, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
Date:
April 28, 2020
Source:
University of Texas at Austin
Summary:
A group of researchers has found a way to stabilize one of the
most challenging parts of lithium-sulfur batteries, bringing the
technology closer to becoming commercially viable.
FULL STORY
__________________________________________________________________
Lithium-sulfur batteries have been hailed as the next big step in
battery technology, promising significantly longer use for everything
from cellphones to electric vehicles on a single charge, while being
more environmentally sustainable to produce than current lithium-ion
batteries. However, these batteries don't last as long as their
lithium-ion counterparts, degrading over time.
A group of researchers in the Cockrell School of Engineering at The
University of Texas at Austin has found a way to stabilize one of the
most challenging parts of lithium-sulfur batteries, bringing the
technology closer to becoming commercially viable. The team's findings,
published today in Joule, show that creating an artificial layer
containing tellurium, inside the battery in-situ, on top of lithium
metal, can make it last four times longer.
"Sulfur is abundant and environmentally benign with no supply chain
issues in the U.S.," said Arumugam Manthiram, a professor of mechanical
engineering and director of the Texas Materials Institute. "But there
are engineering challenges. We've reduced a problem to extend the cycle
life of these batteries."
Lithium is a reactive element that tends to break down other elements
around it. Every cycle of a lithium-sulfur battery -- the process of
charging and discharging it -- can cause mossy, needle-like deposits to
form on the lithium-metal anode, the negative electrode of the battery.
This starts a reaction that can lead to the battery's overall
degradation.
The deposits break down the electrolyte that shuttles lithium ions back
and forth. This can trap some of the lithium, keeping the electrode
from delivering the full power necessary for the ultra-long use the
technology promises. The reaction can also cause the battery to
short-circuit and potentially catch fire.
The artificial layer formed on the lithium electrode protects the
electrolyte from being degraded and reduces the mossy structures that
trap lithium from forming during charges.
"The layer formed on lithium surface allows it to operate without
breaking down the electrolyte, and that makes the battery last much
longer," said Amruth Bhargav, who, along with fellow graduate student
Sanjay Nanda, co-authored the paper.
Manthiram added that this method can be applied to other lithium- and
sodium-based batteries. The researchers have filed a provisional patent
application for the technology.
"The stabilizing layer is formed by a simple in-situ process and
requires no expensive or complicated pre-treatment or coating
procedures on the lithium-metal anode," Nanda said.
Solving the instability of this part of the battery is key to extending
its cycle life and bringing about wider adoption. Manthiram said that
lithium-sulfur batteries are currently best suited for devices that
need lightweight batteries and can run for a long time on a single
charge and don't require a large number of charge cycles, such as
drones. But they have the potential to play an important role in
extending the range of electric vehicles and increased renewable energy
adoption.
Both the positive and negative electrodes in lithium-sulfur batteries
hold 10 times as much charge capacity as the materials used in today's
lithium-ion batteries, Manthiram said, which means they can deliver
much more use out of a single charge. Sulfur is widely available as a
byproduct from the oil and gas industry, making the batteries
inexpensive to produce. Sulfur is also more environmentally friendly
than the metal oxide materials used in lithium-ion batteries.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Texas at Austin. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Sanjay Nanda, Amruth Bhargav, Arumugam Manthiram. Anode-free,
Lean-Electrolyte Lithium-Sulfur Batteries Enabled by
Tellurium-Stabilized Lithium Deposition. Joule, 2020; DOI:
[19]10.1016/j.joule.2020.03.020
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
Textbook formulas for describing heat flow characteristics, crucial in many industries, are oversimplified, study shows.
Date:
April 28, 2020
Source:
Massachusetts Institute of Technology
Summary:
Textbook formulas for describing heat flow characteristics,
crucial in many industries, are oversimplified, study shows.
FULL STORY
__________________________________________________________________
Whether it's water flowing across a condenser plate in an industrial
plant, or air whooshing through heating and cooling ducts, the flow of
fluid across flat surfaces is a phenomenon at the heart of many of the
processes of modern life. Yet, aspects of this process have been poorly
understood, and some have been taught incorrectly to generations of
engineering students, a new analysis shows.
The study examined several decades of published research and analysis
on fluid flows. It found that, while most undergraduate textbooks and
classroom instruction in heat transfer describe such flow as having two
different zones separated by an abrupt transition, in fact there are
three distinct zones. A lengthy transitional zone is just as
significant as the first and final zones, the researchers say.
The discrepancy has to do with the shift between two different ways
that fluids can flow. When water or air starts to flow along a flat,
solid sheet, a thin boundary layer forms. Within this layer, the part
closest to the surface barely moves at all because of friction, the
part just above that flows a little faster, and so on, until a point
where it is moving at the full speed of the original flow. This steady,
gradual increase in speed across a thin boundary layer is called
laminar flow. But further downsteam, the flow changes, breaking up into
the chaotic whirls and eddies known as turbulent flow.
The properties of this boundary layer determine how well the fluid can
transfer heat, which is key to many cooling processes such as for
high-performance computers, desalination plants, or power plant
condensers.
Students have been taught to calculate the characteristics of such
flows as if there was a sudden change from laminar flow to turbulent
flow. But John Lienhard, the Abdul Lateef Jameel Professor of Water and
of mechanical engineering at MIT, made a careful analysis of published
experimental data and found that this picture ignores an important part
of the process. The findings were just published in the Journal of Heat
Transfer.
Lienhard's review of heat transfer data reveals a significant
transition zone between the laminar and turbulent flows. This
transition zone's resistance to heat flow varies gradually between
those of the two other zones, and the zone is just as long and
distinctive as the laminar flow zone that precedes it.
The findings could potentially have implications for everything from
the design of heat exchangers for desalination or other industrial
scale processes, to understanding the flow of air through jet engines,
Lienhard says.
In fact, though, most engineers working on such systems understand the
existence of a long transition zone, even if it's not in the
undergraduate textbooks, Lienhard notes. Now, by clarifying and
quantifying the transition, this study will help to bring theory and
teaching into line with real-world engineering practice. "The notion of
an abrupt transition has been ingrained in heat transfer textbooks and
classrooms for the past 60 or 70 years," he says.
The basic formulas for understanding flow along a flat surface are the
fundamental underpinnings for all of the more complex flow situations
such as airflow over a curved airplane wing or turbine blade, or for
cooling space vehicles as they reenter the atmosphere. "The flat
surface is the starting point for understanding how any of those things
work," Lienhard says.
The theory for flat surfaces was set out by the German researcher Ernst
Pohlhausen in 1921. But even so, "lab experiments usually didn't match
the boundary conditions assumed by the theory. A laboratory plate might
have a rounded edge or a nonuniform temperature, so investigators in
the 1940s, 50s, and 60s often 'adjusted' their data to force agreement
with this theory," he says. Discrepancies between otherwise good data
and this theory also led to heated disagreements among specialists in
the heat transfer literature.
Lienhard found that researchers with the British Air Ministry had
identified and partially solved the problem of nonuniform surface
temperatures in 1931. "But they weren't able to fully solve the
equation they derived," he says. "That had to wait until digital
computers could be used, starting in 1949." Meanwhile, the arguments
between specialists simmered on.
Lienhard says that he decided to take a look at the experimental basis
for the equations that were being taught, realizing that researchers
have known for decades that the transition played a significant role.
"I wanted to plot data with these equations. That way, students could
see how well the equations did or didn't work," he said. "I looked at
the experimental literature all the way back to 1930. Collecting these
data made something very clear: What we were teaching was terribly
oversimplified." And the discrepancy in the description of fluid flow
meant that calculations of heat transfer were sometimes off.
Now, with this new analysis, engineers and students will be able to
calculate temperature and heat flow accurately across a very wide range
of flow conditions and fluids, Lienhard says.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Massachusetts Institute of Technology.
Original written by David L. Chandler. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. John H. Lienhard. Heat Transfer in Flat-Plate Boundary Layers: A
Correlation for Laminar, Transitional, and Turbulent Flow. Journal
of Heat Transfer, 2020; 142 (6) DOI: [19]10.1115/1.4046795
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
Date:
April 28, 2020
Source:
DOE/Princeton Plasma Physics Laboratory
Summary:
New research points to improved control of troublesome magnetic
islands in future fusion facilities.
FULL STORY
__________________________________________________________________
A key challenge to capturing and controlling fusion energy on Earth is
maintaining the stability of plasma -- the electrically charged gas
that fuels fusion reactions -- and keeping it millions of degrees hot
to launch and maintain fusion reactions. This challenge requires
controlling magnetic islands, bubble-like structures that form in the
plasma in doughnut-shaped tokamak fusion facilities. These islands can
grow, cool the plasma and trigger disruptions -- the sudden release of
energy stored in the plasma -- that can halt fusion reactions and
seriously damage the fusion facilities that house them.
Improved island control
Research by scientists at Princeton University and at the U.S.
Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL)
points toward improved control of the troublesome magnetic islands in
ITER, the international tokamak under construction in France, and other
future fusion facilities that cannot allow large disruptions. "This
research could open the door to improved control schemes previously
deemed unobtainable," said Eduardo Rodriguez, a graduate student in the
Princeton Program in Plasma Physics and first author of a paper in
Physics of Plasmas that reports the findings.
The research follows up on previous work by Allan Reiman and Nat Fisch,
which identified a new effect called "RF [radio frequency] current
condensation" that can greatly facilitate the stabilization of magnetic
islands. The new Physics of Plasmas paper shows how to make optimal use
of the effect. Reiman is a Distinguished Research Fellow at PPPL and
Fisch is a Princeton University professor and Director of the Princeton
Program in Plasma Physics and Associate Director of Academic Affairs at
PPPL.
Fusion reactions combine light elements in the form of plasma -- the
state of matter composed of free electrons and atomic nuclei -- to
generate massive amounts of energy in the sun and stars. Scientists
throughout the world are seeking to reproduce the process on Earth for
a virtually inexhaustible supply of safe and clean power to generate
electricity for all humanity.
The new paper, based on a simplified analytical model, focuses on use
of RF waves to heat the islands and drive electric current that causes
them to shrink and disappear. When the temperature gets sufficiently
high, complicated interactions can occur that lead to the RF current
condensation effect, which concentrates the current in the center of
the island and can greatly enhance the stabilization. But as the
temperature increases, and the gradient of the temperature between the
colder edge and the hot interior of the island grows larger, the
gradient can drive instabilities that make it more difficult to
increase the temperature further.
Point-counterpoint
This point-counterpoint is an important indicator of whether the RF
waves will accomplish their stabilizing goal. "We analyze the
interaction between the current condensation and the increased
turbulence from the gradient the heating creates to determine whether
the system is stabilized or not," Rodriguez says. "We want the islands
not to grow." The new paper shows how to control the power and aiming
of the waves to make optimal use of the RF current condensation effect,
taking account of the instabilities. Focusing on this can lead to
improved stabilization of fusion reactors," Rodriguez said.
The researchers now plan to introduce new aspects into the model to
develop a more detailed investigation. Such steps include work being
done towards including the condensation effect in computer codes to
model the behavior of launched RF waves and their true effect. The
technique would ultimately be used in designing optimal island
stabilization schemes.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]DOE/Princeton Plasma Physics Laboratory.
Original written by John Greenwald. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. E. Rodríguez, A. H. Reiman, N. J. Fisch. RF current condensation in
the presence of turbulent enhanced transport. Physics of Plasmas,
2020; 27 (4): 042306 DOI: [19]10.1063/5.0001881
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
viruses
Date:
April 28, 2020
Source:
eLife
Summary:
Synthetic antibodies constructed using bacterial superglue can
neutralize potentially lethal viruses, according to a new study.
FULL STORY
__________________________________________________________________
Synthetic antibodies constructed using bacterial superglue can
neutralise potentially lethal viruses, according to a study published
on April 21 in eLife.
The findings provide a new approach to preventing and treating
infections of emerging viruses and could also potentially be used in
therapeutics for other diseases.
Bunyaviruses are mainly carried by insects, such as mosquitoes, and can
have devastating effects on animal and human health. The World Health
Organization has included several of these viruses on the Blueprint
list of pathogens likely to cause epidemics in humans in the face of
absent or insufficient countermeasures.
"After vaccines, antiviral and antibody therapies are considered the
most effective tools to fight emerging life-threatening virus
infections," explains author Paul Wichgers Schreur, a senior scientist
of Wageningen Bioveterinary Research, The Netherlands. "Specific
antibodies called VHHs have shown great promise in neutralising a
respiratory virus of infants. We investigated if the same antibodies
could be effective against emerging bunyaviruses."
Antibodies naturally found in humans and most other animals are
composed of four 'chains' -- two heavy and two light. VHHs are the
antigen-binding domains of heavy chain-only antibodies found in
camelids and are fully functional as a single domain. This makes VHHs
smaller and able to bind to pathogens in ways that human antibodies
cannot. Furthermore, the single chain nature makes them perfect
building blocks for the construction of multifunctional complexes.
In this study, the team immunised llamas with two prototypes of
bunyaviruses, the Rift Valley fever virus (RVFV) and the Schmallenberg
virus (SBV), to generate VHHs that target an important part of the
virus' infective machinery, the glycoprotein head. They found that RVFV
and SBV VHHs recognised different regions within the glycoprotein
structure.
When they tested whether the VHHs could neutralise the virus in a test
tube, they found that single VHHs could not do the job. Combining two
different VHHs had a slightly better neutralising effect against SBV,
but this was not effective for RVFV. To address this, they used
'superglue' derived from bacteria to stick multiple VHHs together as a
single antibody complex. The resulting VHH antibody complexes
efficiently neutralised both viruses, but only if the VHHs in the
complex targeted more than one region of the virus glycoprotein head.
Studies in mice with the best performing VHH antibody complexes showed
that these complexes were able to prevent death. The number of viruses
in the blood of the treated mice was also substantially reduced
compared with the untreated animals.
To work in humans optimally, antibodies need to have all the effector
functions of natural human antibodies. To this end, the team
constructed llama-human chimeric antibodies. Administering a promising
chimeric antibody to mice before infection prevented lethal disease in
80% of the animals, and treating them with the antibody after infection
prevented mortality in 60%.
"We've harnessed the beneficial characteristics of VHHs in combination
with bacterial superglues to develop highly potent virus neutralising
complexes," concludes senior author Jeroen Kortekaas, Senior Scientist
at Wageningen Bioveterinary Research, and Professor of the Laboratory
of Virology, Wageningen University, The Netherlands. "Our approach
could aid the development of therapeutics for bunyaviruses and other
viral infections, as well as diseases including cancer."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]eLife. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Paul J Wichgers Schreur, Sandra van de Water, Michiel Harmsen,
Erick Bermúdez-Méndez, Dubravka Drabek, Frank Grosveld, Kerstin
Wernike, Martin Beer, Andrea Aebischer, Olalekan Daramola, Sara
Rodriguez Conde, Karen Brennan, Dorota Kozub, Maiken Søndergaard
Kristiansen, Kieran K Mistry, Ziyan Deng, Jan Hellert, Pablo
Guardado-Calvo, Félix A Rey, Lucien van Keulen, Jeroen Kortekaas.
Multimeric single-domain antibody complexes protect against
bunyavirus infections. eLife, 2020; 9 DOI: [19]10.7554/eLife.52716
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
Date:
April 28, 2020
Source:
U.S. Army Research Laboratory
Summary:
In recent years, lithium-ion batteries have become better at
supplying energy to soldiers in the field, but the current
generation of batteries never reaches its highest energy
potential. Researchers are extremely focused on solving this
challenge and providing the power soldiers demand.
FULL STORY
__________________________________________________________________
In recent years, lithium-ion batteries have become better at supplying
energy to Soldiers in the field, but the current generation of
batteries never reaches its highest energy potential. Army researchers
are extremely focused on solving this challenge and providing the power
Soldiers demand.
At the U.S. Army Combat Capabilities Development Command's Army
Research Laboratory, in collaboration with the University of Maryland,
scientists may have found a solution.
"We are very excited to demonstrate a new electrolyte design for
lithium ion batteries that improves anode capacity by more than five
times compared to traditional methods," said Army scientist Dr. Oleg
Borodin. "This is the next step needed to move this technology closer
to commercialization."
The team designed a self-healing, protective layer in the battery that
significantly slows down the electrolyte and silicon anode degradation
process, which could extend the lifespan of next generation lithium-ion
batteries.
Their latest battery design increased the number of possible cycles
from tens to over a hundred with little degradation. The journal Nature
Energy published their findings.
Here's how a battery works. A battery stores chemical energy and
converts it into electrical energy. Batteries have three parts, an
anode (-), a cathode (+), and the electrolyte. An anode is an electrode
through which the conventional current enters into a polarized
electrical device. This contrasts with a cathode, through which current
leaves an electrical device.
The electrolyte keeps the electrons from going straight from the anode
to the cathode within the battery. In order to create better batteries,
Borodin said, you can increase the capacity of the anode and the
cathode, but the electrolyte has to be compatible between them.
Lithium-ion batteries generally use graphite anodes, which have a
capacity of about 370 milliamp hours (mAh) per gram. But anodes made
out of silicon can offer about 1,500 to 2,800 mAh per gram, or at least
four times as much capacity.
The researchers said silicon particle anodes, as opposed to traditional
graphite anodes, provide excellent alternatives, but they also degrade
much faster. Unlike graphite, silicon expands and contracts during a
battery's operation. As the silicon nanoparticles within the anode get
larger, they often crack the protective layer -- called the solid
electrolyte interphase -- that surrounds the anode.
The solid electrolyte interphase forms naturally when anode particles
make direct contact with the electrolyte. The resulting barrier
prevents further reactions from occurring and separates the anode from
the electrolyte. But when this protective layer becomes damaged, the
newly exposed anode particles will react continuously with electrolyte
until it runs out.
"Others have tried to tackle this problem by designing a protective
layer that expands when the silicon anode does," Borodin said.
"However, these methods still cause some electrolyte degradation, which
significantly shortens the lifetime of the anode and the battery."
The joint team at the University of Maryland and the Army Research
Laboratory decided to try a new approach. Instead of an elastic
barrier, the researchers designed a rigid barrier that doesn't break
apart -- even when the silicon nanoparticles expand. They developed a
lithium-ion battery with an electrolyte that formed a rigid Lithium
Fluoride solid electrolyte interphase, or SEI, when electrolyte
interacts with the silicon anode particles and substantially reduced
electrolyte degradation.
"We successfully avoided the SEI damage by forming a ceramic SEI that
has a low affinity to the lithiated silicon particles, so that the
lithiated silicon can relocate at the interface during volume change
without damaging the SEI," said Prof. Chunsheng Wang, a professor of
Chemical and Biomolecular Engineering at the University of Maryland.
"The electrolyte design principle is universal for all alloy anodes and
opens a new opportunity to develop high-energy batteries."
The battery design that Borodin and Wang's group conceived demonstrated
a coulombic [the basic unit of electric charge] efficiency of 99.9
percent, which meant that only 0.1 percent of the energy was lost to
electrolyte degradation each cycle.
This is a significant improvement over conventional designs for
lithium-ion batteries with silicon anodes, which have a 99.5-percent
efficiency. While seemingly small, Borodin said this difference
translates to a cycle life more than five times longer.
"Experiments performed by Dr. Chunsheng Wang's group at the University
of Maryland showed that this new method was successful," Borodin said.
"However, it was successful not only for silicon but also for aluminum
and bismuth anodes, which shows the universality of the principle."
The new design also came with several other benefits. The battery's
higher capacity allowed the electrode to be markedly thinner, which
made the charging time much faster and battery itself much lighter. In
addition, the researchers found that the battery could handle colder
temperatures better than normal batteries.
"For regular batteries, colder temperatures slow diffusion and may even
freeze the liquids inside the batteries," Borodin said. "But because
our design has a much higher capacity, thus ions have to diffuse
shorter distances, resulting in a significantly improved low
temperature operation, which is important for warfighters operating in
cold climates."
The team thanked the ARL Enterprise for Multiscale Modeling of
Materials program for its support during the research effort so far.
According to Borodin, the next step in the research is to develop a
larger cell with a higher voltage using this design. In light of this
goal, the team is currently looking into advancements into the cathode
side of the lithium-ion battery.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]U.S. Army Research Laboratory. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Ji Chen, Xiulin Fan, Qin Li, Hongbin Yang, M. Reza Khoshi, Yaobin
Xu, Sooyeon Hwang, Long Chen, Xiao Ji, Chongyin Yang, Huixin He,
Chongmin Wang, Eric Garfunkel, Dong Su, Oleg Borodin, Chunsheng
Wang. Electrolyte design for LiF-rich solid–electrolyte interfaces
to enable high-performance microsized alloy anodes for batteries.
Nature Energy, 2020; DOI: [19]10.1038/s41560-020-0601-1
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
Date:
April 28, 2020
Source:
University of Vienna
Summary:
Software LipidCreator enables researchers to characterize 60
lipid classes in cells with mass spectrometry.
FULL STORY
__________________________________________________________________
Researchers increasingly aim at utilising the manifold functions of
lipids in our bodies, e.g. as blood fats or in blood coagulation, to
better understand and predict diseases. An international team around
Robert Ahrends at the Faculty of Chemistry of the University of Vienna
now presented a groundbreaking tool for efficient lipid analysis in the
journal Nature Communications. Their software LipidCreator highly
accelerates the analysis of specific lipid groups and lipid signal
molecules, and allows both, their qualitative and quantitative
characterisation with mass spectrometry. The scientists applied the new
method successfully in an analysis of blood components.
Lipids have a great potential as biomarkers. Life as we know it is
wrapped in lipids, fats and waxes: they form cells and organelles,
convey information, protect our organism from the harsh environmental
conditions, and serve as energy building blocks. "It is not long ago
that we gained an idea about the diversity of lipid functions," says
biochemist Robert Ahrends, who started his tenure track professorship
in lipidomics -- i.e. the analysis of the total lipids of a cell,
tissue or organism -- at the University of Vienna at the beginning of
this year.
The innovative software LipidCreator can take lipidomics to the next
level. "The software enables scientists to come up with new targeted
lipidomics assays, to make them easily available to other labs, and to
retrieve and include comprehensive knowledge and data from other
studies, as the software also serves as an online database for
lipidomic research," study author Robert Ahrends from the Department of
Analytical Chemistry explains.
Based on the software, scientists now can quantify about 60 lipid
classes and their lipid signalling molecules in much bigger studies
than previously; they can quickly set up workflows for the analysis of
new target molecules, and easily check and validate the results.
Salvaging the treasure of Lipids
Lipids are chemically very diverse. They have a complex structure and
consist of combinations of different building blocks, such as different
sugars, fatty acyl groups, and different types of bonds. Mass
spectrometry (MS) has become both, faster and more sensitive in recent
years. Special further developments of MS today enable the
identification of up to 500 lipids, the chemical components and
structures of the lipids can be decoded via the masses of the
individual lipid fragments. Despite the rapid growth of lipidomics,
comprehensive software solutions for targeted mass spectrometric
analyses of specific lipid groups have been lacking until now.
Clinical interest
Ahrends and his team already applied their software, proving their high
potential for clinical applications: Lipids in different forms are
important sources of energy, which are transported within the blood. As
important factors in signal transmission between cells, they are also
involved in the activation of blood platelets (thrombocytes), which in
turn are important for blood clotting. Based on LipidCreator, the
scientists successfully characterised lipids in blood plasma and
analysed the role of lipids in platelet activation. According to the
scientists, data gained from these kind of surveys might even help to
identify relevant factors for blood coagulation and for the development
of thrombosis.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Vienna. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Bing Peng, Dominik Kopczynski, Brian S. Pratt, Christer S. Ejsing,
Bo Burla, Martin Hermansson, Peter Imre Benke, Sock Hwee Tan, Mark
Y. Chan, Federico Torta, Dominik Schwudke, Sven W. Meckelmann,
Cristina Coman, Oliver J. Schmitz, Brendan MacLean, Mailin-Christin
Manke, Oliver Borst, Markus R. Wenk, Nils Hoffmann, Robert Ahrends.
LipidCreator workbench to probe the lipidomic landscape. Nature
Communications, 2020; 11 (1) DOI: [19]10.1038/s41467-020-15960-z
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
Efficient painting method reaches nooks and crannies
Date:
April 28, 2020
Source:
Rutgers University
Summary:
Engineers have created a highly effective way to paint complex
3D-printed objects, such as lightweight frames for aircraft and
biomedical stents, that could save manufacturers time and money
and provide new opportunities to create 'smart skins' for
printed parts.
FULL STORY
__________________________________________________________________
Rutgers engineers have created a highly effective way to paint complex
3D-printed objects, such as lightweight frames for aircraft and
biomedical stents, that could save manufacturers time and money and
provide new opportunities to create "smart skins" for printed parts.
The findings are published in the journal ACS Applied Materials &
Interfaces.
Conventional sprays and brushes can't reach all nooks and crannies in
complex 3D-printed objects, but the new technique coats any exposed
surface and fosters rapid prototyping.
"Our technique is a more efficient way to coat not only conventional
objects, but even hydrogel soft robots, and our coatings are robust
enough to survive complete immersion in water and repeated swelling and
de-swelling by humidity," said senior author Jonathan P. Singer, an
assistant professor in the Department of Mechanical and Aerospace
Engineering in the School of Engineering at Rutgers University-New
Brunswick.
The engineers discovered new capabilities of a technology that creates
a fine spray of droplets by applying a voltage to fluid flowing through
a nozzle. This technique (electrospray deposition) has been used mainly
for analytical chemistry. But in recent decades, it has also been used
in lab-scale demonstrations of coatings that deliver vaccines,
light-absorbing layers of solar cells and fluorescent quantum dots
(tiny particles) for LED displays.
Using their approach, Rutgers engineers are building an accessory for
3D printers that will, for the first time, allow automated coating of
3D-printed parts with functional, protective or aesthetic layers of
paint. Their technique features much thinner and better-targeted paint
application, using significantly fewer materials than traditional
methods. That means engineers can use cutting-edge materials, such as
nanoparticles and bioactive ingredients, that would otherwise be too
costly in paints, according to Singer.
Next steps include creating surfaces that can change their properties
or trigger chemical reactions to create paints that can sense their
environment and report stimuli to onboard electronics. The engineers
hope to commercialize their technique and create a new paradigm of
rapid coating immediately after printing that complements 3D printing.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Rutgers University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. Dylan A. Kovacevich, Lin Lei, Daehoon Han, Christianna Kuznetsova,
Steven E. Kooi, Howon Lee, Jonathan P. Singer. Self-Limiting
Electrospray Deposition for the Surface Modification of Additively
Manufactured Parts. ACS Applied Materials & Interfaces, 2020; DOI:
[19]10.1021/acsami.9b23544
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Tue Apr 28 21:30:16 2020
Date:
April 28, 2020
Source:
Pohang University of Science & Technology (POSTECH)
Summary:
Medical researchers have develops wireless smart contact lenses
for diagnosis and treatment of diabetes.
FULL STORY
__________________________________________________________________
Diabetes is called an incurable disease because once it develops, it
does not disappear regardless of treatment in modern medicine. Having
diabetes means a life-long obligation of insulin shots and monitoring
of blood glucose levels. But what if you could control the secretion of
insulin just by wearing contact lenses?
Recently, a research team at POSTECH developed wirelessly driven 'smart
contact lens' technology that can detect diabetes and further treat
diabetic retinopathy just by wearing them.
Professor Sei Kwang Hahn and graduate students Do Hee Keum and
Su-Kyoung Kim of POSTECH's Department of Materials Science and
Engineering, and Professor Jae-Yoon Sim and graduate student Jahyun Koo
of Department of Electronics and Electrical Engineering have developed
a wireless powered smart contact lens that can diagnose and treat
diabetes by controlling drug delivery with electrical signals. The
findings were recently published in Science Advances. The smart contact
lenses developed by the research team are made of biocompatible
polymers and integrate biosensors and drug delivery and data
communication systems.
The research team verified that the glucose level in tears of diabetic
rabbits analyzed by smart contact lenses matched their blood glucose
level using a conventional glucose sensor that utilize drawn blood. The
team additionally confirmed that the drugs encased in smart contact
lenses could treat diabetic retinopathy.
Recently, by applying the platform technology of these smart contact
lenses, a research has been conducted to expand the scope of
electroceuticals that use electrical stimulations to treat brain
disorders such as Alzheimer's and Parkinson's diseases, and mental
illnesses including depression.
The research team expects this development of self-controlled
therapeutic smart contact lenses with real-time biometric analysis to
be quickly applied to wearable healthcare industries.
Professor Sei Kwang Han who led the research stated, "Despite the
full-fledged research and development of wearable devices from global
companies, the commercialization of wireless-powered medical devices
for diagnosis and treatment of diabetes and retinopathy is
insufficient." He added, "We expect that this research will greatly
contribute to the advancement of related industries by being the first
in developing wireless-powered smart contact lenses equipped with drug
delivery system for diagnosis and treatment of diabetes, and treatment
of retinopathy."
This research was financially supported by Samsung Science and
Technology Foundation, the Global Frontier Project (Director: Professor
Kilwon Cho), the Mid-career Researcher Program from the National
Research Foundation of Korea, and World Class 300 Project of the
Ministry of SMEs and Startups. The research findings on smart contact
lens-based technologies were introduced in the January issue of Nature
Reviews Materials, which drew attention from the academic circles. The
research team is preparing to carry out clinical trials for the safety
and validity assessment for commercialization of smart contact lenses
in collaboration with Interojo Inc.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Pohang University of Science & Technology
(POSTECH). Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Geon-Hui Lee, Hanul Moon, Hyemin Kim, Gae Hwang Lee, Woosung Kwon,
Seunghyup Yoo, David Myung, Seok Hyun Yun, Zhenan Bao, Sei Kwang
Hahn. Multifunctional materials for implantable and wearable
photonic healthcare devices. Nature Reviews Materials, 2020; 5 (2):
149 DOI: [19]10.1038/s41578-019-0167-3
__________________________________________________________________
--- up 14 weeks, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:10 2020
system
Astronomer discovers massive extrasolar planet with Maunakea telescope
Date:
April 29, 2020
Source:
W. M. Keck Observatory
Summary:
A team of astronomers has discovered a planet three times the
mass of Jupiter in the Kepler-88 system. The team found that
Kepler-88 d is the most massive known planet in this system -
not Kepler-88 c as previously thought.
FULL STORY
__________________________________________________________________
Our solar system has a king. The planet Jupiter, named for the most
powerful god in the Greek pantheon, has bossed around the other planets
through its gravitational influence. With twice the mass of Saturn, and
300 times that of Earth, Jupiter's slightest movement is felt by all
the other planets. Jupiter is thought to be responsible for the small
size of Mars, the presence of the asteroid belt, and a cascade of
comets that delivered water to young Earth.
Do other planetary systems have gravitational gods like Jupiter?
A team of astronomers led by the University of Hawaii Institute for
Astronomy (UH IfA) has discovered a planet three times the mass of
Jupiter in a distant planetary system.
The discovery is based on six years of data taken at W. M. Keck
Observatory on Maunakea in Hawaii. Using the High-Resolution Echelle
Spectrometer (HIRES) instrument on the 10-meter Keck I telescope, the
team confirmed that the planet, named Kepler-88 d, orbits its star
every four years, and its orbit is not circular, but elliptical. At
three times the mass of Jupiter, Kepler-88 d is the most massive planet
in this system.
The system, Kepler-88, was already famous among astronomers for two
planets that orbit much closer to the star, Kepler-88 b and c (planets
are typically named alphabetically in the order of their discovery).
Those two planets have a bizarre and striking dynamic called mean
motion resonance. The sub-Neptune sized planet b orbits the star in
just 11 days, which is almost exactly half the 22-day orbital period of
planet c, a Jupiter-mass planet. The clockwork-like nature of their
orbits is energetically efficient, like a parent pushing a child on a
swing. Every two laps planet b makes around the star, it gets pumped.
The outer planet, Kepler-88 c, is twenty times more massive than planet
b, and so its force results in dramatic changes in the orbital timing
of the inner planet.
Astronomers observed these changes, called transit timing variations,
with the NASA Kepler space telescope, which detected the precise times
when Kepler-88 b crossed (or transited) between the star and the
telescope. Although transit timing variations (TTVs for short) have
been detected in a few dozen planetary systems, Kepler-88 b has some of
the largest timing variations. With transits arriving up to half a day
early or late, the system is known as "the King of TTVs."
The newly discovered planet adds another dimension to astronomers'
understanding of the system.
"At three times the mass of Jupiter, Kepler-88 d has likely been even
more influential in the history of the Kepler-88 system than the
so-called King, Kepler-88 c, which is only one Jupiter mass," says Dr.
Lauren Weiss, Beatrice Watson Parrent Postdoctoral Fellow at UH IfA and
lead author on the discovery team. "So maybe Kepler-88 d is the new
supreme monarch of this planetary empire -- the empress."
Perhaps these extrasolar sovereign leaders have had as much influence
as Jupiter did for our solar system. Such planets might have promoted
the development of rocky planets and directed water-bearing comets
toward them. Dr. Weiss and colleagues are searching for similar royal
planets in other planetary systems with small planets.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]W. M. Keck Observatory. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Lauren M. Weiss, Daniel C. Fabrycky, Eric Agol, Sean M. Mills,
Andrew W. Howard, Howard Isaacson, Erik A. Petigura, Benjamin
Fulton, Lea Hirsch, Evan Sinukoff. The Discovery of the
Long-Period, Eccentric Planet Kepler-88 d and System
Characterization with Radial Velocities and Photodynamical
Analysis. The Astronomical Journal, 2020; 159 (5): 242 DOI:
[19]10.3847/1538-3881/ab88ca
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:12 2020
It combines human knowledge and expertise with the speed and efficiency of 'smart' computer algorithms
Date:
April 29, 2020
Source:
DOE/SLAC National Accelerator Laboratory
Summary:
Researchers have developed a new tool, using machine learning,
that may make part of the accelerator tuning process 5 times
faster compared to previous methods.
FULL STORY
__________________________________________________________________
Each year, researchers from around the world visit the Department of
Energy's SLAC National Accelerator Laboratory to conduct hundreds of
experiments in chemistry, materials science, biology and energy
research at the Linac Coherent Light Source (LCLS) X-ray laser. LCLS
creates ultrabright X-rays from high-energy beams of electrons produced
in a giant linear particle accelerator.
Experiments at LCLS run around the clock, in two 12-hour shifts per
day. At the start of each shift, operators must tweak the accelerator's
performance to prepare the X-ray beam for the next experiment.
Sometimes, additional tweaking is needed during a shift as well. In the
past, operators have spent hundreds of hours each year on this task,
called accelerator tuning.
Now, SLAC researchers have developed a new tool, using machine
learning, that may make part of the tuning process five times faster
compared to previous methods. They described the method in Physical
Review Letters on March 25.
Tuning the beam
Producing LCLS's powerful X-ray beam starts with the preparation of a
high-quality electron beam. Some of the electrons' energy then gets
converted into X-ray light inside special magnets. The properties of
the electron beam, which needs to be dense and tightly focused, are a
critical factor in how good the X-ray beam will be.
"Even a small difference in the density of the electron beam can have a
huge difference in the amount of X-rays you get out at the end," says
Daniel Ratner, head of SLAC's machine learning initiative and a member
of the team that developed the new technique.
The accelerator uses a series of 24 special magnets, called quadrupole
magnets, to focus the electron beam similarly to how glass lenses focus
light. Traditionally, human operators carefully turned knobs to adjust
individual magnets between shifts to make sure the accelerator was
producing the X-ray beam needed for a particular experiment. This
process took up a lot of the operators' time -- time they could spend
on other important tasks that improve the beam for experiments.
A few years ago, LCLS operators adopted a computer algorithm that
automated and sped up this magnet tuning. However, it came with its own
disadvantages. It aimed at improving the X-ray beam by making random
adjustments to the magnets' strengths. But unlike human operators, this
algorithm had no prior knowledge of the accelerator's structure and
couldn't make educated guesses in its tuning that might have ultimately
led to even better results.
This is why SLAC researchers decided to develop a new algorithm that
combines machine learning -- "smart" computer programs that learn how
to get better over time -- with knowledge about the physics of the
accelerator.
"The machine learning approach is trying to tie this all together to
give operators better tools so that they can focus on other important
problems," says Joseph Duris, a SLAC scientist who led the new study.
A better beam, faster
The new approach uses a technique called a Gaussian process, which
predicts the effect a particular accelerator adjustment has on the
quality of the X-ray beam. It also generates uncertainties for its
predictions. The algorithm then decides which adjustments to try for
the biggest improvements.
For example, it may decide to try a dramatic adjustment whose outcome
is very uncertain but could lead to a big payoff. That means this new,
adventurous algorithm has a better chance than the previous algorithm
of making the tweaks needed to create the best possible X-ray beam.
The SLAC researchers also used data from previous LCLS operations to
teach the algorithm which magnet strengths have typically led to
brighter X-rays, giving the algorithm a way of making educated guesses
about the adjustments it should try. This equips the algorithm with
knowledge and expertise that human operators naturally have, and that
the previous algorithm lacked.
"We can rely on that physics knowledge, that institutional knowledge,
in order to improve the predictions," Duris says.
Insights into the magnets' relationships to each other also improved
the technique. The quadrupole magnets work in pairs, and to increase
their focusing power, the strength of one magnet in a pair must be
increased while the other's is decreased.
With the new process, tuning the quadrupole magnets has become about
three to five times faster, the researchers estimate. It also tends to
produce higher-intensity beams than the previously used algorithm.
"Our ability to increase our tuning efficiency is really, really
critical to being able to deliver a beam faster and with better quality
to people who are coming from all over the world to run experiments,"
says Jane Shtalenkova, an accelerator operator at SLAC who worked with
Duris, Ratner and others to develop the new tool.
Beyond LCLS
The same method can be extended to tune other electron or X-ray beam
properties that scientists may want to optimize for their experiments.
For example, researchers could apply the technique to maximize the
signal they get out of their sample after it's hit by LCLS's X-ray
beam.
This flexibility also makes the new algorithm useful for other
facilities.
"The nice thing about this machine learning algorithm is that you can
do tech transfer relatively easily," says Adi Hanuka, a SLAC scientist
who has been testing the technique at three other accelerators: SPEAR3,
the accelerator ring powering SLAC's Stanford Synchrotron Radiation
Lightsource (SSRL); PEGASUS at the University of California, Los
Angeles; and the Advanced Photon Source (APS) at DOE's Argonne National
Laboratory.
"This tool now exists in several labs," Hanuka says. "Hopefully, we'll
be integrating it into even more labs soon."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]DOE/SLAC National Accelerator Laboratory.
Original written by Erika K. Carlson. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. J. Duris, D. Kennedy, A. Hanuka, J. Shtalenkova, A. Edelen, P.
Baxevanis, A. Egger, T. Cope, M. McIntire, S. Ermon, D. Ratner.
Bayesian Optimization of a Free-Electron Laser. Physical Review
Letters, 2020; 124 (12) DOI: [19]10.1103/PhysRevLett.124.124801
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:12 2020
Date:
April 29, 2020
Source:
Texas A&M University
Summary:
Steady hands and uninterrupted, sharp vision are critical when
performing surgery on delicate structures like the brain or
hair-thin blood vessels. While surgical cameras have improved
what surgeons see during operative procedures, the 'steady hand'
remains to be enhanced -- new surgical technologies, including
sophisticated surgeon-guided robotic hands, cannot prevent
accidental injuries when operating close to fragile tissue.
FULL STORY
__________________________________________________________________
Steady hands and uninterrupted, sharp vision are critical when
performing surgery on delicate structures like the brain or hair-thin
blood vessels. While surgical cameras have improved what surgeons see
during operative procedures, the "steady hand" remains to be enhanced
-- new surgical technologies, including sophisticated surgeon-guided
robotic hands, cannot prevent accidental injuries when operating close
to fragile tissue.
In a new study published in the January issue of the journal Scientific
Reports, researchers at Texas A&M University show that by delivering
small, yet perceptible buzzes of electrical currents to fingertips,
users can be given an accurate perception of distance to contact. This
insight enabled users to control their robotic fingers precisely enough
to gently land on fragile surfaces.
The researchers said that this technique might be an effective way to
help surgeons reduce inadvertent injuries during robot-assisted
operative procedures.
"One of the challenges with robotic fingers is ensuring that they can
be controlled precisely enough to softly land on biological tissue,"
said Hangue Park, assistant professor in the Department of Electrical
and Computer Engineering. "With our design, surgeons will be able to
get an intuitive sense of how far their robotic fingers are from
contact, information they can then use to touch fragile structures with
just the right amount of force."
Robot-assisted surgical systems, also known as telerobotic surgical
systems, are physical extensions of a surgeon. By controlling robotic
fingers with movements of their own fingers, surgeons can perform
intricate procedures remotely, thus expanding the number of patients
that they can provide medical attention. Also, the tiny size of the
robotic fingers means that surgeries are possible with much smaller
incisions since surgeons need not make large cuts to accommodate for
their hands in the patient's body during operations.
To move their robotic fingers precisely, surgeons rely on live
streaming of visual information from cameras fitted on telerobotic
arms. Thus, they look into monitors to match their finger movements
with those of the telerobotic fingers. In this way, they know where
their robotic fingers are in space and how close these fingers are to
each other.
However, Park noted that just visual information is not enough to guide
fine finger movements, which is critical when the fingers are in the
close vicinity of the brain or other delicate tissue.
"Surgeons can only know how far apart their actual fingers are from
each other indirectly, that is, by looking at where their robotic
fingers are relative to each other on a monitor," Park said. "This
roundabout view diminishes their sense of how far apart their actual
fingers are from each other, which then affects how they control their
robotic fingers."
To address this problem, Park and his team came up with an alternate
way to deliver distance information that is independent of visual
feedback. By passing different frequencies of electrical currents onto
fingertips via gloves fitted with stimulation probes, the researchers
were able to train users to associate the frequency of current pulses
with distance, that is, increasing current frequencies indicated the
closing distance from a test object. They then compared if users
receiving current stimulation along with visual information about
closing distance on their monitors did better at estimating proximity
than those who received visual information alone.
Park and his team also tailored their technology according to the
user's sensitivity to electrical current frequencies. In other words,
if a user was sensitive to a wider range of current frequencies, the
distance information was delivered with smaller steps of increasing
currents to maximize the accuracy of proximity estimation.
The researchers found that users receiving electrical pulses were more
aware of the proximity to underlying surfaces and could lower their
force of contact by around 70%, performing much better than the other
group. Overall, they observed that proximity information delivered
through mild electric pulses was about three times more effective than
the visual information alone.
Park said their novel approach has the potential to significantly
increase maneuverability during surgery while minimizing risks of
unintended tissue damage. He also said their technique would add little
to the existing mental load of surgeons during operative procedures.
"Our goal was to come up with a solution that would improve the
accuracy in proximity estimation without increasing the burden of
active thinking needed for this task," he said. "When our technique is
ready for use in surgical settings, physicians will be able to
intuitively know how far their robotic fingers are from underlying
structures, which means that they can keep their active focus on
optimizing the surgical outcome of their patients."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Texas A&M University. Original written by
Vandana Suresh. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Ziqi Zhao, Minku Yeo, Stefan Manoharan, Seok Chang Ryu, Hangue
Park. Electrically-Evoked Proximity Sensation Can Enhance Fine
Finger Control in Telerobotic Pinch. Scientific Reports, 2020; 10
(1) DOI: [19]10.1038/s41598-019-56985-9
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:12 2020
Date:
April 29, 2020
Source:
Massachusetts General Hospital
Summary:
Medical images for a wide range of diseases can be more easily
viewed, compared, and analyzed using a breakthrough open source
web-based imaging platform developed by Massachusetts General
Hospital (MGH) and collaborating researchers.
FULL STORY
__________________________________________________________________
Medical images for a wide range of diseases, including COVID-19, can
now be more easily viewed, compared, and analyzed using a breakthrough
web-based imaging platform developed by Massachusetts General Hospital
(MGH) and collaborating researchers.
The Open Health Imaging Foundation (OHIF) web viewer was originally
developed with grant support from the National Cancer Institute's
Informatics Technology for Cancer Research (NCI-ITCR) program for use
in cancer imaging research and clinical trials, where it is already
adopted by several leaders in the field. However, the OHIF Viewer and
its underlying Cornerstone libraries and tools can also be used for any
disease and are increasingly being used for COVID-19 projects.
"This viewer provides performance that you typically only get from an
installed application [software], but we do it through a web browser,"
says Gordon J. Harris, PhD, the corresponding author of a paper about
this viewer in Journal of Clinical Oncology: Clinical Cancer
Informatics. "This is a free, open-source extendable platform that is
already being used by projects worldwide."
Dr. Harris is director of the 3D Imaging Service at MGH and a professor
of radiology at Harvard Medical School. OHIF was founded in 2015 and is
led by a team including Dr. Harris and co-author collaborators Chris
Hafey, Rob Lewis, Steve Pieper, Trinity Urban, and Erik Ziegler.
The already popular free program is interoperable, commercial grade,
user-friendly and requires less technical support than a typical
commercial product. The software is "zero footprint," meaning it can be
run in a web browser from any computer without any software being
downloaded. It can be launched from a web server on a local computer,
or in the cloud. It is also accessible for a user to access from
multiple locations.
In addition, researchers can freely download, modify, and contribute to
the source code for the program ([17]
http://www.ohif.org;
[18]
http://www.cornerstonejs.org). Overall, the platform has been
downloaded more than 8,500 times, and has been translated into several
languages.
Three examples of projects using the OHIF Viewer and/or its underlying
Cornerstone libraries for COVID-19 imaging applications are:
* From Australia, the DetectED-X CovED virtual clinical environment
platform providing education on COVID-19 appearances on CT scans to
radiologists worldwide;
* From South Korea, the VUNO Med LungQuant and VUNO Med Chest X-ray
artificial intelligence (AI) programs for diagnosis of COVID-19;
* From Germany and Brazil, the Nextcloud DICOM Viewer -- an open
source, secure, fast, cloud-based, and simple web-based medical
image viewer being used to diagnose COVID-19 from sites across
Brazil, where it allows secure and fast diagnosis.
All of these applications are being provided for free to help support
efforts to address this worldwide pandemic.
Meanwhile, the OHIF Viewer has become a mainstay for an elite set of
cancer centers, through the Precision Imaging Metrics program developed
at MGH and the Dana-Farber/Harvard Cancer Center. Users of this program
perform over 25,000 oncology imaging assessments per year for over
1,000 active clinical trials with Precision Imaging Metrics. The
NCI-designated Cancer Centers who are members and using this platform
for clinical trials imaging informatics include:
* Dana-Farber/Harvard Cancer Center
* Yale Cancer Center
* Fred Hutchinson Cancer Research Center at University of Washington
* Huntsman Cancer Institute at University of Utah
* Winship Cancer Institute at Emory University
* Massey Cancer Center at Virginia Commonwealth University
* Medical College of Wisconsin
* Karmanos Cancer Center at Wayne State University
* Nationwide Children's Hospital (launching 2020)
"Many academic and industry projects are also using the OHIF platform
and associated Cornerstone tools for developing novel web-based imaging
applications, and machine learning companies are now also tapping into
it. We only hear about a fraction of the companies that are using it
since it is free for anyone to download and customize. Hundreds of
software developers around the world have adopted our platform and we
welcome contributions from the user community," says Harris.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[19]Materials provided by [20]Massachusetts General Hospital. Original
written by Brian Burns. Note: Content may be edited for style and
length.
__________________________________________________________________
Journal Reference:
1. Erik Ziegler, Trinity Urban, Danny Brown, James Petts, Steve D.
Pieper, Rob Lewis, Chris Hafey, Gordon J. Harris. Open Health
Imaging Foundation Viewer: An Extensible Open-Source Framework for
Building Web-Based Imaging Applications to Support Cancer Research.
JCO Clinical Cancer Informatics, 2020; (4): 336 DOI:
[21]10.1200/CCI.19.00131
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:12 2020
Date:
April 29, 2020
Source:
The University of Hong Kong
Summary:
Researchers have developed a new method to accurately track the
spread of COVID-19 using population flow data, and establishing
a new risk assessment model to identify high-risk locales of
COVID-19 at an early stage, which serves as a valuable toolkit
to public health experts and policy makers in implementing
infectious disease control during new outbreaks.
FULL STORY
__________________________________________________________________
An international research team led by the University of Hong Kong (HKU)
developed a new method to accurately track the spread of COVID-19 using
population flow data, and establishing a new risk assessment model to
identify high-risk locales of COVID-19 at an early stage, which serves
as a valuable toolkit to public health experts and policy makers in
implementing infectious disease control during new outbreaks. The study
findings have been published in the journal Nature today (April 29).
Dr. Jayson Jia, Associate Professor of Marketing at the Faculty of
Business and Economics of HKU and lead author of the study, and his
co-authors used nation-wide data provided by a major national carrier
in China to track population movement out of Wuhan between 1 January
and 24 January 2020, a period covering the annual Chunyun mass
migration before the Chinese Lunar New Year to a lockdown of the city
to contain the virus. The movement of over 11 million people travelling
through Wuhan to 296 prefectures in 31 provinces and regions in China
were tracked.
Differing from usual epidemiological models that rely on historical
data or assumptions, the team used real-time data about actual
movements focusing on aggregate population flow rather than individual
tracking. The data include any mobile phone user who had spent at least
2 hours in Wuhan during the study period. Locations were detected once
users had their phones on. As only aggregate data was used and no
individual data was used, there was no threat to consumer privacy.
Combining the population flow data with the number and location of
COVID-19 confirmed cases up to 19 February 2020 in China, Dr Jia's team
showed that the relative quantity of human movement from the disease
epicentre, in this case, Wuhan, directly predicted the relative
frequency and geographic distribution of the number of COVID-19 cases
across China. The researchers found that their model can explain 96% of
the distribution and intensity of the spread of COVID-19 across China
statistically.
The research team then used this empirical relationship to build a new
risk detection toolkit. Leveraging on the population flow data, the
researchers created an "expected growth pattern" based on the number of
people arriving from the risk source, i.e. the disease epicentre. The
team thereby developed a new risk model by contrasting expected growth
of cases against the actual number of confirmed cases for each city in
China, the difference being the "community transmission risk."
"If there are more confirmed cases than expected ones, there is a
higher risk of community spread. If there are fewer expected cases than
reported, it means that the city's preventive measures are particularly
effective or it can indicate that further investigation by central
authorities is needed to eliminate possible risks from inaccurate
measurement," explained Dr Jia.
"What is innovative about our approach is that we use misprediction to
assess the level of community risk. Our model accurately tells us how
many cases we should expect given travel data. We contrast this against
the confirmed cases using the logic that what cannot be explained by
imported cases and primary transmissions should be community spread,"
he added.
The approach is advantageous because it requires no assumptions or
knowledge of how or why the virus spreads, is robust to data reporting
inaccuracies, and only requires knowledge of relative distribution of
human movement. It can be used by policy makers in any nation with
available data to make rapid and accurate risk assessments and to plan
allocation of limited resources ahead of ongoing disease outbreaks.
"Our research indicates that geographic flow of people outperforms
other measures such as population size, wealth or distance from the
risk source to indicate the gravity of an outbreak." said Dr Jia.
Dr Jia is currently exploring with fellow researchers the feasibility
of applying this toolkit to other countries, and extending it to
situations where there are multiple COVID-19 epicentres. The team is
working with other national telecom carriers and seeking additional
data partners.
The study's co-authors are Jianmin Jia, Presidential Chair Professor at
the Chinese University of Hong Kong, Shenzhen (corresponding author);
Nicholas A. Christakis, Sterling Professor of Social and Natural
Science at Yale; Xin Lu, the National University of Defense Technology
in Changsha, China, and the Karolinska Institutet in Stockholm, Sweden;
Yun Yuan, Southwest Jiaotong University; Ge Xu, Hunan University of
Technology and Business.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]The University of Hong Kong. Note:
Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Jayson S. Jia, Xin Lu, Yun Yuan, Ge Xu, Jianmin Jia, Nicholas A.
Christakis. Population flow drives spatio-temporal distribution of
COVID-19 in China. Nature, 2020; DOI: [19]10.1038/s41586-020-2284-y
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:18 2020
Date:
April 29, 2020
Source:
Cell Press
Summary:
In a Commentary published April 29 in the journal Joule, energy
and climate policy researchers in Switzerland and Germany
provide a framework for responsibly and meaningfully integrating
policies supporting the clean energy transition into the
COVID-19 response in the weeks, months, and years to come.
FULL STORY
__________________________________________________________________
The COVID-19 pandemic emerged at a time when climate and energy
policies were experiencing greater attention and -- in some cases --
greater momentum. But the ensuing global health emergency and economic
crisis mean that the circumstances under which these climate and energy
policies were conceived have drastically changed. In a Commentary
published April 29 in the journal Joule, energy and climate policy
researchers in Switzerland and Germany provide a framework for
responsibly and meaningfully integrating policies supporting the clean
energy transition into the COVID-19 response in the weeks, months, and
years to come.
"We're writing this commentary as COVID-19 fundamentally changes the
economic environment of the clean energy transition, requiring policy
makers to take major decisions within short timeframes," says senior
author Tobias S. Schmidt of ETH Zurich. "While many blogs or comments
put forward 'shopping' lists of which policies to enact or which
technologies to support, much of the advice lacked structure."
In their Commentary, Schmidt and his colleagues argue against small
"green wins" in the short-term that could prevent meaningful change in
the long-term. "Bailouts should exclude sectors that are clearly
incompatible with the Paris Agreement, such as tar sands development,
but at the same time, bailout decisions primarily have to consider the
societal value of uninterrupted service and of safeguarding jobs,"
Schmidt says. "Instead, policymakers should consider increasing their
leverage to shape business activities for Paris Agreement-compatible
pathways in the future, for instance, by taking equity stakes or
securing a say in the future strategy of bailed-out corporations."
"The general public should understand that the short-term emissions
reductions we are experiencing due to the lockdowns will not have major
effects on climate change," Schmidt says. "To decarbonize our energy
systems and industry, we need structural change, meaning more and not
less investment."
Once the immediate crisis has passed, when many countries will have to
address a major economic downturn, the authors say that low interest
rates and massive public spending could offer important opportunities
for the clean energy transition. "It is essential that we not repeat
the mistakes of the post-financial crisis bailouts, which often led to
massive increases in CO2 emissions," says Schmidt.
Going forward, he says, "We think the COVID-19 pandemic has reminded us
that we require policies that are proof to exogenous shocks, and we
hope that future research will support policy makers in developing
shock-proof policy designs."
This work was supported by the Swiss State Secretariat for Education,
Research and Innovation (SERI) as part of the European Union's Horizon
2020 research and innovation program project INNOPATHS.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
Materials provided by [17]Cell Press. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Bjarne Steffen, Florian Egli, Michael Pahle, Tobias S. Schmidt.
Navigating the Clean Energy Transition in the COVID-19 Crisis.
Joule, 2020; DOI: [18]10.1016/j.joule.2020.04.011
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:18 2020
Date:
April 29, 2020
Source:
American Chemical Society
Summary:
When a dead body is found, one of the first things a forensic
pathologist tries to do is estimate the time of death. There are
several ways to do this, including measuring body temperature or
observing insect activity, but these methods don't always work
for corpses found in water. Now, researchers are reporting a
mouse study showing that certain proteins in bones could be used
for this determination.
FULL STORY
__________________________________________________________________
When a dead body is found, one of the first things a forensic
pathologist tries to do is estimate the time of death. There are
several ways to do this, including measuring body temperature or
observing insect activity, but these methods don't always work for
corpses found in water. Now, researchers are reporting a mouse study in
ACS' Journal of Proteome Research showing that certain proteins in
bones could be used for this determination.
An accurate estimate of when someone died can help investigators better
understand what happened to the person and can help them identify
possible murder suspects, if foul play was involved. However,
determining the length of time a body has been underwater, or the
post-mortem submerged interval (PMSI), can be very challenging. One way
is to examine the decomposition stage of several areas of the body, but
factors like water salinity, depth, tides, temperature, presence of
bacteria and scavengers can make PMSI estimation difficult. But bones
are stronger than soft tissues, and they lie deep within the body, so
the proteins within them might be shielded from some of these effects.
So, Noemi Procopio and colleagues wondered if monitoring the levels of
certain proteins in bones could reveal the amount of time that a
mouse's corpse is underwater, and also whether different types of water
mattered.
To find out, the researchers placed fresh mouse carcasses in bottles of
tap water, saltwater, pond water or chlorinated water. After a PMSI of
1 or 3 weeks, the team collected the tibia, or lower leg bones, from
the corpses, extracted the proteins and analyzed them by mass
spectrometry. The researchers found that the time since submersion had
a greater effect on protein levels than the different types of water.
In particular, a protein called fructose-bisphosphate aldolase A
decreased in bone with increasing PMSI. In pond water, a protein called
fetuin-A was more likely to undergo a chemical modification, called
deamidation, than in the other types of water, which could help reveal
if a body was once submerged in pond water and then moved. These and
other potential biomarkers identified in the study could be useful for
PMSI estimation in different aquatic environments, the researchers say.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]American Chemical Society. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Haruka Mizukami, Bella Hathway, Noemi Procopio. Aquatic
Decomposition of Mammalian Corpses: A Forensic Proteomic Approach.
Journal of Proteome Research, 2020; DOI:
[19]10.1021/acs.jproteome.0c00060
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:18 2020
Date:
April 29, 2020
Source:
American Chemical Society
Summary:
With the coronavirus pandemic temporarily shuttering hair
salons, many clients are appreciating, and missing, the ability
of hair dye to cover up grays or touch up roots. However,
frequent coloring, whether done at a salon or at home, can
damage hair and might pose health risks from potentially
cancer-causing dye components. Now, researchers have developed a
process to dye hair with synthetic melanin under milder
conditions than traditional hair dyes.
FULL STORY
__________________________________________________________________
With the coronavirus pandemic temporarily shuttering hair salons, many
clients are appreciating, and missing, the ability of hair dye to cover
up grays or touch up roots. However, frequent coloring, whether done at
a salon or at home, can damage hair and might pose health risks from
potentially cancer-causing dye components. Now, researchers reporting
in ACS Central Science have developed a process to dye hair with
synthetic melanin under milder conditions than traditional hair dyes.
Melanin is a group of natural pigments that give hair and skin their
varied colors. With aging, melanin disappears from hair fibers, leading
to color loss and graying. Most permanent hair dyes use ammonia,
hydrogen peroxide, small-molecule dyes and other ingredients to
penetrate the cuticle of the hair and deposit coloring. Along with
being damaging to hair, these harsh substances could cause allergic
reactions or other health problems in colorists and their clients.
Recently, scientists have explored using synthetic melanin to color
human hair, but the process required relatively high concentrations of
potentially toxic heavy metals, such as copper and iron, and strong
oxidants. Claudia Battistella, Nathan Gianneschi and colleagues at
Northwestern University wanted to find a gentler, safer way to get
long-lasting, natural-looking hair color with synthetic melanin.
The researchers tested different dyeing conditions for depositing
synthetic melanin on hair, finding that they could substitute mild heat
and a small amount of ammonium hydroxide for the heavy metals and
strong oxidants used in prior methods. They could produce darker hues
by increasing the concentration of ammonium hydroxide, or red and gold
shades by adding a small amount of hydrogen peroxide. Overall, the
conditions were similar to or milder than those used for commercially
available hair dyes. And the natural-looking colors deposited on the
hair surface, rather than penetrating the cuticle, which is less likely
to cause damage. The colored layer persisted for at least 18 washes.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]American Chemical Society. Note: Content
may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Claudia Battistella, Naneki C. McCallum, Karthikeyan Gnanasekaran,
Xuhao Zhou, Valeria Caponetti, Marco Montalti, Nathan C.
Gianneschi. Mimicking Natural Human Hair Pigmentation with
Synthetic Melanin. ACS Central Science, 2020; DOI:
[19]10.1021/acscentsci.0c00068
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Wed Apr 29 21:30:18 2020
Date:
April 29, 2020
Source:
University of Sydney
Summary:
Scientists have developed a hypersensitive nano-sensor to detect
harmful 'free' iron disorders. The test could lead to earlier,
more accurate disease diagnosis.
FULL STORY
__________________________________________________________________
Chronic iron imbalances -- having either too little or too much iron in
the blood -- can result in medical conditions ranging from anaemia and
haemochromatosis through to more severe diseases, such as cancer,
Parkinson's Disease and Alzheimer's Disease.
Haemochromatosis is one of Australia's most common hereditary diseases
and the Australian Bureau of Statistics estimates approximately 780,000
people live with anaemia.
School of Biomedical Engineering PhD candidate and Sydney Nano
Institute student ambassador, Pooria Lesani, who is undertaking his
studies under the supervision of Professor Hala Zreiqat and Dr Zufu Lu,
has developed a multipurpose nanoscale bio-probe that allows
researchers to precisely monitor iron disorders in cells, tissue, and
body fluids as small as 1/1000th of a millimolar.
The test is more sensitive and specific than blood testing currently
used to detect iron disorders, which begin at very low, cellular level
concentrations.
Using novel carbon-based fluorescent bio-nanoprobe technology, the
test, which involves non-invasive subcutaneous or intravenous
injections, allows for a more accurate disease diagnosis before the
onset of symptoms, potentially allowing for the early treatment and
prevention of more serious diseases.
"More than 30% of the world's population lives with an iron imbalance,
which over time can lead to certain forms of cancer, as well
Parkinson's Disease and Alzheimer's Disease," said Mr Lesani from the
Tissue Engineering and Biomaterials Research Unit and the ARC Centre
for Innovative BioEngineering.
"Current testing methods can be complex and time consuming. To counter
this, and to enable the early detection of serious diseases, we have
developed a hyper-sensitive and cost-efficient skin testing technique
for detecting iron in the body's cells and tissue.
"Our most recent testing demonstrated a rapid detection of free iron
ions with remarkably high sensitivity. Iron could be detected at
concentrations in the parts per billion range, a rate ten times smaller
than previous nano-probes.
"Our sensor is multifunctional and could be applied to deep-tissue
imaging, involving a small probe that can visualise structure of
complex biological tissues and synthetic scaffolds."
Tested on pig skin, the nanoprobe outperformed current techniques for
deep tissue imaging, and rapidly penetrated biological tissue to depths
of 280 micrometres and remained detectable at depths of up to 3,000
micrometres -- about three millimetres -- in synthetic tissue.
The team aims to test the nanoprobe in larger animal models, as well as
investigate other ways in which it can be used to determine the
structure of complex biological tissues.
We hope to integrate the nanoprobe into a "lab-on-a-chip" sensing
system -- a portable, diagnostic blood testing tool which could allow
clinicians to remotely monitor their patients' health.
"Lab-on-a-chip systems are relatively simple to operate and require
only small blood volume samples from the patient to gain an accurate
insight of potential ferric ion disorders in the body, assisting early
intervention and prevention of disease," he said.
The nano-sensors can also be made from agricultural and petrochemical
waste products, allowing for low-cost, sustainable manufacturing.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Sydney. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Pooria Lesani, Gurvinder Singh, Christina Marie Viray, Yogambha
Ramaswamy, De Ming Zhu, Peter Kingshott, Zufu Lu, Hala Zreiqat.
Two-Photon Dual-Emissive Carbon Dot-Based Probe: Deep-Tissue
Imaging and Ultrasensitive Sensing of Intracellular Ferric Ions.
ACS Applied Materials & Interfaces, 2020; 12 (16): 18395 DOI:
[19]10.1021/acsami.0c05217
__________________________________________________________________
--- up 14 weeks, 1 day, 2 hours, 34 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:04 2020
Date:
April 30, 2020
Source:
Brown University
Summary:
Strange spots scattered across the Moon's nearside where bedrock
is conspicuously exposed are evidence of seismic activity set in
motion 4.3 billion years ago that could be ongoing today, the
researchers say.
FULL STORY
__________________________________________________________________
Researchers have discovered a system of ridges spread across the
nearside of the Moon topped with freshly exposed boulders. The ridges
could be evidence of active lunar tectonic processes, the researchers
say, possibly the echo of a long-ago impact that nearly tore the Moon
apart.
"There's this assumption that the Moon is long dead, but we keep
finding that that's not the case," said Peter Schultz, a professor in
Brown University's Department of Earth, Environmental and Planetary
Sciences and co-author of the research, which is published in the
journal Geology. "From this paper it appears that the Moon may still be
creaking and cracking -- potentially in the present day -- and we can
see the evidence on these ridges."
Most of the Moon's surface is covered by regolith, a powdery blanket of
ground-up rock created by the constant bombardment of tiny meteorites
and other impactors. Areas free of regolith where the Moon's bedrock is
exposed are vanishingly rare. But Adomas Valantinas, a graduate student
at the University of Bern who led the research while a visiting scholar
at Brown, used data from NASA's Lunar Reconnaissance Orbiter (LRO) to
spot strange bare spots within and surrounding the lunar maria, the
large dark patches on the Moon's nearside.
"Exposed blocks on the surface have a relatively short lifetime because
the regolith buildup is happening constantly," Schultz said. "So when
we see them, there needs to be some explanation for how and why they
were exposed in certain locations."
For the study, Valantinas used the LRO's Diviner instrument, which
measures the temperature of the lunar surface. Just as concrete-covered
cities on Earth retain more heat than the countryside, exposed bedrock
and blocky surfaces on the Moon stays warmer through the lunar night
than regolith-covered surfaces. Using nighttime observations from
Diviner, Valantinas turned up more than 500 patches of exposed bedrock
on narrow ridges following a pattern across the lunar nearside maria.
A few ridges topped with exposed bedrock had been seen before, Schultz
says. But those ridges were on the edges of ancient lava-filled impact
basins and could be explained by continued sagging in response to
weight caused by the lava fill. But this new study discovered that the
most active ridges are related to a mysterious system of tectonic
features (ridges and faults) on the lunar nearside, unrelated to both
lava-filled basins and other young faults that crisscross the
highlands.
"The distribution that we found here begs for a different explanation,"
Schultz said.
Valantinas and Schultz mapped out all of the exposures revealed in the
Diviner data and found an interesting correlation. In 2014, NASA's
GRAIL mission found a network of ancient cracks in the Moon's crust.
Those cracks became channels through which magma flowed to the Moon's
surface to form deep intrusions. Valantinas and Schultz showed that the
blocky ridges seemed to line up just about perfectly with the deep
intrusions revealed by GRAIL.
"It's almost a one-to-one correlation," Schultz said. "That makes us
think that what we're seeing is an ongoing process driven by things
happening in the Moon's interior."
Schultz and Valantinas suggest that the ridges above these ancient
intrusions arestill heaving upward. The upward movement breaks the
surface and enables regolith to drain into cracks and voids, leaving
the blocks exposed. Because bare spots on the Moon get covered over
fairly quickly, this cracking must be quite recent, possibly even
ongoing today. They refer to what they've found as ANTS, for Active
Nearside Tectonic System.
The researchers believe that the ANTS was actually set in motion
billions of years ago with a giant impact on the Moon's farside. In
previous studies, Schultz and a co-worker proposed this impact, which
formed the 1500-mile South Pole Aitken Basin, shattered the interior on
the opposite side, the nearside facing the Earth. Magma then filled
these cracks and controlled the pattern of dikes detected in the GRAIL
mission. The blocky ridges comprising the ANTS now trace the continuing
adjustments along these ancient weaknesses.
"This looks like the ridges responded to something that happened 4.3
billion years ago," Schultz said. "Giant impacts have long lasting
effects. The Moon has a long memory. What we're seeing on the surface
today is testimony to its long memory and secrets it still holds."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Brown University. Note: Content may be
edited for style and length.
__________________________________________________________________
Journal Reference:
1. P.H. Schultz, A. Valantinas. The origin of neotectonics on the
lunar nearside. Geology, 2020; DOI: [19]10.1130/G47202.1
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:04 2020
Date:
April 30, 2020
Source:
Cornell University
Summary:
To help future scientists make sense of what their telescopes
are showing them, astronomers have developed a spectral field
guide for rocky worlds orbiting white dwarf stars.
FULL STORY
__________________________________________________________________
The next generation of powerful Earth- and space-based telescopes will
be able to hunt distant solar systems for evidence of life on
Earth-like exoplanets -- particularly those that chaperone burned-out
stars known as white dwarfs.
The chemical properties of those far-off worlds could indicate that
life exists there. To help future scientists make sense of what their
telescopes are showing them, Cornell University astronomers have
developed a spectral field guide for these rocky worlds.
"We show what the spectral fingerprints could be and what forthcoming
space-based and large terrestrial telescopes can look out for," said
Thea Kozakis, doctoral candidate in astronomy, who conducts her
research at Cornell's Carl Sagan Institute. Kozakis is lead author of
"High-resolution Spectra and Biosignatures of Earth-like Planets
Transiting White Dwarfs," published in Astrophysical Journal Letters.
In just a few years, astronomers -- using tools such as the Extremely
Large Telescope, currently under construction in northern Chile's
Atacama Desert, and the James Webb Space Telescope, scheduled to launch
in 2021 -- will be able to search for life on exoplanets.
"Rocky planets around white dwarfs are intriguing candidates to
characterize because their hosts are not much bigger than Earth-size
planets," said Lisa Kaltenegger, associate professor of astronomy in
the College of Arts and Sciences and director of the Carl Sagan
Institute.
The trick is to catch an exoplanet's quick crossing in front of a white
dwarf, a small, dense star that has exhausted its energy.
"We are hoping for and looking for that kind of transit," Kozakis said.
"If we observe a transit of that kind of planet, scientists can find
out what is in its atmosphere, refer back to this paper, match it to
spectral fingerprints and look for signs of life. Publishing this kind
of guide allows observers to know what to look for."
Kozakis, Kaltenegger and Zifan Lin assembled the spectral models for
different atmospheres at different temperatures to create a template
for possible biosignatures.
Chasing down these planets in the habitable zone of white dwarf systems
is challenging, the researchers said.
"We wanted to know if light from a white dwarf -- a long-dead star --
would allow us to spot life in a planet's atmosphere if it were there,"
Kaltenegger said.
This paper indicates that astronomers should be able to see spectral
biosignatures -- such as methane in combination with ozone or nitrous
oxide -- "if those signs of life are present," said Kaltenegger, who
said this research expands scientific databases for finding spectral
signs of life on exoplanets to forgotten star systems.
"If we would find signs of life on planets orbiting under the light of
long-dead stars," she said, "the next intriguing question would be
whether life survived the star's death or started all over again -- a
second genesis, if you will."
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Cornell University. Original written by
Blaine Friedlander. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Thea Kozakis, Zifan Lin, Lisa Kaltenegger. High-resolution Spectra
and Biosignatures of Earth-like Planets Transiting White Dwarfs.
The Astrophysical Journal, 2020; 894 (1): L6 DOI:
[19]10.3847/2041-8213/ab6f6a
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:04 2020
An analysis of 369 solar-like stars shows that solar brightness variations
are extremely weak
Date:
April 30, 2020
Source:
Max Planck Institute for Solar System Research
Summary:
By cosmic standards the sun is extraordinarily monotonous. For
the first time, the scientists compared the sun with hundreds of
other stars with similar rotation periods. Most displayed much
stronger variations. This raises the question whether the sun
has been going through an unusually quiet phase for several
millennia.
FULL STORY
__________________________________________________________________
The extent to which solar activity (and thus the number of sunspots and
the solar brightness) varies can be reconstructed using various methods
-- at least for a certain period of time. Since 1610, for example,
there have been reliable records of sunspots covering the Sun; the
distribution of radioactive varieties of carbon and beryllium in tree
rings and ice cores allows us to draw conclusions about the level of
solar activity over the past 9000 years. For this period of time,
scientists find regularly recurring fluctuations of comparable strength
as during recent decades. "However, compared to the entire lifespan of
the Sun, 9000 years is like the blink of an eye," says MPS scientist
Dr. Timo Reinhold, first author of the new study. After all, our star
is almost 4.6 billion years old. "It is conceivable that the Sun has
been going through a quiet phase for thousands of years and that we
therefore have a distorted picture of our star," he adds.
Since there is no way of finding out how active the Sun was in primeval
times, scientists can only resort to the stars: Together with
colleagues from the University of New South Wales in Australia and the
School of Space Research in South Korea, the MPS researchers
investigated, whether the Sun behaves "normally" in comparison to other
stars. This may help to classify its current activity.
To this end, the researchers selected candidate stars that resemble the
Sun in decisive properties. In addition to the surface temperature, the
age, and the proportion of elements heavier than hydrogen and helium,
the researchers looked above all at the rotation period. "The speed at
which a star rotates around its own axis is a crucial variable,"
explains Prof. Dr. Sami Solanki, director at MPS and co-author of the
new publication. A star's rotation contributes to the creation of its
magnetic field in a dynamo process in its interior. "The magnetic field
is the driving force responsible for all fluctuations in activity,"
says Solanki. The state of the magnetic field determines how often the
Sun emits energetic radiation and hurls particles at high speeds into
space in violent eruptions, how numerous dark sunspots and bright
regions on its surface are -- and thus also how brightly the Sun
shines.
A comprehensive catalogue containing the rotation periods of thousands
of stars has been available only for the last few years. It is based on
measurement data from NASA's Kepler Space Telescope, which recorded the
brightness fluctuations of approximately 150000 main sequence stars
(i.e. those that are in the middle of their lifetimes) from 2009 to
2013. The researchers scoured this huge sample and selected those stars
that rotate once around their own axis within 20 to 30 days. The Sun
needs about 24.5 days for this. The researchers were able to further
narrow down this sample by using data from the European Gaia Space
Telescope. In the end, 369 stars remained, which also resemble the Sun
in other fundamental properties.
The exact analysis of the brightness variations of these stars from
2009 to 2013 reveals a clear picture. While between active and inactive
phases solar irradiance fluctuated on average by just 0.07 percent, the
other stars showed much larger variation. Their fluctuations were
typically about five times as strong. "We were very surprised that most
of the Sun-like stars are so much more active than the Sun," says Dr.
Alexander Shapiro of MPS, who heads the research group "Connecting
Solar and Stellar Variabilities."
However, it is not possible to determine the rotation period of all the
stars observed by the Kepler telescope. To do this, scientists have to
find certain periodically re-appearing dips in the star's lightcurve.
These dips can be traced back to starspots that darken the stellar
surface, rotate out of the telescope's field of view and then reappear
after a fixed period of time. "For many stars, such periodic darkenings
cannot be detected; they are lost in the noise of the measured data and
in overlying brightness fluctuations," explains Reinhold. Viewed
through the Kepler telescope, even the Sun would not reveal its
rotation period.
The researchers therefore also studied more than 2500 Sun-like stars
with unknown rotation periods. Their brightness fluctuated much less
than that of the other group.
These results allow two interpretations. There could be a still
unexplained fundamental difference between stars with known and unknown
rotation period. "It is just as conceivable that stars with known and
Sun-like rotation periods show us the fundamental fluctuations in
activity the Sun is capable of," says Shapiro. This would mean that our
star has been unusually feeble over the past 9000 years and that on
very large time scales phases with much greater fluctuations are also
possible.
There is, however, no cause for concern. For the foreseeable future,
there is no indication of such solar "hyperactivity." On the contrary:
For the last decade, the Sun has been showing itself to be rather
weakly active, even by its own low standards. Predictions of activity
for the next eleven years indicate that this will not change soon.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Max Planck Institute for Solar System
Research. Note: Content may be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Timo Reinhold et al. The Sun is less active than other solar-like
stars. Science, May 1st, 2020 DOI: [19]10.1126/science.aay3821
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:04 2020
Date:
April 30, 2020
Source:
KU Leuven
Summary:
Astronomers have captured images of the inner rims of
planet-forming disks located hundreds of light years away. These
disks of dust and gas, similar in shape to a music record, form
around young stars. The images shed new light on how planetary
systems are formed.
FULL STORY
__________________________________________________________________
An international team of astronomers has captured fifteen images of the
inner rims of planet-forming disks located hundreds of light years
away. These disks of dust and gas, similar in shape to a music record,
form around young stars. The images shed new light on how planetary
systems are formed. They were published in the journal Astronomy &
Astrophysics.
To understand how planetary systems, including our own, take shape, you
have to study their origins. Planet-forming or protoplanetary disks are
formed in unison with the star they surround. The dust grains in the
disks can grow into larger bodies, which eventually leads to the
formation of planets. Rocky planets like the Earth are believed to form
in the inner regions of protoplanetary disks, less than five
astronomical units (five times the Earth-Sun distance) from the star
around which the disk has formed.
Before this new study, several pictures of these disks had been taken
with the largest single-mirror telescopes, but these cannot capture
their finest details. "In these pictures, the regions close to the
star, where rocky planets form, are covered by only few pixels," says
lead author Jacques Kluska from KU Leuven in Belgium. "We needed to
visualize these details to be able to identify patterns that might
betray planet formation and to characterize the properties of the
disks." This required a completely different observation technique.
"I'm thrilled that we now for the first time have fifteen of these
images," Kluska continued.
Image reconstruction
Kluska and his colleagues created the images at the European Southern
Observatory (ESO) in Chile by using a technique called infrared
interferometry. Using ESO's PIONIER instrument, they combined the light
collected by four telescopes at the Very Large Telescope observatory to
capture the disks in detail. However, this technique does not deliver
an image of the observed source. The details of the disks needed to be
recovered with a mathematical reconstruction technique. This technique
is similar to how the first image of a black hole was captured. "We had
to remove the light of the star, as it hindered the level of detail we
could see in the disks," Kluska explains.
"Distinguishing details at the scale of the orbits of rocky planets
like Earth or Jupiter (as you can see in the images) -- a fraction of
the Earth-Sun distance -- is equivalent to being able to see a human on
the Moon, or to distinguish a hair at a 10 km distance," notes
Jean-Philippe Berger of the Université Grenoble-Alpes, who as principal
investigator was in charge of the work with the PIONIER instrument.
"Infrared interferometry is becoming routinely used to uncover the
tiniest details of astronomical objects. Combining this technique with
advanced mathematics finally allows us to turn the results of these
observations into images."
Irregularities
Some findings immediately stand out from the images. "You can see that
some spots are brighter or less bright, like in the images above: this
hints at processes that can lead to planet formation. For example:
there could be instabilities in the disk that can lead to vortices
where the disk accumulates grains of space dust that can grow and
evolve into a planet."
The team will do additional research to identify what might lie behind
these irregularities. Kluska will also do new observations to get even
more detail and to directly witness planet formation in the regions
within the disks that lie close to the star. Additionally, Kluska is
heading a team that has started to study 11 disks around other, older
types of stars also surrounded by disks of dust, since it is thought
these might also sprout planets.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[19]Materials provided by [20]KU Leuven. Note: Content may be edited
for style and length.
__________________________________________________________________
Related Multimedia:
* [21]Images of protoplanetary disks around the R CrA and HD45677
stars, captured with ESO's Very Large Telescope Interferometer
__________________________________________________________________
Journal Reference:
1. J. Kluska, J.-P. Berger, F. Malbet, B. Lazareff, M. Benisty, J.-B.
Le Bouquin, O. Absil, F. Baron, A. Delboulbé, G. Duvert, A. Isella,
L. Jocou, A. Juhasz, S. Kraus, R. Lachaume, F. Ménard, R.
Millan-Gabet, J. D. Monnier, T. Moulin, K. Perraut, S. Rochat, C.
Pinte, F. Soulez, M. Tallon, W.-F. Thi, E. Thiébaut, W. Traub, G.
Zins. A family portrait of disk inner rims around Herbig Ae/Be
stars. Astronomy & Astrophysics, 2020; 636: A116 DOI:
[22]10.1051/0004-6361/201833774
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:08 2020
Date:
April 30, 2020
Source:
Chalmers University of Technology
Summary:
For the first time, people with arm amputations can experience
sensations of touch in a mind-controlled arm prosthesis that
they use in everyday life. A study reports on three Swedish
patients who have lived, for several years, with this new
technology -- one of the world's most integrated interfaces
between human and machine.
FULL STORY
__________________________________________________________________
For the first time, people with arm amputations can experience
sensations of touch in a mind-controlled arm prosthesis that they use
in everyday life. A study in the New England Journal of Medicine
reports on three Swedish patients who have lived, for several years,
with this new technology -- one of the world's most integrated
interfaces between human and machine.
The advance is unique: the patients have used a mind-controlled
prosthesis in their everyday life for up to seven years. For the last
few years, they have also lived with a new function -- sensations of
touch in the prosthetic hand. This is a new concept for artificial
limbs, which are called neuromusculoskeletal prostheses -- as they are
connected to the user's nerves, muscles, and skeleton.
The research was led by Max Ortiz Catalan, Associate Professor at
Chalmers University of Technology, in collaboration with Sahlgrenska
University Hospital, University of Gothenburg, and Integrum AB, all in
Gothenburg, Sweden. Researchers at Medical University of Vienna in
Austria and the Massachusetts Institute of Technology in the USA were
also involved.
"Our study shows that a prosthetic hand, attached to the bone and
controlled by electrodes implanted in nerves and muscles, can operate
much more precisely than conventional prosthetic hands. We further
improved the use of the prosthesis by integrating tactile sensory
feedback that the patients use to mediate how hard to grab or squeeze
an object. Over time, the ability of the patients to discern smaller
changes in the intensity of sensations has improved," says Max Ortiz
Catalan.
"The most important contribution of this study was to demonstrate that
this new type of prosthesis is a clinically viable replacement for a
lost arm. No matter how sophisticated a neural interface becomes, it
can only deliver real benefit to patients if the connection between the
patient and the prosthesis is safe and reliable in the long term. Our
results are the product of many years of work, and now we can finally
present the first bionic arm prosthesis that can be reliably controlled
using implanted electrodes, while also conveying sensations to the user
in everyday life," continues Max Ortiz Catalan.
Since receiving their prostheses, the patients have used them daily in
all their professional and personal activities.
The new concept of a neuromusculoskeletal prosthesis is unique in that
it delivers several different features which have not been presented
together in any other prosthetic technology in the world:
* It has a direct connection to a person's nerves, muscles, and
skeleton.
* It is mind-controlled and delivers sensations that are perceived by
the user as arising from the missing hand.
* It is self-contained; all electronics needed are contained within
the prosthesis, so patients do not need to carry additional
equipment or batteries.
* It is safe and stable in the long term; the technology has been
used without interruption by patients during their everyday
activities, without supervision from the researchers, and it is not
restricted to confined or controlled environments.
The newest part of the technology, the sensation of touch, is possible
through stimulation of the nerves that used to be connected to the
biological hand before the amputation. Force sensors located in the
thumb of the prosthesis measure contact and pressure applied to an
object while grasping. This information is transmitted to the patients'
nerves leading to their brains. Patients can thus feel when they are
touching an object, its characteristics, and how hard they are pressing
it, which is crucial for imitating a biological hand.
"Currently, the sensors are not the obstacle for restoring sensation,"
says Max Ortiz Catalan. "The challenge is creating neural interfaces
that can seamlessly transmit large amounts of artificially collected
information to the nervous system, in a way that the user can
experience sensations naturally and effortlessly."
The implantation of this new technology took place at Sahlgrenska
University Hospital, led by Professor Rickard Brånemark and Doctor
Paolo Sassu. Over a million people worldwide suffer from limb loss, and
the end goal for the research team, in collaboration with Integrum AB,
is to develop a widely available product suitable for as many of these
people as possible.
"Right now, patients in Sweden are participating in the clinical
validation of this new prosthetic technology for arm amputation," says
Max Ortiz Catalan. "We expect this system to become available outside
Sweden within a couple of years, and we are also making considerable
progress with a similar technology for leg prostheses, which we plan to
implant in a first patient later this year."
More about: How the technology works:
The implant system for the arm prosthesis is called e-OPRA and is based
on the OPRA implant system created by Integrum AB. The implant system
anchors the prosthesis to the skeleton in the stump of the amputated
limb, through a process called osseointegration (osseo = bone).
Electrodes are implanted in muscles and nerves inside the amputation
stump, and the e-OPRA system sends signals in both directions between
the prosthesis and the brain, just like in a biological arm.
The prosthesis is mind-controlled, via the electrical muscle and nerve
signals sent through the arm stump and captured by the electrodes. The
signals are passed into the implant, which goes through the skin and
connects to the prosthesis. The signals are then interpreted by an
embedded control system developed by the researchers. The control
system is small enough to fit inside the prosthesis and it processes
the signals using sophisticated artificial intelligence algorithms,
resulting in control signals for the prosthetic hand's movements.
The touch sensations arise from force sensors in the prosthetic thumb.
The signals from the sensors are converted by the control system in the
prosthesis into electrical signals which are sent to stimulate a nerve
in the arm stump. The nerve leads to the brain, which then perceives
the pressure levels against the hand.
The neuromusculoskeletal implant can connect to any commercially
available arm prosthesis, allowing them to operate more effectively.
More about: How the artificial sensation is experienced:
People who lose an arm or leg often experience phantom sensations, as
if the missing body part remains although not physically present. When
the force sensors in the prosthetic thumb react, the patients in the
study feel that the sensation comes from their phantom hand. Precisely
where on the phantom hand varies between patients, depending on which
nerves in the stump receive the signals. The lowest level of pressure
can be compared to touching the skin with the tip of a pencil. As the
pressure increases, the feeling becomes stronger and increasingly
'electric'.
More about: The research:
The current study dealt with patients with above-elbow amputations, and
this technology is close to becoming a finished product. The research
team is working in parallel with a new system for amputations below the
elbow. In those cases, instead of one large bone (humerus), there are
two smaller bones (radius and ulna) to which the implant needs to be
anchored. The group is also working on adapting the system for leg
prostheses.
In addition to applications within prosthetics, the permanent interface
between human and machine provides entirely new opportunities for
scientific research into how the human muscular and nervous systems
work.
Associate Professor Max Ortiz Catalan heads the Biomechatronics and
Neurorehabilitation Laboratory at Chalmers University of Technology and
is currently establishing the new Center for Bionics and Pain Research
at Sahlgrenska University Hospital, in close collaboration with
Chalmers and the University of Gothenburg, where this work will be
further developed and clinically implemented.
The research has been funded by the Promobilia Foundation, the
IngaBritt and Arne Lundbergs Research Foundation, Region Västra
Götaland (ALF grants), Vinnova, the Swedish Research Council, and the
European Research Council.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]Chalmers University of Technology.
Original written by Johanna Wilde. Note: Content may be edited for
style and length.
__________________________________________________________________
Journal Reference:
1. Max Ortiz-Catalan, Enzo Mastinu, Paolo Sassu, Oskar Aszmann,
Rickard Brånemark. Self-Contained Neuromusculoskeletal Arm
Prostheses. New England Journal of Medicine, 2020; 382 (18): 1732
DOI: [19]10.1056/NEJMoa1917537
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)
-
From
SpaceDaily@1337:3/111 to
All on Thu Apr 30 21:30:08 2020
material discovery
Date:
April 30, 2020
Source:
University of Maryland
Summary:
Scientists have reinvented a 26,000-year-old manufacturing
process into an innovative approach to fabricating ceramic
materials that has promising applications for solid-state
batteries, fuel cells, 3D printing technologies, and beyond.
FULL STORY
__________________________________________________________________
Scientists in the University of Maryland (UMD)'s Department of
Materials Science and Engineering (MSE) have reinvented a
26,000-year-old manufacturing process into an innovative approach to
fabricating ceramic materials that has promising applications for
solid-state batteries, fuel cells, 3D printing technologies, and
beyond.
Ceramics are widely used in batteries, electronics, and extreme
environments -- but conventional ceramic sintering (part of the firing
process used in the manufacture of ceramic objects) often requires
hours of processing time. To overcome this challenge, a Maryland
research team has invented an ultrafast high-temperature sintering
method that both meets the needs of modern ceramics and fosters the
discovery of new material innovations.
The study, led by Liangbing Hu, Herbert Rabin Distinguished Professor
of the A. James Clark School of Engineering and director of the Center
for Materials Innovation at UMD, was published on the May 1 cover of
Science. Chengwei Wang, an assistant research scientist in Hu's group,
served as first author on the study.
Conventional sintering techniques require a long processing time -- it
takes hours for a furnace to heat up, then several hours more to 'bake'
the ceramic material -- which is particularly problematic in the
development of electrolytes for solid-state batteries. Alternative
sintering technologies (such as microwave-assisted sintering, spark
plasma sintering, and flash sintering) are limited for a variety of
reasons, often because they are material-specific and/or expensive.
The Maryland team's new method of ultrafast high-temperature sintering
offers high heating and high cooling rates, an even temperature
distribution, and sintering temperatures of up to 3,000 degrees
Celsius. Combined, these processes require less than 10 seconds of
total processing time -- more than 1,000 times faster than the
traditional furnace approach of sintering.
"With this invention, we 'sandwiched' a pressed green pellet of ceramic
precursor powders between two strips of carbon that quickly heated the
pellet through radiation and conduction, creating a consistent
high-temperature environment that forced the ceramic powder to solidify
quickly," Hu said. "The temperature is high enough to sinter basically
any ceramic material. This patented process can be extended to other
membranes beyond ceramics."
The study was conducted through close collaboration with Yifei Mo
(associate professor, UMD), J.C Zhao (professor and department chair,
UMD), Howard Wang (visiting research professor, UMD), Jian Luo
(professor, UC San Diego), Xiaoyu Zheng (assistant professor, UCLA),
and Bruce Dunn (professor and department chair, UCLA).
"Ultrafast high-temperature sintering represents a breakthrough in
ultrafast sintering technologies, not only because of its general
applicability to a broad range of functional materials, but also due to
a great potential of creating non-equilibrium bulk materials via
retaining or generating extra defects," said Luo.
The rapid sintering technology is being commercialized through
HighT-Tech LLC, a UMD spinoff company with a focus on a range of high
temperature technologies.
"This new method solves the key bottleneck problem in computation and
AI-guided materials discovery," said Mo. "We've enabled a new paradigm
for materials discovery with an unprecedented accelerated pace."
"We are delighted to see the pyrolysis time reduced from tens of hours
to a few seconds, preserving the fine 3D-printed structures after fast
sintering," Zheng said.
make a difference: sponsored opportunity
__________________________________________________________________
Story Source:
[17]Materials provided by [18]University of Maryland. Note: Content may
be edited for style and length.
__________________________________________________________________
Journal Reference:
1. Chengwei Wang, Weiwei Ping, Qiang Bai, Huachen Cui, Ryan Hensleigh,
Ruiliu Wang, Alexandra H. Brozena, Zhenpeng Xu, Jiaqi Dai, Yong
Pei, Chaolun Zheng, Glenn Pastel, Jinlong Gao, Xizheng Wang, Howard
Wang, Ji-Cheng Zhao, Bao Yang, Xiaoyu (rayne) Zheng, Jian Luo,
Yifei Mo, Bruce Dunn, Liangbing Hu. A general method to synthesize
and sinter bulk ceramics in seconds. Science, 2020 DOI:
[19]10.1126/science.aaz7681
__________________________________________________________________
--- up 14 weeks, 2 days, 2 hours, 33 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)