Friday, 27 December 2024


A New Dogma Of Molecular Biology: A Paradigm Shift





Thomas Kuhn, in his groundbreaking book The Structure of Scientific Revolutions, described a paradigm shift in science as a new understanding of nature that:
Is a sufficiently unprecedented theory to attract an enduring group of adherents.
A theory that is open-ended with many problems for the redefined group of practitioners to resolve.

This is precisely the nature of our new understanding of biology, which has occurred over the past twenty years and is now sufficiently advanced to offer a new paradigm.


The previous paradigm was given in what is called the central dogma.
DNA—> RNA—> Protein—> Phenotype


The dogma was enshrined in Jim Watson’s 1965 epic textbook The Molecular Biology of the Gene.

The theory holds that Gregor Mendel's concept of a gene (a discrete heritable trait or phenotype) is the consequence of a change in the text of DNA that alters the function of a protein and, therefore, the phenotype. Sickle cell anemia is a prime example. A single-letter change in the DNA that encodes the hemoglobin protein changes the structure of the protein so that it aggregates to create sickle-shaped red blood cells, leading to the blood clots that define the disease. While true, we now view this paradigm as a special case of a broader reality. Why?

There is a need for a new paradigm to account for a series of anomalous observations.

Over the past twenty years, we have made impressive progress in reading the complete genome sequences of hundreds of thousands of humans and tens of thousands of other species. We have uncovered countless variations among individuals in their inherited DNA sequence and associated those changes with thousands of inherited traits, including inherited diseases.

A big surprise: The vast majority of changes in DNA sequence of consequence are not in the 2% of the human genome that specify proteins but rather in the 90% or more of the genome that does not. In other words, most of what Gregor Mendel described as a trait is usually NOT a change in the sequence of a protein.

If most genes are not proteins, what are they?

Another revelation has also emerged from genomic studies. Recent studies reveal that animal and plant cells copy the great majority of DNA into RNA. Humans copy at least 70% of our genome into RNA, vastly more than the 2% specifying protein. Previously, non-protein coding DNA was related to a trash heap called "junk DNA". Additionally, many variants that affect heritable traits occur in other non-protein-coding regulatory regions, including RNA transcription start, stop, and splicing sequences.

A third anomaly has been apparent for some time. Very simple organisms possess versions of the great majority of the twenty thousand proteins similar to those of complex animals, including us. How do we account for the complexity of most multicellular organisms and ourselves?

In the early days of molecular biology, the universality of the genetic code prompted the Nobel Prize-winning biologist Jaques Monod to proclaim, "What is true for phage lambda is true for the elephant." Elucidating the human genome and those of other species revealed an enormous complexity in the organization of DNA and RNA not evident in viruses or bacteria.


A New Paradigm


We are now in the midst of a very exciting revolution defining what role this new view of DNA and the plethora of RNAs play in defining our biology. A consensus is now emerging as a new biological theory. Whereas our twenty thousand proteins perform the necessary functions for life, it is the RNA, made mostly from the "junk DNA," that controls when and where proteins are made.

I describe this new paradigm as the DNA/RNA Dogma, a description that assigns equal importance to both DNA and RNA, a focus on the control of protein expression as a key to understanding Mendelian inheritance. The DNA/RNA Dogma offers a more comprehensive and accurate picture of Mendel’s genes and our complexity.

The new dogma now reads.

DNA <—> RNA—> Control—> Protein—> Phenotype


The insertion of the word "control" denotes our rapidly expanding knowledge of the role RNA plays in controlling when, where, how many, and in what form proteins are made.

The new double arrow between DNA and RNA describes both the RNA-directed modification of DNA that affects RNA production and the reverse flow of genetic information from RNA not accounted for by the original central dogma. Like all diagrams, this one is oversimplified. For example, some non-coding RNAs affect phenotype.

Analogy may help. I now have extensive experience with Lego in building entire cities for my grandchildren. Lego sets come with a set of colored plastic parts and a set of instructions to assemble them. In this analogy, the plastic parts are the proteins, limited in number, each with a defined form. The instructions are the regulatory RNAs. With the same parts, you can build either a simple or complex structure. Change the instructions, and you change the structure. A single error in the instructions (or much less frequently in a building block) results in a fault in the final structure. There are many more words in the instructions than there are in the different types of building blocks. Most organisms produce very similar sets of proteins but differ markedly in the way those proteins are used. Kits for complex structures, like Ninjago sets, also include a minority of customized blocks analogous to specialized proteins.

The new Dogma meets Thomas Kuhn’s definition of a paradigm shift.The DNA/RNA Dogma is sufficiently unprecedented to attract an enduring set of adherents. Indeed, many new studies elaborating on the discovery and details of new regulatory RNAs and genome structures are published each week.
T. he DNA/RNA Dogma is open-ended, with many problems to be resolved. The weekly accumulation of new discoveries highlights this theory's open-ended nature.

I liken the shift to the DNA/RNA Dogma as akin to the paradigm shift between Newtonian physics and quantum theory. Newtonian physics explained much of what was then known about the physical world. Newtonian physics is not incorrect, but it is only a special case that applies to large objects. The original central dogma explained much of what was known about simple organisms, viruses, and bacteria but did not explain the biology of complex organisms.

New theories must not only explain vast new sets of observations and predict future observations, but they must also be accepted by most practitioners and the public. Thanks to COVID-19 RNA vaccines, the recent appreciation of the importance of RNA is now generally understood by the public. RNA biology is now also at the forefront of gene therapy, gene editing, and cancer therapies.

Source: William A. Haseltine

Friday, 20 December 2024

 

Exact shape of a single photon revealed for the first time

Light is something we interact with every day, but have you ever wondered what the shape of a single particle of light — the photon — actually looks like?

For the first time, researchers have defined the precise shape of a single photon, a huge leap forward in quantum physics and material science.

Dr. Benjamin Yuen from the University of Birmingham led the groundbreaking research that delves deep into the quantum level of light and matter interaction.

His team’s findings, published in Physical Review Letters, provide a visual representation of a photon while enhancing our understanding of how photons are emitted and shaped by their environment.

Understanding photons — the basics

A photon is the fundamental unit of light and all electromagnetic radiation. Unlike particles that have mass, photons are massless and always move at the speed of light.

They exhibit both particle-like and wave-like properties, a concept known as wave-particle duality.

Photons carry energy and momentum, determined by their frequency or wavelength. Simply put, the higher the frequency, the more energy a photon possesses.

This unique dual nature allows photons to interact with matter in special ways, such as being absorbed or emitted by atoms.

These interactions are essential for processes like vision, photosynthesis, and the technology behind our electronic devices.

Photon shape and the “far field”

The interaction between photons and their environment leads to endless possibilities for how light exists and travels.

This limitless potential makes modeling these interactions exceptionally challenging — a puzzle that quantum physicists have been trying to solve for decades.

“Our calculations enabled us to convert a seemingly unsolvable problem into something that can be computed,” Dr. Yuen explained.

“And, almost as a byproduct of the model, we were able to produce this image of a photon, something that hasn’t been seen before in physics.”

By grouping the infinite possibilities of photon interactions into distinct sets, the team produced a model that describes not only how photons interact with the atoms or molecules that emit them, but also how the energy from that interaction travels into the distant surroundings, known as the “far field.”

Why do we care about photon shape?

We depend on photons more than we might realize. Our eyes detect photons, allowing us to see the world around us.

In technology, photons are crucial for solar panels that convert light into electricity and fiber optic cables that provide high-speed internet and communications.

Photons also play a significant role in scientific research. They are central to quantum electrodynamics (QED), the theory that describes how light and matter interact.

Photons act as force carriers for the electromagnetic force, one of the four fundamental forces in the universe.

They facilitate interactions between charged particles like electrons and protons, governing a wide range of physical phenomena.

Advancing quantum physics

“The geometry and optical properties of the environment have profound consequences for how photons are emitted, including defining the photons’ shape, color, and even how likely it is to exist,” explained co-author Professor Angela Demetriadou.

Understanding the precise shape and behavior of photons could revolutionize the way we design nanophotonic technologies.

This could lead to advancements in secure communication methods, improved detection of pathogens, and even control over chemical reactions at a molecular level.

“This work helps us to increase our understanding of the energy exchange between light and matter, and to better understand how light radiates into its nearby and distant surroundings,” Dr. Yuen added.

“Lots of this information had previously been thought of as just ‘noise’ — but there’s so much information within it that we can now make sense of and make use of.”

What happens next?

By comprehending how photons interact with their environment, scientists can engineer light-matter interactions for future applications.

Imagine better sensors that can detect diseases at an early stage, more efficient solar cells that provide cleaner energy, or advancements in quantum computing that could solve complex problems in seconds.

“By understanding this, we set the foundations to be able to engineer light-matter interactions for future applications,” says Dr. Yuen.

The possibilities are vast and could significantly impact various fields, from healthcare to technology and environmental science.

Photon shape and future technology

To sum it all up, this fascinating research represents a significant step forward in quantum physics.

By defining the precise shape of a single photon, the team has provided a tool that could help scientists and engineers design new technologies that harness the unique properties of light.

The findings not only answer long-standing questions about the fundamental nature of photons but also open up new questions and research paths.

As we continue to explore the quantum world, understanding the building blocks of light will be crucial in pushing the boundaries of what’s possible.

The visualization of a photon is a testament to human curiosity and our desire to understand the universe at its most fundamental level.

With this new knowledge, we’re not just looking at light differently — we’re poised to use it in ways we never thought possible.

The full study was published in the journal Physical Review Letters.


#AmoPhysics, #NuclearPhysics,#AtomicNuclei#NuclearReactions#Radioactivity#NuclearFission#NuclearFusion#NuclearEnergy#NuclearPower#FusionResearch#FissionReactors#RadioactiveDecay#NuclearMedicine#NuclearAstrophysics, #ParticleAcceleration#NuclearSafety#NuclearEngineering#NuclearWeapons#RadiationProtection#NuclearPolicy#NuclearWasteManagement.


Visit: amo-physics-conferences.scifat.com/


Award Nomination link: https://jut.li/OOqbg


For More Details : physics@scifat.com


Get Connected Here:
==================
Facebook : www.facebook.com/profile.php?id=100092029748922
Pinterest : in.pinterest.com/physicsresearch2000/
Youtube : www.youtube.com/channel/UCtntbI1RB0O0CFLpr575_fg
Instagram : www.instagram.com/amophysicsawards/


Thursday, 19 December 2024

The first colloidal quantum dot (CQD)-based laser for extended SWIR applications






Existing laser technologies for the extended short-wave infrared (SWIR) spectral range use expensive and complex materials, which limits their scalability and affordability.

ICFO researchers addressed these challenges by developing a novel approach based on colloidal quantum dots. Their laser emits coherent light in the extended SWIR range with large colloidal quantum dots made of lead sulfide (PbS).

This new technology using CQDs (colloidal quantum dots) solves some big problems while still working with silicon CMOS platforms, the same technology used to make computer chips.

The researchers used PbS (lead sulfide) quantum dots, the first semiconductor material to produce laser light over such a wide range of wavelengths. What’s impressive is that they did this without changing the quantum dots’ chemical makeup. These findings help bring us closer to creating smaller, more practical quantum dot lasers.

Additionally, the team was able to make PbS quantum dots emit laser light using a much faster method (nanosecond excitation), replacing the need for expensive, large laser equipment that normally uses femtosecond pulses. This was possible using larger quantum dots, allowing them to absorb more light (ten times more). As a result, they reduced the energy needed to start the laser, making the process much more efficient.

The ability to create low-cost, scalable infrared lasers in the extended SWIR (short-wave infrared) range solves key challenges in many technologies. This innovation could potentially revolutionize areas such as:Detecting harmful gases.
Making eye-safe LIDAR systems.
Improving advanced photonic circuits.
Enhancing imaging in the SWIR range for biological applications.
Industries that use LIDAR systems, gas sensors, and medical imaging.

In addition, this breakthrough supports the development of photonic circuits that work well with silicon, which could lead to even smaller devices and wider use of this technology.

ICREA Prof. Gerasimos Konstantatos said, “Our work represents a paradigm shift in infrared laser technology. For the first time, we’ve achieved lasing in the extended SWIR range with solution-processed materials at room temperature, paving the way for practical applications and the development of more accessible technologies.”

Journal Reference:L. Whitworth, C. Rodá, M. Dalmases, N. Taghipour, M. Dosil, K. Nikolaidou, H. Dehghanpour, G. Konstantatos, Extended Short-Wave Infrared Colloidal Quantum Dot Lasers with Nanosecond Excitation. Adv. Mater. 2024, 2410207. DOI: 10.1002/adma.202410207

Wednesday, 18 December 2024

Quantum Metal Bullion Launches Innovative Gold-Backed Blockchain Token




Quantum Metal Bullion (QMB) PTY LTD Australia has officially launched the Quantum Metal Gold Token ($QMGT) project, marking a significant advancement in bullion-based technology. Partnering with Quantum Metal Digital Solutions Inc. (QMDSI), QMB aims to revolutionize the precious metals industry by leveraging blockchain technology to enhance transparency, security, and accessibility.

Simplifying Gold Ownership and Trade

The $QMGT project introduces a streamlined and secure method for owning, trading, and leveraging gold. Utilizing blockchain technology, the project ensures that transactions are transparent and immutable. This innovation is built on Web3 principles, aiming to create a precious metals marketplace and decentralized financial platform. Users can effortlessly convert their expenses and purchases into gold, thus enhancing the existing Quantum Metal Bullion global community model.

Quantum Metal Bullion has long promoted financial empowerment by allowing individuals and businesses to turn expenses into gold, effectively converting purchases into investments. The current model allows users to buy gold from QMB and convert their holdings into 85% fiat currency to settle financial obligations while retaining gold as a hedge. As gold prices appreciate, users can potentially have their loanable gold pay for itself. Despite its success, generating nearly $500 million in gross sales in 2023, the model has faced challenges due to legacy-based processes.

Addressing Inefficiencies with Blockchain

The $QMGT project aims to overcome these inefficiencies by integrating QMB’s proven model with blockchain technology and decentralized finance (DeFi). This integration enables seamless platform scaling, process improvement, and increased investor confidence through secure and transparent transactions. The project represents a significant leap forward in the precious metals sector by combining gold trading expertise with cutting-edge blockchain solutions.


Leadership Insights

Dato Lim Khong Soon, Founder of Quantum Metal Bullion PTY LTD, expressed excitement about the $QMGT project, emphasizing its potential to open new possibilities for gold ownership and investment. He highlighted the combination of their gold trading expertise with QMDSI’s blockchain solutions as a transformative step for the industry.

Erwin Carmelo T. Escudero, Chairman of Quantum Metal Digital Solutions, reiterated the company’s commitment to innovation and the transformative power of Web3 technology. He emphasized the creation of a unique ecosystem that empowers individuals and businesses, reflecting their dedication to leveraging blockchain for broader financial empowerment.

Key Features of the $QMGT Project

The $QMGT project offers several key features designed to enhance user experience and security:

Gold-backed Token: Each $QMGT token is backed by physical gold held in secure vaults, ensuring transparency and trust for users.

Web3 Ecosystem: The project includes a comprehensive ecosystem for gold trading, e-wallets, and DeFi platforms, all built on a robust blockchain foundation.

Regulatory Compliance: Adhering to the highest regulatory standards, $QMGT provides a safe and compliant environment for all transactions.

Community-Driven Approach: Actively engaging with and fostering a thriving community of gold enthusiasts and investors, the project aims to build a strong and supportive user base.
Expanding Opportunities in the Precious Metals Market

The launch of the $QMGT project by Quantum Metal Bullion signifies a major development in the integration of blockchain technology with precious metals trading. By addressing the inefficiencies of legacy-based processes and providing a transparent, secure platform for gold transactions, QMB is set to enhance investor confidence and broaden access to gold investment opportunities.

Through strategic partnerships and a commitment to innovation, Quantum Metal Bullion is poised to lead the way in the evolving landscape of Web3 and blockchain technology. The $QMGT project not only simplifies gold ownership and trading but also establishes a foundation for future advancements in the precious metals market.

As Quantum Metal Bullion continues to innovate and expand its offerings, the $QMGT project represents a significant step toward creating a more accessible and efficient marketplace for gold investors worldwide. With its emphasis on transparency, security, and community engagement, the project is set to redefine the standards of gold investment in the digital age.

Tuesday, 17 December 2024

Breaking Physics: Inside the Strange World of Quantum Metals



A new study examined how quantum critical metals, which behave unusually under low temperatures, challenge conventional physics theories.

The research reveals that these metals experience significant changes at quantum critical points, potentially informing the development of high-temperature superconductors.
Strange Metals and Quantum Fluctuations

A recent study led by Rice University physicist Qimiao Si sheds light on the mysterious behavior of quantum critical metals — materials that break the usual rules of physics at low temperatures. Published on December 9 in Nature Physics, the research explores quantum critical points (QCPs), where materials hover between two distinct states, such as being magnetic or nonmagnetic. These findings help explain the unique properties of these metals and offer new insights into high-temperature superconductors, which conduct electricity without resistance at relatively high temperatures.

At the heart of the study is quantum criticality, a state where materials become extremely sensitive to quantum fluctuations — tiny disruptions that change how electrons behave. While most metals follow well-established physical laws, quantum critical metals defy these expectations, displaying unusual and collective behaviors that have puzzled scientists for decades. Physicists refer to these systems as “strange metals.”
Role of Quasiparticles in Quantum Metals

“Our work dives into how quasiparticles lose their identity in strange metals at these quantum critical points, which leads to unique properties that defy traditional theories,” said Si, the Harry C. and Olga K. Wiess Professor of Physics and Astronomy and director of Rice’s Extreme Quantum Materials Alliance.

Quasiparticles, representing the collective behavior of electrons acting like individual particles, play a crucial role in energy and information transfer in materials. However, at QCPs, these quasiparticles vanish in a phenomenon known as Kondo destruction. Here magnetic moments in the material cease their usual interaction with electrons, dramatically transforming the metal’s electronic structure.

This change is evident in the Fermi surface, a map of possible electron states within the material. As the system crosses the QCP, the Fermi surface abruptly shifts, significantly altering the material’s properties.
Exploring Universal Behaviors

The study extends beyond heavy fermion metals — materials with unusually heavy electrons — to include copper oxides and certain organic compounds. All of these strange metals exhibit behaviors that defy traditional Fermi liquid theory, a framework used to describe electron motion in most metals. Instead, their properties align with fundamental constants such as Planck’s constant, governing the quantum relationship between energy and frequency.
Implications for Advanced Superconductors

The researchers identified a condition called dynamical Planckian scaling, where the temperature dependence of electronic properties mirrors universal phenomena like cosmic microwave background radiation and the radiation of the “black body” that approximates the behavior of stars. This discovery underscores a shared organizational pattern across various quantum critical materials, offering insights into creating advanced superconductors.
Quantum Transitions in New Materials

The research implications extend to other quantum materials, including iron-based superconductors and those with intricate lattice structures. One example is CePdAl, a compound where the interplay of two competing forces — the Kondo effect and RKKY interactions — determines its electronic behavior. By studying these transitions, scientists hope to decode similar phenomena in other correlated materials, where complex interelectronic relationships dominate.

Observing how these forces shape the material at QCPs could help scientists better understand transitions in other correlated materials or those with complex interelectronic relationships.

Reference: “Quantum critical metals and loss of quasiparticles” by Haoyu Hu, Lei Chen and Qimiao Si, 9 December 2024, Nature Physics.
DOI: 10.1038/s41567-024-02679-7

This research, co-authored by Haoyu Hu and Lei Chen from Rice’s Department of Physics and Astronomy, Extreme Quantum Materials Alliance and Smalley-Curl Institute, was supported by the National Science Foundation, Air Force Office of Scientific Research, Robert A. Welch Foundation, Vannevar Bush Faculty Fellowship and European Research Council.

#AmoPhysics, #NuclearPhysics,#AtomicNuclei#NuclearReactions#Radioactivity#NuclearFission#NuclearFusion#NuclearEnergy#NuclearPower#FusionResearch#FissionReactors#RadioactiveDecay#NuclearMedicine#NuclearAstrophysics, #ParticleAcceleration#NuclearSafety#NuclearEngineering#NuclearWeapons#RadiationProtection#NuclearPolicy#NuclearWasteManagement.


Visit: amo-physics-conferences.scifat.com/


Member Nomation link: https://x-i.me/amocon
Award Nomination link: https://x-i.me/amonom


For More Details : physics@scifat.com


Get Connected Here:
==================
Facebook : www.facebook.com/profile.php?id=100092029748922
Pinterest : in.pinterest.com/physicsresearch2000/
Youtube : www.youtube.com/channel/UCtntbI1RB0O0CFLpr575_fg
Instagram : www.instagram.com/amophysicsawards/



Tuesday, 10 December 2024

Comparing light scattering techniques for biologics development





SLS and DLS are robust investigative methods broadly utilized in the biopharmaceutical sector to characterize solution properties of colloidal drug delivery systems, macromolecules, nanoparticles, and viral vectors.
Static light scattering

SLS includes multi-angle light scattering and right-angle light scattering, techniques that calculate the precise values of the molecular weight, weight-average molecular weight, and the radius of gyration of target macromolecules.
Biopharmaceutical applications

SLS has many uses throughout the biopharmaceutical industry, including:Aggregation and degradation propensity disposition
Bioconjugate stability evaluation
Biopolymer characterization, e.g. bioconjugates, macromolecular complexes, peptides, proteins, RNA
Dynamic light scattering

DLS calculates temporal fluctuations in scattered light intensity to verify hydrodynamic radius size, polydispersity, and macromolecule stability in solution.
Biopharmaceutical applications

DLS has numerous use cases in the biopharmaceutical industry, including:Biologic formulation development
Colloidal stability analysis
Nanoparticle characterization
Quality control
Static vs. dynamic light scattering

SLS and DLS rely on the behavior of scattered light but offer corresponding data regarding investigative samples.

SLS verifies colloidal stability and offers precise values of molar mass, weight-average molecular weight, and radius of gyration of all biologic classes.

DLS is ideal for categorizing colloidal dispersions, macromolecules, and nanoparticles and calculates the polydispersity and hydrodynamic radius of macromolecules and macromolecular complexes in solution. It does not precisely verify absolute molecular weight or molecular weight distributions, only size.
SLS and DLS are corresponding methods

Combining SLS and DLS data offers a broader stability environment for target molecules and complexes. The two techniques are often employed together to better understand the biophysical and solution properties of all biologics.

The ARGEN platform utilizes SLS in a patented formation to enable faster biopharmaceutical development. ARGEN complements R&D instrument portfolios and swiftly evaluates biopolymer stability landscapes during early-phase formulation development.
Utilizing ARGEN to accelerate biopolymer formulation R&D

ARGEN is a patented SLS tool used in biopolymer research to overcome critical challenges. The platform is the equivalent of having multiple static light scattering devices in one benchtop instrument.

ARGEN provides accelerated in-situ, real-time stability monitoring, manufacturing stress modeling, shelf-life determination, and rapid parallel analysis, enabling more efficient formulation development.

Its features include:Real-time stability monitoring
Kinetics of Oligomeric State Transitions
Bioprocessing stress modeling
Parallel analysis
Low-temperature analysis
Versatility
Time reduction
Quality control and process development

These capabilities can create more stable and reliable biopolymer formulations, accelerating biopharma R&D efforts and reducing the time and resources spent on unsuccessful candidates.

#AmoPhysics, #NuclearPhysics,#AtomicNuclei#NuclearReactions#Radioactivity#NuclearFission#NuclearFusion#NuclearEnergy#NuclearPower#FusionResearch#FissionReactors#RadioactiveDecay#NuclearMedicine#NuclearAstrophysics, #ParticleAcceleration#NuclearSafety#NuclearEngineering#NuclearWeapons#RadiationProtection#NuclearPolicy#NuclearWasteManagement.


Visit: amo-physics-conferences.scifat.com/


Member Nomation link: https://x-i.me/amocon
Award Nomination link: https://x-i.me/amonom


For More Details : physics@scifat.com


Get Connected Here:
==================
Facebook : www.facebook.com/profile.php?id=100092029748922
Pinterest : in.pinterest.com/physicsresearch2000/
Youtube : www.youtube.com/channel/UCtntbI1RB0O0CFLpr575_fg
Instagram : www.instagram.com/amophysicsawards/

Monday, 9 December 2024


A novel antenna design can measure faint cosmological radio-frequency signals





The Universe is approximately 13.8 billion years old. Shortly after the Big Bang, it was hot and dense, preventing the existence of atoms and prompting matter formation in the form of electrons, protons, and light nuclei like helium and lithium.

Radiation also coexisted with matter, which we now observe as the Cosmic Microwave Background (CMB). The CMB’s distortions in its spectrum carry vital information about the early Universe.

One such distortion occurred during the Epoch of Recombination when the Universe cooled and expanded. This led to the transition of matter from a fully ionized plasma to neutral hydrogen and helium atoms. As this happened, photons were emitted in a process known as Cosmological Recombination Radiation (CRR), adding a distortion to the CMB spectrum.

It is a significant challenge to detect Cosmic Recombination Radiation (CRR), which is nine orders of magnitude fainter than the CMB. The CMB is measured at about 3 degrees Kelvin, equivalent to -270°C.

CRR’s detection would provide crucial confirmation of our understanding of the Universe’s thermal and ionization history. It would also offer a unique opportunity to measure the abundance of helium in the Universe before it started forming in the cores of stars. However, due to CRR’s extraordinarily faint and elusive nature, the scientific community faces the challenge of developing susceptible instruments to detect this radiation.

Scientists at the Raman Research Institute in Bangalore have developed a novel antenna design to perform sky measurements in the 2.5 – 4 Gigahertz (GHz) frequency range. This frequency range is considered optimal for detecting faint cosmic recombination radiation (CRR) signals, which have never been detected before. These elusive signals contain crucial information that could enhance our understanding of the Universe’s thermal and ionization history.

This unique ground-based broadband antenna can detect as faint as one part in 10,000.

Keerthipriya Sathish, lead author of the paper and Research Scientist at RRI, said, “For the sky measurements we plan to perform, the broadband antenna offered us the highest sensitivity compared to other antennas designed for the same bandwidth. The metric of being frequency-independent over the wideband and ensuring smooth frequency performance is unconventional, something only a custom design, such as ours, could achieve. An off-the-shelf wideband antenna won’t work.”

A fantail antenna has been proposed for detecting Cosmological Recombination Radiation (CRR) due to its unique design and stable frequency performance. This antenna features a dual-polarized dipole design with four arms, each shaped like a fantail. Its key advantage is that it maintains the same radiation pattern across frequencies, with only a +/- 1% variation in characteristics. This allows it to consistently target the same patch of the sky across its full operational bandwidth of 1.5 GHz (2.5 to 4 GHz), which is crucial for distinguishing spectral distortions from galactic foregrounds.

The antenna weighs 150 grams and measures 14 cm x 14 cm in a square box shape. It consists of a flat, low-loss dielectric substrate with the antenna etched in copper on top and an aluminum ground plate on the bottom. A thick foam layer, radio-transparent and housing the antenna’s connectors, sits between these plates, ensuring the antenna is lightweight yet robust for its purpose.

Mayuri Rao, faculty, RRI, said, “The antenna has a sensitivity of around 30 millikelvins (mK) across the 2.5-4 GHz frequency range, enabling it to detect tiny temperature variations in the sky. Even before scaling it to an array, this antenna will enable exciting first-science results once integrated with its custom receiver.”

“We plan to study a reported excess radiation in the sky from a previous experiment at 3.3 GHz, attributed to exotic physics, including Dark Matter annihilation. Such experiments with this antenna will help inform improvements in the antenna and experiment design to go all the way to the sensitivity needed for a CRR detection.”

The authors noted, “An antenna array will be deployed in radio-quiet locations, where there is minimal or no radio frequency interference. The design of this planar antenna is such that it is easily fabricated using methods similar to those used in Printed Circuit Board (PCB) printing. Thus, this design offers high machining accuracy and consistency during replication for multiple-element arrays, is portable and easily deployable.”

#AmoPhysics, #NuclearPhysics,#AtomicNuclei#NuclearReactions#Radioactivity#NuclearFission#NuclearFusion#NuclearEnergy#NuclearPower#FusionResearch#FissionReactors#RadioactiveDecay#NuclearMedicine#NuclearAstrophysics, #ParticleAcceleration#NuclearSafety#NuclearEngineering#NuclearWeapons#RadiationProtection#NuclearPolicy#NuclearWasteManagement.


Visit: amo-physics-conferences.scifat.com/


Member Nomation link: https://x-i.me/amocon
Award Nomination link: https://x-i.me/amonom


For More Details : physics@scifat.com


Get Connected Here:
==================
Facebook : www.facebook.com/profile.php?id=100092029748922
Pinterest : in.pinterest.com/physicsresearch2000/
Youtube : www.youtube.com/channel/UCtntbI1RB0O0CFLpr575_fg
Instagram : www.instagram.com/amophysicsawards/

Saturday, 7 December 2024

Alternate Timelines Can’t Help You, Quantum Physicists Say


The multiverse offers no escape from our reality—which might be a very good thing

By George Musser






As memes go, it wasn’t particularly viral. But for a couple of hours on the morning of November 6, the term “darkest timeline” trended in Google searches, and several physicists posted musings on social media about whether we were actually in it. All the probabilities expressed in opinion polls and prediction markets had collapsed into a single definite outcome, and history went from “what might be” to “that just happened.” The two sides in this hyperpolarized U.S. presidential election had agreed on practically nothing—save for their shared belief that its outcome would be a fateful choice between two diverging trajectories for our world.




That raises rather obvious (but perhaps pointless) questions: Could a “darkest timeline” (or any other “timeline,” for that matter) be real? Somewhere out there in the great beyond, might there be a parallel world in which Kamala Harris electorally triumphed instead?

It turns out that, outside of fostering escapist sociopolitical fantasies and putting a scientific gloss on the genre of counterfactual history, the notion of alternate timelines is in fact something physicists take very seriously. The concept most famously appears in quantum mechanics, which predicts a multiplicity of outcomes—cats that are both alive and dead and all that. If a particle of light—a photon—strikes a mirror that is only partially silvered, the particle can, in a sense, both pass through and reflect off that surface—two mutually exclusive outcomes, known in physics parlance as a superposition. Only one of those possibilities will manifest itself when an observation is made, but until then, the particle juggles both possibilities simultaneously. That’s what the mathematics says—and what experiments confirm. For instance, you can create a superposition and then uncreate it by directing the light onto a second partially silvered mirror. That wouldn’t be possible unless both possibilities remained in play. Although this feature is usually framed in terms of subatomic particles, it is thought to be ubiquitous across all scales in the universe.

What supports the idea that these timelines are real, and not just imaginative fictions, is that they can “interfere” with one another, either enhancing or diminishing the probability of their occurrence. That is, something that might have happened but doesn’t has a measurable effect on what does, as if the former reaches from the shadowy realm of the possible into the world of the actual.

Consider the bomb detector that physicists Avshalom Elitzur and Lev Vaidman proposed in 1993 and that has since been demonstrated (fortunately not with real bombs): Perform the experiment with the partially silvered mirror but place a light-sensitive bomb along one of the two paths the photon can take. This blockage prevents you from uncreating the superposition to restore the traveling photon to its original state. It does so even if the bomb never goes off, indicating that the photon never touched it. The mere possibility that the photon could strike the bomb affects what happens. In theory, you could use this principle—known as counterfactual definiteness—to take x-ray images of cells without subjecting them to damaging radiation. In an emerging subject known as counterfactual quantum computing, a computer outputs a value even if you never press the “run” button.

One way to think about counterfactual definiteness is known as the many-worlds interpretation. A photon striking a mirror causes the cosmic timeline to branch, creating one world in which the particle passes through the mirror and one in which it reflects off that surface. Each of us is stuck inside our world and therefore sees only one outcome at a time, but the other is still there, visible to an inhabitant of the alternate world. All such worlds, taken together, constitute a “multiverse.”

Whether they agree with the many-worlds interpretation or not, physicists and philosophers certainly love to argue about it. Some admire its elegance; others grouse about conceptual difficulties such as the slippery matter of what exactly constitutes a “world.” Quantum theory not only allows multiple worlds but also offers an infinity of ways to define them.

In all the debate over many worlds, though, the key insight of the idea’s originator, physicist Hugh Everett, is often forgotten. Everett developed his view in reaction to assumptions by other physicists that, because we can see only one of the possibilities of a superposition if a particle enters into that state, something must cause all the other possibilities to be discarded. In other words, some mechanism must collapse the superposition—perhaps the act of observation itself or some sporadic randomness inherent to the fabric of reality. Everett noticed a fallacy in this reasoning: it will always look as though the superposition has collapsed, even if it remains intact. The reason is that, in making our observation, we interact with the particle, and together we and it become a single combined system. Because the particle is in superposition, so are we. But we can’t tell. Everett’s fundamental point is this: We are part of the reality we seek to observe, yet no part can fully apprehend the whole, and thus our view is limited. Multiple timelines arise in the hidden recesses imposed by our very embedding within the universe.

Other branches of physics also conceive of existence as comprising forking timelines. Physicists consider counterfactuals when calculating the path of a particle; according to what they call the principle of least action, even a classical particle that exhibits no distinctively quantum effects susses out all the possibilities. In statistical physics, researchers study particles by the septillion by thinking in terms of “ensembles,” which are another kind of multiverse, spanning all the possible ways the particles can be arranged and evolve. Over time, the particles explore all possibilities open to them. We sense their machinations indirectly as the flow of heat and establishment of thermodynamic equilibrium. Going outside physics, evolutionary biologists also routinely talk about multiple timelines: If you reran the evolution of species, would things turn out the same?


All these scientific issues are rooted in a fundamental puzzle: What does it mean to be possible but not actual? Why is there something rather than something else? The physicist Paul Davies has called this the “puzzle of what exists.” It touches not just on esoteric ideas about branching timelines but also on aspects of everyday life such as causation. To say that something causes something else, there must be the possibility that the “something else” would never have happened in the first place. In astrobiologist Sara Imari Walker’s recent book on the physics of life, Life As No One Knows It, she noted that the entire observable universe doesn’t contain enough material to create every single possible small organic molecule, let alone big ones such as the DNA strands we know and love. For her, living things distinguish themselves by making molecules and other structures that are otherwise vanishingly unlikely to exist. Life blazes a path through the void of possibility space.

Perhaps some deep rule selects the actual reality from among the possible realities, but efforts to identify that principle have been serially dashed. It is hard to argue that ours is the best of all possible worlds. Nor, despite what the 19th-century philosopher Arthur Schopenhauer proclaimed, does it seem to be the worst—things could always get worse, Google searches for the “darkest timeline” notwithstanding. For many, such as philosopher David Lewis and cosmologist Max Tegmark, the most straightforward conclusion is that all possible realities exist.

The real question, then, is not whether there are other timelines; there certainly are. Rather it is why we see only one. Perhaps life or intelligence would not be possible if the branching were too evident to us. Physics is replete with such preconditions for our existence. For instance, if temporal flow did not have a directionality—an arrow of time—there could be no lasting change, no memories, no intelligence, no agency. Keeping other timelines hidden might be of similar importance. Quantum superposition may serve some specialized functions in our bodies, but otherwise it—along with any traces of alternate timelines—is dissipated in biology’s vigorous exchange of material and energy with the environment. The very nature of intelligence is to be selective; we would be paralyzed if we had to assay boundless infinitudes. Rather than holding open all possibilities, a mind must settle—at least tentatively—on one. The effort required to make that choice—and, from there, to act upon it—may be key to giving us at least the subjective feeling of free will.

So be careful what you wish for. In dark hours we may imagine alternate timelines and long for escape to another, but we seem to be inseparable from our own. Were it easier to flit between them, we might arrive only at oblivion. Like it or not, we’re stuck in this one—if we want to change it, we’ll have to do that the old-fashioned way.
Rights & Permissions


George Musser is a contributing editor at Scientific American and author of Putting Ourselves Back in the Equation (2023) and Spooky Action at a Distance (2015), both published by Farrar, Straus and Giroux. Follow him on Mastodon @gmusser@mastodon.social, Bluesky @gmusser.bsky.social and Threads @georgemusserjr@threads.net

#AmoPhysics, #NuclearPhysics,#AtomicNuclei#NuclearReactions#Radioactivity#NuclearFission#NuclearFusion#NuclearEnergy#NuclearPower#FusionResearch#FissionReactors#RadioactiveDecay#NuclearMedicine#NuclearAstrophysics, #ParticleAcceleration#NuclearSafety#NuclearEngineering#NuclearWeapons#RadiationProtection#NuclearPolicy#NuclearWasteManagement.


Visit: amo-physics-conferences.scifat.com/


Member Nomation link: https://x-i.me/amocon
Award Nomination link: https://x-i.me/amonom


For More Details : physics@scifat.com


Get Connected Here:
==================
Facebook : www.facebook.com/profile.php?id=100092029748922
Pinterest : in.pinterest.com/physicsresearch2000/
Youtube : www.youtube.com/channel/UCtntbI1RB0O0CFLpr575_fg
Instagram : www.instagram.com/amophysicsawards/

Friday, 6 December 2024

Molecular Blueprint Could Redesign Parkinson’s Disease Therapeutics




GPR6, a G protein-coupled receptor, is primarily expressed in the medium spiny neurons of the striatum, specifically within the striatopallidal pathway. This pathway is heavily impacted by the loss of dopamine-producing neurons in Parkinson’s disease (PD). Targeting GPR6 presents a promising therapeutic approach for a nondopaminergic treatment of PD, which would offer reduced risk for dyskinesia and other side effects. Now, a new preclinical study sheds light on the structure and function of GPR6. The insights could guide future research and the rational design of more selective GPR6-targeting drugs that are more effective and have fewer side effects for patients.

The findings are published in Science Signaling in an article titled, “Structural Insights into the High Basal Activity and Inverse Agonism of the Orphan Receptor GPR6 Implicated in Parkinson’s Disease.”


“GPR6 is an orphan G protein-coupled receptor with high constitutive activity found in D2-type dopamine receptor–expressing medium spiny neurons of the striatopallidal pathway, which is aberrantly hyperactivated in Parkinson’s disease,” the researchers wrote. “Here, we solved crystal structures of GPR6 without the addition of a ligand (a pseudo-apo state) and in complex with two inverse agonists, including CVN424, which improved motor symptoms in patients with Parkinson’s disease in clinical trials.”

Parkinson’s disease causes tremors, rigidity, and a loss of mobility over time, eventually leaving patients disabled. This immobility occurs due to the death of dopamine-releasing (or dopaminergic) neurons in the substantia nigra, a small but vital area of the brain that controls movement and cognition.

Treatments for Parkinson’s disease that restore dopamine levels can temporarily relieve symptoms, but there is currently no therapy that can halt the underlying degeneration of these neurons. However, there are some promising treatments in early trials that target GPR6, which is abundant in some dopaminergic neurons. This receptor has high basal activity—meaning it can exert biological effects even when not bound to an activating ligand. Moreover, it is found in the dopaminergic neurons that stop movement, which tend to be unusually active in Parkinson’s disease.


Mahta Barekatain, PhD, from the University of Southern California, and collaborators used mass spectrometry, mutagenesis, and computer models to analyze the structure of G protein-bound GPR6. The researchers discovered potential mechanisms behind the high basal activity and inverse agonism of GPR6.

They also solved the structures of G protein-bound GPR6 in complex with two compounds that suppress its basal activity (or inverse agonists), including one named CVN424. CVN424 is a non-dopamine therapy that inhibits GPR6, and has shown promise in clinical trials.

The structures and results from the new study could guide the rational design of drugs that modulate GPR6 signaling.