|
自然科學:普及篇 – 開欄文
|
瀏覽6,549 |回應34 |推薦3 |
|
|
|
我是物理系畢業生,有了自然科學的基本常識;容易讀懂科學新知的報導,同時也有興趣接觸它們。 過去《中華雜誌》雖是政論性和人文學術性刊物,但有時會介紹一些自然科學的研究結果;每年也都會刊登有關諾貝爾獎得主的消息。我唸大學時就替《中華》翻譯過一篇報導天文學脈動星的文章。 同窗好友王家堂兄在1980前後,介紹我進入高能物理的「普及科學」世界;此後常常讀一些這方面的書籍。因此,我一直保持著對物理學的興趣,之後自然而然的進入宇宙學領域。 以上三點是這個部落格過去經常轉載自然科學方面報導/論文的背景。
本文於 修改第 3 次
|
生命起源新說法 ---- Caroline Delbert
|
|
|
推薦2 |
|
|
|
我看不太懂這篇報導在說些什麼;不過,生命從何而來的新知都值得存檔備查。 Scientists Say We May Have Been Wrong About the Origin of Life Ancient evidence suggests a new twist in how we all got here. Caroline Delbert, 10/24/25 Here’s what you’ll learn when you read this story: * In a peer-reviewed analysis, scientists quantify amino acids before and after our “last universal common ancestor.” * The last universal common ancestor (LUCA) is the single life form that branched into everything since. * Earth four billion years ago may help us check for life on one of Saturn’s moons today. Scientists are making a case for adjusting our understanding of how exactly genes first emerged. For a while, there’s been a consensus about the order in which the building-block amino acids were “added” into the box of Lego pieces that build our genes. But according to genetic researchers at the University of Arizona, our previous assumptions may reflect biases in our understanding of biotic (living) versus abiotic (non-living) sources. In other words, our current working model of gene history could be undervaluing early protolife (which included forerunners like RNA and peptides), as compared to what emerged with and after the beginning of life. Our understanding of these extremely ancient times will always be incomplete, but it’s important for us to keep researching early Earth. The scientists explain that any improvements in that understanding could not only allow us to know more of our own story, but also help us search for the beginnings of life elsewhere in the universe. In this paper, published in the peer-reviewed journal Proceedings of the National Academy of Science, researchers led by senior author Joanna Masel and first author Sawsan Wehbi explain that vital pieces of our proteins (a.k.a. amino acids) date back four billion years—to the last universal common ancestor (LUCA) of all life on Earth. These chains of dozens or more amino acids, called protein domains, are “like a wheel” on a car, Wehbi said in a statement: “It’s a part that can be used in many different cars, and wheels have been around much longer than cars.” The group used specialized software and National Center for Biotechnology Information data to build an evolutionary (so to speak) tree of these protein domains, which were not theorized or observed until the 1970s. Our knowledge of these details has grown by leaps and bounds. One big paradigm shift proposed by this research is the idea that we should rethink the order in which the 20 essential genetic amino acids emerged from the stew of early Earth. The scientists argue that the current model overemphasizes how often an amino acid appeared in an early life form, leading to a theory that the amino acid found in the highest saturation must have emerged first. This folds into existing research, like a 2017 paper suggesting that our amino acids represent the best of the best, not just a “frozen accident” of circumstances. In the paper, the scientists say that amino acids could have even come from different portions of young Earth, rather than from the entire thing as a uniform environment. Tryptophan, the maligned “sleepy” amino found in Thanksgiving turkey, was a particular standout to the scientists (its letter designation is W). “[T]here is scientific consensus that W was the last of the 20 canonical amino acids to be added to the genetic code,” the scientists wrote. But they found 1.2% W in the pre-LUCA data and just.9% after LUCA. Those values may seem small, but that’s a 25% difference. Why would the last amino acid to emerge be more common before the branching of all resulting life? The team theorized that the chemical explanation might point to an even older version of the idea of genetics. As in all things evolutionary, there’s no intuitive reason why any one successful thing must be the only of its kind or family to ever exist. “Stepwise construction of the current code and competition among ancient codes could have occurred simultaneously,” the scientists conclude. And, tantalizingly, “[a]ncient codes might also have used noncanonical amino acids.” These could have emerged around the alkaline hydrothermal vents that are believed to play a key role in how life began, despite the fact that the resulting life forms did not live there for long. To apply this theory to the rest of the universe, we don’t have to go far, either. “[A]biotic synthesis of aromatic amino acids might be possible in the water–rock interface of Enceladus’s subsurface ocean,” the scientists explain. That’s only as far as Saturn. Maybe a Solar System block party is closer than we think.
本文於 修改第 1 次
|
全球最大斷層線中的兩個形成同步關係 -- Tim Newcomb
|
|
|
推薦1 |
|
|
|
索引: Fault Lines:斷層線 Subduction Zone:俯沖帶 , 隱沒帶 turbidite:濁積岩 我轉載這篇報導,除了因為「地震」是重大且無預警的天然災害外,也紀念我大一讀地質系的經歷。 Two of the Biggest Fault Lines in the World Are Synched Together. That Could Be Disastrous. Tim Newcomb, 10/15/25 Uh-Oh—Two Massive Fault Lines Are Synched Together Arctic-Images - Getty Images 請至原網頁觀看此兩個斷層線會集處照片 Here’s what you’ll learn when you read this story: * Researchers have discovered evidence of “partial synchronization” of two of the world’s most famous fault lines—the northern San Andreas Fault and Cascadia Subduction Zone. * The relationship between the sites means an earthquake in one zone can trigger an earthquake in the other in a phenomenon called “stress triggering.” * The triggers can go either way, with a quake in the San Andreas Fault causing a quake in the Cascadia Subduction Zone or vice versa. Researchers hoping to unravel the dance of earthquake movement in North America’s West Coast discovered a disturbing fact: Two of the world’s largest fault lines sometimes work in conjunction. History, as it turns out, shows that an earthquake in either the Cascadia Subduction Zone or the San Andreas Fault can trigger an earthquake in the other. In a new study published in the journal Geosphere, researchers from Oregon State University—led by Chris Goldfinger, a marine geologist and geophysicist—show evidence of what’s called “partial synchronization” of the northern San Andreas Fault and Cascadia Subduction Zone. Partial synchronization basically means that an earthquake event in one zone has a history of triggering events in the other, and the historical evidence of “significant interaction” between the two (and the potential for more in the future) should be taken as a warning. The core piece of evidence of this relationship comes from the seabed. The team mined 130 sediment cores (dating back 3,100 years) from the Mendocino Triple Junction—where the Cascadia Subduction Zone’s Juan de Fuca plate and Gorda plate (under the North American Plate) rendezvous with the San Andreas Fault off the coast of northern California. There, the sediment layers showed unusual turbidite (the layers formed by marine landslides moving the ocean floor) action, which is often a telltale sign of an earthquake. A typical turbidite features coarse sediment at the bottom with the finer silt settling at the top. At the Mendocino Triple Junction, however, that structure was flipped and “seem[ed] to be upside down with all the sand at the top,” Goldfinger told Scientific American. “And as far as we know, gravity hasn’t changed.” That likely implies those unique turbidite formations were stacked by two earthquakes, one from each zone, in quick succession—a matter of years or even minutes apart. The study shows that eight of the turbidite bends have “substantial temporal overlap” between the Cascadia Subduction Zone and the San Andreas Fault, and that the last major synchronization earthquake event occurred around 1700. Goldfinger compared the situation to tuning a radio to convert incoming signals. “When you tune an old radio, you’re essentially causing one oscillator to vibrate at the same frequency as the other one,” he said. “When these faults synchronize, one fault could tune up the other and cause earthquakes in pairs.” Just because it’s been over 300 years since the last paired quake, however, it doesn’t eliminate future possibilities. “We could expect that an earthquake on one of the faults alone would draw down the resources of the whole country to respond to it,” Goldfinger told The Guardian. “If they both went off together, then you’ve got potentially San Francisco, Portland, Seattle, and Vancouver [B.C.] all in an emergency situation in a compressed timeframe.” While there were only eight major events, the evidence showed the two areas were so closely linked that near-simultaneous quakes weren’t so much a rarity as a rule. “In the paper we stuck to the geology instead of dwelling on the potential doom and gloom,” Goldfinger said. “But it’s pretty clear that if something like this happened—and we think the evidence for it is strong—we need to be prepared.”
|
書評:《天文物理學演進簡史》 -- Mark Mortimer
|
|
|
推薦1 |
|
|
|
The Story of Astrophysics in Five Revolutions Mark Mortimer, 08/03/25 請至原網頁觀看The Story of Astrophysics in Five Revolutions封面 In the northern hemisphere, we're getting on to enjoying summer time which traditionally includes vacationing. Typically, vacations are a time to pause from work and remember life's possibilities beyond work. Now, perhaps you the vacationer want to rekindle a brief fling you had with science or maybe begin a new science tryst. Ersilia Vaudo's book "The Story of Astrophysics in Five Revolutions" could be just the impetus necessary for such a diversion. In "The Story of Astrophysics in Five Revolutions", Vaudo details the progress made by humankind in the science of astrophysics. The book is fairly brief considering all that we've accumulated. But her narration is of someone sharing a passion rather than simply reciting statements. Vaudo reminds us that the study of science includes observation and emotions. For instance, just imagine, as she describes staring up at the night sky and wondering if and when all the other galaxies will disappear due to our exponentially expanding universe. That exercise should generate a powerful emotional response! While this book distinguishes astrophysics into five revolutions, or chapters, I'm not seeing a need for these separations. As commonly done, the book advances in temporal steps from Galileo to Green (apologies for missing out any of the many included references). Similarly, the subject matter advances from the idea of gravity as a force propagating through an ether, onto special relativity, and up to string theory. I find it interesting reading how Vaudo shows developments that initially originate from observation, then from imagination, and finally from mathematics. All together, the book makes for a solid read. Given that Vaudo works with the European Space Agency, the book is somewhat Europe centric. Understandably, the historical aspect of astrophysics is European as Europe is the source of scientific knowledge for most of the Western world. Some of the more recent elements in the book include CERN, ESAs Euclid and CUORE. Admirably, Vaudo includes the names of many previous and current researchers as well as references to their work. These would readily serve as jump-off points for those interested in delving deeper. For me, this book is satisfactorily light on rigid science, e.g. no equations to be found, yet heavy on concept. Further, given its easy, pleasant prose, it does make for an enjoyable summer read. As there's not a strong connection from one line of reasoning to the next, it's also fairly easy to pick up and put down if the weather warrants. Last, note that the original text was written in Italian. Thus kudos to the translator for producing such a joyful manuscript. Vacationing can let your mind wander. Some let their minds wander away from the anthropocentric view and consider humans as just part of a very vast space we call the universe or maybe a multiverse. Ersilia Vaudo in her book "The Story of Astrophysics in Five Revolutions", presents a temporally aligned progression of our understanding of our existence within this great space. And she does so with a lightness to quell concerns of the unknown while adding a thoughtfulness to any day, whether vacationing or other. Mark Mortimer After a wonderful career, I have returned to my passion; space. At Universe Today, I advocate that everyone looks as far into the future as they dare and then envisions how our current actions can bring about this potential utopia. Our future is what we make of today.
本文於 修改第 1 次
|
四種詮釋量子力學的方式 ---- Carlo Rovelli
|
|
|
推薦1 |
|
|
|
Four ways to interpret quantum mechanics Editor;s Notes:Orthodox quantum mechanics is empirically flawless, but founded on an awkward interface between quantum systems and classical probes. In this feature, Carlo Rovelli – himself the originator of the relational interpretation – describes the major schools of thought on how to make sense of a purely quantum world. Carlo Rovelli, 07/09/25 One hundred years after its birth, quantum mechanics is the foundation of our understanding of the physical world. Yet debates on how to interpret the theory – especially the thorny question of what happens when we make a measurement – remain as lively today as during the 1930s. The latest recognition of the fertility of studying the interpretation of quantum mechanics was the award of the 2022 Nobel Prize in Physics to Alain Aspect, John Clauser and Anton Zeilinger. The motivation for the prize pointed out that the bubbling field of quantum information, with its numerous current and potential technological applications, largely stems from the work of John Bell at CERN the 1960s and 1970s, which in turn was motivated by the debate on the interpretation of quantum mechanics. The majority of scientists use a textbook formulation of the theory that distinguishes the quantum system being studied from “the rest of the world” – including the measuring apparatus and the experimenter, all described in classical terms. Used in this orthodox manner, quantum theory describes how quantum systems react when probed by the rest of the world. It works flawlessly. Sense and sensibility The problem is that the rest of the world is quantum mechanical as well. There are of course regimes in which the behaviour of a quantum system is well approximated by classical mechanics. One may even be tempted to think that this suffices to solve the difficulty. But this leaves us in the awkward position of having a general theory of the world that only makes sense under special approximate conditions. Can we make sense of the theory in general? Today, variants of four main ideas stand at the forefront of efforts to make quantum mechanics more conceptually robust. They are known as physical collapse, hidden variables, many worlds and relational quantum mechanics. Each appears to me to be viable a priori, but each comes with a conceptual price to pay. The latter two may be of particular interest to the high-energy community as the first two do not appear to fit well with relativity. Probing physical collapse Upper limits on a “mass proportional” physical-collapse model where the wavefunction is localised with a length rc at a rate λ. Shaded regions are excluded by cold-atom interferometry (magenta), gravitational-wave detectors (green), cantilever experiments (blue), bulk heating constraints (cyan) and searches for spontaneous X-ray emission by the Majorana Demonstrator at the Sanford Underground Research Facility (red). Prior limits from spontaneous X-ray emission are shown in orange. The theoretical lower bounds are indicated by the solid black curve. Credit: Adapted from Majorana Collab. 2022 Phys. Rev. Lett. 129 080401 請至原網頁觀看說明圖 The idea of the physical collapse is simple: we are missing a piece of the dynamics. There may exist a yet-undiscovered physical interaction that causes the wavefunction to “collapse” when the quantum system interacts with the classical world in a measurement. The idea is empirically testable. So far, all laboratory attempts to find violations of the textbook Schrödinger equation have failed (see “Probing physical collapse” figure), and some models for these hypothetical new dynamics have been ruled out by measurements. The second possibility, hidden variables, follows on from Einstein’s belief that quantum mechanics is incomplete. It posits that its predictions are exactly correct, but that there are additional variables describing what is going on, besides those in the usual formulation of the theory: the reason why quantum predictions are probabilistic is our ignorance of these other variables. The work of John Bell shows that the dynamics of any such theory will have some degree of non-locality (see “Non-locality” image). In the non-relativistic domain, there is a good example of a theory of this sort, that goes under the name of de Broglie–Bohm, or pilot-wave theory. This theory has non-local but deterministic dynamics capable of reproducing the predictions of non-relativistic quantum-particle dynamics. As far as I am aware, all existing theories of this kind break Lorentz invariance, and the extension of hidden variable theories to quantum-field theoretical domains appears cumbersome. Relativistic interpretations Let me now come to the two ideas that are naturally closer to relativistic physics. The first is the many-worlds interpretation – a way of making sense of quantum theory without either changing its dynamics or adding extra variables. It is described in detail in this edition of CERN Courier by one of its leading contemporary proponents (see “The minimalism of many worlds“), but the main idea is the following: being a genuine quantum system, the apparatus that makes a quantum measurement does not collapse the superposition of possible measurement outcomes – it becomes a quantum superposition of the possibilities, as does any human observer. Non-locality In the 1960s and 1970s, John Bell reignited interest in the foundations of quantum mechanics, laying the groundwork for the now-vibrant field of quantum information. On the blackboard, entangled particles are shown emerging from a central source and travelling to distant detectors. Their spacelike separation ensures that no signal travelling at or below the speed of light could pass between them. Bell’s theorem suggests that no theory based on local hidden variables alone can fully account for the correlations predicted by quantum mechanics and since confirmed by experiment. Credit: CERN PhotoLab Image 82-6-270 請至原網頁觀看照片 If we observe a singular outcome, says the many-worlds interpretation, it is not because one of the probabilistic alternatives has actualised in a mysterious “quantum measurement”. Rather, it is because we have split into a quantum superposition of ourselves, and we just happen to be in one of the resulting copies. The world we see around us is thus only one of the branches of a forest of parallel worlds in the overall quantum state of everything. The price to pay to make sense of quantum theory in this manner is to accept the idea that the reality we see is just a branch in a vast collection of possible worlds that include innumerable copies of ourselves. Relational interpretations are the most recent of the four kinds mentioned. They similarly avoid physical collapse or hidden variables, but do so without multiplying worlds. They stay closer to the orthodox textbook interpretation, but with no privileged status for observers. The idea is to think of quantum theory in a manner closer to the way it was initially conceived by Born, Jordan, Heisenberg and Dirac: namely in terms of transition amplitudes between observations rather than quantum states evolving continuously in time, as emphasised by Schrödinger’s wave mechanics (see “A matter of taste” image). Observer relativity The alternative to taking the quantum state as the fundamental entity of the theory is to focus on the information that an arbitrary system can have about another arbitrary system. This information is embodied in the physics of the apparatus: the position of its pointer variable, the trace in a bubble chamber, a person’s memory or a scientist’s logbook. After a measurement, these physical quantities “have information” about the measured system as their value is correlated with a property of the observed systems. Quantum theory can be interpreted as describing the relative information that systems can have about one another. The quantum state is interpreted as a way of coding the information about a system available to another system. What looks like a multiplicity of worlds in the many-worlds interpretation becomes nothing more than a mathematical accounting of possibilities and probabilities. A matter of taste Relational quantum mechanics develops the perspectives of Dirac (left) and Heisenberg (centre), while the many worlds interpretation leans more heavily on Schrödinger’s (right) conception of the wave function as primitive. Credit: American Institute of Physics 請至原網頁觀看照片 The relational interpretation reduces the content of the physical theory to be about how systems affect other systems. This is like the orthodox textbook interpretation, but made democratic. Instead of a preferred classical world, any system can play a role that is a generalisation of the Copenhagen observer. Relativity teaches us that velocity is a relative concept: an object has no velocity by itself, but only relative to another object. Similarly, quantum mechanics, interpreted in this manner, teaches us that all physical variables are relative. They are not properties of a single object, but ways in which an object affects another object. The QBism version of the interpretation restricts its attention to observing systems that are rational agents: they can use observations and make probabilistic predictions about the future. Probability is interpreted subjectively, as the expectation of a rational agent. The relational interpretation proper does not accept this restriction: it considers the information that any system can have about any other system. Here, “information” is understood in the simple physical sense of correlation described above. Like many worlds – to which it is not unrelated – the relational interpretation does not add new dynamics or new variables. Unlike many worlds, it does not ask us to think about parallel worlds either. The conceptual price to pay is a radical weakening of a strong form of realism: the theory does not give us a picture of a unique objective sequence of facts, but only perspectives on the reality of physical systems, and how these perspectives interact with one another. Only quantum states of a system relative to another system play a role in this interpretation. The many-worlds interpretation is very close to this. It supplements the relational interpretation with an overall quantum state, interpreted realistically, achieving a stronger version of realism at the price of multiplying worlds. In this sense, the many worlds and relational interpretations can be seen as two sides of the same coin. Every theoretical physicist who is any good knows six or seven different theoretical representations for exactly the same physics I have only sketched here the most discussed alternatives, and have tried to be as neutral as possible in a field of lively debates in which I have my own strong bias (towards the fourth solution). Empirical testing, as I have mentioned, can only test the physical collapse hypothesis. There is nothing wrong, in science, in using different pictures for the same phenomenon. Conceptual flexibility is itself a resource. Specific interpretations often turn out to be well adapted to specific problems. In quantum optics it is sometimes convenient to think that there is a wave undergoing interference, as well as a particle that follows a single trajectory guided by the wave, as in the pilot-wave hidden-variable theory. In quantum computing, it is convenient to think that different calculations are being performed in parallel in different worlds. My own field of loop quantum gravity treats spacetime regions as quantum processes: here, the relational interpretation merges very naturally with general relativity, because spacetime regions themselves become quantum processes, affecting each other. Richard Feynman famously wrote that “every theoretical physicist who is any good knows six or seven different theoretical representations for exactly the same physics. He knows that they are all equivalent, and that nobody is ever going to be able to decide which one is right at that level, but he keeps them in his head, hoping that they will give him different ideas for guessing.” I think that this is where we are, in trying to make sense of our best physical theory. We have various ways to make sense of it. We do not yet know which of these will turn out to be the most fruitful in the future. Carlo Rovelli, Aix-Marseille University. Further reading: C Rovelli 2021 Helgoland (Penguin). C Rovelli 2021 arXiv:2109.09170. A Bassi et al. 2023 arXiv:2310.14969. A Valentini 2024 arXiv:2409.01294.
本文於 修改第 1 次
|
生物分佈及擴散法則 -- D. Orf
|
|
|
推薦1 |
|
|
|
Scientists Just Proved That All Life on Earth Follows One Simple Rule A new study provides empirical evidence for a rule explored by biogeographers for centuries. Darren Orf, 06/19/25 All Life on Earth Follows This One Simple Rule CHUYN - Getty Images 請至原網頁觀看照片 Here’s what you’ll learn when you read this story: * The organization of life on Earth follows a simple, hidden rule known as “core-to-transition organization.” * Hypothesized by biogeographers for centuries, a new study finally finds empirical evidence of this phenomenon using geographic dispersion data across five separate taxa. * This shows how a majority of species originate from “core regions,” but those species suitable to heat and drought often colonize areas beyond those regions. The Earth is home to incredibly remarkable and diverse biomes that host millions of species worldwide. (George Lucas managed to create an entire galaxy far, far away for Star Wars using just the natural wonders mostly found in the state of California.) Although life appears relatively well-distributed across countries and continents—barring Antarctica, of course—a new study suggests that biodiversity isn’t so much an evenly distributed blanket across the planet as it is a “core-to-transition” organization. This is the insight gleaned from a new article—published earlier this month in the journal Nature Ecology & Evolution—analyzing how organisms are divided into biogeographical regions, or bioregions, across the planet’s surface. An international team of scientists from Sweden, Spain, and the U.K. examined the global distribution maps of species across a variety of limbs on the tree of life, including amphibians, birds, dragonflies, mammals, marine rays, reptiles, and even trees. Because of this vast swath of differing types of life, the researchers expected that species distribution would vary wildly due to environmental and historical factors. However, what they discovered is that life all around the world proliferates through a very similar process. First, there is a core area where life appears to flourish, and from there, species tend to radiate outward—hence “core-to-transition” organization. “In every bioregion, there is always a core area where most species live,” Rubén Bernardo-Madrid, a co-author of the study from Umeå University, said in a press statement. “From that core, species expand into surrounding areas, but only a subset manages to persist. It seems these cores provide optimal conditions for species survival and diversification, acting as a source from which biodiversity radiates outward.” These “core” regions are immensely important, as they only cover about 30 percent of the world’s surface but contain more biodiversity than the other 70 percent. These regions likely evolved because they were originally refuges from the devastation brought on by past climatic events, such as the Last Glacial Maximum. The study also shows that overall species must be well adapted for heat and drought to colonize new areas beyond these core bioregions. “The predictability of the pattern and its association with environmental filters can help to better understand how biodiversity may respond to global change,” Joaquín Calatayud, co-author of the study from Rey Juan Carlos University, said in a press statement. Of course, this core-to-transition organization idea isn’t a new one. Biogeographers have largely illustrated this phenomenon over the centuries, but this is the first time that empirical evidence has confirmed these long-standing suspicions. Understanding the relationship between life and these bioregions can help inform conservation decisions while predicting how certain species may respond to a new type of climatic uncertainty—anthropogenic climate change. “Our core-to-transition hypothesis and results,” the authors wrote, “show that global variations in species richness can be better understood by unravelling the genesis of regional hotspots and the subsequent filtering of species to the rest of the biogeographical region.” You Might Also Like * The Do’s and Don’ts of Using Painter’s Tape * The Best Portable BBQ Grills for Cooking Anywhere * Can a Smart Watch Prolong Your Life?
本文於 修改第 1 次
|
宇宙間有暗物質嗎? -- M. L. Corredoira
|
|
|
推薦1 |
|
|
|
此文和挑戰「大爆炸」理論(本欄2025/06/11)兩者可視為物理學基本論簡介(該欄2025/06/20)一文的案例。 A universe without dark matter The trouble with the cosmological consensus Martín López Corredoira, 06/16/25 Editor’s Note: The standard cosmological model, critics argue, is built on unobserved phenomena like dark matter, but is defended by the mainstream despite mounting contradictions. However, physicist Martín López-Corredoira argues that there is no elusive dark matter particle waiting to be found that would explain the Big Bang. Instead, we may need a patchwork of explanations: modified gravity, baryonic matter, and contextual fixes. It’s time to abandon the search for a one-size-fits-all cosmic theory. Our current picture of the cosmos is one of an expanding universe governed by gravity, born from a hot beginning and stretched uniformly across vast scales. But this vision only holds together when we invoke mysterious unseen forces -- inflation, dark energy, and more. This prevailing cosmological model -- also known as the Big Bang or Lambda-CDM (ΛCDM) -- has countless problems, but it is mainstream, and almost all the scientific and economic efforts of the community of astrophysicists and theoretical physicists dealing with the question are focused on the search for evidence that can confirm it. Other models explaining the cosmos -- for example those without a Big Bang beginning, without an inflationary period, without dark energy and dark matter -- do not receive the same attention. But if any of them are correct, the cosmos could be very different from the mainstream version we have been taught as dogma. The existence of dark or invisible matter detectable only through its gravitational influence has been known by astronomers for a long time now. In 1844, Friedrich Wilhelm Bessel argued that the observed proper motions of the stars Sirius and Procyon could be explained only in terms of the presence of faint companion stars. In 1846, Urbain Jean Joseph Le Verrier and John Couch Adams independently predicted the existence of Neptune based on calculations of the anomalous motions of Uranus. Le Verrier later proposed the existence of the planet Vulcan to explain anomalies in the orbit of Mercury, but he failed on that occasion because the solution was not invisible matter but a change of gravitational laws, as was solved years later by Einstein with General Relativity. In 1933, Swiss astronomer Fritz Zwicky was studying rich clusters of galaxies -- large clusters containing hundreds or thousands of individual galaxies. He applied the virial theorem to calculate how much matter should be present based on the gravitational forces needed to hold these clusters together. There appeared to be about 60 times more matter than could be accounted for by all the visible stars and gas. In 1939, Horace W. Babcock first showed the need for dark matter for an individual galaxy by measuring how fast stars were rotating in the outer regions of the M31 galaxy, also known as Andromeda. The rotational velocity was faster than it should have been based on visible matter. At that time, however, the majority of astronomers were not yet convinced of the need for dark matter haloes in galaxies. They became more convinced in the 1970s with the rotation curves measured by Albert Bosma using radio telescopes and by Vera Rubin, William Kent Ford Jr., and Nortbert Thonnard using optical telescopes. Vera Rubin is the most popular name among the cited ones, possibly because she is a woman and there is a desperate aim of feminists within science to remake the role of the some relevant women in science. There were even repeated complaints because she was not given a Nobel Prize; and a telescope bearing her name [“Vera Rubin Telescope”, formerly known as Legacy Survey of Space and Time (LSST)] is the only one in which it is forbidden to use an acronym (VRT), as it happens with all of the rests of telescopes in the world, in order to make visible that it corresponds to a name of a woman. In any case, Vera Rubin was not the first one and was not the unique one among pioneers investigating rotation curves of galaxies, and only rotations of galaxies were not enough to support the idea of the present-day idea of (non-baryonic) dark matter. Cosmology has indeed played a very important role in the idea of dark matter on galactic scales. The first predictions based on cosmic microwave background radiation (CMBR) anisotropies -- tiny temperature variations -- were wrong. It was predicted in the 1960s that the relative fluctuations in temperature should be one part in a hundred or a thousand; however, fluctuations of this amplitude could not be found from observations in the 1970s. In order to solve this problem, non-baryonic (baryons are mainly ordinary particles composed by protons and neutrons, which interact with fermions like electrons or others) dark matter was introduced ad hoc. Shortly afterwards, the connection between particle physics and the missing mass problem in galaxies arose. Many astrophysicists considered dark matter haloes surrounding galaxies and galaxy clusters possibly to consist of a gas of non-baryonic particles rather than faint stars or other astrophysical objects. This was a felicitous notion for which there was no proof; there is no proof that directly connects the problem of the amplitude of cosmic microwave background radiation anisotropies with the rotation curves of galaxies or the missing mass in clusters, but the idea was pushed by leading cosmologists, who made the idea fashionable throughout the astrophysical community. This shows a mindset that is common among astrophysicists: accepting facts only when there is a theory to support them with an explanation, a not-so-empirical approach that has dominated the development of cosmology. Part of the success of these non-baryonic dark matter scenarios in the haloes of the galaxies was due to the good agreement of simulations of large scale structure with the observed distributions of galaxies. At first, in the 1980s, with the attempt to fit the data using hot dark matter composed of neutrinos, the simulations showed that very large structures should be formed first and only later go on to form galaxy-sized haloes through fragmentation, which did not match the observations. But later cold dark matter models were more successful, at least on large scales. This tendency to sell a failure of a prediction as a success for a model via the ad hoc introduction of some convenient form of unknown dark matter still prevails. That there is some dark matter, either baryonic or non-baryonic, is clear, but how much is there, and what is its nature? The success of the standard model in converting a hypothesis into a solid theory depends strongly on the answer to these unanswered questions. Some authors have been led to question the very existence of this dark matter on galactic scales since evidence for it is weak and the predictions do not fit the observations: cold dark matter has a “small scale crisis” since there are some features of galaxies that are very different from the predictions of the cosmological model. Issues such as predicted angular momentum in galaxies is much less than observed, cusped halos, bars rotating too rotating without signs of interaction with the halo, lack of dynamical friction, missing satellites, the central densities of dwarfs in the simulations are much higher than the observed ones, satellite planes, and other problems challenge the standard picture. This has led some prominent astrophysicists, such as Pavel Kroupa, to claim that non-baryonic dark matter does not exist at all. Nonetheless, most researchers are still eagerly trying to find solutions that make the data and standard models including dark matter compatible, assuming a priori that the model “must be” correct. Even if rotation curves of spiral galaxies suggest that there should be some dark matter, last analyses in our Galaxy, the Milky Way, are showing that the amount of dark matter is much lower than expected some decades ago. Other analyses based on the distribution of velocities of stars in the vertical direction conclude that, although there may be some dark matter, the data are also compatible with a total lack of dark matter within Newtonian gravity. In any case, no proof has been given that this putative dark matter is non-baryonic. Rotation curves in spiral galaxies can also be explained with magnetic fields, non-circular orbits in the outer disc, alternative gravity theories, or types of dark matter different from non-baryonic cold dark matter proposed in the standard model. The most popular alternative to dark matter is the modification of gravity laws proposed in MOND (Modified Newtonian Dynamics), which modifies the Newtonian law for very low accelerations. Perhaps the most severe caveat against the hypothesis of non-baryonic dark matter is that, after the great effort expended in looking for it, it has not yet been found. Microlensing surveys—astronomical observations of invisible objects temporarily brightening background stars like a magnifying glass—show that our Galaxy’s halo contains far fewer dim stars and brown dwarfs than would be necessary to account for dark matter haloes. Neither massive black hole haloes nor intermediate-mass primordial black holes provide a consistent scenario. The nature of dark matter has been investigated and there are no suitable astrophysical candidates. The other possibility is that dark matter is not concentrated in any kind of astrophysical object but in a gas of exotic non-baryonic particles. There are three possible types of candidates: 1) particles predicted by the supersymmetry hypothesis, which are electrically neutral and not strongly interacting, including superpartners of neutrinos, photons, Z bosons, Higgs bosons, gravitons, and others; 2) axions, typically extremely lightweight particles about a billionth the mass of an electron, that were originally proposed to resolve certain problems in quantum chromodynamics; and 3) Weakly Interacting Massive Particles (WIMPs), which are those particles that interact through the weak force. Many attempts to search for exotic particles have also ended without success. Technologies used to directly detect a dark matter particle have failed to obtain any positive result. Usually, the scientists involved in these projects attribute their failure of detection to the inability of the detectors to reach the necessary low cross section of the interaction, or to the possibility that they may be 3-4 orders of magnitude below the possible flux of gamma rays emitted by dark matter, and they ask for more funding to continue to feed their illusions. This is a never-ending story, another race between Achilles and the tortoise. This will never constitute a falsification of the Lambda-CDM model because although success of detection would confirm the standard paradigm, non-detection is not applied to discard it. In my opinion, the problem of “dark matter” is not a single problem but a composite of many different issues within astrophysics that may have different solutions. The idea that the same non-baryonic dark matter necessary to explain the low anisotropies in the cosmic microwave background radiation is going to solve the large-scale structure distribution, the lack of visible matter in clusters, the dispersion of velocities of their galaxies, the measurements of gravitational lensing, the rotation curves and more, is a happy fantasy that has dominated astrophysics for the last 50 years. It would be wonderful if we also have a happy ending with the discovery of the particles of dark matter that constitute the dark haloes of galaxies. But in the absence of that outcome, it would be prudent to bet on a combination of different elements to explain the entire set of unexplained phenomena: possibly some baryonic dark matter in some cases, possibly a modification of gravity is part of the explanation for a wide set of events, and maybe with cold dark matter dominating some phenomena and hot dark matter other phenomena. Certainly, a unified picture of a unique non-baryonic type of cold dark matter to explain everything would be a simpler and more elegant hypothesis. But reality may not conform to our aesthetic standards. This text contains some excerpts from Chapter 3 of the book Fundamental Ideas in Cosmology. Scientific, philosophical and sociological critical perspective. 表單的頂端Martín López Corredoira is a Staff Researcher at Instituto de Astrofísica de Canarias (Canary islands, Spain) and author of The Twilight of the Scientific Age. He holds PhDs in both Philosophy and Physics. 表單的底部Related Posts: Ethics, death and the block universe Stephen Hawking's radical final theory The world is not a quantum wave function Spacetime is not a continuum, it's made up of discrete pieces Related Videos: Cosmology needs philosophy With Bjørn Ekeberg Quantum and the unknowable universe Adventures in space Journey to other dimensions The trouble with string theory
本文於 修改第 1 次
|
探測到以往未能觀察正常物質藏身之處-Robert Lea
|
|
|
推薦0 |
|
|
|
Scientists find universe's missing matter while watching fast radio bursts shine through 'cosmic fog' "Fast radio bursts shine through the fog of the intergalactic medium, and by precisely measuring how the light slows down, we can weigh that fog, even when it's too faint to see." Robert Lea, 06/16/25 Astronomers have used fast radio bursts (FRBs)— brief, bright radio signals from distant galaxies— to pinpoint the location of the universe’s “missing” matter in the space between galaxies. | Credit: Jack Madden, IllustrisTNG, Ralf Konietzka, Liam Connor/CfA 請至原網頁觀看示意圖 Half of the universe's ordinary matter was missing — until now. Astronomers have used mysterious but powerful explosions of energy called fast radio bursts (FRBs) to detect the universe's missing "normal" matter for the first time. This previously missing stuff isn't dark matter, the mysterious substance that accounts for around 85% of the material universe but remains invisible because it doesn't interact with light. Instead, it is ordinary matter made out of atoms (composed of baryons) that does interact with light but has until now just been too dark to see. Though this puzzle might not quite get as much attention as the dark matter conundrum — at least we knew what this missing matter is, while the nature of dark matter is unknown — but its AWOL status has been a frustrating problem in cosmology nonetheless. The missing baryonic matter problem has persisted because it is spread incredibly thinly through halos that surround galaxies and in diffuse clouds that drift in the space between galaxies. Now, a team of astronomers discovered and accounted for this missing everyday matter by using FRBs to illuminate wispy structures lying between us and the distant sources of these brief but powerful bursts of radio waves. "The FRBs shine through the fog of the intergalactic medium, and by precisely measuring how the light slows down, we can weigh that fog, even when it's too faint to see," study team leader Liam Connor, a researcher at the Center for Astrophysics, Harvard & Smithsonian (CfA), said in a statement. FRBs are FAB searchlights for missing matter FRBs are pulses of radio waves that often last for mere milliseconds, but in this brief time they can emit as much energy as the sun radiates in 30 years. Their origins remain something of a mystery. That's because the short duration of these flashes and the fact that most occur only once make them notoriously hard to trace back to their source. Yet for some time, their potential to help "weigh" the matter between galaxies has been evident to astronomers. Though thousands of FRBs have been discovered, not all were suitable for this purpose. That's because, to act as a gauge of the matter between the FRB and Earth, the energy burst has to have a localized point of origin with a known distance from our planet. Thus far, astronomers have only managed to perform this localization for about 100 FRBs. Connor and colleagues, including California Institute of Technology (Caltech) assistant professor Vikram Ravi, utilized 69 FRBs from sources at distances of between 11.7 million to about 9.1 billion light-years away. The FRB from this maximum distance, FRB 20230521B, is the most distant FRB source ever discovered. An animation shows the random appearance of fast radio bursts (FRBs) across the sky. | Credit: NRAO Outreach/T. Jarrett (IPAC/Caltech); B. Saxton, NRAO/AUI/NSF) 請至原網頁觀看照片 Of the 69 FRBs used by the team, 39 were discovered by a network of 110 radio telescopes located at Caltech's Owen Valley Radio Observatory (OVRO) called the Deep Synoptic Array (DSA). The DSA was built with the specific mission of spotting and localizing FRBs to their home galaxies. Once this had been done, instruments at Hawaii's W. M. Keck Observatory and at the Palomar Observatory near San Diego were used the measure the distance between Earth and these FRB-source galaxies. Many of the remaining FRBs were discovered by the Australian Square Kilometre Array Pathfinder (ASKAP), a network of radio telescopes in Western Australia that has excelled in the detection and localization of FRBs since it began operations. As FRBs pass through matter, the light that comprises them is split into different wavelengths. This is just like what happens when sunlight passes through a prism and creates a rainbow diffraction pattern. The angle of the separation of these different wavelengths can be used to determine how much matter lies in the clouds or structures that the FRBs pass through. "It's like we're seeing the shadow of all the baryons, with FRBs as the backlight," Ravi explained. "If you see a person in front of you, you can find out a lot about them. But if you just see their shadow, you still know that they're there and roughly how big they are." The team's results allowed them to determine that approximately 76% of the universe's normal matter lurks in the space between galaxies, known as the intergalactic medium. They found a further 15% is locked up in the vast diffuse haloes around galaxies. The remaining 9% seems to be concentrated within the galaxies, taking the form of stars and cold galactic gas. The distribution calculated by the team is in agreement with predictions delivered by advanced simulations of the universe and its evolution, but it represents the first observational evidence of this. The team's results could lead to a better understanding of how galaxies grow. For Ravi, however, this is just the first step toward FRBs becoming a vital tool in cosmology, aiding our understanding of the universe. The next step in this development may well be Caltech's planned radio telescope, DSA-2000. This radio array, set to be constructed in the Nevada desert, could spot and localize as many as 10,000 FRBs every year. This should both boost our understanding of these powerful blasts of radio waves and increase their usefulness as probes of the universe's baryonic matter content. The team's research was published on Monday (June 16) in the journal Nature Astronomy. Related Stories: — What are fast radio bursts? — Mysterious fast radio burst traced back to massive 'cosmic graveyard' of ancient stars — Mysterious fast radio bursts could be caused by asteroids slamming into dead stars
本文於 修改第 1 次
|
挑戰「大爆炸」理論 -- Sarah Knapton
|
|
|
推薦1 |
|
|
|
下文和此文(本欄2025/06/20)兩者可視為物理學基本論簡介(該欄2025/06/20)一文的案例。 Big Bang theory is wrong, claim scientists
Researchers suggest universe was formed following gravitational collapse that generated massive black hole Sarah Knapton, 06/10/25 A black hole pictured by the Spitzer space telescope - AFP/Getty Images請至原網頁觀看照片 The Big Bang theory is wrong and the universe is sitting inside a black hole, scientists have suggested. Since the 1930s – when Belgian theoretical physicist Georges Lemaitre proposed the universe emerged from a “primeval atom” – researchers have believed that everything that exists exploded from a single point of infinite density, or singularity. An international team of physicists, led by the University of Portsmouth’s Institute of Cosmology and Gravitation, has now suggested instead that the universe formed following a huge gravitational collapse that generated a massive black hole. Matter within the black hole was crunched down before huge amounts of stored energy caused it to bounce back like a compressed spring, creating our universe. The new theory has been named Black Hole Universe and suggests that, rather than the birth of the universe being from nothing, it is the continuation of a cosmic cycle. It also suggests that the edge of our universe is the event horizon of a black hole, from which light cannot escape, making it impossible for us to see beyond into our parent universe. And it implies other black holes may also contain unseen universes. How the universe could have been born in black hole: 1. Accretion disc collapses to form black hole with matter crunched down to highly dense point 2. Matter reaches max density, and bounces back forming universe 3. The edge of the universe is actually the event horizon of a black hole, from which no light can escape 請至原網頁觀看示意片 Prof Enrique Gaztañaga said: “We’ve shown that gravitational collapse does not have to end in a singularity and found that a collapsing cloud of matter can reach a high-density state and then bounce, rebounding outward into a new expanding phase. “What emerges on the other side of the bounce is a universe remarkably like our own. Even more surprisingly, the rebound naturally produces a phase of accelerated expansion driven not by a hypothetical field but by the physics of the bounce itself. “We now have a fully worked-out solution that shows the bounce is not only possible – it’s inevitable under the right conditions.” Defying quantum mechanics The Big Bang theory was based on classic physics, but scientists have struggled to make it fit with the known effects of quantum mechanics, which sets a limit on how much matter can be compressed. Physicists such as Roger Penrose and Prof Stephen Hawking had suggested that gravitational collapse inside a black hole must lead to a singularity, but under the new model that does not need to happen. Matter does not need to crunch down infinitely, just enough so it can bounce back. Unlike the Big Bang theory, the new theory model aligns with both the general theory of relativity and quantum physics. Prof Gaztañaga added: “In contrast to the famous singularity theorems by Penrose and Hawking, which assume that matter can be compressed indefinitely, we rely on the fact that quantum physics places fundamental limits on how much matter can be compressed. “The Black Hole Universe also offers a new perspective on our place in the cosmos. In this framework, our entire observable universe lies inside the interior of a black hole formed in some larger ‘parent’ universe. “We are not special. We are not witnessing the birth of everything from nothing, but rather the continuation of a cosmic cycle – one shaped by gravity, quantum mechanics, and the deep interconnections between them.” The theory that the universe might exist inside a black hole was first proposed in 1972 by Raj Kumar Pathria, an Indian theoretical physicist, but gained little traction. However, recent observations by the James Webb Space Telescope have reignited interest in the concept. Anomaly of galaxies’ rotation In March, images of early galaxies showed that two thirds were spinning clockwise, with the remaining third rotating anti-clockwise. In a random universe, the distribution should be even – so something was causing an anomaly. One explanation is that the universe was born rotating, which would occur if it had been created in the interior of a black hole. Lior Shamir, an associate professor of computer science at Kansas State University said: “That explanation agrees with theories such as black hole cosmology, which postulates that the entire universe is the interior of a black hole.” Black holes form when the core of a massive star collapses under its own galaxy, leading to a supernova explosion. They cannot be seen because of the strong gravity that is pulling light into the black hole’s centre. However, scientists can see the effects of its strong gravity on the stars and gases around it, and it sometimes forms an accretion disc of spiralling gas which emits X-rays. Under the theory of black hole cosmology, each black hole could produce a new “baby universe” connected to the outside universe through a an Einstein-Rosen bridge, or a “wormhole”. Scientists are hoping that the new model may be able to explain other mysteries in the universe, such as the origin of supermassive black holes, the nature of dark matter, or the formation and evolution of galaxies. The new research was published in the journal Physical Review D. Broaden your horizons with award-winning British journalism. Try The Telegraph free for 1 month with unlimited access to our award-winning website, exclusive app, money-saving offers and more.
本文於 修改第 3 次
|
又一個可能將生命賦予分子的情境 -- Mindy Weisberger
|
|
|
推薦1 |
|
|
|
Scientists redid an experiment that showed how life on Earth could have started. They found a new possibility Mindy Weisberger, CNN, 03/28/25 “Microlightning” exchanges among water droplets could have sparked the building blocks of life on ancient Earth, new research finds. Here, a wave breaks on White Sand Beach on the Thai island of Koh Chang. - Frank Bienewald/LightRocket/ Getty Images 請至原網頁觀看照片 “It’s alive! IT’S ALIVE!” In the 1931 movie “Frankenstein,” Dr. Henry Frankenstein howling his triumph was an electrifying moment in more ways than one. As massive bolts of lightning and energy crackled, Frankenstein’s monster stirred on a laboratory table, its corpse brought to life by the power of electricity. Electrical energy may also have sparked the beginnings of life on Earth billions of years ago, though with a bit less scenery-chewing than that classic film scene. Earth is around 4.5 billion years old, and the oldest direct fossil evidence of ancient life — stromatolites, or microscopic organisms preserved in layers known as microbial mats — is about 3.5 billion years old. However, some scientists suspect life originated even earlier, emerging from accumulated organic molecules in primitive bodies of water, a mixture sometimes referred to as primordial soup. But where did that organic material come from in the first place? Researchers decades ago proposed that lightning caused chemical reactions in ancient Earth’s oceans and spontaneously produced the organic molecules. Now, new research published March 14 in the journal Science Advances suggests that fizzes of barely visible “microlightning,” generated between charged droplets of water mist, could have been potent enough to cook up amino acids from inorganic material. Amino acids — organic molecules that combine to form proteins — are life’s most basic building blocks and would have been the first step toward the evolution of life. “It’s recognized that an energetic catalyst was almost certainly required to facilitate some of the reactions on early Earth that led to the origin of life,” said astrobiologist and geobiologist Dr. Amy J. Williams, an associate professor in the department of geosciences at the University of Florida. For animo acids to form, they need nitrogen atoms that can bond with carbon. Freeing up atoms from nitrogen gas requires severing powerful molecular bonds and takes an enormous amount of energy, according to Williams, who was not involved in the research. “Lightning, or in this case, microlightning, has the energy to break molecular bonds and therefore facilitate the generation of new molecules that are critical to the origin of life on Earth,” Williams told CNN in an email. Mist and microlightning To recreate a scenario that may have produced Earth’s first organic molecules, researchers built upon experiments from 1953 when American chemists Stanley Miller and Harold Urey concocted a gas mixture mimicking the atmosphere of ancient Earth. Miller and Urey combined ammonia (NH3), methane (CH4), hydrogen (H2) and water, enclosed their “atmosphere” inside a glass sphere and jolted it with electricity, producing simple amino acids containing carbon and nitrogen. The Miller-Urey experiment, as it is now known, supported the scientific theory of abiogenesis: that life could emerge from nonliving molecules. For the new study, scientists revisited the 1953 experiments but directed their attention toward electrical activity on a smaller scale, said senior study author Dr. Richard Zare, the Marguerite Blake Wilbur Professor of Natural Science and professor of chemistry at Stanford University in California. Zare and his colleagues looked at electricity exchange between charged water droplets measuring between 1 micron and 20 microns in diameter. (The width of a human hair is 100 microns.) “The big droplets are positively charged. The little droplets are negatively charged,” Zare told CNN. “When droplets that have opposite charges are close together, electrons can jump from the negatively charged droplet to the positively charged droplet.” American chemist Stanley Miller, using original laboratory equipment, recreates the Miller-Urey experiment, which supported the scientific theory that life could emerge from nonliving molecules. - Roger Ressmeyer/Corbis/Getty Images 請至原網頁觀看照片 The researchers mixed ammonia, carbon dioxide, methane and nitrogen in a glass bulb, then sprayed the gases with water mist, using a high-speed camera to capture faint flashes of microlightning in the vapor. When they examined the bulb’s contents, they found organic molecules with carbon-nitrogen bonds. These included the amino acid glycine and uracil, a nucleotide base in RNA. “We discovered no new chemistry; we have actually reproduced all the chemistry that Miller and Urey did in 1953,” Zare said. Nor did the team discover new physics, he added — the experiments were based on known principles of electrostatics. “What we have done, for the first time, is we have seen that little droplets, when they’re formed from water, actually emit light and get this spark,” Zare said. “That’s new. And that spark causes all types of chemical transformations.” Water and life Lightning was likely too infrequent to produce amino acids in quantities sufficient for life, researchers say. - Mariana Suarez/AFP/Getty Images 請至原網頁觀看照 Lightning is a dramatic display of electrical power, but it is also sporadic and unpredictable. Even on a volatile Earth billions of years ago, lightning may have been too infrequent to produce amino acids in quantities sufficient for life — a fact that has cast doubt on such theories in the past, Zare said. Water spray, however, would have been more common than lightning. A more likely scenario is that mist-generated microlightning constantly zapped amino acids into existence from pools and puddles, where the molecules could accumulate and form more complex molecules, eventually leading to the evolution of life. “Microdischarges between obviously charged water microdroplets make all the organic molecules observed previously in the Miller-Urey experiment,” Zare said. “We propose that this is a new mechanism for the prebiotic synthesis of molecules that constitute the building blocks of life.” However, even with the new findings about microlightning, questions remain about life’s origins, he added. While some scientists support the notion of electrically charged beginnings for life’s earliest building blocks, an alternative abiogenesis hypothesis proposes that Earth’s first amino acids were cooked up around hydrothermal vents on the seafloor, produced by a combination of seawater, hydrogen-rich fluids and extreme pressure. Yet another hypothesis suggests that organic molecules didn’t originate on Earth at all. Rather, they formed in space and were carried here by comets or fragments of asteroids, a process known as panspermia. “We still don’t know the answer to this question,” Zare said. “But I think we’re closer to understanding something more about what could have happened.” Though the details of life’s origins on Earth may never be fully explained, “this study provides another avenue for the formation of molecules crucial to the origin of life,” Williams said. “Water is a ubiquitous aspect of our world, giving rise to the moniker ‘Blue Marble’ to describe the Earth from space. Perhaps the falling of water, the most crucial element that sustains us, also played a greater role in the origin of life on Earth than we previously recognized.” Mindy Weisberger is a science writer and media producer whose work has appeared in Live Science, Scientific American and How It Works magazine.
For more CNN news and newsletters create an account at CNN.com Sign up for CNN’s Wonder Theory science newsletter. Explore the universe with news on fascinating discoveries, scientific advancements and more.
本文於 修改第 2 次
|
我們完全搞錯了「暗能量」?-Ben Turner
|
|
|
推薦1 |
|
|
|
'The universe has thrown us a curveball': Largest-ever map of space reveals we might have gotten dark energy totally wrong Ben Turner, 03/20/25 Findings from the Dark Energy Spectroscopic Instrument (DESI) suggest that dark energy could be evolving over time. If they're right, cosmology will need a new model. Astronomers studying the largest-ever map of the cosmos have found hints that our best understanding of the universe is due a major rewrite. The analysis, which looked at nearly 15 million galaxies and quasars spanning 11 billion years of cosmic time, found that dark energy — the presumed-to-be constant force driving the accelerating expansion of our universe — could be weakening. Or at least this is what the data, collected by the Dark Energy Spectroscopic Instrument (DESI), suggest when combined with information taken from star explosions, the cosmic microwave background and weak gravitational lensing. PLAY SOUND 請至原網頁觀看視頻 If the findings hold up, it means that one of the most mysterious forces controlling the fate of our universe is even weirder than first thought — and that something is very wrong with our current model of the cosmos. The researchers' findings were published in multiple papers on the preprint server arXiv and presented March 19 at the American Physical Society's Global Physics Summit in Anaheim, California, so they have not yet been peer-reviewed. "It's true that the DESI results alone are consistent with the simplest explanation for dark energy, which would be an unchanging cosmological constant," co-author David Schlegel, a DESI project scientist at the Lawrence Berkeley National Laboratory in California, told Live Science. "But we can't ignore other data that extend to both the earlier and later universe. Combining [DESI's results] with those other data is when it gets truly weird, and it appears that this dark energy must be 'dynamic,' meaning that it changes with time." The evolving cosmos Dark energy and dark matter are two of the universe's most puzzling components. Together they make up roughly 95% of the cosmos, but because they do not interact with light, they can't be detected directly. Yet these components are key ingredients in the reigning Lambda cold dark matter (Lambda-CDM) model of cosmology, which maps the growth of the cosmos and predicts its end. In this model, dark matter is responsible for holding galaxies together and accounts for their otherwise inexplicably powerful gravitational pulls, while dark energy explains why the universe's expansion is accelerating. But despite countless observations of these hypothetical dark entities shaping our universe, scientists are still unsure where they came from, or what they even are. Currently, the best theoretical explanation for dark energy is made by quantum field theory, which describes the vacuum of space as filled with a sea of quantum fields that fluctuate, creating an intrinsic energy density in empty space. In the aftermath of the Big Bang, this energy increases as space expands, creating more vacuum and more energy to push the universe apart faster. This suggestion helped scientists to tie dark energy to the cosmological constant — a hypothetical inflationary energy, growing with the fabric of space-time throughout the universe's life. Einstein named it Lambda in his theory of general relativity. "The problem with that theory is that the numbers don't add up," said Catherine Heymans, a professor of astrophysics at the University of Edinburgh and the Astronomer Royal for Scotland who was not involved in the study. "If you say: 'Well, what sort of energy would I expect from this sort of vacuum?' It's very, very, very, very different from what we measure," she told Live Science. "It's kind of exciting that the universe has thrown us a curveball here," she added. An artist's illustration of the universe's evolution to the present day, with its expansion being driven by dark energy. 請至原網頁觀看示意圖 Scanning the dark universe To figure out if dark energy is changing over time, the astronomers turned to three years' worth of data from DESI, which is mounted on the Nicholas U. Mayall 4-meter Telescope in Arizona. DESI pinpoints the monthly positions of millions of galaxies to study how the universe expanded up to the present day. By compiling DESI's observations, which includes nearly 15 million of the best measured galaxies and quasars (ultra-bright objects powered by supermassive black holes), the researchers came up with a strange result. Taken on their own, the telescope's observations are in "weak tension" with the Lambda-CDM model, suggesting dark energy may be losing strength as the universe ages, but without enough statistical significance to break with the model. But when paired with other observations, such as the universe's leftover light from the cosmic microwave background, supernovas, and the gravitational warping of light from distant galaxies, the likelihood that dark energy is evolving grows. In fact, it pushes the observations' disagreement with the standard model as far as 4.2 Sigma, a statistical measure on the cusp of the five-Sigma result physicists use as the "gold standard" for heralding a new discovery. Whether this result will hold or fade over time with more data is unclear, but astrophysicists are growing confident that the discrepancy is less likely to disappear. "These data seem to indicate that either dark energy is becoming less important today, or it was more important early in the universe," Schlegel said. Astronomers say that further answers will come from a flotilla of new experiments investigating the nature of dark matter and dark energy in our universe. These include the Euclid space telescope, NASA's Nancy Grace Roman Space Telescope, and DESI itself, which is now in its fourth of five years scanning the sky and will measure 50 million galaxies and quasars by the time it's done. "I think it's fair to say that this result, taken at face-value, appears to be the biggest hint we have about the nature of dark energy in the [rough] 25 years since we discovered it," Adam Riess, a professor of astronomy at Johns Hopkins University who won the 2011 Nobel Prize in physics for his team's 1998 discovery of dark energy, told Live Science. "If confirmed, it literally says dark energy is not what most everyone thought, a static source of energy, but perhaps something even more exotic." 另一則內容近似的相關報導: Dark energy isn't what we thought – and that may transform the cosmos Ben Turner is a U.K. based staff writer at Live Science. He covers physics and astronomy, among other topics like tech and climate change. He graduated from University College London with a degree in particle physics before training as a journalist. When he's not writing, Ben enjoys reading literature, playing the guitar and embarrassing himself with chess. Related: After 2 years in space, the James Webb telescope has broken cosmology. Can it be fixed? Cosmic voids may explain the universe's acceleration without dark energy Could the universe ever stop expanding? New theory proposes a cosmic 'off switch' 'Heavy' dark matter would rip our understanding of the universe apart, new research suggests Something invisible and 'fuzzy' may lurk at the Milky Way's center, new research suggests
本文於 修改第 2 次
|
|
|