|
自然科學:普及篇 – 開欄文
|
瀏覽2,984|回應18|推薦3 |
|
|
我是物理系畢業生,有了自然科學的基本常識;容易讀懂科學新知的報導,同時也有興趣接觸它們。 過去《中華雜誌》雖是政論性和人文學術性刊物,但有時會介紹一些自然科學的研究結果;每年也都會刊登有關諾貝爾獎得主的消息。我唸大學時就替《中華》翻譯過一篇報導天文學脈動星的文章。 同窗好友王家堂兄在1980前後,介紹我進入高能物理的「普及科學」世界;此後常常讀一些這方面的書籍。因此,我一直保持著對物理學的興趣,之後自然而然的進入宇宙學領域。 以上三點是這個部落格過去經常轉載自然科學方面報導/論文的背景。
本文於 修改第 3 次
|
宇宙學需要新點子 -- R. Lea
|
|
推薦1 |
|
|
'Our understanding of the universe may be incomplete': James Webb Space Telescope data suggests we need a 'new cosmic feature' to explain it all "The discrepancy between the observed expansion rate of the universe and the predictions of the standard model suggests that our understanding of the universe may be incomplete. " Robert Lea, 12/09/24 Credit: NASA, ESA, CSA, STScI, Jose M. Diego (IFCA), Jordan C. J. D’Silva (UWA), Anton M. Koekemoer (STScI), Jake Summers (ASU), Rogier Windhorst (ASU), Haojing Yan (University of Missouri) 請至原網頁查看照片 New observations from the James Webb Space Telescope (JWST) have corroborated data from its predecessor, the Hubble Space Telescope, to determine something is missing from our recipe of the cosmos. The JWST conducted its largest survey yet of the accelerating expansion of the cosmos as scientists attempt to discover why the universe is expanding faster today than our picture of its infancy, billions of years ago, says that it should. Currently, scientists theorize that the accelerating expansion is caused by a placeholder element, "dark energy," but they really need to know what dark energy actually is before a conclusive explanation can be found. JWST's survey served to cross-check observations made by Hubble that suggested a discrepancy in measurements of the rate of cosmic expansion, known as the Hubble constant. This issue has been termed "Hubble tension," and these new findings show that errors in data from the long-serving space telescope of the same name are not responsible for it. As the Hubble tension can't be accounted for by either our best models of the universe or errors in Hubble measurements, an extra ingredient still seems to be needed in our cosmic recipe. "The discrepancy between the observed expansion rate of the universe and the predictions of the standard model suggests that our understanding of the universe may be incomplete," team leader Adam Reiss, an astrophysicist at Johns Hopkins University, said in a statement. "With two NASA flagship telescopes now confirming each other’s findings, we must take this [Hubble tension] problem very seriously — it's a challenge but also an incredible opportunity to learn more about our universe." In 2011, Reiss won the Nobel Prize in Physics for the discovery of dark energy, a mysterious force that drives the acceleration of the expansion of the universe. This new research builds upon that Nobel Prize-winning work. What is the Hubble tension? Because the expansion of the universe works on very large scales, Hubble tension isn't something that affects us in our everyday life or even on scales of the solar system or even the Milky Way. This discrepancy becomes really problematic when considering the distances between galaxies and the larger structure of the universe. That means cosmologists can't really understand the evolution of the universe until they know what the cause of the Hubble tension. The Hubble tension arises from the fact that there are two ways to calculate the Hubble constant. Scientists can use things like distances to Type Ia supernovas or variable stars, which they call "standard candles," to measure the distances from Earth to the galaxies that host them and then determine how rapidly these galaxies are moving away. They can also use our models of cosmic evolution to "wind forward" the universe and calculate what the Hubble constant should be today. However, when measurements of the Hubble constant are taken in the local universe, they are higher than the value predicted by working forward using the best model we have for cosmic evolution, the Lambda Cold Dark Matter (LCDM) model, also known as the Standard Model of Cosmology. A diagram showing the evolution of the universe according to the prevailing cold dark matter model. Observations of El Gordo could throw this model into doubt 請至原網頁查看說明及圖示 The LCDM-based method gives a value for the Hubble constant of about 152,000 miles per hour per megaparsec (68 kilometers per second per megaparsec, or Mpc), while measurements based on telescope observations regularly give a higher value of between 157,000 mph per Mpc to 170,000 mph per Mpc (70 to 76 km/s/Mpc). An Mpc is equivalent to 3.26 light-years or 5.8 trillion miles (9.4 trillion kilometers), so this is a huge discrepancy, one which scientists feared was too large to be explained by uncertainties in observations. Looks like they were right! Hubble was right! To confirm the findings of Hubble, Reiss, and colleagues turned to the largest sample of data collected by the JWST during its first two years of operations, which came from two different projects. To measure the Hubble constant, they used three independent methods to determine the distance to other galaxies. First, they used so-called "Cepheid variables," pulsating stars considered the gold standard for measuring cosmic distances. The team then cross-checked this with measurements based on carbon-rich stars and the brightest red giants across the same galaxies. The team particularly honed in on galactic distances measured by Hubble. The team's research with the JWST covered about a third of the full sample of galaxies as seen by Hubble using the galaxy Messier 106 (M106), also known as NGC 4258 and located around 23 million light-years away in the constellation Canes Venaticias, a reference point. A dusty-looking section of space with orange and red streaks concentrated around a glowing greenish center. 請至原網頁查看照片 This not only helped them produce the most precise local measurements of the Hubble constant to date, but it also independently verified that Hubble's distance measurements were accurate. The galaxies observed by the JWST yielded a Hubble constant of around 162,400 mph per Mpc (72.6 km/s/Mpc), nearly identical to the value of 162849 mph per Mpc (72.8 km/s/Mpc) found by Hubble for the same galaxies. This eliminates the possibility that the Hubble tension is just an artifact arising from significant bias in the long-serving space telescope's measurements. "The JWST data is like looking at the universe in high definition for the first time and really improves the signal-to-noise of the measurements,’’ team member and Johns Hopkins University graduate student Siyang Li said. Of course, this means there is still a problem of Hubble tension that needs to be tackled. Because the expansion of the universe works on very large scales. Johns Hopkins cosmologist Marc Kamionkowski, who was not involved with this study, thinks that solving the Hubble tension requires a new element to our models of the universe. He has an idea of what this element may be. "One possible explanation for the Hubble tension would be if there was something missing in our understanding of the early universe, such as a new component of matter — early dark energy — that gave the universe an unexpected kick after the Big Bang," Kamionkowski said in the statement. "And there are other ideas, like funny dark matter properties, exotic particles, changing electron mass, or primordial magnetic fields that may do the trick. "Theorists have license to get pretty creative.” The team's research was published on Monday (Dec. 9) in the Astrophysical Journal. Related Stories: — James Webb Space Telescope spies never-before-seen star behavior in distant nebula (video, photo) — Galactic penguin honors the 2nd anniversary of James Webb Space Telescope's 1st images — James Webb Space Telescope directly images its coldest exoplanet target yet
本文於 修改第 2 次
|
黑洞影像佐證下的萬有引力理論 -- Robert Lea
|
|
推薦1 |
|
|
轉載下文於此,略盡推廣科學普及教育之責。 索引: anatomy:此處:(黑洞)結構分析;解剖學,解剖構造,(動植物)結構,身體,剖析 black hole, mimetic:「擬態引力理論」中黑洞的結構和性質 caveat:此處:提示;警告,注意,謹慎 curvature:彎曲率,彎曲度 ergosphere:動圈(在旋轉黑洞外面的區域);黑洞能量/物質攝取區(胡卜凱的翻譯) event horizon:事件穹界;(黑洞)阻隔區(胡卜凱的翻譯) magnum opus:代表作,巨著,傑作 gravity, mimetic:擬態引力 mimic:模仿的,模擬的,非真的,假裝的 Schwartzchild solution: singularity:「時-空」終結點(胡卜凱的翻譯) singularity, naked:裸奇異點;在黑洞內,但是沒有阻隔區包圍的「『時-空』終結點」(胡卜凱的解讀) spacetime:「時-空」 Black hole images deliver a deathblow to alternative theory of gravity Robert Lea, 11/26/24 Images of the supermassive black holes wouldn’t have been possible if mimetic gravity was the right recipe for gravity. 請至原網頁觀看照片 Researchers have examined the historical images of supermassive black holes — Sagittarius A* (Sgr A*) at the heart of the Milky Way and the black hole at the center of the galaxy Messier 87 (M87) — to rule out an alternative to our current best theory of gravity. In doing so, the team behind this research also help to confirm the existence of dark energy and dark matter, the two most mysterious and difficult-to-explain aspects of the universe. Despite being arguably the most ubiquitous “force” experienced by humanity, gravity hasn’t necessarily been easy to explain. Newton’s theory of gravity was a good early attempt and still works perfectly well for relatively small-scale calculations, but it starts to fail when considering massive objects, even struggling to explain the wobbly orbit of Mercury. In 1915, Einstein put forward the theory of general relativity, suggesting that gravity is not a force in the traditional sense but instead arises from the curvature of space and time, united as a four-dimensional entity called “spacetime,” caused by the presence of mass. The more mass an object has, the greater the curvature of spacetime and, thus, the greater the gravitational influence of that object. One of the most remarkable aspects of general relativity is the number of concepts that it predicted, including black holes and gravitational waves, that later came to be evidentially verified. General relativity is one of the most tested theories in science, and it has survived every experimental challenge thrown at it. It has thus supplanted Newton’s theory of gravity. General relativity isn’t perfect, however. One of the major problems with Einstein’s magnum opus theory is the fact cosmological theories that tell the story of the universe based upon it can’t account for the so-called “dark universe.” That is dark energy, the mysterious force that drives the acceleration of the universe’s expansion, and dark matter, the strange “stuff” that out-populates ordinary matter by five to one but remains effectively invisible. The dark universe problem for Einstein Dark energy accounts for an estimated 70% of the universe’s matter/energy budget, while dark matter accounts for a further 25% of that budget. This means that everything we see in the universe around us, all the stars, planets, moons, asteroids, animals, etc… account for just 5% of the contents of the universe. No wonder most scientists are desperate to discover what dark energy and dark matter are. Why the caveat most? That’s because other scientists propose that dark matter and dark energy don’t exist. Instead, they suggest that the effects we attribute to them are a consequence of the fact that general relativity isn’t the “right recipe” for gravity. These researchers posit theories of “modified gravity” that do away with the need for the dark universe to exist. Some modify Newton’s theory of gravity; others attempt to extend on general relativity. One of the most credible modified gravity theories is mimetic gravity, suggested in 2013 by researchers Slava Mukhanov and Ali Chamseddine. Mimetic gravity extends general relativity, leading to the appearance of a dust-like perfect fluid that can mimic cold dark matter at a cosmological level and can explain the late-time acceleration of the cosmic expansion attributed to dark energy. To surpass and supplant general relativity, one thing any modified gravity theory must do is also (to explain) the phenomenon in the universe that conforms to Einstein’s 1915 theory. That is where the images of black holes come in. 註:(to explain)為原文所無,我加上來補足文法和文意;可能是鍵誤或漏植。 In April 2019, when scientists from the Event Horizon Telescope (EHT) revealed the first-ever image of a black hole to the public, the supermassive black hole M87*, they expressed how surprised they were that it almost exactly conformed to the appearance of a black hole and its surroundings predicted by general relativity. This was compounded in May 2022, when the first image of “our black hole” Sgr A* also tightly conformed to expectations and closely resembled M87* despite the fact that the latter is much more massive than the Milky Way’s supermassive black hole. Thus, it is only natural to put theories of modified gravity up against observations of the supermassive black holes M87* and Sgr A* collected by the EHT, a global network of instruments that effectively creates a single Earth-sized telescope. That is exactly what the authors of a new paper set out to do. “In a sense, the mere fact that we can see these images rules out mimetic gravity!” said University of Trento researcher Sunny Vagnozzi. “In short, our findings completely rule out baseline mimetic gravity, which was previously one of the least unlikely modified gravity-based models for dark matter and dark energy. “In some sense, it empirically gives even more support to the fact that dark matter and dark energy may be ‘real’ and not the effect of modifications of gravity.” The anatomy of black holes: General relativity vs. mimetic gravity To understand why the team’s research and the EHT images are bad news for supporters of mimetic gravity, it is necessary to delve into the anatomy of black holes a little bit. All black holes are considered to be composed of a central singularity, an inestimably small region of space with infinite mass where the laws of physics fail, and an outer boundary called an “event horizon.” The event horizon is the point at which the gravitational influence of the black hole becomes so great that not even light is fast enough to escape. Thus, anything that passes the event horizon of a black hole is on a one-way trip to the central singularity. Around the event horizon is a region of space that is constantly dragged along with the rotational of the black hole due to its immense gravity. It is impossible for matter to sit still in this region, called the “ergosphere.” Further out is matter whipping around the black hole at near-light speeds, causing it to glow. This matter appears as a striking golden ring in the images of M87* and Sgr A*, with the shadow of the black holes appearing in the center of these rings. That is, if general relativity is the correct recipe for gravity and if a solution to its equations called the Schwartzchild solution accurately describes the anatomy of black holes. Mimetic gravity has two different ideas about black holes. Vagnozzi explained that one of the two natural classes of objects in mimetic gravity is a naked singularity. This is a central singularity that is not bounded by a light-trapping event horizon. No event horizon would have meant no EHT image. The second possible object predicted in mimetic gravity is a so-called “mimetic black hole.” If the EHT had imaged one of these objects when it snapped M87* or Sgr A*, what researchers would seen is an image with a much smaller dark region at its heart than the dark region that was seen in these black hole images. “We demonstrated that the naked singularity does not cast a shadow. It should not lead to an image in EHT observations,” Vagnozzi said. “To use an everyday life analogy, say EHT images are actually the reflection of ourselves we see in the mirror. If I were a mimetic naked singularity, I would look in the mirror and see no reflection. If I were a mimetic black hole, my image in the mirror would be much smaller than it actually is. “This analogy is stretching it a lot, but it should give an idea of what is happening.” Vagnozzi explained that although the interpretation of EHT data to create the images of M87* and Sgr A* is a complex process with some margin of error, this possible uncertainty simply isn’t significant enough for the team’s conclusion to be incorrect. The researcher stresses that the research conducted by the team rules out only a “baseline” version of mimetic gravity, adding that more complex mimetic gravity theories with more adjustments to general relativity could still be possible. “This is absolutely a demonstration of the importance of the EHT and its observations. It demonstrates that EHT has the potential to rule out candidate theories of dark matter and dark energy, which were previously completely viable,” Vagnozzi said. “The takeaway message is very important: any theory that claims to explain dark matter and dark energy needs not only be consistent with cosmological observations but also with observations of black holes, and this provides a highly non-trivial test of many such models, which may be inconsistent with EHT images. “We believe this idea deserves to be explored in much more detail.” Reference: Mohsen Khodadi, Sunny Vagnozzi, and Javad T. Firouzjaee Event Horizon Telescope observations exclude compact objects in baseline mimetic gravity, Scientific Reports (2024). DOI: 10.1038/s41598-024-78264-y
本文於 修改第 1 次
|
人體器官的細胞結構圖 ------ RAMAKRISHNAN/UNGAR
|
|
推薦1 |
|
|
Scientists map out the human body one cell at a time ADITHI RAMAKRISHNAN and LAURA UNGAR, 11/21/24 This image provided by Ana-Maria Cujba shows blood vessels in a portion of the human small intestine, March 21, 2024. (Ana-Maria Cujba/Wellcome Sanger Institute via AP) 請至原網頁查看圖片 This image provided by Nathan Richoz shows a T cell aggregate in a human trachea biopsy on July 12, 2021, at the University of Cambridge in Cambridge, England. (Nathan Richoz/Clatworthy Lab/University of Cambridge via AP) 請至原網頁查看圖片 Researchers have created an early map of some of the human body’s estimated 37.2 trillion cells. Each type of cell has a unique role, and knowing what all the cells do can help scientists better understand health and diseases such as cancer. Scientists focused on certain organs — plotting the jobs of cells in the mouth, stomach and intestines, as well as cells that guide how bones and joints develop. They also explored which cells group into tissues (組織), where they’re located in the body and how they change over time. They hope the high-resolution, open-access atlas — considered a first draft — will help researchers fight diseases that damage or corrupt human cells. “When things go wrong, they go wrong with our cells first and foremost,” said Aviv Regev, co-chair of the Human Cell Atlas consortium who was involved with the research. The findings were published Wednesday in Nature and related journals. The group plans to release a more complete atlas in 2026, profiling cells across 18 organs and body systems. That includes the skin, heart, breasts and more. The current cell map not only charts the many types of human cells, but it also shows the relationships of cells to each other, said Dr. Timothy Chan, a cancer expert at the Cleveland Clinic. Chan said it’s a deep dive into human biology that’s sure to have practical impact such as identifying and treating cancer cells. “Different types of cells have different Achilles’ heels,” said Chan, who was not involved in the studies. “This is going to be a boon” for cancer research. Scientists are also creating other atlases that could help them learn more about the underpinnings of health and disease in specific parts of the body. With brain atlases, they’re seeking to understand the structure, location and function of the many types of brain cells. A new gut microbiome atlas looks at the collection of microorganisms in the intestines, which plays a key role in digestion and immune system health. The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute’s Science and Educational Media Group. The AP is solely responsible for all content. ADITHI RAMAKRISHNAN is a science reporter for The Associated Press, based in New York. She covers research and new developments related to space, early human history and more. LAURA UNGAR covers medicine and science on the AP’s Global Health and Science team. She has been a health journalist for more than two decades. Related Stories Científicos crean mapa del cuerpo humano célula por célula The dark energy pushing our universe apart may not be what it seems, scientists say SpaceX launches giant Starship rocket, but aborts attempt to catch booster with mechanical arms
本文於 修改第 4 次
|
「(基因)機能擴張」研究的風險 -- Ross Pomeroy
|
|
推薦1 |
|
|
過去兩、三年間,我開始注意到「(基因)機能擴張」這個「術語」開始在科技報導的文章中出現。不過,因為它太專門,我一直懶得花時間去了解。下文標題相當聳動,內容淺顯易懂;或許是認識這個術語的機會。 下文提及:新冠病毒有可能來自這類實驗,雖然作者強調「可能性不大」;有點意思。作者並暗示:它可能成為「生物戰武器」;則值得警惕和關注。 索引: airborne:此處:空氣傳播的;在空中的,空運的,飛行中的 antigenic:具抗原性的(這是我的翻譯,有待專家指正) dank:陰冷潮濕的,濕冷的 Gain-of-Function Research:(基因)機能擴張研究 lyssavirus:麗沙病毒屬 pathogen:病原體 rabies:狂犬病 spelunking:洞穴探察 The Gain-of-Function Experiment That Could 'Eliminate Humans From the Face of the Earth' Ross Pomeroy, 06/15/24 A Google search for "Frio Cave" makes the Uvalde County, Texas destination look like a tourists' dream. One quickly learns that the cave is home to tens of millions of Mexican free-tailed bats, and that you can sometimes witness the flapping horde streaming out of their dark, dank home just before sunset, clouding the sky in a "once in a lifetime experience." But Frio Cave has a darker history that visitors websites don't mention. More than fifty years ago, two humans contracted rabies while spelunking there. That humans would get infected with rabies while visiting a bat-infested cave isn't altogether surprising. Bats are a reservoir for the terrifying disease – 99% fatal to humans once symptoms – like hyperactivity, hallucinations, seizures, and fear of water – develop. A simple bite from one of the millions of bats could have transmitted a lyssavirus that triggers rabies. However, in this instance, the spelunkers apparently weren't bitten. Rather, it seems they caught the virus from the air itself. A team of scientists subsequently investigated. They found that rabies virus could be transmitted to animals housed in empty cages within the cave, apparently just via the atmosphere itself. Moreover, the virus was isolated from samples collected via air condensation techniques. The episode raised a disturbing prospect. Had rabies, the deadliest virus for humankind, gone airborne? To be clear, it had not, at least not in a manner that would result in ultra-contagious, human-to-human spread. The sheer number of rabies-carrying bats in the cave likely transformed it into a "hot-box" of infection. Rabies remains transmitted almost entirely through bites and scratches from infected animals, and it is rapidly inactivated by sunlight and heat. However, for safety, members of the general public are now only allowed to enter Frio Cave on guided tours that remain near the mouth of the cave. That doesn't mean that rabies virus couldn't mutate to become transmitted through the air. It's an RNA virus, and these are known to have high mutation rates. Indeed, scientists have found "a vast array of antigenic variants of this pathogen in a wide range of animal hosts and geographic locations." Moreover, as two Italian scientists wrote in a 2021 article, "Even single amino acid mutations in the proteins of Rabies virus can considerably alter its biological characteristics, for example increasing its pathogenicity and viral spread in humans, thus making the mutated virus a tangible menace for the entire mankind." Another possible route for this to occur would be through a "gain-of-function" experiment, in which researchers employ gene-editing to tweak the rabies virus, making it evade current vaccines and endowing it with the ability to spread through the air like measles or influenza. Gain-of-function research has earned increased public scrutiny of late as there's a small, outside chance it may have produced SARS-CoV-2, the virus that causes Covid-19. Paul Offit, a professor of pediatrics at the Children's Hospital of Philadelphia and co-inventor of a rotavirus vaccine, commented on the potential to augment rabies through gain-of-function in a recent Substack post. "In the absence of an effective vaccine, it could eliminate humans from the face of the earth. The good news is that no one has tried to make rabies virus more contagious. But that doesn’t mean that it’s not possible or that no one would be willing to try."
本文於 修改第 1 次
|
什麼是「三體問題」,它真的無解?-Skyler Ware
|
|
推薦1 |
|
|
繼續「三體(互動)問題」的通俗科學報導。請參看本欄2024/04/13貼文;以及其它相關文章1和文章2(該欄2024/04/12)。 What is the 3-body problem, and is it really unsolvable? Skyler Ware, 06/06/24 The three-body problem is a physics conundrum that has boggled scientists since Isaac Newton's day. But what is it, why is it so hard to solve and is the sci-fi series of the same name really possible? The real-like “Tatooine” planet Kepler-16b orbits two suns at once, illustrating the infamous three-body problem. (Image credit: NASA/JPL-Caltech) 請至原網頁查看照片 A rocket launch. Our nearest stellar neighbor. A Netflix show. All of these things have something in common: They must contend with the "three-body problem." But exactly what is this thorny physics conundrum? The three-body problem describes a system containing three bodies that exert gravitational forces on one another. While it may sound simple, it's a notoriously tricky problem and "the first real worry of Newton," Billy Quarles, a planetary dynamicist at Valdosta State University in Georgia, told Live Science. In a system of only two bodies, like a planet and a star, calculating how they'll move around each other is fairly straightforward: Most of the time, those two objects will orbit roughly in a circle around their center of mass, and they'll come back to where they started each time. But add a third body, like another star, and things get a lot more complicated. The third body attracts the two orbiting each other, pulling them out of their predictable paths. The motion of the three bodies depends on their starting state — their positions, velocities and masses. If even one of those variables changes, the resulting motion could be completely different. "I think of it as if you're walking on a mountain ridge," Shane Ross, an applied mathematician at Virginia Tech, told Live Science. "With one small change, you could either fall to the right or you could fall to the left. Those are two very close initial positions, and they could lead to very different states." There aren't enough constraints on the motions of the bodies to solve the three-body problem with equations, Ross said. But some solutions to the three-body problem have been found. For example, if the starting conditions are just right, three bodies of equal mass could chase one another in a figure-eight pattern. Such tidy solutions are the exception, however, when it comes to real systems in space. Certain conditions can make the three-body problem easier to parse. Consider Tatooine, Luke Skywalker's fictional home world from "Star Wars" — a single planet orbiting two suns. Those two stars and the planet make up a three-body system. But if the planet is far enough away and orbiting both stars together, it's possible to simplify the problem. This artist image illustrates Kepler-16b, the first directly detected circumbinary planet, which is a planet that orbits two stars. (Image credit: NASA/JPL-Caltech) 請至原網頁查看照片 "When it's the Tatooine case, as long as you're far enough away from the central binary, then you think of this object as just being a really fat star," Quarles said. The planet doesn't exert much force on the stars because it's so much less massive, so the system becomes similar to the more easily solvable two-body problem. So far, scientists have found more than a dozen Tatooine-like exoplanets, Quarles told Live Science. But often, the orbits of the three bodies never truly stabilize, and the three-body problem gets "solved" with a bang. The gravitational forces could cause two of the three bodies to collide, or they could fling one of the bodies out of the system forever — a possible source of "rogue planets" that don't orbit any star, Quarles said. In fact, three-body chaos may be so common in space that scientists estimate there may be 20 times as many rogue planets as there are stars in our galaxy. When all else fails, scientists can use computers to approximate the motions of bodies in an individual three-body system. That makes it possible to predict the motion of a rocket launched into orbit around Earth, or to predict the fate of a planet in a system with multiple stars. With all this tumult, you might wonder if anything could survive on a planet like the one featured in Netflix's "3 Body Problem," which — spoiler alert — is trapped in a chaotic orbit around three stars in the Alpha Centauri system, our solar system's nearest neighbor. "I don't think in that type of situation, that's a stable environment for life to evolve," Ross said. That's one aspect of the show that remains firmly in the realm of science fiction. Skyler Ware is a freelance science journalist covering chemistry, biology, paleontology and Earth science. She was a 2023 AAAS Mass Media Science and Engineering Fellow at Science News. Her work has also appeared in Science News Explores, ZME Science and Chembites, among others. Skyler has a Ph.D. in chemistry from Caltech. Related: Cosmic 'superbubbles' might be throwing entire galaxies into chaos, theoretical study hints 'Mathematically perfect' star system being investigated for potential alien technology How common are Tatooine worlds? Mathematicians find 12,000 new solutions to 'unsolvable' 3-body problem
本文於 修改第 1 次
|
基因與智能障礙-Emily Cooke
|
|
推薦1 |
|
|
New genetic cause of intellectual disability potentially uncovered in 'junk DNA' Emily Cooke, 06/01/24 Mutations in "junk DNA" could be responsible for rare genetic cases of intellectual disability, new research hints. Scientists have uncovered a rare genetic cause of intellectual disability in a historically overlooked part of the human genome: so-called junk DNA. This knowledge could someday help to diagnose some patients with these disorders, the researchers say. An intellectual disability is a neurodevelopmental disorder that appears during childhood and is characterized by intellectual difficulties that impact people's learning, practical skills and ability to live independently. Such conditions affect approximately 6.5 million Americans. Factors such as complications during birth can trigger intellectual disabilities. However, in most cases, the disorders have an underlying genetic cause. So far, around 1,500 genes have been linked with various intellectual disabilities — but clinicians are still not always able to identify the specific cause of every patient's condition. One possible explanation for this gap in knowledge is that previous approaches for reading DNA have only focused on a tiny portion of it. Specifically, they've looked at the roughly 2% of the genome that codes for proteins, known as coding DNA. About 98% of the genome contains DNA that doesn't code for proteins. This DNA was once considered "junk DNA," but scientists are now discovering that it actually performs critical biological functions. In a new study, published Friday (May 31) in the journal Nature Medicine, scientists used whole-genome sequencing technology to identify a rare genetic mutation within non-coding DNA that seems to contribute to intellectual disability. The team compared the whole genomes of nearly 5,530 people who have a diagnosed intellectual disability to those of about 46,400 people without the conditions. These data were gathered from the U.K.-based 100,000 Genomes Project. The researchers discovered that 47 of the people with intellectual disabilities — about 0.85% — carried mutations in a gene called RNU4-2. They then validated this finding in three additional large, independent genetic databases, bringing the total number of cases to 73. RNU4-2 doesn't code for proteins but rather for an RNA molecule, a cousin of DNA; RNA's code can either be translated into proteins or stand on its own as a functional molecule. The RNA made by RNU4-2 makes up part of a molecular complex called the spliceosome. The spliceosome helps to refine RNA molecules after their codes are copied down from DNA by "splicing" out certain snippets of the code. To further determine the prevalence of this new disorder, the team then launched a separate analysis where they looked at the genomes of another 5,000 people in the U.K. who'd been diagnosed with "neurodevelopmental abnormality." This is a term that refers to any deviation from "normal" in the neurodevelopment of a child. The team's analysis revealed that, out of those 5,000 people, 21 carried mutations in RNU4-2. That made the mutations the second most common type seen in the overall group, following mutations on the X chromosome known to cause a disorder called Rett syndrome. If changes in RNU4-2 can be confirmed as a cause of intellectual disability, this finding hints that the mutations may contribute significantly to a variety of conditions. The new study joins a second that also linked RNU4-2 to intellectual disabilities. The research has opened up "an exciting new avenue in ID [intellectual disability] research," Catherine Abbott, a professor of molecular genetics at the University of Edinburgh in the U.K. who was not involved in either study, told Live Science in an email. "These findings reinforce the idea that ID can often result from mutations that have a cumulative downstream effect on the expression of hundreds of other genes," Abbott said. RNA molecules that don't make proteins often help control the activity of genes, turning them on or off. The findings also stress the importance of sequencing the whole genome rather than just coding DNA, she said. The scientists behind the new study say the findings could be used to diagnose certain types of intellectual disability. The team now plans to investigate the precise mechanism by which RNU4-2 causes intellectual disabilities — for now, they've only uncovered a strong correlation. Emily Cooke is a health news writer based in London, United Kingdom. She holds a bachelor's degree in biology from Durham University and a master's degree in clinical and therapeutic neuroscience from Oxford University. She has worked in science communication, medical writing and as a local news reporter while undertaking journalism training. In 2018, she was named one of MHP Communications' 30 journalists to watch under 30. (emily.cooke@futurenet.com) Rates of autism diagnosis in children are at an all time high, CDC report suggests 'Look at all this we don't understand': Study unravels whole new layer of Alzheimer's disease New syndrome identified in children exposed to fentanyl in the womb 1st UK child to receive gene therapy for fatal genetic disorder is now 'happy and healthy' This brain structure may grow too fast in babies who develop autism
本文於 修改第 1 次
|
宇宙學的新重力理論--Claudia de Rham
|
|
推薦1 |
|
|
如果我沒有誤解德爾涵教授的意思,她試圖:用一個(目前)難以測量得到的「帶質量重力子」概念,來取代一個(目前)無法測量得到的「暗物質」概念,以建立一個解釋「宇宙加速膨脹現象」的新重力理論。 我不是物理學家,只是如果我沒有誤解,這篇大作讀起來有點像文字遊戲。 New theory of gravity solves accelerating universe Massive Gravity and the end of dark energy Claudia de Rham, 05/03/24 The universe is expanding at an accelerating rate but Einstein’s theory of General Relativity and our knowledge of particle physics predict that this shouldn’t be happening. Most cosmologists pin their hopes on Dark Energy to solve the problem. But, as Claudia de Rham argues, Einstein’s theory of gravity is incorrect over cosmic scales, her new theory of Massive Gravity limits gravity’s force in this regime, explains why acceleration is happening, and eliminates the need for Dark Energy. The beauty of cosmology is that it often connects the infinitely small with the infinitely big – but within this beauty lies the biggest embarrassment in the history of physics. According to Einstein’s theory of General Relativity and our knowledge of particle physics, the accumulated effect of all infinitely small quantum fluctuations in the cosmos should be so dramatic that the Universe itself should be smaller than the distance between the Earth and the Moon. But as we all know, our Universe spans over tens of billions of light years: it clearly stretches well beyond the moon. This is the “Cosmological Constant Problem”. Far from being only a small technical annoyance, this problem is the biggest discrepancy in the whole history of physics. The theory of Massive Gravity, developed by my colleagues and I, seeks to address this problem. For this, one must do two things. First, we need to explain what leads to this cosmic acceleration; second, we need to explain why it leads to the observed rate of acceleration – no more no less. Nowadays it is quite popular to address the first point by postulating a new kind of Dark Energy fluid to drive the cosmic acceleration of the Universe. As for the second point, a popular explanation is the anthropic principle: if the Universe was accelerating at a different rate, we wouldn’t be here to ask ourselves the questions. To my mind, both these solutions are unsatisfactory. In Massive Gravity, we don’t address the first point by postulating a new form of as yet undiscovered Dark Energy but rather by relying on what we do know to exist: the quantum nature of all the fundamental particles we are made out of, and the consequent vacuum energy, which eliminates the need for dark energy. This is a natural resolution for the first point and would have been adopted by scientists a long time ago if it wasn’t for the second point: how can we ensure that the immense levels of vacuum energy that fill the Universe don’t lead to too fast an acceleration? We can address this by effectively changing the laws of gravity on cosmological scales and by constraining the effect of vacuum energy. The Higgs Boson and the Nature of Nothingness At first sight, our Universe seems to be filled with a multitude of stars within galaxies. These galaxies are gathered in clusters surrounded by puffy “clouds” of dark matter. But is that it? Is there anything in between these clusters of galaxies plugged in filaments of dark matter? Peeking directly through our instruments, most of our Universe appears to be completely empty, with empty cosmic voids stretching between clusters of galaxies. There are no galaxies, nor gas, nor dark matter nor anything else really tangible we can detect within these cosmic voids. But are they completely empty and denuded of energy? To get a better picture of what makes up “empty space,” it is useful to connect with the fundamental particles that we are made of. I still vividly remember watching the announcement of the discovery of the Higgs boson in 2012. By now most people have heard of this renowned particle and how it plays an important role in our knowledge of particle physics, as well as how it is responsible for giving other particles mass. However, what’s even more remarkable is that the discovery of the Higgs boson and its mechanism reveals fundamental insights into our understanding of nothingness. To put it another way, consider “empty space" as an area of space where everything has been wiped away down to the last particle. The discovery of the Higgs boson indicates that even such an ideal vacuum is never entirely empty: it is constantly bursting with quantum fluctuations of all known particles, notably that of the Higgs. This collection of quantum fluctuations I'll refer to as the “Higgs bath.” The Higgs bath works as a medium, influencing other particles swimming in it. Light or massless particles, such as photons, don’t care very much about the bath and remain unaffected. Other particles, such as the W and Z bosons that mediate the Weak Force, interact intensely with the Higgs bath and inherit a significant mass. As a result of their mass the Weak Force they mediate is fittingly weakened. Accelerating Expansion When zooming out to the limits of our observable Universe we have evidence that the Universe is expanding at an accelerating speed, a discovery that led to the 2011 Nobel Prize in Physics. This is contrary to what we would have expected if most of the energy in the Universe was localized around the habitable regions of the Universe we are used to, like clusters of galaxies within the filaments of Dark Matter. In this scenario, we would expect the gravitational attraction pulling between these masses to lead to a decelerating expansion. So what might explain our observations of acceleration? We seem to need something which fights back against gravity but isn’t strong enough to tear galaxies apart, which exists everywhere evenly and isn't diluted by the expansion of the cosmos. This “something” has been called Dark Energy. A different option is vacuum energy. We’ve long known that the sea of quantum fluctuations has dramatic effects on other particles, as with the Higgs Bath, so it’s natural to ask about its effect on our Universe. In fact, scientists have been examining the effect of this vacuum energy for more than a century, and long ago realized that its effects on cosmological scales should lead to an accelerated expansion of the Universe, even before we had observations that indicated that acceleration was actually happening. Now that we know the Universe is in fact accelerating, it is natural to go back to this vacuum energy and estimate the expected rate of cosmic acceleration it leads to. The bad news is that the rate of this acceleration would be way too fast. The estimated acceleration rate would be wrong by at least twenty-eight orders of magnitude! This is the “Cosmological Constant Problem” and is also referred to as the “Vacuum Catastrophe.” Is our understanding of the fundamental particles incorrect? Or are we using Einstein’s theory of General Relativity in a situation where it does not apply? The Theory of Massive Gravity Very few possibilities have been suggested. The one I would like to consider is that General Relativity may not be the correct description of gravity at large cosmological scales where gravity remains untested. In Einstein’s theory of gravity, the graviton like the photon is massless and gravity has an infinite reach. This means objects separated by cosmological scales, all the way up to the size of the Universe, are still under the gravitational influence of each other and of the vacuum energy that fills the cosmos between them. Even though locally the effect of this vacuum energy is small, when you consider its effect accumulated over the whole history and volume of the Universe, its impact is gargantuan, bigger than everything else we can imagine, so that the cosmos would be dominated by its overall effect. Since there is a lot of vacuum energy to take into account, this leads to a very large acceleration, much larger than what we see, which again is the Cosmological Constant Problem. The solution my colleagues and I have suggested is that perhaps we don’t need to account for all this vacuum energy. If we only account for a small fraction of it, then it would still lead to a cosmic acceleration but with a much smaller rate, compatible with the Universe in which we live. Could it be that the gravitational connection that we share with the Earth, with the rest of the Galaxy and our local cluster only occurs because we are sufficiently close to one another? Could it be that we do not share that same gravitational connection with very distant objects, for instance with distant stars located on the other side of the Universe some 10 thousand million trillion km away? If that were the case, there would be far less vacuum energy to consider and this would lead to a smaller cosmic acceleration, resolving the problem. In practice, what we need to do is understand how to weaken the range of gravity. But that’s easy: nature has already showed us how to do that. We know that the Weak Force is weak and has a finite range distance because the W and Z bosons that carry it are massive particles. So, in principle, all we have to do is simply to give a mass to the graviton. Just as Einstein himself tried to include a Cosmological Constant in his equations, what we need to do is add another term which acts as the mass of the graviton, dampening the dynamics of gravitational waves and limiting the range of gravity. By making the graviton massive we now have ourselves a theory of “massive gravity.” Easy! The possibility that gravity could have a finite range is not a question for science-fiction. In fact Newton, Laplace and many other incredible scientists after them contemplated the possibility. Even following our development of quantum mechanics and the Standard Model, many including Pauli and Salam considered the possibility of gravitons with mass. But that possibility was always entirely refuted! Not because it potentially contradicts observations – quite the opposite, it could solve the vacuum catastrophe and explain why our Universe’s expansion is accelerating – but rather because models of massive gravity appeared to be haunted by “ghosts.” Ghosts are particles with negative energies that would cause everything we know, including you, me, the whole Universe, and possibly the structure of space and time to decay instantaneously. So if you want a theory of massive gravity you need to either find a way to get rid of these ghosts or to “trap” them. For decades, preventing these supernatural occurrences seemed inconceivable. That’s until Gregory Gabadadze, Andrew Tolley and I found a way to engineer a special kind of “ghost trap” that allowed us to trick the ghost to live in a constrained space and do no harm. One can think of this like an infinite loop-Escherian impossible staircase in which the Ghosts may exist and move but ultimately end up nowhere. Coming up with a new trick was one thing, but convincing the scientific community required even more ingenuity and mental flexibility. Even in science, there are many cultures and mathematical languages or scientific arguments that individuals prefer. So, throughout the years, whenever a new colleague had a different point of view, we were bound to learn their language, translate, and adjust our reasoning to their way of thinking. We had to repeat the process for years until no stone remained unturned. Overcoming that process was never the goal, though: it was only the start of the journey that would allow us to test our new theory of gravity. If gravitons have a mass, this mass should be tiny, smaller than the mass of all the other massive particles, even lighter than the neutrino. Consequently, detecting it may not be straightforward. Nevertheless, different features will appear which may make it possible to measure it. The most promising way to test for massive gravity involves observations of gravitational waves. If gravitons are massive, then we’d expect that low-frequency gravitational waves will travel ever so slightly slower than high-frequency ones. Unfortunately, this difference would be too slight to measure with current ground-based observatories. However, we should have better luck with future observatories. Missions like the Pulsar Timing Array, LISA and the Simons Observatory will detect gravitational waves with smaller and smaller frequencies, making possible the observations we need. Whether the massive gravity theory developed by my collaborators and I will survive future tests is of course presently unknown, but the possibility is now open. After all, even if the outcome isn’t certain, when it comes to challenging the biggest discrepancy of the whole history of science, addressing the Cosmological Constant Problem, eliminating the need for dark energy, and reconciling the effect of vacuum energy with the evolution of the Universe, some risks may be worth taking. Claudia de Rham, is a theoretical physicist working at the interface of gravity, cosmology and particle physics at Imperial College London. Claudia de Rham has recently published her first book The Beauty of Falling: A Life in Pursuit of Gravity with Princeton University Press. This article is presented in partnership with Closer To Truth, an esteemed partner for the 2024 HowTheLightGetsIn Hay Festival. Dive deeper into the profound questions of the universe with thousands of video interviews, essays, and full episodes of the long-running TV show at their website: www.closertotruth.com. You can see Claudia de Rham live, debating in ‘Dark Energy and The Universe’ alongside Priya Natarajan and Chris Lintott and ‘Faster Than Light’ with Tim Maudlin and João Magueijo at the upcoming HowTheLightGetsIn Festival on May 24th-27th in Hay-on-Wye. This article is presented in association with Closer To Truth, an esteemed partner for the 2024 HowTheLightGetsIn Festival. Related Posts: The most complex thing in the universe Was the Big Bang a white hole? Dark energy is the product of quantum universe interaction By Artyom Yurov The mistake at the heart of the physics of time Einstein’s failed magic trick Related Videos: Roger Penrose | Interview Fragments and reality The trouble with string theory The trouble with time
本文於 修改第 2 次
|
破解生命起源奧秘:過去5年的5個突破性發現 – S. Jordan/L. G. de Chalonge
|
|
推薦1 |
|
|
Unravelling life’s origin: five key breakthroughs from the past five years Seán Jordan/Louise Gillet de Chalonge, 05/02/24 There is still so much we don’t understand about the origin of life on Earth. The definition of life itself is a source of debate among scientists, but most researchers agree on the fundamental ingredients of a living cell. Water, energy, and a few essential elements are the prerequisites for cells to emerge. However, the exact details of how this happens remain a mystery. Recent research has focused on trying to recreate in the lab the chemical reactions that constitute life as we know it, in conditions plausible for early Earth (around 4 billion years ago). Experiments have grown in complexity, thanks to technological progress and a better understanding of what early Earth conditions were like. However, far from bringing scientists together and settling the debate, the rise of experimental work has led to many contradictory theories. Some scientists think that life emerged in deep-sea hydrothermal vents, where the conditions provided the necessary energy. Others argue that hot springs on land would have provided a better setting because they are more likely to hold organic molecules from meteorites. These are just two possibilities which are being investigated. Here are five of the most remarkable discoveries over the last five years. Reactions in early cells What energy source drove the chemical reactions at the origin of life? This is the mystery that a research team in Germany has sought to unravel. The team delved into the feasibility of 402 reactions known to create some of the essential components of life, such as nucleotides (a building block of DNA and RNA). They did this using some of the most common elements that could have been found on the early Earth. These reactions, present in modern cells, are also believed to be the core metabolism of LUCA, the last universal common ancestor, a single-cell, bacterium-like organism. For each reaction, they calculated the changes in free energy, which determines if a reaction can go forward without other external sources of energy. What is fascinating is that many of these reactions were independent of external influences like adenosine triphosphate, a universal source of energy in living cells. The synthesis of life’s fundamental building blocks didn’t need an external energy boost: it was self-sustaining. Volcanic glass Life relies on molecules to store and convey information. Scientists think that RNA (ribonucleic acid) strands were precursors to DNA in fulfilling this role, since their structure is more simple. The emergence of RNA on our planet has long confused researchers. However, some progress has been made recently. In 2022, a team of collaborators in the US generated stable RNA strands in the lab. They did it by passing nucleotides through volcanic glass. The strands they made were long enough to store and transfer information. Volcanic glass was present on the early Earth, thanks to frequent meteorite impacts coupled with a high volcanic activity. The nucleotides used in the study are also believed to have been present at that time in Earth’s history. Volcanic rocks could have facilitated the chemical reactions that assembled nucleotides into RNA chains. Hydrothermal vents Carbon fixation is a process in which CO₂ gains electrons. It is necessary to build the molecules that form the basis of life. An electron donor is necessary to drive this reaction. On the early Earth, H₂ could have been the electron donor. In 2020, a team of collaborators showed that this reaction could spontaneously occur and be fuelled by environmental conditions similar to deep-sea alkaline hydrothermal vents in the early ocean. They did this using microfluidic technology, devices that manipulate tiny volumes of liquids to perform experiments by simulating alkaline vents. This pathway is strikingly similar to how many modern bacterial and archaeal cells (single-cell organisms without a nucleas) operate. The Krebs Cycle In modern cells, carbon fixation is followed by a cascade of chemical reactions that assemble or break down molecules, in intricate metabolic networks that are driven by enzymes. But scientists are still debating how metabolic reactions unfolded before the emergence and evolution of those enzymes. In 2019, a team from the University of Strasbourg in France made a breakthrough. They showed that ferrous iron, a type of iron that was abundant in early Earth’s crust and ocean, could drive nine out of 11 steps of the Krebs Cycle. The Krebs Cycle is a biological pathway present in many living cells. Here, ferrous iron acted as the electron donor for carbon fixation, which drove the cascade of reactions. The reactions produced all five of the universal metabolic precursors – five molecules that are fundamental across various metabolic pathways in all living organisms. Building blocks of ancient cell membranes Understanding the formation of life’s building blocks and their intricate reactions is a big step forward in comprehending the emergence of life. However, whether they unfolded in hot springs on land or in the deep sea, these reactions would not have gone far without a cell membrane. Cell membranes play an active role in the biochemistry of a primitive cell and its connection with the environment. Modern cell membranes are mostly composed of compounds called phospholipids, which contain a hydrophilic head and two hydrophobic tails. They are structured in bilayers, with the hydrophilic heads pointing outward and the hydrophobic tails pointing inward. Research has shown that some components of phospholipids, such as the fatty acids that constitute the tails, can self-assemble into those bilayer membranes in a range of environmental conditions. But were these fatty acids present on the early Earth? Recent research from Newcastle University, UK gives an interesting answer. Researchers recreated the spontaneous formation of these molecules by combining H₂-rich fluids, likely present in ancient alkaline hydrothermal vents, with CO₂-rich water resembling the early ocean. This breakthrough aligns with the hypothesis that stable fatty acid membranes could have originated in alkaline hydrothermal vents, potentially progressing into living cells. The authors speculated that similar chemical reactions might unfold in the subsurface oceans of icy moons, which are thought to have hydrothermal vents similar to terrestrial ones. Each of these discoveries adds a new piece to the puzzle of the origin of life. Regardless of which ones are proved correct, contrasting theories are fuelling the search for answers. As Charles Darwin wrote: False facts are highly injurious to the progress of science for they often long endure: but false views, if supported by some evidence, do little harm, for everyone takes a salutary pleasure in proving their falseness; and when this is done, one path towards error is closed and the road to truth is often at the same time opened. Seán Jordan, Associate professor, Dublin City University Louise Gillet de Chalonge, PhD Student in Astrobiology, Dublin City University
本文於 修改第 1 次
|
三體互動問題:從天文物理到人際關係 – Avi Loeb
|
|
推薦1 |
|
|
應該是因為功力不夠,讀完這篇文章後我頗有「丈二金剛」的感覺。標題有「人際互動」,但涉及此議題文字可能頂多只有全文的1/8。 婁布博士寫這篇文章的目的,大概只是為了說出以下這段話: Four years ago, I recommended this novel to the creators of a new series on Netflix, which recently came to fruition. 劉慈欣先生這部大作《三體》和根據它拍成的電影,也被此欄04/12/24關於文革的貼文引用,做為該談話節目的「引子」。所以,雖然我沒搞懂作者此文要表達什麼主旨,還是給它轉載,湊個熱鬧;並替劉先生和Netflix打打廣告。 THE THREE-BODY PROBLEM: FROM CELESTIAL MECHANICS TO HUMAN INTERACTIONS AVI LOEB, 04/04/24 There are striking analogies between the interpersonal relationships of humans and the gravitational interaction of physical bodies in space. Consider a two-body system. In both realms, the systems can have stable configurations, leading to long-lived marriages or stellar binaries. But when a third body interacts strongly with these systems, a non-hierarchical three-body system often displays chaos with one of the members ejected and the other two remaining bound. This brings up analogies with interpersonal relationships when a third body is added to a non-hierarchical two-body system. The chaotic gravitational dynamics in a system of three stars inspired the storyline for the novel “The Three-Body Problem” by the Chinese science fiction writer Cixin Liu. The book describes a planet in the triple star system, Alpha-Centauri, whose unpredictable chaotic dynamics motivate a civilization born there to travel towards Earth, which possesses a stable orbit around the Sun. Four years ago, I recommended this novel to the creators of a new series on Netflix, which recently came to fruition. The restricted three-body problem involves a stable orbit of two large bodies accompanied by a small third body. In this case, the satellite resembles a child living with two parents, a configuration that often, but not always, displays stability. In 1975, the Scottish astronomer Douglas C. Heggie wrote a paper in which he simulated the evolution of pairs of stars embedded in a star cluster. Heggie compared the binding energy per unit mass of each stellar binary to the characteristic energy of the background cluster members. He found that binaries, which are more tightly bound than the background average, tend to get tighter as a result of interactions with the background stars. Conversely, binaries which are more loosely bound than the background, get wider and eventually detach. This resulted in Heggie’s law: “Hard binaries get harder, and soft binaries get softer.” This law rings a bell regarding married couples in a closed society of background people who interact intensely with them. The above-mentioned analogies are surprising, given that gravity is attractive, whereas human interactions are both attractive and repulsive. In electromagnetism, charges of equal sign repel each other, whereas charges of opposite sign are attracted to each other. This is different from human interactions, where people with aligned views are attracted to each other, and those with opposite views repel each other. The main difference between a collection of charged particles, a so-called plasma, and a collection of gravitating bodies is that electric interactions can be screened. An embedded charge tends to attract opposite charges around it, resulting in the so-called Debye sphere, outside of which this charge has no influence. The neutralization of embedded charges makes a plasma behave like a neutral fluid on scales much larger than the Debye scale. In contrast, gravity cannot be screened because all known gravitating masses are positive. The long-range nature of gravity, with no screening, allows it to dominate the evolution of the Universe. All other known forces, including electromagnetism and the weak and strong interactions, are much stronger than gravity on small scales, but they do not reach the cosmic scales on which gravity is most effective. Another difference between a plasma and a collection of gravitating bodies is that the latter is dynamically unstable. The core of a star cluster, with more binding energy per star than its envelope, tends to transport energy outwards, just like a hot object embedded in a cold environment. As energy is drained from the core, it condenses to a higher density where it becomes even “hotter”. This results in a gravothermal instability, during which the collapse process accelerates as the interaction time among the stars gets shorter as the cluster core gets denser. In 1957, the Austrian-British astrophysicist Herman Bondi wrote a paper in which he considered the existence of negative masses in Albert Einstein’s theory of gravity. A negative mass would repel a positive mass away from it and attract another negative mass towards it. Given that, a pair of positive and negative masses of equal magnitude could accelerate together up to the speed of light. The negative mass would push away the positive mass, which in turn would pull the negative mass for the ride. The runaway pair would accelerate indefinitely without any need for fuel or a propulsion system. Energy conservation would not be violated because the sum of the two masses is zero. Does the real Universe contain runaway pairs of positive and negative masses that accelerate close to the speed of light over billions of years? A runaway pair of equal and opposite-sign masses would not exert a net-gravitational influence at large distances because the two components sum up to a zero total mass. However, if the runaway pair passes close to a gravitational wave observatory, like LIGO-Virgo-KAGRA, it could induce a brief gravitational signal that could be detected at distances comparable to the separation between the positive and negative masses. The signal will be characterized by a pulse of gravitational attraction followed by a pulse of gravitational repulsion or the other way around. Given that the net gravitational effect of runaway pairs is zero, they have no effect on the mass budget of the Universe or its expansion history. However, it would be intriguing to search for them. If we ever find material with a negative mass, we could use it for gravitational propulsion. Alternatively, if we ever encounter an alien spacecraft that maneuvers with no associated engine or fuel, we should check whether its creators used negative mass to propel it. After all, we know that the expansion of the universe is accelerating due to the repulsive gravity generated by an unknown substance called “dark energy.” If we could bottle this substance in a thin enclosure, we might possess a negative mass object that could enable our future exploration of interstellar space. Avi Loeb is the head of the Galileo Project, founding director of Harvard University’s – Black Hole Initiative, director of the Institute for Theory and Computation at the Harvard-Smithsonian Center for Astrophysics, and the former chair of the astronomy department at Harvard University (2011-2020). He is a former member of the President’s Council of Advisors on Science and Technology and a former chair of the Board on Physics and Astronomy of the National Academies. He is the bestselling author of “Extraterrestrial: The First Sign of Intelligent Life Beyond Earth” and a co-author of the textbook Life in the Cosmos”, both published in 2021. His new book, titled “Interstellar”, was published in August 2023
本文於 修改第 2 次
|
關於暗能量的新數據及其導致的推論 -- Dennis Overbye
|
|
推薦1 |
|
|
A Tantalizing ‘Hint’ That Astronomers Got Dark Energy All Wrong Scientists may have discovered a major flaw in their understanding of that mysterious cosmic force. That could be good news for the fate of the universe. An interactive flight through millions of galaxies mapped using coordinate data from the Dark Energy Spectroscopic Instrument, or DESI. Credit...By Fiske Planetarium, University Of Colorado Boulder And Desi Collaboration (請至原網頁查看圖片) Dennis Overbye, 04/04/24 On Thursday, astronomers who are conducting what they describe as the biggest and most precise survey yet of the history of the universe announced that they might have discovered a major flaw in their understanding of dark energy, the mysterious force that is speeding up the expansion of the cosmos. Dark energy was assumed to be a constant force in the universe, both currently and throughout cosmic history. But the new data suggest that it may be more changeable, growing stronger or weaker over time, reversing or even fading away. “As Biden would say, it’s a B.F.D.,” said Adam Riess, an astronomer at Johns Hopkins University and the Space Telescope Science Institute in Baltimore. He shared the 2011 Nobel Prize in Physics with two other astronomers for the discovery of dark energy, but was not involved in this new study. “It may be the first real clue we have gotten about the nature of dark energy in 25 years,” he said. (卜凱:B.F.D. = big fucking deal) That conclusion, if confirmed, could liberate astronomers — and the rest of us — from a longstanding, grim prediction about the ultimate fate of the universe. If the work of dark energy were constant over time, it would eventually push all the stars and galaxies so far apart that even atoms could be torn asunder, sapping the universe of all life, light, energy and thought, and condemning it to an everlasting case of the cosmic blahs. Instead, it seems, dark energy is capable of changing course and pointing the cosmos toward a richer future. The key words are “might” and “could.” The new finding has about a one-in-400 chance of being a statistical fluke, a degree of uncertainty called three sigma, which is far short of the gold standard for a discovery, called five sigma: one chance in 1.7 million. In the history of physics, even five-sigma events have evaporated when more data or better interpretations of the data emerged. This news comes in the first progress report, published as a series of papers, by a large international collaboration called the Dark Energy Spectroscopic Instrument, or DESI. The group has just begun a five-year effort to create a three-dimensional map of the positions and velocities of 40 million galaxies across 11 billion years of cosmic time. Its initial map, based on the first year of observations, includes just six million galaxies. The results were released today at a meeting of the American Physical Society in Sacramento, Calif., and at the Rencontres de Moriond conference in Italy. DESI has generated the largest-ever 3-D map of the universe. Earth is depicted at the bottommost point of one magnified section.Credit...Claire Lamman/DESI collaboration; Custom Colormap Package by cmastro (請至原網頁查看圖片) “So far we’re seeing basic agreement with our best model of the universe, but we’re also seeing some potentially interesting differences that could indicate that dark energy is evolving with time,” Michael Levi, the director of DESI, said in a statement issued by the Lawrence Berkeley National Laboratory, which manages the project. The DESI team had not expected to hit pay dirt so soon, Nathalie Palanque-Delabrouille, an astrophysicist at the Lawrence Berkeley lab and a spokeswoman for the project, said in an interview. The first year of results was designed to simply confirm what was already known, she said: “We thought that we would basically validate the standard model.” But the unknown leaped out at them. When the scientists combined their map with other cosmological data, they were surprised to find that it did not quite agree with the otherwise reliable standard model of the universe, which assumes that dark energy is constant and unchanging. A varying dark energy fit the data points better. “It’s certainly more than a curiosity,” Dr. Palanque-Delabrouille said. “I would call it a hint. Yeah, it’s not yet evidence, but it’s interesting.” But cosmologists are taking this hint very seriously. Wendy Freedman, an astrophysicist at the University of Chicago who has led efforts to measure the expansion of the universe, praised the new survey as “superb data.” The results, she said, “open the potential for a new window into understanding dark energy, the dominant component of the universe, which remains the biggest mystery in cosmology. Pretty exciting.” Michael Turner, an emeritus professor at the University of Chicago who coined the term “dark energy,” said in an email: “While combining data sets is tricky, and these are early results from DESI, the possible evidence that dark energy is not constant is the best news I have heard since cosmic acceleration was firmly established 20-plus years ago.” In an artist’s rendering, light from quasars passes through intergalactic clouds of hydrogen gas. The light offers clues to the structure of the distant cosmos.Credit...NOIRLab/NSF/AURA/P. Marenfeld and DESI collaboration (請至原網頁查看圖片) Dark energy entered the conversation in 1998, when two competing groups of astronomers, including Dr. Riess, discovered that the expansion of the universe was speeding up rather than slowing, as most astronomers had expected. The initial observations seemed to suggest that this dark energy was acting just like a famous fudge factor — denoted by the Greek letter Lambda — that Einstein had inserted into his equations to explain why the universe didn’t collapse from its own gravity. He later called it his worst blunder. But perhaps he spoke too soon. As formulated by Einstein, Lambda was a property of space-itself: The more space there was as the universe expanded, the more dark energy there was, pushing ever harder and eventually leading to a runaway, lightless future. Dark energy took its place in the standard model of the universe known as L.C.D.M., composed of 70 percent dark energy (Lambda), 25 percent cold dark matter (an assortment of slow-moving exotic particles) and 5 percent atomic matter. So far that model has been bruised but not broken by the new James Webb Space Telescope. But what if dark energy were not constant as the cosmological model assumed? (卜凱:L.C.D.M. = Lambda cold dark matter) At issue is a parameter called w, which is a measure of the density, or vehemence, of the dark energy. In Einstein’s version of dark energy, this number remains constant, with a value of –1, throughout the life of the universe. Cosmologists have been using this value in their models for the past 25 years. But this version of dark energy is merely the simplest one. “With DESI we now have achieved a precision that allows us to go beyond that simple model,” Dr. Palanque-Delabrouille said, “to see if the density of dark energy is constant over time, or if it has some fluctuations and evolution with time.” The DESI project, 14 years in the making, was designed to test the constancy of dark energy by measuring how fast the universe was expanding at various times in the past. To do that, scientists outfitted a telescope at Kitt Peak National Observatory with 5,000 fiber-optic detectors that could conduct spectroscopy on that many galaxies simultaneously and find out how fast they were moving away from Earth. An animated 3-D model of DESI’s focal plane. The movement of the 5,000 robotic positioners is coordinated so that they don’t bump into one another. Credit Credit...By David Kirkby/desi Collaboration (請至原網頁查看圖片) As a measure of distance, the researchers used bumps in the cosmic distribution of galaxies, known as baryon acoustic oscillations. These bumps were imprinted on the cosmos by sound waves in the hot plasma that filled the universe when it was just 380,000 years old. Back then, the bumps were a half-million light-years across. Now, 13.5 billion years later, the universe has expanded a thousandfold, and the bumps — which are now 500 million light-years across — serve as convenient cosmic measuring sticks. The DESI scientists divided the past 11 billion years of cosmic history into seven spans of time. (The universe is 13.8 billion years old.) For each, they measured the size of these bumps and how fast the galaxies in them were speeding away from us and from each other. (卜凱:關於宇宙年齡,請見本欄2024/03/19貼文) When the researchers put it all together, they found that the usual assumption — a constant dark energy — didn’t work to describe the expansion of the universe. Galaxies in the three most recent epochs appeared closer than they should have been, suggesting that dark energy could be evolving with time. “And we do see, indeed, a hint that the properties of dark energy would not correspond to a simple cosmological constant” but instead may “have some deviations,” Dr. Palanque-Delabrouille said. “And this is the first time we have that.” But, she emphasized again, “I wouldn’t call it evidence yet. It’s too, too weak.” Time and more data will tell the fate of dark energy, and of cosmologists’ battle-tested model of the universe “L.C.D.M. is being put through its paces by precision tests coming at it from every direction,” Dr. Turner said. “And it is doing well. But, when everything is taken together, it is beginning to appear that something isn’t right or something is missing. Things don’t fit together perfectly. And DESI is the latest indication.” Dr. Riess of Johns Hopkins, who had an early look at the DESI results, noted that the “hint,” if validated, could pull the rug out from other cosmological measurements, such as the age or size of the universe. “This result is very interesting and we should take it seriously,” he wrote in his email. “Otherwise why else do we do these experiments?” Dennis Overbye is the cosmic affairs correspondent for The Times, covering physics and astronomy. More about Dennis Overbye
本文於 修改第 1 次
|
|
|