網路城邦
回本城市首頁 時事論壇
市長:胡卜凱  副市長:
加入本城市推薦本城市加入我的最愛訂閱最新文章
udn城市政治社會政治時事【時事論壇】城市/討論區/
討論區知識和議題 字體:
看回應文章  上一個討論主題 回文章列表 下一個討論主題
自然科學:普及篇 – 開欄文
 瀏覽3,220|回應21推薦3

胡卜凱
等級:8
留言加入好友
文章推薦人 (3)

亓官先生
嵩麟淵明
胡卜凱

我是物理系畢業生,有了自然科學的基本常識;容易讀懂科學新知的報導,同時也有興趣接觸它們。

過去《中華雜誌》雖是政論性和人文學術性刊物,但有時會介紹一些自然科學的研究結果;每年也都會刊登有關諾貝爾獎得主的消息。我唸大學時就替《中華》翻譯過一篇報導天文學脈動星的文章。

同窗好友王家堂兄在1980前後,介紹我進入高能物理的「普及科學」世界;此後常常讀一些這方面的書籍。因此,我一直保持著對物理學的興趣,之後自然而然的進入宇宙學領域。

以上三點是這個部落格過去經常轉載自然科學方面報導/論文的背景。



本文於 修改第 3 次
回應 回應給此人 推薦文章 列印 加入我的文摘

引用
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7219386
 回應文章 頁/共3頁 回應文章第一頁 回應文章上一頁 回應文章下一頁 回應文章最後一頁
破解開天闢地之謎--IAI Panel
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

這是藝術與思想研究學會正在進行討論會系列之一主持人外還有三位學者。我不習慣看視頻,所以沒有全程看完由於過去和最近本部落格轉貼了很多關於宇宙起源的報導/論文所以介紹它,分享給對這個議題有興趣的朋友

本城市專載過多篇在該學會網誌上發表的專論一般而言水準都相當高。不過,訂閱者才能閱讀該網誌

The riddle of the beginning
Making sense of the beginning of the universe

Philosophy for our times
1000+ Debates from the world's leading thinkers



本文於 修改第 3 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7243341
暗能量被反證? ------ Royal Astronomical Society
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

這下子科學家們要吵翻天了。即使在自然科學領域,理論之爭從來不只是對、錯或名聲、地位之爭;它更是研究經費和教席地盤之爭。

Dark energy 'doesn't exist' so can't be pushing 'lumpy' universe apart, physicists say

Royal Astronomical Society, 12/20/24

This graphic offers a glimpse of the history of the universe, as we currently understand it. The cosmos began expanding with the Big Bang but then around 10 billion years later it strangely began to accelerate thanks to a theoretical phenomenon termed dark energy. Credit: NASA, Licence type
Attribution (CC BY 4.0) 請至原網頁觀看示意圖

One of the biggest mysteries in science—dark energydoesn't actually exist, according to researchers looking to solve the riddle of how the universe is expanding.

Their analysis has been 
published in the journal Monthly Notices of the Royal Astronomical Society Letters.

For the past 100 years, physicists have generally assumed that the cosmos is growing equally in all directions. They employed the concept of dark energy as a placeholder to explain unknown physics they couldn't understand, but the contentious theory has always had its problems.

Now a team of physicists and astronomers at the university of Canterbury in Christchurch, New Zealand are challenging the status quo, using improved analysis of supernovae light curves to show that the universe is expanding in a more varied, "lumpier" way.

The new evidence supports the "timescape" model of cosmic expansion, which doesn't have a need for dark energy because the differences in stretching light aren't the result of an accelerating universe but instead a consequence of how we calibrate time and distance.

It takes into account that gravity slows time, so an ideal clock in empty space ticks faster than inside a galaxy.

The model suggests that a clock in the Milky Way would be about 35 percent slower than the same one at an average position in large cosmic voids, meaning billions more years would have passed in voids. This would in turn allow more expansion of space, making it seem like the expansion is getting faster when such vast empty voids grow to dominate the universe.

Professor David Wiltshire, who led the study, said, "Our findings show that we do not need dark energy to explain why the universe appears to expand at an accelerating rate.

"Dark energy is a misidentification of variations in the kinetic energy of expansion, which is not uniform in a universe as lumpy as the one we actually live in."

He added, "The research provides compelling evidence that may resolve some of the key questions around the quirks of our expanding cosmos.

"With new data, the universe's biggest mystery could be settled by the end of the decade."

Dark energy is commonly thought to be a weak anti-gravity force which acts independently of matter and makes up around two thirds of the mass-energy density of the universe.

The standard Lambda Cold Dark Matter (ΛCDM) model of the universe requires dark energy to explain the observed acceleration in the rate at which the cosmos is expanding.

Scientists base this conclusion on measurements of the distances to supernova explosions in distant galaxies, which appear to be farther away than they should be if the universe's expansion were not accelerating.

However, the present expansion rate of the universe is increasingly being challenged by new observations.

Firstly, evidence from the afterglow of the Big Bang—known as the Cosmic Microwave Background (CMB)—shows the expansion of the early universe is at odds with current expansion, an anomaly known as the "Hubble tension."

表單的底部

In addition, recent analysis of new high precision data by the Dark Energy Spectroscopic Instrument (DESI) has found that the ΛCDM model does not fit as well as models in which dark energy is "evolving" over time, rather than remaining constant.

Both the Hubble tension and the surprises revealed by DESI are difficult to resolve in models which use a simplified 100-year-old cosmic expansion law—Friedmann's equation.

This assumes that, on average, the universe expands uniformly—as if all cosmic structures could be put through a blender to make a featureless soup, with no complicating structure. However, the present universe actually contains a complex cosmic web of galaxy clusters in sheets and filaments that surround and thread vast empty voids.

Professor Wiltshire added, "We now have so much data that in the 21st century we can finally answer the question—how and why does a simple average expansion law emerge from complexity?

"A simple expansion law consistent with Einstein's general relativity does not have to obey Friedmann's equation."

The researchers say that the European Space Agency's Euclid satellite, which was launched in July 2023, has the power to test and distinguish the Friedmann equation from the timescape alternative. However, this will require at least 1,000 independent high quality supernovae observations.

When the proposed timescape model was last tested in 2017, the analysis suggested it was only a slightly better fit than the ΛCDM as an explanation for cosmic expansion, so the Christchurch team worked closely with the Pantheon+ collaboration team who had painstakingly produced a catalog of 1,535 distinct supernovae.

They say the new data now provides "very strong evidence" for timescape. It may also point to a compelling resolution of the Hubble tension and other anomalies related to the expansion of the 
universe.

Further observations from Euclid and the Nancy Grace Roman Space Telescope are needed to bolster support for the timescape 
model, the researchers say, with the race now on to use this wealth of new data to reveal the true nature of cosmic expansion and dark energy.

More information: Antonia Seifert et al, Supernovae evidence for foundational change to cosmological models, Monthly Notices of the Royal Astronomical Society: Letters (2024). DOI: 10.1093/mnrasl/slae112
Journal information: 
Monthly Notices of the Royal Astronomical Society Letters Provided by Royal Astronomical Society 

本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7243211
「大爆炸」假說到底在講啥子 - Elisha Sauers
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

邵爾絲女士這篇對大爆炸假說的闡述相當簡潔;就普及科學工作而言,也掌握到該「假說」要點。下文可以跟本欄2024/12/12 2024/05/13兩篇報導合看

What most people think they know about the Big Bang is wrong

A rapid stretching of the universe.

Elisha Sauers, Mashable, 12/14/24

Cosmic inflation tries to describe one brief but crucial phase in the Big Bang that launched the universe onto its expansion course.  Credit: Christine Daniloff / MIT / ESA / Hubble / NASA
請至原網頁觀看照片

Many textbooks and science educators have attempted to describe the Big Bang as the birth of the universe — an explosive start that happened at a specific point creating matter and flinging it into the void like shrapnel from a grenade.

But the Big Bang is not really the moment of creation — more like its aftermath. The Big Bang didn't emerge from a particular location in 
space, and it wasn't an explosion — at least not in the traditional sense.

Popular culture — and cosmologists, begrudgingly — made the unfortunate mistake of adopting a name for the theory that even evokes the sound of a gunpowder blast. So… 
bazinga?

"It’s often said that the whole universe we can now observe was once compressed into a volume the size of a golf ball," wrote John Mather, a Nobel Prize-winning astrophysicist and senior project scientist for 
NASA's James Webb Space Telescope, in an essay for Theedge.org. "But we should imagine that the golf ball is only a tiny piece of a universe that was infinite even then."

When the universe was still in its infancy, less than 1 billion years old, star formation fed on hydrogen that emerged from the Big Bang. Credit: NASA / ESA / A. Schaller (for STScI) illustration
請至原網頁觀看照片

The Big Bang Theory describes an event when existing space — much hotter, denser, and smaller at the time — suddenly and rapidly started stretching out. The primitive universe was a scalding goulash of tiny particles, light, and energy, but as it expanded, space cooled enough to allow 
important processes to occur, such as forming atoms and elements. The expansion continues today.

That's it. It doesn't suggest what the conditions were before expansion. It doesn't suppose what the universe is expanding into. It doesn't even explain what caused the expansion in the first place. And there are reasons why trying to imagine the event as an explosion can lead to some misinformed conclusions.

"No reputable scientist will claim that we understand in detail what happened at the exact moment when the universe began. We just don't," said Don Lincoln, senior scientist at Fermilab in Illinois, in 
a video. "In spite of the fact that we don't know everything about how the universe began, I'm constantly staggered by the fact that we know so much."

The Big Bang pertains to the visible universe 

To understand the Big Bang — and Mather's previous comment — it's first important to clarify that this theory applies to the visible universe, not the universe as a whole. The visible universe is a bubble of the cosmos centered on our perspective from Earth, with a radius determined by the speed of light. The entire bubble is about 92 billion 
light-years wide.

The bubble's size is not determined by the range of telescopes, but the literal limitation of light. There is a maximum distance from which 
photons could have traveled to an observer in the age of the universe. This boundary is known as the cosmic light horizon: Any potential signals beyond it haven't had time to reach us — and they never will, not even billions of years into the future. That's because at a certain extreme distance, far-flung objects recede faster than the speed of light.

So what's beyond this bubble? No one knows because it's unseeable, but scientists could speculate there's more universe. After all, with the expansion of space, scientists are aware that, every second, thousands of stars are escaping our view, beyond that horizon.

Where exactly did the Big Bang happen?

The Big Bang should be thought of as a "point" in time but not happening at a particular place. Astronomers will often say that the Big Bang happened everywhere, which is a confounding idea if you've been thinking of the Big Bang like a detonating bomb.

Imagine instead a hypothetical scenario where space was condensed within a speck, like a pinhead-sized balloon. Then imagine that this tiny balloon somehow inflated into the size of an orange. In this analogy, you can begin to understand why there is no "origin point" for the Big Bang: Nothing left the pinhead where it began; the pinhead point got exponentially bigger.

This is one of the reasons why many astrophysicists say everywhere in the knowable universe could be considered part of 
the Big Bang's center. There was no particular site from which bits were blown away, according to the theory.

Astronomer Edwin Hubble used the 100-inch Hooker telescope in California to observe that galaxies were receding in space in all directions. Credit: NASA / Edwin P. Hubble Papers / Huntington Library
請至原網頁觀看照片

The Big Bang wasn't really an explosion

Scientific observations support the idea of rapid universal expansion versus an explosion. If there had been a firecracker-type blast that scattered matter outward, for example, the laws of physics would dictate that debris farther from that place where it exploded would be moving faster than the stuff closer to that starting point.

"That's because objects far away from the firecracker have to be moving faster. That's how they got far away," Lincoln 
said.

But that is not what astronomers see. In the cosmos, the space between galaxies is increasing, in all directions — not just relative to a central spot. Astronomer
Edwin Hubble, for whom the Hubble Space Telescope was later named, discovered this in 1929.

Using the 100-inch Hooker Telescope in California, Hubble noted that the farther a galaxy was from the 
Milky Way, Earth's home galaxy, the faster it seemed to be receding. He figured this out by plotting 24 nearby galaxies' velocities and distances. The plot showed that everything was drifting uniformly, at speeds proportional to distance, in all directions.

The rate of expansion has been dubbed the 
Hubble Constant. Two years after Hubble's observations, a Belgian astronomer and priest, Georges Lemaître, used this premise to publish the first Big Bang-like theory to explain the beginnings of the universe.

Cosmologists believe the universe has expanded over 13.8 billion years since the Big Bang. Credit: Britt Griswold (Maslow Media Group) / NASA illustration
請至原網頁觀看照片

How astronomers know the universe is expanding

With Hubble's finding that space itself is expanding, scientists have been able to estimate 
the age of the universe. The formula for velocity — which you might have learned in high school — is distance divided by time. Scientists already know the speeds of galaxies and their distances, so they can figure out the duration by dividing distance by speed.

If scientists rewind the clock from the present day to the time that everything in the knowable universe crumples back into that small deflated balloon, it occurred about 
13.8 billion years ago.

So, if the universe is 13.8 billion years old, one might incorrectly assume that the visible bubble of the universe has a radius of 13.8 billion light-years, with an overall width of 27.6 billion light-years. But the universe isn't standing still, and the distance between objects isn't fixed. The expansion of space explains the discrepancy between 27.6 billion light-years and 92 billion light-years, the diameter of the visible universe.

Have scientists disproved the Big Bang?

Scientists have not disproved the Big Bang Theory, but they have discovered disagreements in the rate of expansion — the Hubble Constant — from different research teams' measurements. The disagreement is known as the 
Hubble tension.

In short, speed measurements based on telescope observations of the present universe are somewhat higher than projections based on known conditions of the universe during its infancy. For the past few years, astronomers have considered that something is causing the expansion rate to speed up. Studies using 
the Webb telescope have found that the small-but-significant divergence in the expansion rate is probably not the result of miscalculations but an aspect of the universe that is not yet understood.

As scientists work to solve this mystery, the Big Bang might need some tweaking, but so far this disparity has not upended the bottom line, which is that space was once smaller and hotter, then it suddenly stretched out, and it's still expanding.

A map of the Cosmic Microwave Background. U.S. physicists Arno Penzias and Robert Wilson unintentionally discovered the Cosmic Microwave Background, which fills the visible universe. Credit: ESA / Planck Collaboration
請至原網頁觀看照片

The expansion rate of the early universe

Researchers have calculated the expansion rate of the baby universe using data from the so-called 
Cosmic Microwave Background. U.S. physicists Arno Penzias and Robert Wilson accidentally discovered this phenomenon, a faint afterglow from 380,000 years after the Big Bang, using a radio telescope in 1965.

Around the same time, a separate team at 
Princeton University had predicted that such waves should exist. If astronomers were archaeologists, this discovery would be akin to finding the earliest fossil of light. It is the oldest thing in the universe anyone has seen.

This heat signature, radiating from atoms that are now more than 46 billion light-years away and stretched into microwaves, fills the sky. The European Space Agency’s 
Planck mission mapped the microwaves to measure teensy fluctuations in temperature. These slight variations allow scientists to infer the expansion rate at the time.

How 'cosmic inflation' theory fits into the Big Bang

Cosmic inflation tries to describe one brief but crucial phase in the Big Bang narrative that launched the universe onto its expansion timeline.

Alan Guth, a theoretical physicist at MIT, put forward the idea in 1980. It suggests that some repulsive form of gravity, something like 
dark energy, drove the universe's rapid expansion for an early instant. This phase would have lasted for a fraction of a trillionth of a second. Then, the energy that propelled inflation turned off.

"I usually describe inflation as a theory of the 'bang' of the Big Bang," Guth said 
in a 2014 Q&A by the university. "In its original form, the Big Bang theory never was a theory of the bang. It said nothing about what banged, why it banged, or what happened before it banged."

During the inflation phase, the tiny universe would have expanded at a rate faster than light. And get this: It wouldn't have broken any laws of physics.

"It's true that nothing can move through space faster than light, but there are no restrictions on how fast space can expand," Lincoln said.

How the 'Big Bang' got its name

Fred Hoyle, an astronomer and well-known science communicator in the United Kingdom, is largely credited with coining the "big bang" in 1949. He was in many ways the Neil deGrasse Tyson of his time. But today many astrophysicists and cosmologists lament that the misnomer stuck.

During a BBC broadcast, Hoyle described theories based on the idea that "all the matter in the universe was created in one big bang at a particular time in the remote past," according to 
a transcript published in a BBC magazine. He later mentioned the phrase again in his 1950 book "The Nature of the Universe."

Hoyle balked at the idea of a sudden origin of the universe, but he didn't use the words "big bang" disparagingly, according to 
a recent essay about it in the journal Nature. Instead, he meant to convey the hypothesis with descriptive metaphors to help get the point across over radio.

Bazinga, indeed.


SEE ALSO: 
Webb telescope spots proof of the first stars to light the universe

本文於 修改第 2 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7242918
宇宙學需要新點子 -- R. Lea
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

'Our understanding of the universe may be incomplete': James Webb Space Telescope data suggests we need a 'new cosmic feature' to explain it all

"The discrepancy between the observed expansion rate of the universe and the predictions of the standard model suggests that our understanding of the universe may be incomplete. "

Robert Lea, 12/09/24

Credit: NASA, ESA, CSA, STScI, Jose M. Diego (IFCA), Jordan C. J. D’Silva (UWA), Anton M. Koekemoer (STScI), Jake Summers (ASU), Rogier Windhorst (ASU), Haojing Yan (University of Missouri)
請至原網頁查看照片

New observations from the James Webb Space Telescope (JWST) have corroborated data from its predecessor, the Hubble Space Telescope, to determine something is missing from our recipe of the cosmos.

The
 JWST conducted its largest survey yet of the accelerating expansion of the cosmos as scientists attempt to discover why the universe is expanding faster today than our picture of its infancy, billions of years ago, says that it should. Currently, scientists theorize that the accelerating expansion is caused by a placeholder element, "dark energy," but they really need to know what dark energy actually is before a conclusive explanation can be found.

JWST's survey served to cross-check observations made by Hubble that suggested a discrepancy in measurements of the rate of cosmic expansion, known as the Hubble constant. This issue has been termed "
Hubble tension," and these new findings show that errors in data from the long-serving space telescope of the same name are not responsible for it.

As the Hubble tension can't be accounted for by either our best models of the universe or errors in Hubble measurements, an extra ingredient still seems to be needed in 
our cosmic recipe.

"The discrepancy between the observed expansion rate of the universe and the predictions of the standard model suggests that our understanding of the universe may be incomplete," team leader 
Adam Reiss, an astrophysicist at Johns Hopkins University, said in a statement. "With two NASA flagship telescopes now confirming each other’s findings, we must take this [Hubble tension] problem very seriously — it's a challenge but also an incredible opportunity to learn more about our universe."

In 2011, 
Reiss won the Nobel Prize in Physics for the discovery of dark energy, a mysterious force that drives the acceleration of the expansion of the universe. This new research builds upon that Nobel Prize-winning work.

What is the Hubble tension?

Because the expansion of the universe works on very large scales, Hubble tension isn't something that affects us in our everyday life or even on scales of the solar system or even the Milky Way.

This discrepancy becomes really problematic when considering the distances between galaxies and the larger structure of the universe. That means cosmologists can't really understand the evolution of the universe until they know what the cause of the Hubble tension.

The Hubble tension arises from the fact that there are two ways to calculate the Hubble constant.

Scientists can use things like distances to 
Type Ia supernovas or variable stars, which they call "standard candles," to measure the distances from Earth to the galaxies that host them and then determine how rapidly these galaxies are moving away.

They can also use our models of cosmic evolution to "wind forward" the universe and calculate what the Hubble constant should be today.

However, when measurements of the Hubble constant are taken in the local universe, they are higher than the value predicted by working forward using the best model we have for cosmic evolution, the 
Lambda Cold Dark Matter (LCDM) model, also known as the Standard Model of Cosmology.

A diagram showing the evolution of the universe according to the prevailing cold dark matter model. Observations of El Gordo could throw this model into doubt
請至原網頁查看說明及圖示

The LCDM-based method gives a value for the Hubble constant of about 152,000 miles per hour per megaparsec (68 kilometers per second per megaparsec, or Mpc), while measurements based on telescope observations regularly give a higher value of between 157,000 mph per Mpc to 170,000 mph per Mpc (70 to 76 km/s/Mpc).

An Mpc is equivalent to 3.26 light-years or 5.8 trillion miles (9.4 trillion kilometers), so this is a huge discrepancy, one which scientists feared was too large to be explained by uncertainties in observations.

Looks like they were right!

Hubble was right!

To confirm the findings of Hubble, Reiss, and colleagues turned to the largest sample of data collected by the JWST during its first two years of operations, which came from two different projects.

To measure the Hubble constant, they used three independent methods to determine the distance to other galaxies. First, they used so-called "
Cepheid variables," pulsating stars considered the gold standard for measuring cosmic distances. The team then cross-checked this with measurements based on carbon-rich stars and the brightest red giants across the same galaxies.

The team particularly honed in on galactic distances measured by Hubble.

The team's research with the JWST covered about a third of the full sample of galaxies as seen by Hubble using the galaxy 
Messier 106 (M106), also known as NGC 4258 and located around 23 million light-years away in the constellation Canes Venaticias, a reference point.

A dusty-looking section of space with orange and red streaks concentrated around a glowing greenish center.
請至原網頁查看照片

This not only helped them produce the most precise local measurements of the Hubble constant to date, but it also independently verified that Hubble's distance measurements were accurate.

The galaxies observed by the JWST yielded a Hubble constant of around 162,400 mph per Mpc (72.6 km/s/Mpc), nearly identical to the value of 162849 mph per Mpc (72.8 km/s/Mpc) found by Hubble for the same galaxies.

This eliminates the possibility that the Hubble tension is just an artifact arising from significant bias in the long-serving space telescope's measurements.

"The JWST data is like looking at the universe in high definition for the first time and really improves the signal-to-noise of the measurements,’’ team member and Johns Hopkins University graduate student Siyang Li said.

Of course, this means there is still a problem of Hubble tension that needs to be tackled. Because the expansion of the universe works on very large scales.

Johns Hopkins cosmologist Marc Kamionkowski, who was not involved with this study, thinks that solving the Hubble tension requires a new element to our models of the universe. He has an idea of what this element may be.

"One possible explanation for the Hubble tension would be if there was something missing in our understanding of the early universe, such as a new component of matter —
 early dark energy — that gave the universe an unexpected kick after the Big Bang," Kamionkowski said in the statement. "And there are other ideas, like funny dark matter properties, exotic particles, changing electron mass, or primordial magnetic fields that may do the trick.

"Theorists have license to get pretty creative.”

The team's research was published on Monday (Dec. 9) in 
the Astrophysical Journal.


Related Stories:

— 
James Webb Space Telescope spies never-before-seen star behavior in distant nebula (video, photo)
— 
Galactic penguin honors the 2nd anniversary of James Webb Space Telescope's 1st images
— 
James Webb Space Telescope directly images its coldest exoplanet target yet

本文於 修改第 2 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7242713
黑洞影像佐證下的萬有引力理論 -- Robert Lea
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

轉載下文於此,略盡推廣科學普及教育之責。

索引

anatomy
:此處:(黑洞)結構分析;解剖學,解剖構造,(動植物)結構,身體,剖析
black hole, mimetic:「擬態引力理論」中黑洞的結構和性質
caveat
:此處:提示;警告,注意,謹慎
curvature
:彎曲率,彎曲度
ergosphere動圈(在旋轉黑洞外面的區域);黑洞能量/物質攝取區(胡卜凱的翻譯)
event horizon事件穹界;(黑洞)阻隔區(胡卜凱的翻譯)
magnum opus
:代表作,巨著,傑作
gravity, mimetic
擬態引力
mimic:模仿的,模擬的,非真的,假裝的
Schwartzchild solution
singularity
:「時-空」終結點(胡卜凱的翻譯)
singularity, naked
裸奇異點;在黑洞內,但是沒有阻隔區包圍的「『時-空』終結點」(胡卜凱的解讀)
spacetime
「時-空」


Black hole images deliver a deathblow to alternative theory of gravity 

Robert Lea, 11/26/24

Images of the supermassive black holes wouldn’t have been possible if mimetic gravity was the right recipe for gravity.
請至原網頁觀看照片

Researchers have examined the 
historical images of supermassive black holes — Sagittarius A* (Sgr A*) at the heart of the Milky Way and the black hole at the center of the galaxy Messier 87 (M87) — to rule out an alternative to our current best theory of gravity.

In doing so, the team behind this research also help to confirm the existence of dark energy and dark matter, the two most mysterious and difficult-to-explain aspects of the universe. 

Despite being arguably the most ubiquitous “force” experienced by humanity, gravity hasn’t necessarily been easy to explain. Newton’s theory of gravity was a good early attempt and still works perfectly well for relatively small-scale calculations, but it starts to fail when considering massive objects, even struggling to explain the wobbly orbit of Mercury.

In 1915, Einstein put forward the 
theory of general relativity, suggesting that gravity is not a force in the traditional sense but instead arises from the curvature of space and time, united as a four-dimensional entity called “spacetime,” caused by the presence of mass. The more mass an object has, the greater the curvature of spacetime and, thus, the greater the gravitational influence of that object. 

One of the most remarkable aspects of general relativity is the number of concepts that it predicted, including black holes and 
gravitational waves, that later came to be evidentially verified. General relativity is one of the most tested theories in science, and it has survived every experimental challenge thrown at it. It has thus supplanted Newton’s theory of gravity.

General relativity isn’t perfect, however. One of the major problems with Einstein’s magnum opus theory is the fact cosmological theories that tell the story of the universe based upon it can’t account for the so-called “dark universe.” That is dark energy, the mysterious force that drives the acceleration of the universe’s expansion, and dark matter, the strange “stuff” that out-populates ordinary matter by five to one but remains effectively invisible.

The dark universe problem for Einstein

Dark energy accounts for an estimated 70% of the universe’s matter/energy budget, while dark matter accounts for a further 25% of that budget. This means that everything we see in the universe around us, all the stars, planets, moons, asteroids, animals, etc… account for just 5% of the contents of the universe. No wonder most scientists are desperate to discover what dark energy and dark matter are. 

Why the caveat most

That’s because other scientists propose that dark matter and dark energy don’t exist. Instead, they suggest that the effects we attribute to them are a consequence of the fact that general relativity isn’t the “right recipe” for gravity. These researchers posit theories of “modified gravity” that do away with the need for the dark universe to exist. Some modify Newton’s theory of gravity; others attempt to extend on general relativity.

One of the most credible modified gravity theories is mimetic gravity, suggested in 2013 by researchers Slava Mukhanov and Ali Chamseddine. Mimetic gravity extends general relativity, leading to the appearance of a dust-like perfect fluid that can mimic cold dark matter at a cosmological level and can explain the late-time acceleration of the cosmic expansion attributed to dark energy.

To surpass and supplant general relativity, one thing any modified gravity theory must do is also (to explain) the phenomenon in the universe that conforms to Einstein’s 1915 theory. That is where the images of black holes come in.
(to explain)為原文所無我加上來補足文法和文意可能是鍵誤或漏植

In April 2019, when scientists from the Event Horizon Telescope (EHT) revealed the first-ever image of a black hole to the public, the supermassive black hole M87*, they expressed how surprised they were that it almost exactly conformed to the appearance of a black hole and its surroundings predicted by general relativity. This was compounded in May 2022, when the first image of “our black holeSgr A* also tightly conformed to expectations and closely resembled M87* despite the fact that the latter is much more massive than the Milky Way’s supermassive black hole. 

Thus, it is only natural to put theories of modified gravity up against observations of the supermassive black holes M87* and Sgr A* collected by the EHT, a global network of instruments that effectively creates a single Earth-sized telescope. That is exactly what the authors of 
a new paper set out to do.

“In a sense, the mere fact that we can see these images rules out mimetic gravity!” said University of Trento researcher Sunny Vagnozzi. “In short, our findings completely rule out baseline mimetic 
gravity, which was previously one of the least unlikely modified gravity-based models for dark matter and dark energy.

“In some sense, it empirically gives even more support to the fact that dark matter and dark energy may be ‘real’ and not the effect of modifications of gravity.”

The anatomy of black holes: General relativity vs. mimetic gravity

To understand why the team’s research and the EHT images are bad news for supporters of mimetic gravity, it is necessary to delve into the anatomy of 
black holes a little bit. 

All black holes are considered to be composed of a central singularity, an inestimably small region of space with infinite mass where the laws of physics fail, and an outer boundary called an “event horizon.” The event horizon is the point at which the gravitational influence of the black hole becomes so great that not even light is fast enough to escape. Thus, anything that passes the event horizon of a black hole is on a one-way trip to the central singularity. 

Around the event horizon is a region of space that is constantly dragged along with the rotational of the black hole due to its immense gravity. It is impossible for matter to sit still in this region, called the “ergosphere.” Further out is matter whipping around the black hole at near-light speeds, causing it to glow. This matter appears as a striking golden ring in the images of M87* and Sgr A*, with the shadow of the black holes appearing in the center of these rings.

That is, if general relativity is the correct recipe for gravity and if a solution to its equations called the Schwartzchild solution accurately describes the anatomy of black holes.

Mimetic gravity has two different ideas about black holes. Vagnozzi explained that one of the two natural classes of objects in mimetic gravity is a naked singularity. This is a central singularity that is not bounded by a light-trapping event horizon. No event horizon would have meant no EHT image.

The second possible object predicted in mimetic gravity is a so-called “mimetic black hole.” If the EHT had imaged one of these objects when it snapped M87* or Sgr A*, what researchers would seen is an image with a much smaller dark region at its heart than the dark region that was seen in these black hole images.

“We demonstrated that the naked singularity does not cast a shadow. It should not lead to an image in EHT observations,” Vagnozzi said. “To use an everyday life analogy, say EHT images are actually the reflection of ourselves we see in the mirror. If I were a mimetic naked singularity, I would look in the mirror and see no reflection. If I were a mimetic black hole, my image in the mirror would be much smaller than it actually is.

“This analogy is stretching it a lot, but it should give an idea of what is happening.”

Vagnozzi explained that although the interpretation of EHT data to create the images of M87* and Sgr A* is a complex process with some margin of error, this possible uncertainty simply isn’t significant enough for the team’s conclusion to be incorrect.

The researcher stresses that the research conducted by the team rules out only a “baseline” version of mimetic gravity, adding that more complex mimetic gravity theories with more adjustments to general relativity could still be possible.

“This is absolutely a demonstration of the importance of the EHT and its observations. It demonstrates that EHT has the potential to rule out candidate theories of dark matter and dark energy, which were previously completely viable,” Vagnozzi said.

“The takeaway message is very important: any theory that claims to explain dark matter and dark energy needs not only be consistent with cosmological observations but also with observations of black holes, and this provides a highly non-trivial test of many such models, which may be inconsistent with EHT images. 

“We believe this idea deserves to be explored in much more detail.”


Reference:

Mohsen Khodadi, Sunny Vagnozzi, and Javad T. Firouzjaee  
Event Horizon Telescope observations exclude compact objects in baseline mimetic gravity, Scientific Reports (2024). DOI: 10.1038/s41598-024-78264-y

本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7241885
人體器官的細胞結構圖 ------ RAMAKRISHNAN/UNGAR
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

Scientists map out the human body one cell at a time

 
ADITHI RAMAKRISHNAN and LAURA UNGAR, 11/21/24

This image provided by Ana-Maria Cujba shows blood vessels in a portion of the human small intestine, March 21, 2024. (Ana-Maria Cujba/Wellcome Sanger Institute via AP)
請至原網頁查看圖片

This image provided by Nathan Richoz shows a T cell aggregate in a human trachea biopsy on July 12, 2021, at the University of Cambridge in Cambridge, England. (Nathan Richoz/Clatworthy Lab/University of Cambridge via AP)
請至原網頁查看圖片

Researchers have created an early map of some of the human body’s 
estimated 37.2 trillion cells.

Each type of cell has a unique role, and knowing what all the cells do can help scientists better understand health and diseases such as cancer.

Scientists focused on certain organs — plotting the jobs of cells in the mouth, stomach and intestines, as well as cells that guide how bones and joints develop. They also explored which cells group into tissues (
組織), where they’re located in the body and how they change over time.

They hope the high-resolution, open-access atlas — considered a first draft — will help researchers 
fight diseases that damage or corrupt human cells.

“When things go wrong, they go wrong with our cells first and foremost,” said Aviv Regev, co-chair of the Human Cell Atlas consortium who was involved with the research.

The findings were published Wednesday in Nature and related journals.

The group plans to release a more complete atlas in 2026, profiling cells across 18 organs and body systems. That includes the skin, heart, breasts and more.

The current cell map not only charts the many types of human cells, but it also shows the relationships of cells to each other, said Dr. Timothy Chan, a cancer expert at the Cleveland Clinic.

Chan said it’s a deep dive into human biology that’s sure to have practical impact such as identifying and treating cancer cells.

“Different types of cells have different Achilles’ heels,” said Chan, who was not involved in the studies. “This is going to be a boon” for cancer research.

Scientists are also creating other atlases that could help them learn more about the underpinnings of health and disease in specific parts of the body.

With brain atlases, they’re seeking to understand the structure, location and function of the many types of brain cells. A new gut microbiome atlas looks at the collection of microorganisms in the intestines, which plays a key role in digestion and immune system health.


The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute’s Science and Educational Media Group. The AP is solely responsible for all content.

ADITHI RAMAKRISHNAN is a science reporter for The Associated Press, based in New York. She covers research and new developments related to space, early human history and more.

LAURA UNGAR covers medicine and science on the AP’s Global Health and Science team. She has been a health journalist for more than two decades.

Related Stories

Científicos crean mapa del cuerpo humano célula por célula
The dark energy pushing our universe apart may not be what it seems, scientists say
SpaceX launches giant Starship rocket, but aborts attempt to catch booster with mechanical arms

本文於 修改第 4 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7241461
「(基因)機能擴張」研究的風險 -- Ross Pomeroy
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

過去兩、三年間,我開始注意到「(基因)機能擴張」這個「術語」開始在科技報導的文章中出現。不過,因為它太專門,我一直懶得花時間去了解。下文標題相當聳動,內容淺顯易懂;或許是認識這個術語的機會。

下文提及新冠病毒有可能來自這類實驗,雖然作者強調「可能性不大」;有點意思。作者並暗示它可能成為「生物戰武器」;則值得警惕和關注。

索引

airborne
:此處:空氣傳播的;在空中的空運的飛行中的
antigenic具抗原性的(這是我的翻譯,有待專家指正)
dank
陰冷潮濕的,濕冷的
Gain-of-Function Research(基因)機能擴張研究
lyssavirus
麗沙病毒屬
pathogen病原體
rabies
狂犬病
spelunking洞穴探察


The Gain-of-Function Experiment That Could 'Eliminate Humans From the Face of the Earth'

Ross Pomeroy, 06/15/24

A Google search for "Frio Cave" makes the Uvalde County, Texas destination look like a tourists' dream. One quickly learns that the cave is home to tens of millions of Mexican free-tailed bats, and that you can sometimes witness the flapping horde streaming out of their dark, dank home just before sunset, clouding the sky in a "once in a lifetime experience."

But Frio Cave has a darker history that visitors websites don't mention. More than fifty years ago,
two humans contracted rabies while spelunking there.

That humans would get infected with rabies while visiting a bat-infested cave isn't altogether surprising. Bats are a reservoir for the terrifying disease –
99% fatal to humans once symptoms – like hyperactivity, hallucinations, seizures, and fear of water develop. A simple bite from one of the millions of bats could have transmitted a lyssavirus that triggers rabies. However, in this instance, the spelunkers apparently weren't bitten. Rather, it seems they caught the virus from the air itself.

A team of scientists subsequently
investigated. They found that rabies virus could be transmitted to animals housed in empty cages within the cave, apparently just via the atmosphere itself. Moreover, the virus was isolated from samples collected via air condensation techniques.

The episode raised a disturbing prospect. Had rabies, the deadliest virus for humankind, gone airborne?

To be clear, it had not, at least not in a manner that would result in ultra-contagious, human-to-human spread. The sheer number of rabies-carrying bats in the cave likely transformed it into a "hot-box" of infection. Rabies remains transmitted almost entirely through bites and scratches from infected animals, and it is
rapidly inactivated by sunlight and heat. However, for safety, members of the general public are now only allowed to enter Frio Cave on guided tours that remain near the mouth of the cave.

That doesn't mean that rabies virus couldn't mutate to become transmitted through the air. It's an RNA virus, and these are known to have high mutation rates. Indeed, scientists have
found "a vast array of antigenic variants of this pathogen in a wide range of animal hosts and geographic locations."

Moreover, as two Italian scientists wrote in a 2021
article, "Even single amino acid mutations in the proteins of Rabies virus can considerably alter its biological characteristics, for example increasing its pathogenicity and viral spread in humans, thus making the mutated virus a tangible menace for the entire mankind."

Another possible route for this to occur would be through a "gain-of-function" experiment, in which researchers employ gene-editing to tweak the rabies virus, making it evade
current vaccines and endowing it with the ability to spread through the air like measles or influenza. Gain-of-function research has earned increased public scrutiny of late as there's a small, outside chance it may have produced SARS-CoV-2, the virus that causes Covid-19.

Paul Offit, a professor of pediatrics at the Children's Hospital of Philadelphia and co-inventor of a rotavirus vaccine,
commented on the potential to augment rabies through gain-of-function in a recent Substack post.

"In the absence of an effective vaccine, it could eliminate humans from the face of the earth. The good news is that no one has tried to make rabies virus more contagious. But that doesn’t mean that it’s not possible or that no one would be willing to try."


本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7231232
什麼是「三體問題」,它真的無解?-Skyler Ware
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

繼續「三體(互動)問題」的通俗科學報導。請參看本欄2024/04/13貼文;以及其它相關文章1文章2(該欄2024/04/12)


What is the 3-body problem, and is it really unsolvable?

, 06/06/24

The three-body problem is a physics conundrum that has boggled scientists since Isaac Newton's day. But what is it, why is it so hard to solve and is the sci-fi series of the same name really possible?

The real-like “Tatooine” planet Kepler-16b orbits two suns at once, illustrating the infamous three-body problem. (Image credit: NASA/JPL-Caltech) 請至原網頁查看照片

A rocket launch. Our nearest stellar neighbor. A Netflix show. All of these things have something in common: They must contend with the "three-body problem." But exactly what is this thorny physics conundrum?

The three-body problem describes a system containing three bodies that exert gravitational forces on one another. While it may sound simple, it's a notoriously tricky problem and "the first real worry of Newton," Billy Quarles, a planetary dynamicist at Valdosta State University in Georgia, told Live Science.

In a system of only two bodies, like a planet and a star, calculating how they'll move around each other is fairly straightforward: Most of the time, those two objects will orbit roughly in a circle around their center of mass, and they'll come back to where they started each time. But add a third body, like another star, and things get a lot more complicated. The third body attracts the two orbiting each other, pulling them out of their predictable paths.

The motion of the three bodies depends on their starting state — their positions, velocities and masses. If even one of those variables changes, the resulting motion could be completely different.

"I think of it as if you're walking on a mountain ridge," Shane Ross, an applied mathematician at Virginia Tech, told Live Science. "With one small change, you could either fall to the right or you could fall to the left. Those are two very close initial positions, and they could lead to very different states."

There aren't enough constraints on the motions of the bodies to solve the three-body problem with equations, Ross said.

But some solutions to the three-body problem have been found. For example, if the starting conditions are just right, three bodies of equal mass could chase one another in a figure-eight pattern. Such tidy solutions are the exception, however, when it comes to real systems in space.

Certain conditions can make the three-body problem easier to parse. Consider Tatooine, Luke Skywalker's fictional home world from "Star Wars" — a single planet orbiting two suns. Those two stars and the planet make up a three-body system. But if the planet is far enough away and orbiting both stars together, it's possible to simplify the problem.

This artist image illustrates Kepler-16b, the first directly detected circumbinary planet, which is a planet that orbits two stars. (Image credit: NASA/JPL-Caltech)
請至原網頁查看照片

"When it's the Tatooine case, as long as you're far enough away from the central binary, then you think of this object as just being a really fat star," Quarles said. The planet doesn't exert much force on the stars because it's so much less massive, so the system becomes similar to the more easily solvable two-body problem. So far, scientists have found more than a dozen Tatooine-like exoplanets, Quarles told Live Science.

But often, the orbits of the three bodies never truly stabilize, and the three-body problem gets "solved" with a bang. The gravitational forces could cause two of the three bodies to collide, or they could fling one of the bodies out of the system forever — a possible source of "rogue planets" that don't orbit any star, Quarles said. In fact, three-body chaos may be so common in space that scientists estimate there may be 20 times as many rogue planets as there are stars in our galaxy.

When all else fails, scientists can use computers to approximate the motions of bodies in an individual three-body system. That makes it possible to predict the motion of a rocket launched into orbit around Earth, or to predict the fate of a planet in a system with multiple stars.


With all this tumult, you might wonder if anything could survive on a planet like the one featured in Netflix's "3 Body Problem," which — spoiler alert — is trapped in a chaotic orbit around three stars in the Alpha Centauri system, our solar system's nearest neighbor.

"I don't think in that type of situation, that's a stable environment for life to evolve," Ross said. That's one aspect of the show that remains firmly in the realm of science fiction.
 


Skyler Ware is a freelance science journalist covering chemistry, biology, paleontology and Earth science. She was a 2023 AAAS Mass Media Science and Engineering Fellow at Science News. Her work has also appeared in Science News Explores, ZME Science and Chembites, among others. Skyler has a Ph.D. in chemistry from Caltech.

Related:

Cosmic 'superbubbles' might be throwing entire galaxies into chaos, theoretical study hints

'Mathematically perfect' star system being investigated for potential alien technology
How common are Tatooine worlds?

Mathematicians find 12,000 new solutions to 'unsolvable' 3-body problem


本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7230651
基因與智能障礙-Emily Cooke
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

New genetic cause of intellectual disability potentially uncovered in 'junk DNA'

Emily Cooke, 06/01/24

Mutations in "junk DNA" could be responsible for rare genetic cases of intellectual disability, new research hints.

Scientists have uncovered a rare genetic cause of intellectual disability in a historically overlooked part of the human genome: so-called junk DNA.

This knowledge could someday help to diagnose some patients with these disorders, the researchers say.

An intellectual disability is a neurodevelopmental disorder that
appears during childhood and is characterized by intellectual difficulties that impact people's learning, practical skills and ability to live independently. Such conditions affect approximately 6.5 million Americans.

Factors such as
complications during birth can trigger intellectual disabilities. However, in most cases, the disorders have an underlying genetic cause. So far, around 1,500 genes have been linked with various intellectual disabilities — but clinicians are still not always able to identify the specific cause of every patient's condition.

One possible explanation for this gap in knowledge is that
previous approaches for reading DNA have only focused on a tiny portion of it. Specifically, they've looked at the roughly 2% of the genome that codes for proteins, known as coding DNA. About 98% of the genome contains DNA that doesn't code for proteins. This DNA was once considered "junk DNA," but scientists are now discovering that it actually performs critical biological functions.

In a new study, published Friday (May 31) in the journal
Nature Medicine, scientists used whole-genome sequencing technology to identify a rare genetic mutation within non-coding DNA that seems to contribute to intellectual disability.

The team compared the whole genomes of nearly 5,530 people who have a diagnosed intellectual disability to those of about 46,400 people without the conditions. These data were gathered from the U.K.-based
100,000 Genomes Project.

The researchers discovered that 47 of the people with intellectual disabilities — about 0.85% — carried mutations in a gene called RNU4-2. They then validated this finding in three additional large, independent genetic databases, bringing the total number of cases to 73.

RNU4-2 doesn't code for proteins but rather for an
RNA molecule, a cousin of DNA; RNA's code can either be translated into proteins or stand on its own as a functional molecule. The RNA made by RNU4-2 makes up part of a molecular complex called the spliceosome. The spliceosome helps to refine RNA molecules after their codes are copied down from DNA by "splicing" out certain snippets of the code.

To further determine the prevalence of this new disorder, the team then launched a separate analysis where they looked at the genomes of another 5,000 people in the U.K. who'd been diagnosed with "neurodevelopmental abnormality." This is a
term that refers to any deviation from "normal" in the neurodevelopment of a child.

The team's analysis revealed that, out of those 5,000 people, 21 carried mutations in RNU4-2. That made the mutations the second most common type seen in the overall group, following mutations on the X chromosome known to cause a disorder called Rett syndrome. If changes in RNU4-2 can be confirmed as a cause of intellectual disability, this finding hints that the mutations may contribute significantly to a variety of conditions.

The new study joins a second that also linked RNU4-2 to intellectual disabilities. The research has opened up "an exciting new avenue in ID [intellectual disability] research," Catherine Abbott, a professor of molecular genetics at the University of Edinburgh in the U.K. who was not involved in either study, told Live Science in an email.

"These findings reinforce the idea that ID can often result from mutations that have a cumulative downstream effect on the expression of hundreds of other genes," Abbott said. RNA molecules that don't make proteins often help control the activity of genes, turning them on or off. The findings also stress the importance of sequencing the whole genome rather than just coding DNA, she said.

The scientists behind the new study say the findings could be used to diagnose certain types of intellectual disability.

The team now plans to investigate the precise mechanism by which RNU4-2 causes intellectual disabilities — for now, they've only uncovered a strong correlation.


Emily Cooke is a health news writer based in London, United Kingdom. She holds a bachelor's degree in biology from Durham University and a master's degree in clinical and therapeutic neuroscience from Oxford University. She has worked in science communication, medical writing and as a local news reporter while undertaking journalism training. In 2018, she was named one of MHP Communications' 30 journalists to watch under 30. (emily.cooke@futurenet.com)


Rates of autism diagnosis in children are at an all time high, CDC report suggests
'Look at all this we don't understand': Study unravels whole new layer of Alzheimer's disease
New syndrome identified in children exposed to fentanyl in the womb
1st UK child to receive gene therapy for fatal genetic disorder is now 'happy and healthy'
This brain structure may grow too fast in babies who develop autism

本文於 修改第 1 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7230231
宇宙學的新重力理論--Claudia de Rham
推薦1


胡卜凱
等級:8
留言加入好友

 
文章推薦人 (1)

胡卜凱

如果我沒有誤解德爾涵教授的意思,她試圖:用一個(目前)難以測量得到的「帶質量重力子」概念,來取代一個(目前)無法測量得到的「暗物質」概念,以建立一個解釋「宇宙加速膨脹現象」的新重力理論

我不是物理學家,只是如果我沒有誤解,這篇大作讀起來有點像文字遊戲


New theory of gravity solves accelerating universe

Massive Gravity and the end of dark energy

Claudia de Rham, 05/03/24

The universe is expanding at an accelerating rate but Einstein’s theory of General Relativity and our knowledge of particle physics predict that this shouldn’t be happening. Most cosmologists pin their hopes on Dark Energy to solve the problem. But, as Claudia de Rham argues, Einstein’s theory of gravity is incorrect over cosmic scales, her new theory of Massive Gravity limits gravity’s force in this regime, explains why acceleration is happening, and eliminates the need for Dark Energy.


The beauty of cosmology is that it often connects the infinitely small with the infinitely big – but within this beauty lies the biggest embarrassment in the history of physics.

According to Einstein’s theory of General Relativity and our knowledge of particle physics, the accumulated effect of all infinitely small quantum fluctuations in the cosmos should be so dramatic that the Universe itself should be smaller than the distance between the Earth and the Moon. But as we all know, our Universe spans over tens of billions of light years: it clearly stretches well beyond the moon.

This is the “Cosmological Constant Problem”. Far from being only a small technical annoyance, this problem is the biggest discrepancy in the whole history of physics. The theory of Massive Gravity, developed by my colleagues and I, seeks to address this problem.

For this, one must do two things. First, we need to explain what leads to this cosmic acceleration; second, we need to explain why it leads to the observed rate of acceleration – no more no less. Nowadays it is quite popular to address the first point by postulating a new kind of
Dark Energy fluid to drive the cosmic acceleration of the Universe. As for the second point, a popular explanation is the anthropic principle: if the Universe was accelerating at a different rate, we wouldn’t be here to ask ourselves the questions.

To my mind, both these solutions are unsatisfactory. In Massive Gravity, we don’t address the first point by postulating a new form of as yet undiscovered Dark Energy but rather by relying on what we do know to exist: the quantum nature of all the fundamental particles we are made out of, and the consequent vacuum energy, which eliminates the need for dark energy. This is a natural resolution for the first point and would have been adopted by scientists a long time ago if it wasn’t for the second point: how can we ensure that the immense levels of vacuum energy that fill the Universe don’t lead to too fast an acceleration? We can address this by effectively changing the laws of gravity on cosmological scales and by constraining the effect of vacuum energy.

The Higgs Boson and the Nature of Nothingness

At first sight, our Universe seems to be filled with a multitude of stars within galaxies. These galaxies are gathered in clusters surrounded by puffy “clouds” of dark matter. But is that it? Is there anything in between these clusters of galaxies plugged in filaments of dark matter? Peeking directly through our instruments, most of our Universe appears to be completely empty, with empty cosmic voids stretching between clusters of galaxies. There are no galaxies, nor gas, nor dark matter nor anything else really tangible we can detect within these cosmic voids. But are they completely empty and denuded of energy? To get a better picture of what makes up “empty space,” it is useful to connect with the fundamental particles that we are made of.

I still vividly remember watching the announcement of the discovery of the Higgs boson in 2012. By now most people have heard of this renowned particle and how it plays an important role in our knowledge of particle physics, as well as how it is responsible for giving other particles mass. However, what’s even more remarkable is that the discovery of the Higgs boson and its mechanism reveals fundamental insights into our understanding of nothingness.

To put it another way, consider “empty space" as an area of space where everything has been wiped away down to the last particle. The discovery of the Higgs boson indicates that even such an ideal vacuum is never entirely empty: it is constantly bursting with quantum fluctuations of all known particles, notably that of the Higgs.

This collection of quantum fluctuations I'll refer to as the “Higgs bath.” The Higgs bath works as a medium, influencing other particles swimming in it. Light or massless particles, such as photons, don’t care very much about the bath and remain unaffected. Other particles, such as the W and Z bosons that mediate the Weak Force, interact intensely with the Higgs bath and inherit a significant mass. As a result of their mass the Weak Force they mediate is fittingly weakened.

Accelerating Expansion

When zooming out to the limits of our observable Universe we have evidence that the Universe is expanding at an accelerating speed, a discovery that led to the 2011 Nobel Prize in Physics. This is contrary to what we would have expected if most of the energy in the Universe was localized around the habitable regions of the Universe we are used to, like clusters of galaxies within the filaments of Dark Matter. In this scenario, we would expect the gravitational attraction pulling between these masses to lead to a decelerating expansion.

So what might explain our observations of acceleration? We seem to need something which fights back against gravity but isn’t strong enough to tear galaxies apart, which exists everywhere evenly and isn't diluted by the expansion of the cosmos. This “something” has been called Dark Energy.

A different option is vacuum energy. We’ve long known that the sea of quantum fluctuations has dramatic effects on other particles, as with the Higgs Bath, so it’s natural to ask about its effect on our Universe. In fact, scientists have been examining the effect of this vacuum energy for more than a century, and long ago realized that its effects on cosmological scales should lead to an accelerated expansion of the Universe, even before we had observations that indicated that acceleration was actually happening. Now that we know the Universe is in fact accelerating, it is natural to go back to this vacuum energy and estimate the expected rate of cosmic acceleration it leads to.

The bad news is that the rate of this acceleration would be way too fast. The estimated acceleration rate would be wrong by at least twenty-eight orders of magnitude! This is the “Cosmological Constant Problem” and is also referred to as the “Vacuum Catastrophe.”

Is our understanding of the fundamental particles incorrect? Or are we using Einstein’s theory of General Relativity in a situation where it does not apply?

The Theory of Massive Gravity

Very few possibilities have been suggested. The one I would like to consider is that General Relativity may not be the correct description of gravity at large cosmological scales where gravity remains untested.

In Einstein’s theory of gravity, the graviton like the photon is massless and gravity has an infinite reach. This means objects separated by cosmological scales, all the way up to the size of the Universe, are still under the gravitational influence of each other and of the vacuum energy that fills the cosmos between them. Even though locally the effect of this vacuum energy is small, when you consider its effect accumulated over the whole history and volume of the Universe, its impact is gargantuan, bigger than everything else we can imagine, so that the cosmos would be dominated by its overall effect. Since there is a lot of vacuum energy to take into account, this leads to a very large acceleration, much larger than what we see, which again is the Cosmological Constant Problem. The solution my colleagues and I have suggested is that perhaps we don’t need to account for all this vacuum energy. If we only account for a small fraction of it, then it would still lead to a cosmic acceleration but with a much smaller rate, compatible with the Universe in which we live.

Could it be that the gravitational connection that we share with the Earth, with the rest of the Galaxy and our local cluster only occurs because we are sufficiently
close to one another?

Could it be that we do not share that same gravitational connection with very distant objects, for instance with distant stars located on the other side of the Universe some 10 thousand million trillion km away? If that were the case, there would be far less vacuum energy to consider and this would lead to a smaller cosmic acceleration, resolving the problem.

In practice, what we need to do is understand how to weaken the range of gravity. But that’s easy: nature has already showed us how to do that. We know that the Weak Force is weak and has a finite range distance because the W and Z bosons that carry it are massive particles. So, in principle, all we have to do is simply to give a mass to the graviton. Just as Einstein himself tried to include a Cosmological Constant in his equations, what we need to do is add another term which acts as the mass of the graviton, dampening the dynamics of gravitational waves and limiting the range of gravity. By making the graviton massive we now have ourselves a theory of “massive gravity.” Easy!

The possibility that gravity could have a
finite range is not a question for science-fiction.

In fact Newton, Laplace and many other incredible scientists after them contemplated the possibility. Even following our development of quantum mechanics and the Standard Model, many including Pauli and Salam considered the possibility of gravitons with mass.

But that possibility was always entirely
refuted! Not because it potentially contradicts observations – quite the opposite, it could solve the vacuum catastrophe and explain why our Universe’s expansion is accelerating – but rather because models of massive gravity appeared to be haunted by “ghosts.” Ghosts are particles with negative energies that would cause everything we know, including you, me, the whole Universe, and possibly the structure of space and time to decay instantaneously. So if you want a theory of massive gravity you need to either find a way to get rid of these ghosts or to “trap” them.

For decades, preventing these supernatural occurrences seemed inconceivable. That’s until Gregory Gabadadze, Andrew Tolley and I found a way to engineer a special kind of “ghost trap” that allowed us to trick the ghost to live in a constrained space and do no harm. One can think of this like an infinite loop-Escherian impossible staircase in which the Ghosts may exist and move but ultimately end up nowhere.

Coming up with a new trick was one thing, but convincing the scientific community required even more ingenuity and mental flexibility. Even in science, there are many cultures and mathematical languages or scientific arguments that individuals prefer. So, throughout the years, whenever a new colleague had a different point of view, we were bound to learn their language, translate, and adjust our reasoning to their way of thinking. We had to repeat the process for years until no stone remained unturned. Overcoming that process was never the goal, though: it was only the start of the journey that would allow us to test our new theory of gravity.

If gravitons have a mass, this mass should be tiny, smaller than the mass of all the other massive particles, even lighter than the neutrino. Consequently, detecting it may not be straightforward. Nevertheless, different features will appear which may make it possible to measure it.

The most promising way to test for
massive gravity involves observations of gravitational waves. If gravitons are massive, then we’d expect that low-frequency gravitational waves will travel ever so slightly slower than high-frequency ones. Unfortunately, this difference would be too slight to measure with current ground-based observatories. However, we should have better luck with future observatories. Missions like the Pulsar Timing Array, LISA and the Simons Observatory will detect gravitational waves with smaller and smaller frequencies, making possible the observations we need.

Whether the massive gravity theory developed by my collaborators and I will
survive future tests is of course presently unknown, but the possibility is now open. After all, even if the outcome isn’t certain, when it comes to challenging the biggest discrepancy of the whole history of science, addressing the Cosmological Constant Problem, eliminating the need for dark energy, and reconciling the effect of vacuum energy with the evolution of the Universe, some risks may be worth taking.


Claudia de Rham, is a theoretical physicist working at the interface of gravity, cosmology and particle physics at Imperial College London. Claudia de Rham has recently published her first book The Beauty of Falling: A Life in Pursuit of Gravity with Princeton University Press.

This article is presented in partnership with
Closer To Truth, an esteemed partner for the 2024 HowTheLightGetsIn Hay Festival. Dive deeper into the profound questions of the universe with thousands of video interviews, essays, and full episodes of the long-running TV show at their website: www.closertotruth.com.

You can see Claudia de Rham live, debating in ‘Dark Energy and The Universe’ alongside Priya Natarajan and Chris Lintott and ‘Faster Than Light’ with Tim Maudlin and João Magueijo at the upcoming HowTheLightGetsIn Festival on May 24th-27th in Hay-on-Wye.

This article is presented in association with
Closer To Truth, an esteemed partner for the 2024 HowTheLightGetsIn Festival.

Related Posts:

The most complex thing in the universe
Was the Big Bang a white hole?
Dark energy is the product of quantum universe interaction By Artyom Yurov
The mistake at the heart of the physics of time
Einstein’s failed magic trick

Related Videos:

Roger Penrose | Interview
Fragments and reality
The trouble with string theory
The trouble with time

本文於 修改第 2 次
回應 回應給此人 推薦文章 列印 加入我的文摘
引用網址:https://city.udn.com/forum/trackback.jsp?no=2976&aid=7228963
頁/共3頁 回應文章第一頁 回應文章上一頁 回應文章下一頁 回應文章最後一頁