|
最新的「通吃理論」 --- Physics arXiv Blog
|
瀏覽2,751|回應3|推薦1 |
|
|
Topology: The Secret Ingredient In The Latest Theory of Everything
Combine topology with symmetry and add a sprinkling of quantum mechanics. The result? A powerful new theory of everything
The Physics arXiv Blog, 10/08/12
Topology is the study of shape, in particular the properties that are preserved when a shape is squeezed, stretched and battered but not torn or ripped.
In the past, topology was little more than an amusing diversion for mathematicians doodling about the difference between donuts and dumplings.
But that is beginning to change. In recent years, physicists have begun to use topology to explain some of the most important puzzles at the frontiers of physics.
For example, certain quantum particles cannot form pairs but do form triplets called Efimov states. That's curious -- surely the bonds that allow three particles to bond together should also allow two to become linked?
Actually, no and topology explains why. The reason is that the mathematical connection between these quantum particles takes the form of a Borromean ring: three circles intertwined in such a way that cutting one releases the other two. Only three rings can be connected in this way, not two. Voila!
But this kind of topological curiosity is merely the tip of the iceberg if Xiao-Gang Wen at the Perimeter Institute for Theoretical Physics, in Waterloo, Canada, is to be believed.
Today, Wen combines topology, symmetry and quantum mechanics in a new theory that predicts the existence of new states of matter, unifies various puzzling phenomena in solid state physics and allows the creation artificial vacuums populated with artificial photons and electrons.
So where to start? Wen begins by explaining the fundamental role of symmetry in the basic states of matter such as liquids and solids. A symmetry is a property that remains invariant under a transformation of some kind.
So in a liquid, for example, atoms are randomly distributed and so the liquid looks the same if it is displaced in any direction by any distance. Physicists say it has a continuous translation symmetry.
However, when a liquid freezes, the atoms become locked into a crystal lattice and a different symmetry applies. In this case, the lattice only appears the same if it is displaced along the crystal axis by a specific distance. So the material now has discrete translation symmetry and the original symmetry is broken.
In other words, when the material undergoes a phase change, it also undergoes a change in symmetry, a process that physicists call symmetry breaking.
But in addition to the four ordinary phases of matter -- liquid, solid, gas and plasma, -- physicists have discovered many quantum phases of matter such as superconductivity, superfluidity, and so on.
These phases are also the result of symmetry breaking but symmetry alone cannot explain what's going on.
So physicists have turned to topology to help. It turns out that the mathematics of quantum mechanics has topological properties that, when combined with symmetry, explain how these phases form.
This kind of work has led to the discovery of additional phases of matter such as topological conductors and insulators.
The important point here is that the properties of these systems are guaranteed not by the ordinary laws of physics but by the topological properties of quantum mechanics, just like the Borromean rings that explain the Efimov states described earlier.
Xiao-Gang Wen's approach is to explore the properties of matter when the topological links between particles become much more general and complex. He generalises these links, thinking of them as strings that can connect many particles together. In fact, he considers the way many strings can form net-like structures that have their own emergent properties.
So what kind of emergent properties do these string-nets have? It turns out that string-nets are not so different from the ordinary matter. String nets can support waves which Xiao-Gang Wen says are formally equivalent to photons.
That makes string nets a kind of "quantum ether" through which electromagnetic waves travel. That's a big claim.
Wen also says that various properties of string nets are equivalent to fundamental particles such as electrons. And that it may be possible to derive the properties of other particles too. That's another big idea.
Of course, no theory is worth more than bag of beans unless it makes testable predictions about the universe.
Wen says that his theory has significant implications for the states of matter that existed soon after the Big Bang but doesn't develop the idea into specific predictions.
Presumably, the same ought to be true of other extreme astrophysical phenomenon. For example, it'd be interesting to see what conditions this kind of approach places on the nature of black holes.
Wen also says that it ought to be possible to manipulate the topological properties of materials to create artificial vacuums complete with artificial photons and artificial particles like electrons. In other words, topology is the key to creating entirely new worlds in the lab.
Clearly, Wen's ideas will take some digesting. And the implications he discusses need to firmed up into specific experimental predictions.
But it's not the first time we've come across the notion that topology plays a more fundamental role in the universe than anyone imagined. We explored a similar idea a couple of years ago.
Physicists have known for many decades that symmetry plays a powerful role in the laws of physics. In fact, it's fair to say that symmetry has changed the way we think about the universe.
It's just possible that adding topology to the mix could be equally revolutionary.
Ref: arxiv.org/abs/1210.1281: Topological Order: From Long-Range Eentangled Quantum Matter To An Unification Of Light And Electrons
http://www.technologyreview.com/view/429528/topology-the-secret-ingredient-in-the-latest/
本文於 修改第 2 次
|
「通吃理論」2.1 - A. Gefter
|
|
推薦1 |
|
|
Theoretical physics: Complexity on the horizon
Amanda Gefter, 05/30/14
When physicist Leonard Susskind gives talks these days, he often wears a black T-shirt proclaiming “I ♥ Complexity”. In place of the heart is a Mandelbrot set, a fractal pattern widely recognized as a symbol for complexity at its most beautiful.
That pretty much sums up his message. The 74-year-old Susskind, a theorist at Stanford University in California, has long been a leader in efforts to unify quantum mechanics with the general theory of relativity -- Albert Einstein's framework for gravity. The quest for the elusive unified theory has led him to advocate counter-intuitive ideas, such as superstring theory or the concept that our three-dimensional Universe is actually a two-dimensional hologram. But now he is part of a small group of researchers arguing for a new and equally odd idea: that the key to this mysterious theory of everything is to be found in the branch of computer science known as computational complexity.
This is not a subfield to which physicists have tended to look for fundamental insight. Computational complexity is grounded in practical matters, such as how many logical steps are required to execute an algorithm. But if the approach works, says Susskind, it could resolve one of the most baffling theoretical conundrums to hit his field in recent years: the black-hole firewall paradox, which seems to imply that either quantum mechanics or general relativity must be wrong. And more than that, he says, computational complexity could give theorists a whole new way to unify the two branches of their science -- using ideas based fundamentally on information.
Behind a firewall
It all began 40 years ago, when physicist Stephen Hawking at the University of Cambridge, UK, realized that quantum effects would cause a black hole to radiate photons and other particles until it completely evaporates away.
As other researchers were quick to point out, this revelation brings a troubling contradiction. According to the rules of quantum mechanics, the outgoing stream of radiation has to retain information about everything that ever fell into the black hole, even as the matter falling in carries exactly the same information through the black hole's event horizon, the boundary inside which the black hole's gravity gets so strong that not even light can escape. Yet this two-way flow could violate a key law of quantum mechanics known as the no-cloning theorem, which dictates that making a perfect copy of quantum information is impossible.
Happily, as Susskind and his colleagues observed1 in 1995, nature seemed to sidestep any such violation by making it impossible to see both copies at once: an observer who remains outside the horizon cannot communicate with one who has fallen in. But in 2012, four physicists at the University of California, Santa Barbara -- Ahmed Almheiri, Donald Marolf, Joseph Polchinski and James Sully, known collectively as AMPS -- spotted a dangerous exception to this rule2. They found a scenario in which an observer could decode the information in the radiation, jump into the black hole and then compare that information with its forbidden duplicate on the way down.
AMPS concluded that nature prevents this abomination by creating a blazing firewall just inside the horizon that will incinerate any observer -- or indeed, any particle -- trying to pass through. In effect, space would abruptly end at the horizon, even though Einstein's gravitational theory says that space must be perfectly continuous there. If AMPS's theory is true, says Raphael Bousso, a theoretical physicist at the University of California, Berkeley, “this is a terrible blow to general relativity”.
Does not compute
Fundamental physics has been in an uproar ever since, as practitioners have struggled to find a resolution to this paradox. The first people to bring computational complexity into the debate were Stanford’s Patrick Hayden, a physicist who also happens to be a computer scientist, and Daniel Harlow, a physicist at Princeton University in New Jersey. If the firewall argument hinges on an observer's ability to decode the outgoing radiation, they wondered, just how hard is that to do?
Impossibly hard, they discovered. A computational-complexity analysis showed that the number of steps required to decode the outgoing information would rise exponentially with the number of radiation particles that carry it. No conceivable computer could finish the calculations until long after the black hole had radiated all of its energy and vanished, along with the forbidden information clones. So the firewall has no reason to exist: the decoding scenario that demands it cannot happen, and the paradox disappears.
“The black hole's interior is protected by an armour of computational complexity.”
Hayden was sceptical of the result at first. But then he and Harlow found much the same answer for many types of black hole3. “It did seem to be a robust principle,” says Hayden: “a conspiracy of nature preventing you from performing this decoding before the black hole had disappeared on you.”
The Harlow–Hayden argument made a big impression on Scott Aaronson, who works on computational complexity and the limits of quantum computation at the Massachusetts Institute of Technology in Cambridge. “I regard what they did as one of the more remarkable syntheses of physics and computer science that I've seen in my career,” he says.
It also resonated strongly among theoretical physicists. But not everyone is convinced. Even if the calculation is correct, says Polchinski, “it is hard to see how one would build a fundamental theory on this framework”. Nevertheless, some physicists are trying to do just that. There is a widespread belief in the field that the laws of nature must somehow be based on information. And the idea that the laws might actually be upheld by computational complexity -- which is defined entirely in terms of information -- offers a fresh perspective.
It certainly inspired Susskind to dig deeper into the role of complexity. For mathematical clarity, he chose to make his calculations in a theoretical realm known as anti-de Sitter space (AdS). This describes a cosmos that is like our own Universe in the sense that everything in it, including black holes, is governed by gravity. Unlike our Universe, however, it has a boundary -- a domain where there is no gravity, just elementary particles and fields governed by quantum physics. Despite this difference, studying physics in AdS has led to many insights, because every object and physical process inside the space can be mathematically mapped to an equivalent object or process on its boundary. A black hole in AdS, for example, is equivalent to a hot gas of ordinary quantum particles on the boundary. Better still, calculations that are complicated in one domain often turn out to be simple in the other. And after the calculations are complete, the insights gained in AdS can generally be translated back into our own Universe.
Increasing complexity
Susskind decided to look at a black hole sitting at the centre of an AdS universe, and to use the boundary description to explore what happens inside a black hole's event horizon. Others had attempted this and failed, and Susskind could see why after he viewed the problem through the lens of computational complexity. Translating from the boundary of the AdS universe to the interior of a black hole requires an enormous number of computational steps, and that number increases exponentially as one moves closer to the event horizon4. As Aaronson puts it, “the black hole's interior is protected by an armour of computational complexity”.
Furthermore, Susskind noticed, the computational complexity tends to grow with time. This is not the increase of disorder, or entropy, that is familiar from everyday physics. Rather, it is a pure quantum effect arising from the way that interactions between the boundary particles cause an explosive growth in the complexity of their collective quantum state.
If nothing else, Susskind argued, this growth means that complexity behaves much like a gravitational field. Imagine an object floating somewhere outside the black hole. Because this is AdS, he said, the object can be described by some configuration of particles and fields on the boundary. And because the complexity of that boundary description tends to increase over time, the effect is to make the object move towards regions of higher complexity in the interior of the space. But that, said Susskind, is just another way of saying that the object will be pulled down towards the black hole. He captured that idea in a slogan4: “Things fall because there is a tendency toward complexity.”
Another implication of increasing complexity turns out to be closely related to an argument5 that Susskind made last year in collaboration with Juan Maldacena, a physicist at the Institute for Advanced Study in Princeton, New Jersey, and the first researcher to recognize the unique features of AdS. According to general relativity, Susskind and Maldacena noted, two black holes can be many light years apart yet still have their interiors connected by a space-time tunnel known as a wormhole. But according to quantum theory, these widely separated black holes can also be connected by having their states 'entangled', meaning that information about their quantum states is shared between them in a way that is independent of distance.
After exploring the many similarities between these connections, Susskind and Maldacena concluded that they were two aspects of the same thing -- that the black hole's degree of entanglement, a purely quantum phenomenon, will determine the wormhole's width, a matter of pure geometry.
With his latest work, Susskind says, it turns out that the growth of complexity on the boundary of AdS shows up as an increase in the wormhole's length. So putting it all together, it seems that entanglement is somehow related to space, and that computational complexity is somehow related to time.
Susskind is the first to admit that such ideas by themselves are only provocative suggestions; they do not make up a fully fledged theory. But he and his allies are confident that the ideas transcend the firewall paradox.
“I don't know where all of this will lead,” says Susskind. “But I believe these complexity–geometry connections are the tip of an iceberg.”
Journal name: Nature;Volume: 509, Pages: 552–553;Date published: (29 May 2014)
DOI: doi:10.1038/509552a
Corrections
This article inadvertently underplayed the role of Daniel Harlow in bringing computational complexity to fundamental physics -- he worked with Patrick Hayden from the start of their project. The text has been corrected to reflect this.
References
1. Lowe, D. A., Polchinski, J., Susskind, L., Thorlacius, L. & Uglum, J. Phys. Rev. D 52, 6997 (1995).
Show context
2. Almheiri, A., Marolf, D., Polchinski, J. & Sully, J. J. High Energy Phys. 2013, 62 (2013).
Show context
3. Harlow, D. & Hayden, P. J. High Energy Phys. 2013, 85 (2013).
Show context
4. Susskind, L. Preprint available at http://arxiv.org/abs/1402.5674 (2014).
Show context
5. Maldacena, J. & Susskind, L. Fortschr. Phys. 61, 781–811 (2013).
Show context
http://www.nature.com/news/theoretical-physics-complexity-on-the-horizon-1.15285
本文於 修改第 1 次
|
「通吃理論」2.0 - Z. Merali
|
|
推薦1 |
|
|
A Meta-Law to Rule Them All: Physicists Devise a “Theory of Everything”
“Constructor theory” unites in one framework how information is processed in the classical and quantum realms
Zeeya Merali, Scientific American, 05/26/14
“Once you have eliminated the impossible,” the fictional detective Sherlock Holmes famously opined, “whatever remains, however improbable, must be the truth.” That adage forms the foundational principle of “constructor theory” -- a candidate “theory of everything” first sketched out by David Deutsch, a quantum physicist at the University of Oxford, in 2012. His aim was to find a framework that could encompass all physical theories by determining a set of overarching “meta-laws” that describe what can happen in the universe and what is forbidden. In a May 23 paper posted to the physics preprint server, arXiv, constructor theory claims its first success toward that goal by unifying the two separate theories that are currently used to describe information processing in macroscopic, classical systems as well as in subatomic, quantum objects.
Computer scientists currently use a theory developed by the American mathematician and cryptographer Claude Shannon in the 1940s to describe how classical information can be encoded and transmitted across noisy channels efficiently -- what, for instance, is the most data that can be streamed, in theory, down a fiber-optic cable without becoming irretrievably corrupted. At the same time, physicists are striving to build quantum computers that could, in principle, exploit peculiar aspects of the subatomic realm to perform certain tasks at a far faster rate than today’s classical machines.
But the principles defined by Shannon’s theory cannot be applied to information processing by quantum computers. In fact, Deutsch notes, physicists have no clear definition for what “quantum information” even is or how it relates to classical information. “If we want to make progress in finding new algorithms for quantum computers, we need to understand what quantum information actually is!” he says. “So far, the algorithms that have been discovered for quantum computers have been surprises that were discovered by blundering about because we have no underlying theory to guide us.”
In 2012 Deutsch outlined constructor theory, which, he believes, could provide the underlying foundation for a grand unification of all theories in both the classical and quantum domains. This latest paper is a first step toward that larger goal -- a demonstration of how classical and quantum information can be used to unify two physical theories. According to constructor theory, the most fundamental components of reality are entities -- “constructors” -- that perform particular tasks, accompanied by a set of laws that define which tasks are actually possible for a constructor to carry out. For instance, a kettle with a power supply can serve as a constructor that can perform the task of heating water. “The language of constructor theory gives a natural way to describe the most fundamental principles that must be obeyed by all subsidiary theories, like conservation of energy,” explains Chiara Marletto, a quantum physicist also at Oxford, who co-authored the new paper. “You simply say that the task of creating energy from nothing is impossible.”
Dean Rickles, a philosopher of physics at the University of Sydney who was not involved in the development of the theory, is intrigued by its potential to unify nature’s laws. “It’s a very curious new take on the notion of a theory of everything,” he says. “In principle, everything possible in our universe could be written down in a big book consisting of nothing but tasks [and in] this big book will also be encoded all of the laws of physics.”
To develop their description of information, Deutsch and Marletto homed in on one key task that is possible in classical systems but impossible in quantum systems: the ability to make a copy. Since the 1980s physicists have known that it is impossible to make an identical copy of an unknown quantum state. In their new paper Deutsch and Marletto define a classical information medium as one in which states can all be precisely copied. They then work out which tasks must be possible in such a system to remain in line with Shannon’s theory.
The collaborators then go on to define the concept of a “superinformation” medium that encodes messages that specify particular physical states -- in this case, one in which copying is impossible. They discovered that a special subset of their superinformation media display the properties associated with quantum information processing. “We found that with this one constraint in place telling you what you cannot do in a superinformation medium -- the task of copying -- you end up discovering the weird new information-processing power that is a property of quantum systems,” Marletto says.
The team showed that with this restriction on copying in place a number of other properties begin to emerge: Measuring the state of a superinformation medium will inevitably disturb it -- a feature commonly associated with quantum systems. But because it is forbidden to make an exact copy of certain sets of states in a superinformation medium this forces some uncertainty into the outcome of the measurement.
The team has also shown that entanglement -- the spooky property that binds quantum objects together so that they act in tandem, no matter how far apart they are -- also arises naturally, once this constraint on copying is in place. According to Marletto, the crucial property of a system containing two entangled states is that the information stored in the combined system is more than the information that can be gleaned just by examining each member of the pair individually. The quantum whole is more than the sum of its parts.
In their paper Deutsch and Marletto demonstrate that information can be encoded in two superinformation media in such a way that it is impossible to retrieve it by measuring the single subsystems separately -- that is, entanglement is inevitable. Similarly, in a classical medium, entanglement is impossible. “The appealing thing about this formalism is the way that common features of quantum mechanics fall out,” says Patrick Hayden, a quantum physicist at Stanford University, adding: “I have real respect for the creative thinking behind constructor theory and its ambitions.” He notes, however, that there are competing attempts by other researchers to develop a deeper understanding of quantum mechanics, including ideas based on copying, and as yet it is too early to say which, if any, will prove to be the best description.
Rickles agrees that it will take time for physicists to verify that the theory -- which has not yet passed through peer review -- is truly successful at uniting classical and quantum information theory. But if affirmed, it would give a boost to Deutsch’s goal to help in the hunt for the long-sought theory of quantum gravity, uniting the currently incompatible quantum theory and general relativity. “This is the first time in the history of science that it’s known that our deepest theories are wrong, so it’s obvious that we need a deeper theory,” Deutsch says.
Rickles believes that constructor theory has the potential to prescribe meta-laws that general relativity and quantum theory must obey. “The meta-laws are more stable creatures, they survive scientific revolutions,” he says. “Having such principles in hand gives us a better grasp of the nature of reality. I’d say that’s a pretty good advantage.”
http://www.scientificamerican.com/article/a-meta-law-to-rule-them-all-physicists-devise-a-theory-of-everything/
請參閱【時間與空間本質的假說】欄下的《時間「方向性」的根源》(Time’s Arrow Traced to Quantum Source – N. Wolchover)
本文於 修改第 3 次
|
綜合重力學與量子力學的努力 - N. Wolchover
|
|
推薦0 |
|
|
Physicists Eye Quantum-Gravity Interface
Natalie Wolchover, 10/31/13
t starts like a textbook physics experiment, with a ball attached to a spring. If a photon strikes the ball, the impact sets it oscillating very gently. But there’s a catch. Before reaching the ball, the photon encounters a half-silvered mirror, which reflects half of the light that strikes it and allows the other half to pass through.
What happens next depends on which of two extremely well-tested but conflicting theories is correct: quantum mechanics or Einstein’s theory of general relativity; these describe the small- and large-scale properties of the universe, respectively.
In a strange quantum mechanical effect called “superposition,” the photon simultaneously passes through and reflects backward off the mirror; it then both strikes and doesn’t strike the ball. If quantum mechanics works at the macroscopic level, then the ball will both begin oscillating and stay still, entering a superposition of the two states. Because the ball has mass, its gravitational field will also split into a superposition.
But according to general relativity, gravity warps space and time around the ball. The theory cannot tolerate space and time warping in two different ways, which could destabilize the superposition, forcing the ball to adopt one state or the other.
Knowing what happens to the ball could help physicists resolve the conflict between quantum mechanics and general relativity. But such experiments have long been considered infeasible: Only photon-size entities can be put in quantum superpositions, and only ball-size objects have detectable gravitational fields. Quantum mechanics and general relativity dominate in disparate domains, and they seem to converge only in enormously dense, quantum-size black holes. In the laboratory, as the physicist Freeman Dyson wrote in 2004, “any differences between their predictions are physically undetectable.”
In the past two years, that widely held view has begun to change. With the help of new precision instruments and clever approaches for indirectly probing imperceptible effects, experimentalists are now taking steps toward investigating the interface between quantum mechanics and general relativity in tests like the one with the photon and the ball. The new experimental possibilities are revitalizing the 80-year-old quest for a theory of quantum gravity.
“In the final showdown between quantum mechanics and gravity, our understanding of space and time will be completely changed.”
“The biggest single problem of all of physics is how to reconcile gravity and quantum mechanics,” said Philip Stamp, a theoretical physicist at the University of British Columbia. “All of a sudden, it’s clear there is a target.”
Theorists are thinking through how the experiments might play out, and what each outcome would mean for a more complete theory merging quantum mechanics and general relativity. “Neither of them has ever failed,” Stamp said. “They’re incompatible. If experiments can get to grips with that conflict, that’s a big deal.”
Quantum Nature
At the quantum scale, rather than being “here” or “there” as balls tend to be, elementary particles have a certain probability of existing in each of the locations. These probabilities are like the peaks of a wave that often extends through space. When a photon encounters two adjacent slits on a screen, for example, it has a 50-50 chance of passing through either of them. The probability peaks associated with its two paths meet on the far side of the screen, creating interference fringes of light and dark. These fringes prove that the photon existed in a superposition of both trajectories.
But quantum superpositions are delicate. The moment a particle in a superposition interacts with the environment, it appears to collapse into a definite state of “here” or “there.” Modern theory and experiments suggest that this effect, called environmental decoherence, occurs because the superposition leaks out and envelops whatever the particle encountered. Once leaked, the superposition quickly expands to include the physicist trying to study it, or the engineer attempting to harness it to build a quantum computer. From the inside, only one of the many superimposed versions of reality is perceptible.
A single photon is easy to keep in a superposition. Massive objects like a ball on a spring, however, “become exponentially sensitive to environmental disturbances,” explained Gerard Milburn, director of the Center for Engineered Quantum Systems at the University of Queensland in Australia. “The chances of any one of their particles getting disturbed by a random kick from the environment is extremely high.”
Because of environmental decoherence, the idea of probing quantum superpositions of massive objects in tabletop experiments seemed for decades to be dead in the water. “The problem is getting the isolation, making sure no disturbances come along other than gravity,” Milburn said. But the prospects have dramatically improved.
Dirk Bouwmeester, an experimental physicist who splits his time between the University of California, Santa Barbara, and Leiden University in the Netherlands, has developed a setup much like the photon-and-ball experiment, but replacing the ball on its spring with an object called an optomechanical oscillator — essentially a tiny mirror on a springboard. The goal is to put the oscillator in a quantum superposition of two vibration modes, and then see whether gravity destabilizes the superposition.
Ten years ago, the best optomechanical oscillators of the kind required for Bouwmeester’s experiment could wiggle back and forth 100,000 times without stopping. But that wasn’t long enough for the effects of gravity to kick in. Now, improved oscillators can wiggle one million times, which Bouwmeester calculates is close to what he needs in order to see, or rule out, decoherence caused by gravity. “Within three to five years, we will prove quantum superpositions of this mirror,” he said. After that, he and his team must reduce the environmental disturbances on the oscillator until it is sensitive to the impact of a single photon. “It’s going to work,” he insists.
Markus Aspelmeyer, a quantum physicist at the University of Vienna, is developing three experiments aimed at probing the interface between quantum mechanics and gravity.
Markus Aspelmeyer, a professor of physics at the University of Vienna, is equally optimistic. His group is developing three separate experiments at the quantum-gravity interface — two for the lab and one for an orbiting satellite. In the space-based experiment, a nanosphere will be cooled to its lowest energy state of motion, and a laser pulse will put the nanosphere in a quantum superposition of two locations, setting up a situation much like a double-slit experiment. The nanosphere will behave like a wave with two interfering peaks as it moves toward a detector. Each nanosphere can be detected in only a single location, but after multiple repetitions of the experiment, interference fringes will appear in the distribution of the nanospheres’ locations. If gravity destroys superpositions, the fringes won’t appear for nanospheres that are too massive.
The group is designing a similar experiment for Earth’s surface, but it will have to wait. At present, the nanospheres cannot be cooled enough, and they fall too quickly under Earth’s gravity, for the test to work. But “it turns out that optical platforms on satellites actually already meet the requirements that we need for our experiments,” said Aspelmeyer, who is collaborating with the European Aeronautic Defense and Space Company in Germany. His team recently demonstrated a key technical step required for the experiment. If it gets off the ground and goes as planned, it will reveal the relationship between the mass of the nanospheres and decoherence, pitting gravity against quantum mechanics.
The researchers laid out another terrestrial experiment last spring in Nature Physics. Many proposed quantum gravity theories involve modifications to Heisenberg’s uncertainty principle, a cornerstone of quantum mechanics that says it isn’t possible to precisely measure both the position and momentum of an object at the same time. Any deviations to Heisenberg’s formula should show up in the position-momentum uncertainty of an optomechanical oscillator, because it is affected by gravity. The uncertainty itself is immeasurably small — a blurriness just 100-million-trillionth the width of a proton — but Igor Pikovski, a theorist in Aspelmeyer’s group, has discovered a backdoor route to detecting it. When a light pulse strikes the oscillator, Pikovski claims that its phase (the position of its peaks and troughs) will undergo a discernible shift that depends on the uncertainty. Deviations from the predictions of traditional quantum mechanics could be experimental evidence of quantum gravity.
Aspelmeyer’s group has started to realize the first experimental steps. Pikovski’s idea “provides us with a quite, I have to admit, unexpected improvement in performance,” Aspelmeyer said. “We are all a little surprised, actually.”
The Showdown
Many physicists expect quantum theory to prevail. They believe the ball on a spring should, in principle, be able to exist in two places at once, just as a photon can. The ball’s gravitational field should be able to interfere with itself in a quantum superposition, just as the photon’s electromagnetic field does. “I don’t see why these concepts of quantum theory that have proven to be right for the case of light should fail for the case of gravity,” Aspelmeyer said.
But the incompatibility of general relativity and quantum mechanics itself suggests that gravity might behave differently. One compelling idea is that gravity could act as a sort of inescapable background noise that collapses superpositions.
“While you can get rid of air molecules and electromagnetic radiation, you can’t screen out gravity,” said Miles Blencowe, a professor of physics at Dartmouth College. “My view is that gravity is sort of like the fundamental, unavoidable, last-resort environment.”
In an optomechanical oscillator, the light confined between two mirrors causes one of the mirrors to oscillate on a spring. Experimentalists plan to use such devices to pit quantum mechanics against general relativity.
The background-noise idea was conceived in the 1980s and 1990s by Lajos Diósi of the Wigner Research Center for Physics in Hungary and, separately, by Roger Penrose of Oxford University. According to Penrose’s model, a discrepancy in the curvature of space and time could accumulate during a superposition, eventually destroying it. The more massive or energetic the object involved and, thus, the larger its gravitational field, the more quickly “gravitational decoherence” would happen. The space-time discrepancy ultimately results in an irreducible level of noise in the position and momentum of particles, consistent with the uncertainty principle.
“That would be a wonderful result if the ultimate reason for the uncertainty principle and the puzzling features of quantum physics are due to some quantum effects of space and time,” Milburn said.
Inspired by the possibility of experimental tests, Milburn and other theorists are expanding on Diósi and Penrose’s basic idea. In a July paper in Physical Review Letters, Blencowe derived an equation for the rate of gravitational decoherence by modeling gravity as a kind of ambient radiation. His equation contains a quantity called the Planck energy, which equals the mass of the smallest possible black hole. “When we see the Planck energy we think quantum gravity,” he said. “So it may be that this calculation is touching on elements of this undiscovered theory of quantum gravity, and if we had one, it would show us that gravity is fundamentally different than other forms of decoherence.”
Stamp is developing what he calls a “correlated path theory” of quantum gravity that pinpoints a possible mathematical mechanism for gravitational decoherence. In traditional quantum mechanics, probabilities of future outcomes are calculated by independently summing the various paths a particle can take, such as its simultaneous trajectories through both slits on a screen. Stamp found that when gravity is included in the calculations, the paths connect. “Gravity basically is the interaction that allows communication between the different paths,” he said. The correlation between paths results once more in decoherence. “No adjustable parameters,” he said. “No wiggle room. These predictions are absolutely definite.”
At meetings and workshops, theorists and experimentalists are working closely to coordinate the various proposals and plans for testing them. They say it’s a mutually motivating situation.
“In the final showdown between quantum mechanics and gravity, our understanding of space and time will be completely changed,” Milburn said. “We’re hoping these experiments will lead the way.”
https://www.simonsfoundation.org/quanta/20131031-physicists-eye-quantum-gravity-interface/
本文於 修改第 1 次
|
|
|