The preprint of the paper that is the subject of the (not so great) phys.org article at the top is <https://arxiv.org/abs/2205.08767>. An accessible HTML5 version is available at <https://ar5iv.org/abs/2205.08767> (arxiv->ar5iv, the latter expands to a link within ar5iv.labs.arxiv.org).
Hawking radiation is a semiclassical result: the curved spacetime is classical General Relativity and the scalar field (in which Hawking quanta arise near the central black hole) is quantum.
The dynamical spacetime creates -- through the equivalence principle -- an acceleration between past observers and future observers, and this acceleration corresponds with the Unruh effect. The Unruh effect rests on the definition of a vacuum as a state in which an observer sees no particles, and that when an observer accelerates a no particle state may be transformed into a state with particles. Equivalently, differently-accelerated observers will count different numbers of particles in a spacetime-filling quantum field. (A family of observers may count no particles, i.e., it's vacuum.)
The important part here is that a dynamical spacetime ("gravity") and a relativistic quantum field is needed for Hawking radiation.
So, "[can] spacetime alone ... create light?" No. There must be a matter field filling the spacetime. That matter field, if quantum, can look like it has no particles in it to some observers, but not all observers. The dynamical evolution of the spacetime can cause observers' counts of particles to evolve.
> gravitational waves ... carry energy, so can be transformed into light
The paper is about how, given:
* a massless quantum field theory proxying for light
* a quantum field theory in which gravitation is mediated by a massless spin-2 boson
* a dense medium with a (light-) refractive index greater than 1
* standing gravitational waves of significant amplitude occur in cases where gravitational radiation from widely separated sources converge within the dense medium and somehow [a] cancel out polariation and [b] are within a wide (compared to the wavelength) patch of flat spacetime
* the non-light massive and massless particles within the medium couple very weakly to the incoming gravitational radiation
* the particles of the refracting medium couple weakly to the "light" field, and generate practically no spacetime curvature even in bulk
then the light-proxying particles may be produced via a process which the authors compare with electron-positron pair production and Cherenkov radiation. (Although they do the latter comparison very very breezily, not delving into the cross section of light-by-light scattering).
There are weaknesses in this list of requirements, some of which the authors admit requires further study.
The key point though is that their mechanism cannot work in vacuum.
It absolutely requires that the light travels significantly slower than the gravitational radiation (which in turn is assumed to travel at c, even in the non-vacuum in which light travels slower than that) and that a far-from-negligible momentum is lost by the incoming gravitational radiation as it passes through the refracting medium.
> great experiment here
The last paragraph in the Conclusions and Discussion section suggests there may be avenues for experimenting with the ideas in the paper.
> The key point though is that their mechanism cannot work in vacuum. It absolutely requires that the light travels significantly slower than the gravitational radiation
I am not a physicist, but I understand we are talking about Universe so early after Big Bang that it wasn't yet transparent to light. There simply wasn't vacuum yet if by vacuum you mean electromagnetic waves being able to travel long distances.
I am not sure what you're getting at exactly. Although the paper does touch on early universe cosmology, the authors do their principal analysis using the refractive value for water, and there were no interstellar water clouds before early supernovae started generating oxygen. The authors also explicitly contemplate observables generated by LIGO-accessible compact binary mergers ("compact" here means black holes and neutron stars), all of which postdate the first stars.
In physical cosmology (and especially considering alternatives to General Relativity) it is very common to consider the possibility that some effect is strong in the very early universe and so weak as to be undetectable at present times (or even as early as the first galaxies or the surface of last scattering). Examples include auxiliary gravitational fields ("bimetric" theories, for example) that decay in the early universe, variable-speed-of-light/variable-Newton's-constant theories, and so forth.
Although one might think "hm, it's very convenient that an important effect only happens so early that we cannot use telescopes to see it", there is very good evidence for electroweak unification and cosmic inflation, both of which terminated (in different ways) in the very early universe, and are (or arguably were) too difficult to directly observe this late in the universe's history. Additionally there is ample indirect evidence that (if it exists) is within our reach.
That the hypothesized graviton-photon mechanism cannot work in vacuum makes it at least very difficult to test (or observe with telescopes) today, however the final section of the paper does suggest that if it happens in nature, where it happens is likely to become accessible to us in due course. This is not a theory that has a hard cut-off in the early universe; it is just a hypothesis that to be realized requires a configuration of e.g. binaries and molecular clouds that is not very close to what we commonly observe. (Double-binary compact objects in dusty environments might end up being commonplace though, and in those settings one could expect changes in "multimessenger" signals if the authors' ideas are correct. It's amazing how many star systems are turning out to be triples, and we know of triple-compact-star systems; there are a number of known quadruples like DI Chamaelontis; and Gamma Cassiopeiae is a system of at least seven ~stellar mass bodies.)
AFAIK there exists no popular belief that physics was different in early universe. The physics was the same, the only thing that was different was physical conditions. Meaning everything was densely packed together.
If you, even for a moment, assume that laws were different in early universe then you essentially lost any possibility to predict anything.
It's not that things become unpredictable, it's that it can capture mispredictions (actual and possible) of things colloquially called "laws" of physics.
Spontaneous symmetry breaking has been at the root of at least three Nobel prizes, and is crucial to understanding the differences in physical systems at very high energies both in laboratories and in extreme astrophysical settings, at both early and approximately present times in the universe.
The early universe was in a high energy state, being very much hotter and denser than the later universe, as you say. There are several epochs -- notably the https://en.wikipedia.org/wiki/Electroweak_epoch -- where symmetry breaking is important, and using the lower energy theory (electromagnetism, in this example) simply does not work: results are (if even calculable) manifestly wrong, leading to a universe with a very different cosmic microwave background, and very different chemistry and nuclear physics.
I think at best one might say that theories with broken symmetries could still have those symmetries (i.e., the breaking may be reversible under "different ... physical conditions", like if our universe surprisingly evolved to a Big Crunch), however treating that as a denial of the possibility of different physics in the early universe is probably something you'd have to take up with philosophers or lexicographers for now.
Additionally, there is no reason to just assume (and refuse to trace out implications if wrong, or to validate) that physical constants are constants everywhere and everywhen. Putting some spacetime-location-dependent function on constants like G, k_B, \alpha, \Lambda, c has at the very least proven instructive in further understanding the concordance (standard) models of particle physics and cosmology, where those constants are taken as constant everywhere and at all times in the universe. Indeed paramaterizing apparent constants is outright productive science. See e.g. <https://en.wikipedia.org/wiki/Test_theories_of_special_relat...> for a scratch-the-surface set of details, and additionally <https://en.wikipedia.org/wiki/Variable_speed_of_light#Relati...> are at least [a] interesting [b] testable and [c] improves testability of the families of theories in which these constants are assumed truly constant (i.e, everywhere and everywhen).
> popular belief
Well, I guess your popular is could outweigh a literature search. But for scientists:
It's been about half a century since Kenneth Wilson and Nikolay Bogolyubov explored rescaling and renormalization, and nowadays practically every physical theory is written down as, considered as, or is being adapted towards <https://en.wikipedia.org/wiki/Effective_field_theory> (EFT). It is common that different EFTs apply to the same physical configuration as some characteric scale is crossed, and it is possible that physical theories will be EFTs all the way down (and all the way up), with the concept of fundamental becoming a relation between families of theories. (For example, Newtonian gravitation is less fundamental than General Relativity, because the former can be derived from the latter (and not the reverse), not because General Relativity is known to be correct at all scales).
Hawking radiation is a semiclassical result: the curved spacetime is classical General Relativity and the scalar field (in which Hawking quanta arise near the central black hole) is quantum.
The dynamical spacetime creates -- through the equivalence principle -- an acceleration between past observers and future observers, and this acceleration corresponds with the Unruh effect. The Unruh effect rests on the definition of a vacuum as a state in which an observer sees no particles, and that when an observer accelerates a no particle state may be transformed into a state with particles. Equivalently, differently-accelerated observers will count different numbers of particles in a spacetime-filling quantum field. (A family of observers may count no particles, i.e., it's vacuum.)
The important part here is that a dynamical spacetime ("gravity") and a relativistic quantum field is needed for Hawking radiation.
So, "[can] spacetime alone ... create light?" No. There must be a matter field filling the spacetime. That matter field, if quantum, can look like it has no particles in it to some observers, but not all observers. The dynamical evolution of the spacetime can cause observers' counts of particles to evolve.
> gravitational waves ... carry energy, so can be transformed into light
The paper is about how, given:
* a massless quantum field theory proxying for light
* a quantum field theory in which gravitation is mediated by a massless spin-2 boson
* a dense medium with a (light-) refractive index greater than 1
* standing gravitational waves of significant amplitude occur in cases where gravitational radiation from widely separated sources converge within the dense medium and somehow [a] cancel out polariation and [b] are within a wide (compared to the wavelength) patch of flat spacetime
* the non-light massive and massless particles within the medium couple very weakly to the incoming gravitational radiation
* the particles of the refracting medium couple weakly to the "light" field, and generate practically no spacetime curvature even in bulk
then the light-proxying particles may be produced via a process which the authors compare with electron-positron pair production and Cherenkov radiation. (Although they do the latter comparison very very breezily, not delving into the cross section of light-by-light scattering).
There are weaknesses in this list of requirements, some of which the authors admit requires further study.
The key point though is that their mechanism cannot work in vacuum. It absolutely requires that the light travels significantly slower than the gravitational radiation (which in turn is assumed to travel at c, even in the non-vacuum in which light travels slower than that) and that a far-from-negligible momentum is lost by the incoming gravitational radiation as it passes through the refracting medium.
> great experiment here
The last paragraph in the Conclusions and Discussion section suggests there may be avenues for experimenting with the ideas in the paper.