The black hole firewall is a relatively recent idea. On Wikipedia you can read how it “was proposed in 2012 by Ahmed Almheiri, Donald Marolf, Joseph Polchinski, and James Sully as a possible solution to an apparent inconsistency in black hole complementarity”. Their proposal is known as the AMPS firewall, and the title of their paper is black holes: complementarity or firewalls?
They cannot all be true
They start by saying “we argue that the following three statements cannot all be true: (i) Hawking radiation is in a pure state, (ii) the information carried by the radiation is emitted from the region near the horizon, with low energy effective field theory valid beyond some microscopic distance from the horizon, and (iii) the infalling observer encounters nothing unusual at the horizon”. That sounds like a good start. They also refer to the black hole information paradox, wherein a black hole is said to destroy such things as baryon number. In physics if you hit a paradox it means something’s wrong somewhere, as per reductio ad absurdum. They then talk about black hole complementarity, which is where information is allegedly both “reflected at the event horizon and passes through the event horizon”. They say that this can’t be right, and that there are some “sharp and perhaps unpalatable alternatives”. They also say “the tensions noted in this work may lead the reader to wonder whether even the most basic coarse-grained properties of Hawking emission as derived in  are to be trusted”. This is promising stuff. Sadly they miss the trick by saying “the thermodynamic picture of black holes now rests on many pillars that remain intact”. But on the plus side they do conclude that “the most conservative resolution is that the infalling observer burns up”. This sounds reasonable given that the distant observer allegedly sees Susskind’s elephant get thermalized. So it sounds reasonable that the black hole is surrounded by a “ring of fire”:
Ring of fire image by Sam Chivers from New Scientist
There’s a reader-friendly article on all this by Zeeya Merali in the Huffington post. It’s called black hole ‘firewall’ theory challenges Einstein’s equivalence principle. She tells us that the team’s verdict shocked the physics community because their firewall would violate a foundational tenet of physics known as the equivalence principle. She quotes Raphael Bousso saying the firewall idea “shakes the foundations of what most of us believed about black holes”. She quotes Steve Giddings describing the situation as “a crisis in the foundations of physics that may need a revolution to resolve”. She also talks about the monogamy of entanglement.
The monogamy of entanglement
The monogamy of entanglement is another paradox, wherein the emitted Hawking radiation particle can’t be entangled with two systems at once. It can’t be entangled with the particle that fell into the black hole, and with the Hawking radiation that was previously emitted. As I said, in physics if you hit a paradox it means something’s wrong. If you hit two, it means something’s definitely wrong. As to what, well, there’s the rub. Merali tells us that “to escape this paradox, Polchinski and his co-workers realized, one of the entanglement relationships had to be severed”. She says they were reluctant to abandon the entanglement needed to encode information in the Hawking radiation, so they severed the binding between the escaping particle and its infalling twin. She quotes Polchinski saying “it’s a violent process, like breaking the bonds of a molecule, and it releases energy”. That’s an odd thing to say, because energy is released when bonds are formed, not when they’re broken. Merali also quotes Ted Jacobson saying “it was outrageous to claim that giving up Einstein’s equivalence principle is the best option”. That’s odd too. Because it isn’t outrageous at all.
Penrose did his own thing
As to why, I need to rewind. In 1964 Roger Penrose wrote a paper on gravitational collapse and space-time singularities. He started by saying “the discovery of the quasi-stellar radio sources has stimulated renewed interest in the question of gravitational collapse”. He went on to say this: “it has been suggested by some authors  that the enormous amounts of energy that these objects apparently emit may result from the collapse of a mass of the order of (106 – 108)MⓍ to the neighborhood of its Schwarzschild radius”. That sounds reasonable too, and it is in line with the mass deficit. But oddly enough, Penrose appears to have ignored it. He also said “the general situation with regard to a spherically symmetrical body is well known ”. He was referring to Oppenheimer and Snyder’s 1939 frozen star paper on continued gravitational contraction. But he ignored that too. Because he was pushing point-singularities:
Image from Gravitational collapse and space-time singularities by Roger Penrose
He wasn’t interested in what Einstein said about why light curves. Or in Einstein’s 1939 paper. That’s where Einstein said light rays “take an infinitely long time (measured in “coordinate time”) to reach the point r = μ/2”. Penrose didn’t even seem interested 2 years later in his 1966 Adams prize essay where he appealed to authority by referring to Eddington-Finkelstein coordinates. Even though neither Eddington nor Finkelstein “ever wrote down these coordinates or the metric in these coordinates”. Most of all, Penrose didn’t seem interested in how gravity works and why matter falls down or what a black hole really was. It’s as if he was doing his own thing, playing fast and loose with what had gone before.
The frozen star was frozen out
Hawking seems to do have done something similar. In a brief history of time he referred indirectly to Oppenheimer and Snyder’s 1939 frozen star paper. But he told a twisted tale. He said this: “Oppenheimer’s work was then rediscovered and extended by a number of people. The picture that we now have from Oppenheimer’s work is as follows. The gravitational field of the star changes the paths of light rays in space-time from what they would have been had the star not been present. The light cones, which indicate the paths followed in space and time by flashes of light emitted from their tips, are bent slightly inward near the surface of the star. This can be seen in the bending of light from distant stars observed during an eclipse of the sun. As the star contracts, the gravitational field at its surface gets stronger and the light cones get bent inward more. This makes it more difficult for light from the star to escape, and the light appears dimmer and redder to an observer at a distance. Eventually, when the star has shrunk to a certain critical radius, the gravitational field at the surface becomes so strong that the light cones are bent inward so much that light can no longer escape”. That’s not what Oppenheimer and Snyder said. Hawking must have known this, and about the Physics Today article introducing the black hole by Remo Ruffini and John Wheeler. They said the collapse is continuing because even after an infinite time, as measured by a distant observer, the collapse is still not complete. That’s why they said “in this sense the system is a frozen star”. But Hawking doesn’t even mention the word frozen. The frozen star was frozen out.
Einstein was frozen out too
Einstein was frozen out too. The word Einstein occurs 62 times in a brief history of time. But Hawking didn’t refer to Einstein saying why light curves. Nor did he refer to Einstein’s 1939 paper about light rays taking forever to reach the event horizon. Hawking was appealing to Einstein’s authority whilst flatly contradicting him. Which is why Hawking got it wrong in his 1976 paper on the breakdown of predictability in gravitational collapse. That’s where he said this: “as was shown in a series of papers by Penrose and this author3-6, a space-time singularity is inevitable in such circumstances provided that general relativity is correct”.
General relativity is correct
General relativity is correct, it’s one of the best-tested theories we’ve got. The evidence started coming in in 1919 after a mere three years, with a war on, courtesy of Arthur Eddington. Compare and contrast with Hawking radiation. Not only that, but the evidence of optical clocks says the speed of light is spatially variable. That’s what Einstein said time and time again. But Hawking didn’t refer to that. Instead he referred to the principle of equivalence and said gravity is always attractive, and that “this leads to singularities in any reasonable theory of gravitation”. But he didn’t refer to the event horizon as a singularity like Einstein did. Instead he said the “apparent” singularity at the event horizon was “simply due to a bad choice of coordinates”. His general relativity wasn’t Einstein’s general relativity. It was some ersatz popscience version of the real thing.
But the waterfall analogy is not
That’s why Hawking talked about “such a strong gravitational field that even the ‘outgoing’ light rays from it are dragged back”. He didn’t realise that light goes slower when it’s lower, so the ascending light beams speeds up. So in a strong gravitational field, it doesn’t get dragged back at all. It speeds up all the more. That’s what Einstein’s variable speed of light says, that’s what optical clocks say, and that’s what general relativity says. Yes, general relativity is correct, but the waterfall analogy is not. We do not live in some Chicken-Little world where space is falling down:
Image from Sunil Bisoyi’s blog, said to be from a Leonard Susskind article in Scientific American April 1999
That means Hawking’s understanding of gravity and singularities was not correct. Which is why he talked about the event horizon not as a place where the coordinate speed of light is zero, but as a surface which ”emits with equal probability all configurations of particles”. Spontaneously, like worms from mud. Regardless of the infinite time dilation. Regardless of the very reason why the light can’t get out. And then without a trace of irony, Hawking talked about a “principle of ignorance”, and said “something obviously goes badly wrong”. It did indeed.
Something obviously goes badly wrong
As for when, it looks like things started to go wrong after Einstein died, in the “golden age“ of general relativity in the sixties. Things were definitely badly wrong by 1983 when Gerard ‘t Hooft wrote his paper on the ambiguity of the equivalence principle and Hawking’s temperature. He said the laws governing a system in a gravitational field can be obtained by viewing the field as generated by an acceleration relative to an inertial frame. He said “strictly speaking this procedure only works if the gravitational field is homogeneous”. It’s as if he’d never read Einstein’s Leyden Address. That’s where Einstein described a gravitational field as a place where space was “neither homogeneous nor isotropic”. Furthermore ‘t Hooft also said “by homogeneous we mean that there is an inertial frame in which the entire gravitational field disappears”. It’s as if he’d never read Einstein’s Relativity: The Special and General Theory. That’s where Einstein said it’s “impossible to choose a body of reference such that, as judged from it, the gravitational field of the Earth (in its entirety) vanishes”. As Peter Brown points out in Einstein’s gravitational field, a uniform gravitational field has no tidal forces and thus no space-time curvature, so it’s a contradiction in terms. So it’s not a good idea for ‘t Hooft to claim that uniform gravitational fields “occur in Nature almost by definition” when they don’t occur in nature at all. Nor is it a good idea to claim that Hawking radiation “originates in a region where the collapsing matter has a close to infinite kinetic energy per particle”. Not when gravity doesn’t add any energy to a falling particle. Gravity converts potential energy, which is mass-energy, into kinetic energy. Hence when the latter is radiated away, the particle is left with a mass deficit. And last but not least, it is not a good idea to use the wrong equivalence principle.
The wrong equivalence principle
In his 1993 paper the stretched horizon and black hole complementarity, Leonard Susskind said “the belief is based on the equivalence principle”, and “it seems certain that a freely falling observer experiences nothing out of the ordinary when crossing the horizon”. Only it doesn’t. The equivalence principle doesn’t say the falling observer experiences nothing out of the ordinary when he encounters the hard unyielding ground, and it doesn’t say the falling observer experiences nothing out of the ordinary when he encounters a black hole. As for what it does say, see Kevin Brown’s mathspages article on the many principles of equivalence. He tells us how the definition of the equivalence principle has undergone several changes over the years, and that “the modern statement of the strong equivalence principle, of the assertion that the laws of physics are the same for all frames of reference (i.e., independent of velocity) is also conceptually quite distinct from the original meaning of Einstein’s equivalence principle”. The story starts with Einstein’s happiest thought in 1907. That’s when he thought of an observer in free-fall from the roof of a house, and realized the observer wasn’t feeling any force. It was as if there was no force on the observer. Or on anything falling alongside him. Setting aside air resistance and any surrounding buildings, falling would feel like inertial motion without acceleration. Hence page 149 of MTW describes the equivalence principle thus: “in a freely falling (non-rotating) laboratory occupying a small region of spacetime, the laws of physics are those of special relativity”. You can read the same thing on the Wikipedia equivalence principle article: “the outcome of any local non-gravitational experiment in a freely falling laboratory is independent of the velocity of the laboratory and its location in spacetime”. However in 1939 Einstein said light rays take an infinitely long time to reach the event horizon. That doesn’t square with a freely-falling observer experiencing “nothing out of the ordinary when crossing the horizon”. That’s because it isn’t Einstein’s principle of equivalence. John Norton explained what was in his 1985 paper what was Einstein’s principle of equivalence? He said it was a special relativity principle that dealt only with fields that could be transformed away. This means Einstein was talking about the accelerating observer, not the inertial observer. This is why Kevin Brown says the term equivalence principle should be reserved for assertions about the “sameness” of the effects of gravitation and extrinsic acceleration. And why Norton talked of an old view and a new view, and said “the equivalence of all frames embodied in this new view goes well beyond the result that Einstein himself claimed in 1916”.
The Einstein equivalence principle
Chung Lo said essentially the same in 2015 in rectification of general relativity. He said “many were confused that his 1916 equivalence principle was the same 1911 assumption of equivalence that has been proven invalid”. He blames Wheeler and others: “Wheeler led his school at Princeton University while his colleagues, Sciama and Zel’dovich (another H-bomb maker) developed the subject at Cambridge University and the University of Moscow”. Dennis Sciama strongly influenced Penrose and of course Hawking. Lo goes on to say “the misinterpretations of the theory and errors such as the singularity theorems have been accepted as part of the faith”. It sounds like heavy stuff, but it’s true. Take another look at the Wikipedia equivalence principle article: “this was developed by Robert Dicke as part of his program to test general relativity. Two new principles were suggested, the so-called Einstein equivalence principle and the strong equivalence principle, each of which assumes the weak equivalence principle as a starting point”. The Einstein equivalence principle isn’t Einstein’s equivalence principle. The Wikipedia article also says “the outcome of any local non-gravitational experiment in a freely falling laboratory is independent of the velocity of the laboratory and its location in spacetime”. Even though Einstein didn’t. The article also says the fine-structure constant must not depend on where it’s measured. Even though the fine structure constant is a running constant that varies with energy density, and so must vary with gravitational potential. Solar probe plus was going to test this, but it’s now a purely heliophysics mission. As to why a fine structure experiment isn’t on board is a mystery to me. Something else that’s a mystery to me is why anybody should take the wrong principle of equivalence so far that they end up with trips to the end of time and back, and elephants in two places at once.
The right equivalence principle
As for the right equivalence principle, I’m afraid that’s been taken too far too. In the Wikipedia equivalence principle article you can read how being on the surface of the Earth is like being inside an accelerating spaceship. That’s the equivalence principle most people know about. Look at google images, and that’s what you see:
Image by various contributors and Google
Note however that the two situations are not exactly the same. Standing still in inhomogeneous space is not identical to accelerating through homogeneous space. When you’re in your windowless room either on Earth or in the spaceship, light curves and matter falls down. But like Wikipedia says, the room has to be small enough so that tidal effects are negligible. As for how small, note that Einstein said the special theory of relativity is “nowhere precisely realized in the real world”. It’s only valid “in the infinitesimal”. Your room has to be an infinitesimal room for the principle of equivalence to be exactly valid there. This is why on page 20 of Einstein’s gravitational field, Peter Brown quoted John Synge talking about the midwife. Synge said the principle of equivalence performed the essential office of midwife at the birth of general relativity. But then he suggested that “the midwife be buried with appropriate honours”. See pages ix and x in the preface to Synge’s relativity: the general theory. The bottom line is that the principle of equivalence is only valid in an infinitesimal region. It’s only valid in a in a region of zero size, which is no region at all. That’s why in the Einstein digital papers you can read that “it was presented as a heuristic principle for finding the theory rather than as one of the theory’s main tenets”. As for why people think it’s one of the theory’s main tenets is another mystery.
Gamma ray bursts
Yet another mystery is gamma-ray bursts. See for example the 2008 NASA article gamma-ray bursts: the mystery continues by Tony Phillips and Dauna Coulter. Or see the NASA HEASARC article the mysterious gamma ray bursts by Joslyn Schoemer et al. They say “one of astronomy’s most baffling mysteries is the undiscovered source of sudden, intense bursts of gamma rays”. Hawking referred to them in a brief history of time. He said “in fact bursts of gamma rays from space have been detected by satellites originally constructed to look for violations of the Test Ban Treaty. These seem to occur about sixteen times a month and to be roughly uniformly distributed in direction across the sky”. He was hinting that they might be caused by black hole explosions associated with Hawking radiation. Don’t forget that when Penrose wrote his epoch-making singularities paper in 1964 he said “the discovery of the quasistellar radio sources has stimulated renewed interest in the question of gravitational collapse”. Of course quasars aren’t quite the same as gamma-ray bursts, and the latter weren’t declassified until 1973. But they’re the same kettle of fish, to do with black holes and gravitational collapse. This is the sort of stuff that resurrected interest in general relativity. Only the real mystery is why gamma ray bursts are still a mystery. Especially when Friedwardt Winterberg explained them in 2001.
See the 2013 AMPS paper an apologia for firewalls. Tucked away in the conclusion is footnote 31, containing a reference 87 to Winterberg’s 2001 paper gamma ray bursters and Lorentzian relativity. Winterberg talks about the direct conversion of an entire stellar rest mass into gamma ray energy. The nub of it is this: “if the balance of forces holding together elementary particles is destroyed near the event horizon, all matter would be converted into zero rest mass particles which could explain the large energy release of gamma ray bursters”. See the Wikipedia gamma ray burst article and note that “a typical burst releases as much energy in a few seconds as the Sun will in its entire 10-billion-year lifetime”. Also see the emission mechanisms section where you can read that: “some gamma-ray bursts may convert as much as half (or more) of the explosion energy into gamma-rays”. As I write Winterberg’s proposal is mentioned in the progenitors section of the Wikipedia article, albeit in a single line.
Perhaps Winterberg’s gamma ray bursters and Lorentzian relativity hasn’t received much attention because it looks like unfamiliar territory. For example Winterberg talked about an ether. Some people might not like that because Einstein is said to have done away with the ether. But some people don’t know that in 1920 Einstein referred to space as the ether of general relativity. Winterberg also said “the event horizon appears first at the center of the collapsing body, thereafter moving radially outward”. Some people might not like that because it sounds back to front. But some people don’t know about frozen stars growing like hailstones: You’re a water molecule. You alight upon the surface of the hailstone. You can’t pass through this surface. But you are presently surrounded by other water molecules, and eventually buried by them. So whilst you can’t pass through the surface, the surface can pass through you. Winterberg also said in the limit v = c, massive particles become unstable and break up into zero rest-mass particles, and that “for v > c there can be no static equilibrium”. Some people might not like that because massive particles can never reach the speed of light. Only they can, and I’m not talking about Cherenkov radiation. Back in the day when John Michell was talking about dark stars, the idea was that a dark star has an escape velocity that “equals or exceeds the speed of light”. You can flip this around and reason that if I dropped you at some great distance from a black hole, you would survive the fall to the event horizon. At the event horizon you would be moving at the speed of light, or faster. Especially since you could have fired your gedanken boosters and accelerated towards the black hole. Then you might think you could go even faster as you continued to fall towards the point-singularity. But you can’t. You can’t go faster than light because of the wave nature of matter. We can make electrons out of light in pair production, and we can diffract electrons. You are made out of electrons. And other things too, but the same principle applies. You are made of matter, in a very real way matter is made of light, and matter cannot go faster than the light from which it is made.
When matter falls into a black hole
By now you may have spotted the fly in the ointment. The crucial point is this: matter falls down when it’s in a place where there’s a gradient in the speed of light. That’s what Einstein said, and that’s what optical clocks say too. It’s because of the wave nature of matter. When you fall towards a black hole it’s because the speed of light is reducing. The reducing speed of light is transformed into your downward motion. The more it reduces the faster you fall. You fall towards the event horizon, faster and faster. All the while the speed of light is getting slower and slower. Falling bodies don’t stop accelerating, and they don’t slow down. The descending light beam does, but your falling body does not. So there has to be some crossover point where you would end up going faster than the local speed of light. Your velocity v would exceed the “coordinate” speed of light c at that location. Can you fall faster than the local speed of light? No. Relativity says no, the wave nature of matter says no, and so do quasars. So you don’t have to worry about spaghettification. Or about the AMPS firewall. Matter cannot go faster than the light from which it is made, so something else happens. Something more dramatic. Something Einstein should have predicted in his 1939 paper where he said light rays and material particles take an infinitely long time to reach the event horizon:
Public domain image by NASA, see dying supergiant stars implicated in hours-long gamma-ray bursts
BOOM! A gamma ray burst happens. When I talked about dropping you such that you survived the fall to the event horizon, I was being economical with the truth. Because you were never going to survive the fall to the event horizon. Because you’re made of electrons and things, and gravity converts potential energy which is mass-energy into kinetic energy. This reduces the mass-energy of those electrons and things, and you can only take this so far. The electron is a 511keV electron because h is what it is, and only one E=hf energy yields the stable spin ½ standing-wave Ponting-vector thing that we call an electron. Reduce the mass-energy and it’s like trying to make a 411keV electron. The rotational energy flow cannot confine itself. The electron cannot persist as an electron. It’s something like stretching a helical spring. You can’t stretch it straighter than straight. When you try to do so, it breaks. In similar vein the wave that is the electron breaks. And because you are made of electrons, you are thermalised and ionised and marmalised. You are annihilated in a catastrophic 100% conversion of matter into energy. Every electron, every proton, and every last neutron is ripped apart and rendered down to gamma photons. And doubtless neutrinos, but they depart at the speed of light too so what’s the difference? The difference is that you do not encounter a firewall at the black hole event horizon. Because you are your own firewall, before you ever get to that event horizon.
You can take comfort in the conclusion that photons and neutrinos depart in different directions. Think of the right hand rule and the left-hand rule, and stick your fingers out in orthogonal directions. Charge isn’t conserved, but angular momentum is. The directions of the resultant photons and neutrinos depend on the charge of the original particle. You can also take comfort in the conclusion that there is no Hawking radiation, and no information paradox either. I don’t, because I don’t care about information. What I care about is why all this isn’t common knowledge. It’s all so straightforward. It’s all so obvious, so simple when you’ve read the Einstein digital papers and know that the speed of light is not constant. But when I look around on the internet, I don’t see people talking about the Einstein digital papers. What I see is Winterberg making dark comments about string theorists and groupthink and priority and censorship. We’ll come back to that another time, because it’s important. But for now there’s some other dark stuff we need to look at. Called dark matter.