The quantum entanglement story began in 1935 with the EPR paper. That’s where Einstein, Podolsky, and Rosen said quantum mechanics must be incomplete, because it predicts a system in two different states at the same time. Later that year Bohr replied saying spooky action at a distance could occur. Then Schrödinger came up with a paper where he talked of entanglement, a paper where he used his cat to show how ridiculous the two-state situation was, and a paper saying he found spooky action at a distance to be repugnant. He also compared it to Voodoo magic, where a savage *“believes that he can harm his enemy by piercing the enemy’s image with a needle*”.

*Credits: NASA/JPL-Caltech, see* *Particles in Love: Quantum Mechanics Explored in New Study | NASA*

Then in 1952 Bohm came up with two hidden variables papers which retained the spooky action at a distance. He also came up with the EPRB experiment, which was the subject of a paper in 1957. Then in 1964 Bell came up with papers On the Problem of Hidden Variables in Quantum Mechanics and On the Einstein Podolsky Rosen Paradox. Bell said the issue was resolved in the way Einstein would have liked least, and gave a mathematical “proof” which is usually called Bell’s theorem. Then in 1969 we had the CHSH paper on a Proposed Experiment to Test Local Hidden-Variable Theories by Clauser et al. Then in 1972 we had Clauser and Freedman’s Experimental Test of Local Hidden-Variable Theories. Then in 1981 we had Experimental Tests of Realistic Local Theories via Bell’s Theorem by Aspect et al, followed by two further papers in 1982. Then in 1998 we had Violation of Bell’s inequality under strict Einstein locality conditions by Zeilinger et al.

*Convinced the physics community in general that local realism is untenable*

All these so-called Bell test experiments used photons and polarizing filters. They featured ever-increasing complexity to cater for so-called loopholes, and are said to have *“**convinced the physics community in general that local realism is untenable”*. So much so, that the 2022 physics Nobel Prize was awarded to Clauser, Aspect, and Zeilinger* “for experiments with entangled photons, establishing the violation of Bell inequalities and pioneering quantum information science”*. This Nobel prize was awarded exactly a hundred years after Bohr was awarded a Nobel prize in 1922. There’s just one problem. It’s all bullshit.

*Quantum bullshit from the man who said reality does not exist until you measure it*

The bullshit started in 1935 with Bohr’s reply to the EPR paper. Bohr’s paper was rambling, off-topic, and pretentious. He didn’t address the issue at all. Instead he gave a lofty lecture on complementarity and the double slit experiment, then energy and time, and space and time. In the middle of all the condescending bluster he slipped in this: *“an influence on the very conditions which define the possible types of predictions regarding the future behavior of the system”.* That’s spooky action at a distance. He also slipped in this: *“we see that the argumentation of the mentioned authors does not justify their conclusion”*. No, we don’t. We see Bohr ducking the issue. We see quantum bullshit from the man who said reality does not exist until you measure it. From the man who said the Moon isn’t there if nobody looks. From the man who said quantum mechanics surpasseth all human understanding, and you can never hope to understand it, so don’t even try. Bohr even had the gall to say the *“**quantum-mechanical description of physical phenomena would seem to fulfill, within its scope, all rational demands of completeness”*. Even though quantum mechanics doesn’t tell us what a photon is, how pair production works, or what the electron is. Even though it dismisses electron spin as an abstract notion via a spinning faster than light non-sequitur. Despite the hard scientific evidence of the Einstein-de Haas effect, Larmor precession, and the wave nature of matter. And do note the use of the word *rational*. Bohr was just so patronising. You will not find one person who has a good word to say about his paper. Even Clauser said he found Bohr’s ideas “*muddy and difficult to understand”*.

*A whole layer cake of bullshit *

The next load of bullshit came from Bohm with his 1952 papers. I say that because people talk about de Broglie-Bohm theory, and de Broglie won his 1929 Nobel Prize for the discovery of the wave nature of electrons. Did Bohm somehow miss the bit that said matter is, by its nature, a wave motion? He couldn’t have. Which means he was going off on one when he said wavefunction was a mathematical representation of a real field which exerts a force on a particle akin to an electromagnetic field. Especially when he then said it can *“transmit uncontrollable disturbances instantaneously from one particle to another”. *In one fell swoop Bohm had contradicted the wave nature of matter, dug up spooky action at a distance, and then pulled the stake from its heart. He had resurrected a myth and set it free to stalk the land. Not only that, but in 1957 he ignored the original EPR position v momentum debate, and said *spooky action at a distance* applied to the Stern-Gerlach measurements of spin ½ particles.

*Spindle torus animation by Adrian Rossiter, see **antiprism.com**, resized and reversed by me via **EZgif*

He said *“if the x component is definite, then the y and z components are indeterminate”*. That makes as much sense as saying the South side of a whirlpool is definitely spinning from West to East, but the spin on the East side is indeterminate. Conservation of angular momentum says it isn’t. But Bohm was so full of bullshit he even said *“in any single case, the total angular momentum will not be conserved”. *On top of that, he said it was only practicable to test the EPR paradox via *“the polarization properties of correlated photons”*. Photon polarization is nothing like spin ½. This is just a whole layer cake of bullshit from a wannabee mystic who didn’t understand the photon, pair production and annihilation, the electron the positron, or anything else.

*A barrelful of blarney *

The next shedload of bullshit came in Bell’s 1964 paper. It was a career-promoting *Einstein-was-wrong* hit piece. A hit piece that peddled Copenhagen mysticism and was as clear as mud. Bell didn’t address the physics of spin ½. Instead he gave a rambling smoke-and-mirrors mathematical “proof” which culminated in him pulling a rabbit out of a hat and making the grandiose claim that *“the signal involved must propagate instantaneously”*. How on Earth can anybody pretend to prove *that* via mere mathematics? With no experimental evidence at all? Bell’s whole paper was a barrelful of blarney from a 1960s particle physicist who hadn’t read any of the realist papers from the 1920s and 1930s. In the introduction, he gave some history and referred to Bohm and others. Then he made the claim that non-locality is characteristic of any theory which *“reproduces exactly the quantum mechanical predictions”*. In section II he talked about an entangled pair of spin ½ particles. He defined terms and to describe a ±1 Stern-Gerlach measurement on particle 1 at magnet angle a, and a similar measurement on particle 2 at magnet angle b. His equation 2 said the expectation value for such measurements was , where λ denoted a set, A being a result of measuring particle 1 and B being a result of measuring particle 2. That’s A for Alice and B for Bob.

*Introducing an unwarranted linear relationship*

Bell also said a hidden-variable expectation value should equal the quantum-mechanical singlet state expectation value , but that *“it will be shown that this is not possible”*. Then in section III he said = cos θ. There’s nothing wrong with that. But there’s plenty wrong with his hidden variable in the form of “*a unit vector λ, with a uniform probability distribution”*. That’s a straw man hidden variable. It’s nothing like the real hidden variable. The real hidden variable is that spin ½ is a real rotation in two orthogonal directions, just like Darwin said in 1928:

*Sinusoidal strip by me, GNUFDL spinor image by **Slawkb**, Torii by* *Adrian Rossiter**, S-orbital from **Encyclopaedia Britannica*

Stern-Gerlach measurements yielding +1 and -1 just don’t do it justice, especially since a magnetic field causes Larmor precession. Those Stern-Gerlach magnets don’t “measure” spin, they just give you a pointer to it, and whilst doing so, they alter it. So that’s another hidden variable. So Bell’s section III is useless. However section IV is worse than useless, because that’s where the sleight-of-hand lies. In equation 13 Bell said , which means your A and B sets of measurements will be opposite if your magnet angles are the same. Then in his equation 14, he treated a set of measurements as a simple number, and said the expectation value . That’s saying you can get the same results using only your Alice measurements, because they’re the opposite of your Bob measurements. That means you’re introducing an unwarranted linear relationship.

*When Bell snuck in that linear relationship, he kicked a cosine under the rug*

Then Bell threw in another measurement at angle c, and after a bit of jiggery-pokery came up with expression 15, which is . That’s Bell’s inequality. As to what it means, let me draw you a picture:

One plus your bc term is greater than or equal to your ab term plus your minus ac term, which isn’t really negative. Your expectations add up in a sensible fashion, because there is no magic. You can ignore the rest of Bell’s maths because of this statement: *“Nor can the quantum mechanical correlation (3) be arbitrarily closely approximated by the form (2). The formal proof of this may be set out as follows”*. It’s not important. What’s important is that when Bell snuck in that linear relationship, he kicked a cosine under the rug. That resulted in a straw man “classical prediction” that takes a linear form. Hence Bell’s inequality is usually illustrated with a picture like the one below, with a red straight-line classical prediction, and a blue curved-line quantum mechanical prediction:

*CCASA image by Richard Gill, see **Bell’s theorem – Wikipedia**. Caption: **The best possible local realist imitation (red) for the quantum correlation of two spins in the singlet state (blue), insisting on perfect anti-correlation at zero degrees, perfect correlation at 180 degrees. Many other possibilities exist for the classical correlation subject to these side conditions, but all are characterized by sharp peaks (and valleys) at 0, 180, 360 degrees, and none has more extreme values (±0.5) at 45, 135, 225, 315 degrees. These values are marked by stars in the graph, and are the values measured in a standard Bell-CHSH type experiment: QM allows ±1/√2 = 0.707, local realism predicts ±0.5 or less.*

This concerns two spin ½ particles with opposite spins. It’s saying the classical prediction for the correlation at 45° is -0.5, whilst the quantum mechanical prediction is -0.707. Special thanks to Physics Forums and Richard Gill for this picture. It’s from an old version of the Wikipedia Bell’s Theorem article. Gill has written a number of papers on the subject of Bell’s theorem.

*In **Bell’s toy model, correlations fall off linearly*

Don’t think Gill made a mistake with this. That heavyweight Stanford article I mentioned includes Abner Shimony amongst the authors, and says this: *“In **Bell’s toy model, correlations fall off linearly with the angle between the device axes”*. In addition, you can see much the same picture in Alain Aspect’s paper Bell’s Theorem: The Naive View of an Experimentalist__: __

*Image by Alain Aspect*

This concerns two photons with the same polarization, so we don’t have the exact same curve, but it’s still cosinusoidal, and the meaning is the same. Again there’s a straight-line classical prediction, and a curved-line quantum mechanical prediction. Aspect mentioned Malus’s Law, which I’ve mentioned previously. It’s to do with optics, and it’s of crucial interest. So much so that I want to show you something from the Wikipedia Bell’s theorem article:

What it’s saying is vectors a and b are orthogonal, whilst vector c is at a 45° angle to both of them. P is a correlation. When you combine a and b, you get a result that’s zero. That’s cos 90°. Combine a and c, or b and c, and you get a result that’s –√2/2, which is –0.707. Note that 0.707 is cos 45°. The claim is that 0.707 is not less than 1 – 0.707 = 0.293, therefore *“there is no local hidden variable model that can reproduce the predictions of quantum mechanics”*. But there is something that can. Polarizing filters.

*Polarizing filters*

Take two polarizing filters and place them in a light beam, one orthogonal to the other. No light gets through. Then take a third filter and put it between the first two at a 45° angle. Now some of the light does get through:

*Crossed polarizer images from Rod Nave’s **hyperphysics*

The third polarizing filter is sometimes presented as something mysterious, but that mystery is based upon a lack of understanding. Darel Rex Finley explained it well in his 2004 article Third-Polarizing-Filter Experiment Demystified – How It Works. He described how a horizontal polarizing filter doesn’t just let through a horizontally polarized light wave. It lets through the horizontal *component* of a light wave polarized at some angle, also rotating it so that it ends up horizontal. If the light wave is horizontally polarized, it lets it all through. If the light wave is polarized at 22.5° it lets almost all of it through whilst rotating it by 22.5°. If the light wave is polarized at 45° it lets some of it through whilst rotating it by 45°. If the light wave is polarized at 67.5° it lets less of it through whilst rotating it by 67.5°. If the light wave is polarized at 90°, it lets none of it though. Here’s Finley’s depictions:

*Polarization images by Darel Rex Finley, see **Third-Polarizing-Filter Experiment Demystified – How It Works*

There’s an adjacent-over-hypotenuse cosine function at work here. Cos 0° = 1, cos 22.5°= 0.923, cos 45° = 0.707, cos 67.5° = 0.382, cos 90° = 0. As Finley said, there are no spooky quantum properties, and there’s nothing mysterious about it. Instead it’s a straightforward chain of cause and effect yielding rational, comprehensible results. It’s closely related to Malus’s law, which gives the intensity of the transmitted light as I = I₀ cos²θ. There’s a square in there because *“the transmitted intensity is proportional to the amplitude squared”*. When θ is 0°, all the light gets through, when θ is 90°, none of the light gets through, when θ is 45°, half the light gets through because cos²θ is 0.707 x 0.707 = 0.5. Note that the same is true for a beam of unpolarized light going through a polarizing filter. It’s a mixture of polarizations, and half the light gets through.

*It describes a cosine-like curve because that’s how polarizing filters work *

It’s similar when your polarizing filters are at opposite ends of an entanglement experiment. You emit a random assortment of photon pairs towards your polarizing filters at A and B, and on average half of them get through A. If your photon pair have the same polarization and your polarizing filters have the same alignment, then if you detect a photon at A you will detect a photon at B. If you rotate filter B round by 90° and repeat, then if you detect a photon at A you will not detect a photon at B. There’s a sliding scale between the two extremes, only it isn’t linear. It describes a cosine-like curve because that’s how polarizing filters work. When you know this, articles like Bell’s Theorem explained, Bell’s theorem and Bell’s Theorem with Easy Math seem totally lame. So does Clauser and Freedman’s experiment. Take another look at their results. Freedman’s inequality is δ = | R(22½°) / R₀ – R(67½°) / R₀ | – ¼ ≤ 0. That’s where R is the coincidence rate for two-photon detection with both polarizing filters in place, and R₀ is the coincidence rate for two-photon detection with no polarizing filters in place. The former divided by the latter is about 0.5 when the two polarizers have the same orientation and so angle ϕ is zero. This is what you’d expect, because on average half the light gets through polarizer A. It isn’t quite 0.5 because polarizers aren’t perfect:

*Image from **Experimental Test of Local Hidden-Variable Theories** by Clauser and Freedman*

Then as you rotate polarizer B, the coincidence rate drops off, following a cosine-like curve. Looking at the plot above, at 22½° it’s about 0.4, at 67½° it’s about 0.1, so δ = 0.4 – 0.1 – 0.25 = 0.05. Turning to Malus’s law, the 0.5 relates to a cos² 0° = 1. So for cos² 22.5° we have 0.923 x 0.923 = 0.852 which we divide by 2 to give 0.426. Then for cos² 67.5° we have 0.382 x 0.382 = 0.146 which we divide by 2 to give 0.073. So δ = 0.426 – 0.073 – 0.25 = 0.103. That isn’t less than or equal to zero, so Freedman’s inequality is broken. But not because of some magical mysterious spooky action at a distance. Because of the way polarizers work.

*Those photons weren’t entangled at all *

Not only that, but Clauser and Freedman’s photons weren’t even entangled. The wavelengths were 581 nanometres and 406 nanometres. They weren’t the same wavelength because the photons were not produced at the same time. Clauser and Freedman referred to an intermediate state lifetime of 5 nanoseconds. Those photons were produced 5 nanoseconds apart. A photon travels 1.5 metres in 5 nanoseconds. By the time the second photon was emitted, the first photon was through the polarizer. Those photons weren’t entangled at all. Aspect used the same calcium cascade. So his photons weren’t entangled either. That’s why you can do a Bell-type Polarization Experiment With Pairs Of Uncorrelated Optical Photons, and get the same result. This is why Al Kracklauer was talking about Malus’s law twenty years ago. And why Dean L Mamas wrote a no-nonsense “realist” paper called Bell tests explained by classical optics without quantum entanglement. It’s brief and to the point, saying this: *“The observed cosine dependence in the data is commonly attributed to quantum mechanics; however, the cosine behavior can be attributed simply to the geometry of classical optics with no need for quantum mechanics”*. Quite. I found out about this because Gill wrote a paper in 2022 dissing Mamas by moving the goalposts and saying the quantum mechanical cosine is twice the classical cosine. Gill used the trick of changing both polarizer angles. We are dealing with correlations here, so it’s the difference between the polarizer angles that matters, not the difference of each from some reference angle. Cos 2 θ is not equal to 2 cos θ. It’s like what Joy Christian said, the terms do not commute.

*Is Spookiness Under Threat?*

Note that Gill also wrote a paper dissing Joy Christian, who had the temerity to challenge spooky at a distance in 2007. See Mark Buchanan’s New Scientist article Quantum Entanglement: Is Spookiness Under Threat? It referred to Christian’s paper Disproof of Bell’s Theorem by Clifford Algebra Valued Local Variables. After that, Christian got cancelled for challenging quantum entanglement. He was dropped by the Perimeter Institute and Oxford University, because of people like Gill, who talks about Bell denialists. Scott Aaronson used the same word when he was dissing Christian. He also said he’d *“decided to impose a moratorium, on this blog, on all discussions about the validity of quantum mechanics in the microscopic realm, the reality of quantum entanglement, or the correctness of theorems such as Bell’s Theorem”*. That’s what you call safe-space censorship. He also said this: *“Imagine, for example, that there existed a devoted band of crackpots who believed, for complicated, impossible-to-pin-down reasons of topology and geometric algebra, that triangles actually have five corners. These crackpots couldn’t be persuaded by rational argument”*. And so on. Scott Aaronson is a cheerleader for quantum computing.

*Physicists Create a Wormhole Using a Quantum Computer*

There’s a lot of that sort of stuff in physics. Propaganda and censorship has been par for the course for decades, along with ad-hominem abuse and de-platforming. The tofu-eating wokerati learned it from the tenured academics living a life of ease on the public purse. That’s why you can’t read about people* like Krackleur, Mamas, and Christian in Nature, or in Quanta magazine. Instead you can read how Physicists Create a Wormhole Using a Quantum Computer. Wooo! You can also read how *“the power of a quantum computer grows exponentially with each additional entangled qubit”*. The problem is that quantum entanglement is bullshit, so entangled qubits are bullshit, so quantum computing is bullshit too. That’s why it still hasn’t delivered anything, and never ever will. What else would you expect when the quantum entanglement story is a fairy tale? It’s a castle in the air, built of bullshit, blarney, and bollocks, all mixed in with straw men and non-sequiturs, all held aloft by hype, hokum and hogwash. It is sophistry peddled by shysters and charlatans. It is cargo-cult pseudoscience promoted by quantum quacks who are spinning you a yarn and playing you for a sucker. The emperor has no clothes, and scientific fraud leaves a nasty taste in the mouth. Remember that the next time you read some jam-tomorrow puff-piece about quantum information science.

* It’s like finding a whole cave wall full of messages from a whole host of different Arne Saknussemms. So many people have been here before. But you only find them once you know you have to search on Entanglement and Malus’s law. Here’s a few:

Disproof of Bell’s Theorem: Further Consolidations by Joy Christian, who refers to Bell’s own derivation of Malus’s law.

A Comparison of Bell’s Theorem and Malus’s Law: Action-at-a-Distance is not Required in Order to Explain Results of Bell’s Theorem Experiments by Austin J Fearnley who said *“there is an enforceable duality between results of Malus’ Law experiments and the results from Bell experiments”.*

Disentangling entanglement by Antony R Crofts, who said *“Violation of the expectation values of Bell’s theorem have been taken to exclude all ‘local realistic’ explanations of quantum entanglement. A simple computer simulation demonstrates that Bell’s inequalities are violated by a local realistic model that depends only on a statistical implementation of the law of Malus in the measurement context”.*

Entanglement: A Contrarian View by A F Kracklauer, who said *“This comparison is made on the basis of Malus’ Law”*.

Polarization Correlation of Entangled Photons Derived Without Using Non-local Interactions by Kurt Jung, who said *“In consequence Bell’s inequalities are irrelevant”*.

Analyzer Output Correlations Compared for Entangled or Non-entangled Photons by Gary Gordon, who said* “Our predicted results for this non-entangled photon case are an exact match to those reported in the literature for the analysis and experimental outcomes for the entangled photon case”*.

A New Model for Linear Polarizing Filter by Herb Savage, who said* “**This new single-photon model for a linear polarizing filter shows the same apparent violation of Bell’s theorem as current loophole-free experiments”.*

Fatal_Flaws_in_Bell’s_Inequality_Analyses_Omitting_Malus_Law_and_Wave_Physics_Born_Rule by Arthur S Dixon. who talked about the conscious or negligent omission of Malus’ law in the furtherance of scientific fame and fortune. His reference 43 refers to a paper called Bell’s Theorem and Einstein’s ‘Spooky Actions’ from a Simple Thought Experiment: The Physics Teacher by Fred Kuttner and Bruce Rosenblum who give a reference 11 to Malus’s law on page 128.

Rotational Invariance, Phase Relationships and the Quantum Entanglement Illusion by Caroline H Thompson, along with The tangled methods of quantum entanglement experiments. She said *“They appear to be rejecting papers that endanger the accepted dogma”*.

Classical Interpretation of EPR- Bell Test Photon Correlation Experiments by Thomas Smid, who said *“All experiments are claimed to rule out the existence of Hidden Variables as Bell’s Inequality is violated and the coincidence rate (for initially unpolarized radiation) is observed to follow the well known Malus law”*.

Experimental Counterexample to Bell’s Locality Criterion by Ghenadie N. Mardari who said *“If polarization measurements are interpreted as transformations, then there is no mystery to explain”.*

Visualization of quantum entanglement by Stefan Heusler, who said *“Visualization of the angular dependency of the transmission probability p _{Ĥ}(θ) = |⟨Ĥ|Ω_{Polarisation}⟩|² of vertical polarized light. After many measurements, Malus law is recovered, indicated as red line”.*

One of the “best” videos on YouTube about Bell’s inequality (https://youtu.be/sAXxSKifgtU) makes the same exact mistake. First it derives the inequality by assuming equal probability of test outcomes (i.e. a linear probability distribution), then it “proves” the violation introducing an angle-dependent probability (i.e. a non linear distribution)!!!

You should take a look at all the praising comments. Sometimes I wonder if it’s just me or everyone else really has gone mad.

Many thanks Leon. I’ll check that out and probably add it to the list or include it in the article.

Hi John,

Thanks again for the article, I knew I had to ask you to write about the Bell’s related Nobel prices 🙂

And Leon, I also sometimes have the feeling a lot of people have lost their mind, it seems all is upside down. And everybody is parroting the current and “majority-backed” thing without proper questioning (Covid for sure was an eye opener). It seems our world has gotten dumber, it should be the other way round!

In my work I deal with optics and lasers so I knew something was not right. It also appears that these “scientist” hide behind their fancy math and difficult terms and ridicule outsiders that they are not smart enough to understand. And also give the typical reply that it is “proven by measurements”, what they don’t realize is that the setup of the measurement is key in understanding the outcome, wrong assumptions lead to wrong conclusions. And not sure how the saying goes, but if you want a miracle, you will get a miracle …. (The Large Hadron Collider comes to mind, which I always wondered what actually all these Billion euros explained anything additionally?)

New insights and discoveries are always done by outsiders, and these days it is increasingly difficult to swim against the stream (which I believe is going really into the drain or abyss).

Happy to have this place so I don’t loose my mind 🙂

Good stuff, Raf. As far as I know the saying is this:

When a Church needs a miracle, a Church gets a miracle. Apparently it goes back to the old days of pilgrims, saints, relics, and weeping statues. I can’t say exactly where I got that from, and will have a look to see if I can dig it up. Either way, it seems to be very true of “Big Science”.Another thing that is related to this is the delayed choice quantum eraser (actually it is more or less the same as Wheeler’s delayed choice) double split experiment. I also looked into how they did this experiment and actually the result does not show what they conclude from it.

What the detectors at D3 and D4 show is not the typical interference pattern and it comes from the prism-crystal they to split the beam. This guy explains it good: https://yrayezojeqrgexue.quora.com/The-delayed-choice-quantum-eraser-The-delayed-choice-quantum-eraser-is-one-of-the-most-hyped-experiments-in-popular-sci?ch=10&oid=50697428&share=092c6fc8&srid=QT6x&target_type=post

He also mentions Sabine, which to be fair her video went only with little toe into the matter but as always not really explaining why… Nowadays anyway she is more interested in keeping her subscribers entertained.

Many thanks Raf. I’ve always felt cynical about the delayed choice quantum eraser, but I’ve never had a proper look at it. I will now. I will watch the video and take it from there.

I actually saw Marlan Scully give a seminar on his delayed choice quantum eraser experiment.

…

and just like all QM entanglement phenomenon, it sounds exciting.

.

It sounds like you do a double slit but record which slit, so you don’t get a pattern on the screen. Then you erase the data and magically the image on the screen is changed to an interference pattern, after it has already been recorded. Amazing!

.

But what really happens is you get two blurs on the screen , one from each slit, when you record which slit the photon went through. Then when you ‘erase’ the which slit data, you somehow get two different sets of data. Each of these sets corresponds to an interference pattern. However, the interference patterns are offset and their sum is equal to the two blurs you originally got.

.

At the end I asked, ” but clearly the image on the screen never changes”. He said, it’s just the way you analyze the data that changes.

.

I know I hand waived through, and the experiment is significantly more complex, but to me, it is not as exciting as it sounds. Unless I just don’t understand the exciting part. Everyone else seems very excited by it.

That’s interesting Doug. You know, the more I find out about this sort of thing, the more I learn that there is no substance to it. This is not good. Especially since the issue is not limited to quantum physics.

Hi Doug, interesting story and you had the right gut feeling. Indeed on the screen nothing changes, it is how the data is processed, and the reason they have these offset patterns (that add of course up to the REAL pattern on the screen) is because of the crystal that is used, it has something to do with the structure inside crystal. They wrongly attribute (and make it appear it is the same) this pattern to the typical double split result.

It is the same story as with the polarizers in Bell’s experiments, they don’t seem to understand that these physical elements in the optical path do effect the result. It seems they are blind to this…

What I also think happens is that the experimentalists probably know this, but keep it under wraps so both the theorists and the experimentalist can confirm each other and everybody is happy and can continue their tax payer funded work (and of course whoo the general public with this “magic”, and since they don’t understand it, the funding can go on endlessly…. They even have the guts to ask for a crazy amount of funding to build an even bigger collider, which will lead to no new understanding)

And the trouble right now is that people are getting a lot of funding for quantum information science. See this for example:

.

https://qubitreport.com/quantum-computing-business-and-industry/2021/08/06/375m-fund-marks-a-new-era-for-uk-tech-innovation-quantum-computing/

.

“The Qubit Report – Because Quantum is Coming, £375M Fund Marks a New Era for UK Tech Innovation, Quantum Computing, August 6, 2021”.

https://www.reuters.com/article/us-usa-quantum-funding-idUSKBN25M0Y9

.

“The U.S. Department of Energy on Wednesday said it will provide \$ 625 million over the next five years for five newly formed quantum information research hubs as it tries to keep ahead of competing nations like China on the emerging technology. The funding is part of $1.2 billion earmarked in the National Quantum Initiative Act in 2018.”.

PS: if you want to put a dollar symbol in the comments, put a backslash \ in front of it.

The answer to all this Bullshito over spending on cubits is obvious John, all of us need to form a shell company together and start applying for grants !

And because of my brilliantly smashing expertise in Bullshito, is exactly why I should be put in charge of the marketing department.

What do you think Dr. Duffield ?

Grants are not for the likes of you and

me, Greg. Only academic charlatans are permitted to live a life of ease on the public purse… for standing four-square in the way of scientific progress.Hey I tried to post a comment, and was denied. I’ll try again.

Hey John,

I´m glad you´ve been posting more frequently. I always enjoy them.

I asked you last month if you’ve read Thomas Kuhn’s works. I’ve been recently interested in the sociology of science, and I’ve been reading as much on the matter as I can. Something important I’ve taken away from this, is an understanding that science in reality is very much a sociological process. As much as we’d love to believe that science is a forum in which the best ideas rise to the top, and everyone argues for theories based on evidence and simplicity, understandability and reasonableness, and all the things we think would be obvious to judge theories on, it really seems like science is driven by many other human tendencies: Pride and charisma and popularity and such all play into theory making. Any theory outside of Orthodox is up against greater scrutiny and are held to much greater standards.

I don’t know if you have yet developed a concise, cohesive model, but with this blog you hint towards foundations you believe to be important. These all seem like very rational foundations. You seem to imply that new foundations would do away with the current mysticism found in quantum mechanics. You argue that quantum and particle physics are foundationally flawed. The new foundations for a new theory that could fix the current flawed theories in physics would supplant foundations that sit beneath behemoths of theory.

What I’m trying to say is that you’re up against a pretty huge challenge. For the physics community today to throw away a theory that has had so much development, in terms of money and man-hours spent on it, and the mountains of publications, the bar is set incredibly high. The new replacement theory would probably have to derive nearly all results and explain nearly all paradigmatic experiments in current physics. I personally can’t imagine myself, or any single person creating de novo such a comprehensive corpus of work.

I understand that none of the work that went into the creation of the current theories of modern physics came close to the rigor and comprehensiveness that would be required today. This doesn’t seem fair, but I think it actually does make sense from a sociological perspective. The beginning of quantum mechanics was ridden with vague, poorly written publications, but, to understand why this was acceptable at the time, we need to realize that quantum mechanics emerged at a very particular time, during which the theory and the practice were evolving together. This means that to be accepted into the theory making, one had to only find a way to explain one result, since there were so many new, and altogether unexplained results. Nothing had to supplant old, established theories. Maybe in many cases, old established philosophies had to be supplanted or modified, but those, the scientific community tends to forsake much easier than explanations of practical results. Today, any work on fixing quantum mechanics or particle physics would involve not explaining new results, tearing away foundations, and rebuilding. This is a project very few people within the community are willing to work on or fund. This is a project that I think would only be accepted if, in one swift motion, the old foundations could be sufficiently torn down, and replaced leaving all, or most, current results left standing. I think that this is a project that simply could not succeed in piecemeal.

I guess this is all prelude to ask, what is your intent with this blog? To spark curiosity and prove to your readers that questioning the scientific orthodoxy is not only reasonable, but in this case well justified?

Do you have any aspiration to head a project to attempt to convince the scientific community as a whole that there exists a better theory?

To motivate others to go into science, and bring rationality into their field in whatever incremental ways they can?

I’m also curious how much you know about Steven Wolfram’s project on a foundational theory? There was a very interesting time during the early stages of the pandemic where a bunch of scientists took time to pursue their own foundational theories. Steven Wolfram and Eric Weinstein are two of the more prominent examples. I find this time to be really interesting. While neither project was entirely successful, I think this period of time shows something really interesting about the scientific community. It seems like many people harbor some feeling that there needs to be a foundational change, but it’s only worth their time to pursue when they have the free time to play around. Something else interesting I found was that the Lone Wolf approach that Weinstein took was entirely untenable. Not necessarily that his theory was bad (it was), but he was entirely unable to convince anyone else that his theory was worth anything, He couldn’t explain it, he couldn’t back it up, and it derive specific results. Wolfram’s approach was very different. He started an open source community, guided by a few foundational principles, and set the power of hobbyists to work. Personally I think Wolfram’s project will come to be seen similar to string theory: certainly interesting for the tools it develops, but definitely irrelevant to reality. Regardless he was highly successful in showing that (within a type of toy universe) there seems to be some inevitability to the structures of relativity and quantum mechanics.

Anyways, rambling aside, do you see any value in attempting a similar project to Wolframs? It seems you could maybe structure a program to probe your ideas’ foundations. Create some sort of basic assumptions (I’m not entirely sure what those would be, maybe that the electron is geometrical, measurement involves some frequency transform, things like that) and outline goals for derivations.

I’d love to hear your thoughts, and thanks again for all the thought provoking posts. It’s all much appreciated.

For some reason I’m unable to post. I’ll try again.

Hi Eric. Apologies for the trouble you’ve had posting. The Askimet antispam can be a pain in the arse. Email me at johnduffield at btconnect dot com for anything that’s gone adrift, and I will put it up here. As for my aims, I’m just a guy who wants to see some advancement in physics, and thence the world. As to whether this website actually helps with that, well, I hope so. When it comes to addressing all the points in your post, it’s big job, so l will go through it and get back to you. Meanwhile the moot point is this: I hope I can shift things along a smidge. Maybe I can, and maybe I can’t, but either way, I think it’s worth trying.

Hi Eric, blogs like these where history of theory development, pointing out faults-inconsistencies-misinterpretations, and providing alternative solutions (which I do agree will acquire a lot more exploration, however can be inspiritation for future generations) are very rare. I think the comment filter is a good thing. Not to have this as a echo chamber but to provide a meaningfull discussion (based on fact, logic and understanding), other public forums suffer from this…

What John is pointing out is that the current theories are flawed and full of beliefs and assumptions (actually for me it is now very similar to organised religion…), that keep the majority stuck in a certain thinking pattern. The reason I am following the discussion closely now is the whole particle aspect is key to overtrow by something new. I do think string theory was onto something good (because everything in essence is a vibration) but was overly complicated, untestable and so on. What the new theory needs to do is explain why an electron, proton, neutron (and al other flavors that are detected in collusions) have a certain energy content, calculate the fine structure constant… many other things that are now not explained at all by the current model. And also take away the mysteries and magic that are currently involved (the whole dark energy-matter, big bang inflation,… shenanigans are good examples). The truth will always prevail and I do believe this blog has its impact and I am gratefull for all the time John puts into it.

Sorry to be slow replying, Eric, you’re in italics, I’m not.

.

I´m glad you´ve been posting more frequently. I always enjoy them..

Thanks. I try to do one a month, but I work for a living, and the day job has been tough of late.

.

I asked you last month if you’ve read Thomas Kuhn’s works. I’ve been recently interested in the sociology of science, and I’ve been reading as much on the matter as I can. Something important I’ve taken away from this, is an understanding that science in reality is very much a sociological process. As much as we’d love to believe that science is a forum in which the best ideas rise to the top, and everyone argues for theories based on evidence and simplicity, understandability and reasonableness, and all the things we think would be obvious to judge theories on, it really seems like science is driven by many other human tendencies: Pride and charisma and popularity and such all play into theory making. Any theory outside of Orthodox is up against greater scrutiny and are held to much greater standards..

I’m sure science is a sociological process. However I think for physics, it’s more than just that. So much more that the field is up to its neck in corruption and dishonesty. I look at quantum entanglement, and it’s out-and-out scientific fraud, with bile directed at people who challenge the orthodoxy. I think particle physics is also troubled, with Goebbelesque lies-to-children, and the total censorship of electron papers et cetera. Meanwhile gravitational physics flatly contradicts Einstein, and if you try to point that out, you will get banned from every physics forum on the planet. Write a paper saying “the speed of light is spatially variable” and it will never see the light of day. Instead what will, is multiverse moonshine and holographic pseudoscience. The more I learn, the more I think that the situation is absolutely appalling.

.

I don’t know if you have yet developed a concise, cohesive model.

Just an outline. See https://physicsdetective.com/the-theory-of-everything/ . It’s basically William Kingdon Clifford’s 1870 space theory of matter.

.

but with this blog you hint towards foundations you believe to be important. These all seem like very rational foundations. You seem to imply that new foundations would do away with the current mysticism found in quantum mechanics. You argue that quantum and particle physics are foundationally flawed. The new foundations for a new theory that could fix the current flawed theories in physics would supplant foundations that sit beneath behemoths of theory..

It isn’t quite like that. Note my strapline, courtesy of Bert Schroer: “Perhaps the past, if looked upon with care and hindsight, may teach us where we possibly took a wrong turn”. Those new foundations you’re referring to, are old foundations, and I didn’t come up with them. The foundations of quantum physics were laid down by the “realists” like de Broglie, Schrodinger, Darwin, and so on. The foundations of gravitational physics were laid down by Einstein and others. All these foundations are still there, but the behemoths you mention are not built on them.

.

What I’m trying to say is that you’re up against a pretty huge challenge. For the physics community today to throw away a theory that has had so much development, in terms of money and man-hours spent on it, and the mountains of publications, the bar is set incredibly high. The new replacement theory would probably have to derive nearly all results and explain nearly all paradigmatic experiments in current physics. I personally can’t imagine myself, or any single person creating de novo such a comprehensive corpus of work..

I’m just doing my bit. Just spread the word Eric. There’s plenty of young physicists who can use this stuff to good advantage.

.

I understand that none of the work that went into the creation of the current theories of modern physics came close to the rigor and comprehensiveness that would be required today. This doesn’t seem fair, but I think it actually does make sense from a sociological perspective. The beginning of quantum mechanics was ridden with vague, poorly written publications, but, to understand why this was acceptable at the time, we need to realize that quantum mechanics emerged at a very particular time, during which the theory and the practice were evolving together..

I’m afraid I don’t agree, Eric. There are some great papers written in the 1920s and 1930s. However history is written by the victors. And in quantum mechanics, the victors were the Copenhagen school, who insisted that quantum mechanics surpasseth all human understanding. They even stole Schrodinger’s cat and used it to promote quantum weirdness. Start reading the articles here: https://physicsdetective.com/articles/historical-articles/. Let me know if links to papers no longer work, and I will fix them. (I have two issues with the old papers: 1) link rot and 2) Elsevier etc playing whack-a-mole with Sci-Hub).

.

This means that to be accepted into the theory making, one had to only find a way to explain one result, since there were so many new, and altogether unexplained results. Nothing had to supplant old, established theories. Maybe in many cases, old established philosophies had to be supplanted or modified, but those, the scientific community tends to forsake much easier than explanations of practical results. Today, any work on fixing quantum mechanics or particle physics would involve not explaining new results, tearing away foundations, and rebuilding. This is a project very few people within the community are willing to work on or fund..

I fear that if they don’t, they will, in the end, be thrown to the wolves. Then there will be no funding, and no community.

.

This is a project that I think would only be accepted if, in one swift motion, the old foundations could be sufficiently torn down, and replaced leaving all, or most, current results left standing. I think that this is a project that simply could not succeed in piecemeal..

I certainly think there’s a problem. For example particle physics has dug itself into a hole with The Standard Model. How can anybody at CERN ever admit that the Higgs mechanism flatly contradicts E=mc², wherein the mass of a body is a measure of its energy content? They can’t. They’ve painted themselves into a corner with particle “discoveries”, and they can’t get out. One day I will look into that, and plot an escape route.

.

I guess this is all prelude to ask, what is your intent with this blog? To spark curiosity and prove to your readers that questioning the scientific orthodoxy is not only reasonable, but in this case well justified?.

I’m just doing my bit to try to make the world a better place. See “the future isn’t what it used to be” for more: https://physicsdetective.com/future/

.

Do you have any aspiration to head a project to attempt to convince the scientific community as a whole that there exists a better theory?.

No. Well, I suppose you could say I’m already heading a project. A project of one.

.

To motivate others to go into science, and bring rationality into their field in whatever incremental ways they can?.

Yes.

.

I’m also curious how much you know about Steven Wolfram’s project on a foundational theory? There was a very interesting time during the early stages of the pandemic where a bunch of scientists took time to pursue their own foundational theories. Steven Wolfram and Eric Weinstein are two of the more prominent examples..

I took a look at it, and thought there were no foundations present.

.

I find this time to be really interesting. While neither project was entirely successful, I think this period of time shows something really interesting about the scientific community. It seems like many people harbor some feeling that there needs to be a foundational change, but it’s only worth their time to pursue when they have the free time to play around. Something else interesting I found was that the Lone Wolf approach that Weinstein took was entirely untenable. Not necessarily that his theory was bad (it was), but he was entirely unable to convince anyone else that his theory was worth anything, He couldn’t explain it, he couldn’t back it up, and it derive specific results..

As above.

.

Wolfram’s approach was very different. He started an open source community, guided by a few foundational principles, and set the power of hobbyists to work. Personally I think Wolfram’s project will come to be seen similar to string theory: certainly interesting for the tools it develops, but definitely irrelevant to reality. Regardless he was highly successful in showing that (within a type of toy universe) there seems to be some inevitability to the structures of relativity and quantum mechanics..

Noted.

.

Anyways, rambling aside, do you see any value in attempting a similar project to Wolframs? It seems you could maybe structure a program to probe your ideas’ foundations. Create some sort of basic assumptions (I’m not entirely sure what those would be, maybe that the electron is geometrical, measurement involves some frequency transform, things like that) and outline goals for derivations..

A lot of the ideas you read on this blog are ideas I’ve dug up by reading other people’s papers. There is some original content by me, but not so much. For example, the electron model is from Williamson and van der Mark’s 1996 paper Is the electron a photon with toroidal topology? See https://5p277b.n3cdn1.secureserver.net/wp-content/uploads/electron.pdf

.

I’d love to hear your thoughts, and thanks again for all the thought provoking posts. It’s all much appreciated..

My pleasure Eric. It’s good to talk.

.

For some reason I’m unable to post. I’ll try again..

Sorry. Before I got Askimet, I was getting a hundred spam comments a day, some of them pretty awful. Now the spam filter can be over-zealous at times.

“and providing alternative solutions”

I have one, so do many others. There are so many, professors won’t even consider looking at any. Hence, getting anyone with solid knowledge to review your work is difficult. Fortunately, there is a lot of solid knowledge, much of it ready to let go the current baggage, in the hobbyist community.

You may find it interesting. The links (embedded in the post) and a diagram and equation sheets here:

physicsdiscussionforum. org/graphical-representation-of-the-refraction-steerin-t2529. html

I’m currently trying to take the step away from static gravity fields to interacting ones.

physicsdiscussionforum.org/fizeau-follows-fresnel-s-fractional-formula-favori-t2558.html

“(actually for me it is now very similar to organised religion…),”

This is a common criticism, including by me. Check out my second comment under my physics article.

“The truth will always prevail…”

That is the intent of the Scientific Method. Verify your hypotheses to eke out the truth. Where most people go wrong is to claim that Science has found the truth. Or that there is only one plausible model for anything.

I’ll take a look Cedron. Apologies though, I’ve been run of my feet of late, so it might be the weekend. The job is tough, and there ain’t enough hours in the day.

Hey John,

Thanks for the long thought-out reply. Like you, I’ve been busy of late. I’ll look it over more thoroughly once I have the time tonight.

No worries on the spam filter, that’s totally understandable.

Thanks for referencing my 2019 quote on Malus & Bell:

A Comparison of Bell’s Theorem and Malus’s Law: Action-at-a-Distance is not Required in Order to Explain Results of Bell’s Theorem Experiments by Austin J Fearnley who said “there is an enforceable duality between results of Malus’ Law experiments and the results from Bell experiments”.

I have written four more papers on vixra since then and all pertain to Malus and Bell and my ideas and models have continued to change but I still hold that Malus is crucial to the Bell experiment. I feel too old to write any more papers!

I came across your posts by chance today and am pleased that not too much time has elapsed since you posted.

I noticed a duality between Bell and Malus calculations by following Susskind’s online (theoretical minimum) course. He gave a simple example of the QM calculation. It became apparent to me only much later (my being a naive amateur) that he had only proved Malus rather than Bell as he measured at zero degrees and 45 degrees and his polarised pair were aligned with zero degrees and 180 degrees eg up&down polarisations. So he already had a pre-polarised beam in the zero degrees direction. That is, he had used the equivalent of a Malus polarising filter to start off the Bell calculation. I have noticed this in some wiki proofs also. (Though I know there is a also more complicated proof used by QM).

Later, I started playing with Malus’s Law for photons and for electrons [Io cos^2(theta/2)] for a concrete example. Again, after some time and very slow realisations, I thought that I had surely seen the Malus calculation before. Not a mundane calculation but the term 0.5*(0.5-0.5/sqrt2) rang a bell in my head as exactly the one used by Susskind for Bell.

Next, the bad news that I could not get 0.707 for a Bell experiment using Malus. In fact, using my newer gyroscopic model for electrons/photons I could only get r=0.375 ish. What you have posted about dynamic hidden variables also agrees with my gyroscopic model. The dynamic model of a particle motion throws out CFD. Why only r=0.375? Because the dynamic model is even worse (more fuzzy and with even more variation to attenuate the correlation below 0.5). (Not sure why the value is 0.375? I have not reported this value in a paper.)

Next, the good news (in my opinion at any rate) is that I get 0.707 for Bell if I use Malus with retrocausality thrown in. Retrocausality caused by antiparticles moving backwards in time.

Malus + retrocausality => Bell r=0.707

Retrocausality eschews the need, if there ever was one, for entanglement.

Gill has pointed me at useful papers on quantum cryptography and quantum computers. I am coming to the view that retrocausality can allow both to happen. But it happens, in my opinion, without entanglement but with the power of statistics to jointly manipulate unknown separate states in optical fibres.

Best

Austin Fearnley

aka ben6993

Austin: utmost apologies, your comment was in the spam folder. I’m sorry, but I can’t see why. If you have lots of hyperlinks it can end up there, or some dodgy “medical” words. It’s a shame your papers are on viXra rather than the arXiv, but not surprising. I will have a look. For now I have to say I’m not fond of the retrocausality I’m afraid. I take a very realist plain-vanilla stance when it comes to my thoughts on physics, which, it would seem, is in essence William Kingdon Clifford’s

Space Theory of Matter. I plan to write about the delayed choice quantum eraser at some point. Meanwhile see https://physicsdetective.com/the-electron/ for what might be some useful hyperlinks, including the Williamson / van der Mark paper that describes the electron as a photon in a closed path. So IMHO if Malus’s law explains the experimental results for light, it will do the same for electrons and positrons, especially since the photon is a time-reversed electron. That’s what the second image in the above article is attempting to portray. Both the electron and the positron are light in a spin ½ double-loop path with two orthogonal rotations, but the positron is “going round the other way”. I reversed the animated gif to go from one to the other.John, no apology needed. I had a second try at reCAPTACHA after the first one timed out. Maybe that caused a problem.

I have become accustomed to vixra: 15 papers there. As I am not a physics professional, it causes me no offence or embarrassment. For me it is a quick and useful service to datestamp my ideas. And I can still read papers by professionals on arxiv. I also do not think my papers would look good on arxiv. I would need to do a Gauss on my papers by removing the scaffolding to make them more professional and obscure enough to look right in that setting. Quite the opposite of what I want.

I followed your link on the electron, and Williamson, and realised that I have very recently seen a video conference this month by Vivian Robinson on the work of Williamson and van der Mark

https://www.youtube.com/watch?v=pMohlK94120

So that is very new for me. My own ideas on electron and photon structure are in my papers on my preon model where I construct all elementary particles from four preons.

Just throwing in some retrocausal speculations about moibus closures… Joy Christian uses an S^3 model with his old analogy of skaters dancing in the reverse direction on the far side of the moibus strip. He met some opposition by claims that he was illegitimately using εijk=1 and εijk=-1 within the same clifford calculation which ought only to use one trivector setting.

I am looking at this differently by letting the dancers keep their spatial metrics but reversing their time metrics with the same overall effect. This could be a time restraint constricting a particle in this loop. And if the loop is broken you could release two particles, one with a +ve time direction and one with a negative time direction. Probably into dS and AdS spaces coexisting alongside one another like rationals and irrationals coexisting on the real line. G ‘t Hooft has an interesting idea online about a particle passing through a Black Hole instantly and coming out on the far side with an opposite time direction. I speculate that the particle experienced time going into the BH but experiences an anti direction of time coming out. Times cancel, appearing to outside observer to have no total time elapsed. So I would says that BHs do have interiors w=but maybe have pposite time directions on opposite sides. Very far fetched.

In my retro model for Bell, the observer of the antiparticle measurement is using that measurement as a polarising filter for a Malus experiment. The other observer measures the partner particles in a second measurement which is the analyser filter. So it is pure Malus experiment in practice, not Bell. Retro action means that the only particle polarisations within the normal flight paths of the Bell experiment are +a, -a, +b and -b. This again is a pure Malus setup and allows r=0.707. Without retro, the particle polarisations take all random values, and this limits r to 0.5. Removing retro removes the action of the polarising filter. Malus is dual in my mind to Bell, but without retro I cannot think how to make it provide r=0.707.

Austin

Austin: noted re the papers. I will take a look at them sometime. And thanks re the Vivian Robinson video: https://www.youtube.com/watch?v=pMohlK94120 . I know Vivian, he was in a “Nature of Light and Particles” discussion group I was in. I bowed out because I just didn’t have enough free time.

.

Noted re the preons. I’m not fond of them I’m afraid: https://en.wikipedia.org/wiki/Preon. I very much dislike point particles. Back in the 1920s Schrödinger and other “realists” talked about the wave nature of the electron, and described it as a wave in a closed path. But IMHO the Copenhagen school adopted Frenkel’s point-particle electron to spite the competition, and it was all downhill from there.

.

I have no issue with skaters dancing in the reverse direction on the far side of Mobius strip. Or with reversing their time metric. That’s essentially what I did when I reversed the animation in the second image in the above post. The thing on the left depicts an electron, the thing on the right depicts a positron. The positron is a “time reversed electron”. But it isn’t travelling backwards through time. The energy flow moves through space. It’s going backwards when compared to the electron. But that energy flow is in essence a photon regardless of which way it’s going. Hence when you annihilate the electron with the positron, you usually get two photons. That’s all. There’s no evidence of any particles going back in time, and no evidence of Anti-de-Sitter space either. I looked into the history of that sort of thing in https://physicsdetective.com/a-compressed-prehistory-of-dark-energy/. As for ‘t Hooft’s idea of a particle passing through a black hole and coming out on the far side with an opposite time direction, I’m afraid I take a similar view. I take note of Einstein saying a gravitational field was a place where the speed of light is spatially variable. This is backed up by the hard scientific evidence of NIST optical clocks, which go slower when they’re lower. There is no actual thing called time being measured by such a clock. It goes slower when it’s lower because

lightgoe slower when it’s lower. I would urge you to read https://physicsdetective.com/articles/articles-on-gravity-and-cosmology/ to avoid spending time on fruitless speculative avenues..

I don’t see any issue with the polarising filters giving a Malus Law result. When you have light going from left-to-right through two polarizing filters, the first filter A means you then have polarized light going into the second filter B. In the Clauser & Freedman experiment, the calcium cascade emitted polarized photons in opposite directions. One photon goes through filter A, the other goes through filter B. The first photon is just a time-reversed version of the light that went through the first filter.

.

Apologies, I have to go. I will look at your other comment when I can.

Now that I have lost belief in entanglement I see the space metric knitted from disjoint regions of εijk=1 space and εijk=-1 space..

But there’s no experimental support for that, Austin. I am now very sceptical of speculations which have no support.

.

antiparticles are travelling backwards in thermodynamic time..

I’m sorry Austin, but when you think some more about time, I am sure you will no longer believe that there’s any kind of travel through it. See https://physicsdetective.com/the-nature-of-time/

.

Also, which you will not like, is that εijk=1 and εijk=-1 space regions may need to be connected by a Rosen-Einstein bridge for two such regions to interact..

I don’t like it. It’s just a fantasy. I’ve referred to the Einstein-Rosen paper in the article on time I mentioned above.

.

So I am not averse to the idea of microscopic black holes on a chip. But if they exist, then they are ten-a-penny and not worth any ridiculous hype..

I am, because most of what you read about black holes is ridiculous hype.

.

CCC leads me to Penrose having only photons left over at a CCC node. That might fit in with photons being integral to electrons as in the Williamson et al model..

The important point is that Williamson and van der Mark were not the first people to come up with the idea of a wave in a closed path. Louis de Broglie had a brief paper ( https://sci-hub.tw/10.1038/112540a0 ) published in Nature in 1923. That’s where he said “the wave is tuned with the length of the closed path”. Check out the Wikipedia article on the toroidal ring model: https://en.wikipedia.org/w/index.php?title=Toroidal_ring_model.

.

But it also leads me to think that pure energy is also being over-hyped or over-used. One cannot in my opinion obtain matter from pure energy..

I’m afraid that’s what E=mc² is all about Austin. A photon is pure energy, because it’s a wave. Take all the energy out of a wave, and it no longer exists. But if you don’t, you can make matter out of it.

.

My preon model constructs all fundamental particles using just four preons….

I’m sorry Austin, I just can’t empathise with your preons. Or with gluons. Or the Higgs boson, or the W or Z boson, or weak isospin. If you read more of the articles here, you will appreciate that this is not some casual decision.

I have a few ideas which, because they do not conform to standard theory, may possibly be relevant themes for this site?

Starting with entanglement, which we already considered.

When I retired and took to learning physics again, after a 38 year break, I came to praise Caesar not to bury him. I even had my own idea from Rasch measurement theory that entanglement could be responsible for creating the space metric. And I still fully support (I have a vixra paper on this) the CCC model where the metric of space breaks down at a CCC node. Now that I have lost belief in entanglement I see the space metric knitted from disjoint regions of εijk=1 space and εijk=-1 space. Where these trivectors should not coexist locally and yet they must do for me to claim that antiparticles are travelling backwards in thermodynamic time. I also believe that these εijk=1 space and εijk=-1 space regions underpin current experimental results for Bell, quantum computing and quantum cryptography.

Also, which you will not like, is that εijk=1 and εijk=-1 space regions may need to be connected by a Rosen-Einstein bridge for two such regions to interact. So I am not averse to the idea of microscopic black holes on a chip. But if they exist, then they are ten-a-penny and not worth any ridiculous hype.

CCC leads me to Penrose having only photons left over at a CCC node. That might fit in with photons being integral to electrons as in the Williamson et al model (though not in my own preon model). Or even that everything is made from photons. But it also leads me to think that pure energy is also being over-hyped or over-used. One cannot in my opinion obtain matter from pure energy. I am against just getting any old types of matter from an input of a given quantity of energy. At a CCC node the universe has no metric and therefore can reset at a spatial point. But the CCC node universe is not made of pure energy but instead is made of photons. So that leads me to my preon model. [Note, admission that I rather like BECs and am unsure where they sit with respect to entanglement. The CCC metric loss however arises because of loss of fermions not because of BEC entanglement.]

My preon model constructs all fundamental particles using just four preons (with details in my vixra papers). I have modeled many interactions and all interactions preserve preon content. Count the preons into the interactions and they are all counted out again after the interaction. I used known decay paths set out in the particle bible. It was always possible to construct the decay paths using preons. If pure energy was sufficient, there should be no need to list a range of decay paths as anything could happen. I also was able to model leptoquark structures in terms of preons using the principle of conservation of preons in interactions.

To cope with constructing fundamental particles, especially the gluons, from preons I had to cope with superposition which is the only-slightly less-evil sister of entanglement. And equally non-existent. I coped with superposition by manipulating the three particle generations and increasing the numbers of preons per generation. That made sense as higher generations of fermions are the heavier particles, so maybe more preons are in them. This led me to making three generations of bosons: 1st photon; 2nd Z; and 3rd gluons (and higgs). If a gluon can enable interaction 1 OR interaction 2 then that gluon’s contents can be such that taking preons to enable 1 does not leave enough left over to carry out interaction 2. But if another gluon had more of the necessary preons then it would have enough to carry out BOTH interactions together. So one can build up supposed superpositions by design of gluon structure.

Another problem is that one can use field theory to explain all interactions but one cannot do this with a particle theory. So how did I manage with my preons model which is a particle model. The problem lies with weak isospin which is not conserved in interactions, not even when an electron emits a photon. Note that weak isospin (+1/2 or -1/2) is the only quantum property of the higgs field.

To make a particle model that coped with all interactions I had to create 1/4-higgs in generation 1 and 1/2-higgs in generation 2. The normal higgs is generation 3. So the normal higgs has four times as many preons as the 1/4 higgs. I assume that the 1/4-higgs is very light in mass as it is involved with an electron emitting a photon. The preons in the higgs supply the preons for the emission of the photon rather any particle being made from pure energy.

Another pet hate is supposedly superposed particle-antiparticles eg https://www.ox.ac.uk/news/2021-06-08-subatomic-particle-seen-changing-antiparticle-and-back-first-time

All such supposed morphs can be explained with exchanges of preons as in a normal interaction. But if I eschew superposition then how do we explain interference effects? My electron has four preons but also has 96 hexarks (turtles all the way down) and it is really ‘any integer n’ times 96. So plenty of scope there for interference effects.

I am just going through Mr Tomkins magical world, quickly, and I do not accept that a particle tunnels through an impossible hurdle without changing its form first in an interaction. And then back again. [Mr Tomkins is surprised to find his car outside the garage.] Is there anything left …

Austin

Great post, John ! Txs for commenting on my post and getting in touch again. I am swamped by other work, but you seem to do great bringing some sense to the ‘physics’ scene !

Merci Jean, et moi aussi. I am trying to do my bit. Somebody asked me to do an article on quantum entanglement, and I was surprised to find out that it’s just Malus’s Law. There is no substance to it.

I need to write up more properly in a paper my negative attitude to superposition with respect to gluon structure in my preon model, and also neutrino oscillation due to preon exchanges at interactions. And more generally anti the idea of particle and antiparticle inter-morphing as though it were somehow mystical. When I get the energy to do so.

I don’t think there’s anything wrong with superposition, Austin. Let’s not forget that we have hard scientific evidence for the wave nature of matter. But as for gluons and preons, I’m afraid to say there is no actual evidence for their existence. Despite what people say about gluon jets, nobody has ever seen a free gluon, and the gluons in ordinary hadrons are virtual. Nobody has ever seen particles morphing into antiparticles either. I would encourage you to spend your energy on other matters. Whilst you’re thinking about that, you might like to take a look at https://physicsdetective.com/what-the-proton-is-not/.

.

PS: I’m sorry to have been slow replying to your others posts. I’ve had to do various things this week. I will take a look now.

Detective: thanks for pointing out the issues with entanglement. I think there either needs to be a description of how you can produce the results in the Bell tests without spooky action or nonlocality or there really is something spooky like entanglement going on,…. Or it really is as simple as just a polarizer, as you suggest.

.

I think Sandra is saying it’s not as easy as saying it’s just the same as a polaizer, and idk if it really matters how the entangled photons were produced in the older papers, there have been more recent experiments with the detectors over a km apart and the results still cannot be explained without entanglement, as far as I can tell…. Unless it really is just the same as a simple polarizer, but I think there’s more to it. However, when you pointed out the similarity it made me a lot more comfortable questioning entanglement and I feel like these questions are unwelcome in some physics communities, like…. it’s been proven already, stop questioning!

.

Personally, I don’t like entanglement, spooky action, nonlocal or faster than light effects, and the curve is similar to polarization, but idk if it’s as easy as saying they are wrong, its just a polarizer…. I think you are pointing us to something bigger.

.

The next step would be a proof of this, and Sandra seems to believe this is impossible. If you’re good at coding it might be possible to code up a bell experiment with Mallus’ law and see if the results are the same as the Bell tests, then it would be obvious there’s nothing spooky.

.

At least that’s my two cents, I’m in two minds about it, and it’s as clear as mud right now for me. I’ll try to be more humble in my lack of understanding.

It’s a castle in the air, Doug. A myth that has grown over the years. Plenty of people will tell you how magical and mysterious entanglement is. But nobody will tell you that Clauser and Freedman’s photon pairs weren’t even entangled. Or Aspect’s. Zeilinger’s were, but see Analyzer Output Correlations Compared for Entangled or Non-entangled Photons by Gary Gordon. He said

“Our predicted results for this non-entangled photon case are an exact match to those reported in the literature for the analysis and experimental outcomes for the entangled photon case”.That’s your next step, it’s already been done. But none of those authors I listed at the end of the article never get any air time.It’s just a polarizerisn’t going to feature in Nature any time soon. They won’t print any electron papers either.When I get some free time i’ll code up a simulator where you can stick in any equation you want for the off axis measurements, cos^2, linear, whatever. But I have a feeling Bell’s theorem is famous for a reason and I will always just get back the classical line unless I include entanglement. We will see.

.

Crazy papers get published all the time, I assume there would be a lot more papers talking about this if it was all baloney.

.

Looking forward to making the plots myself.

Yes, crazy papers get published all the time, like the Wormhole publicity stunt, see Woit and Nature. But if you were to write a non-crazy paper explaining why, after considered research, you thought quantum entanglement was merely Malus’s law in action, your paper would never see the light of day. Do not underestimated the strength of vested interest.

Doug: Let’s forget polarizers for a second. What we need is some property, or combinations of properties, which explains our experimental results. Our tests show that whenever we measure our entangled particles (electron spin, photon polarization, you name it) WITH THE SAME measurement type (i.e. same angle, or same spin orientation) we ALWAYS get the same result for both particles, no matter how far apart they are (So if we are talking photons, we’ll measure same polarization, if we are talking electrons we’ll measure opposite spin).

If we instead choose 2 different measurement types for each partner particle, our result will not be the same everytime, but will depend on the relationship between those two measurements. Since for both electron spin and photon polarization the difference in our apparatuses is a certain ANGLE, the correlations between partners will depend on that angle. It turns out, it is a cos^2 dependence.

Now the big problem: how can the particles tell what setting the two apparatuses are when they are created? How can they tell “we’re going to give the same result, because both measurements are the exact same” from “the measurements are different, let’s follow a cosine rule!” before they reach their destination?

You have two options. 1) The particles don’t know anything in advance, but communicate to their partner what their measurement is a the moment of detection. This clearly implies FTL.

2) The particles agree to follow a pre-determined plan, so that they can be consistent when they finally know how they will be measured. Let’s expand 2) a bit further, taking back polarizers into consideration (just for simplicity of argument, but it can be ANY measurement type).

If the plan is “let’s always give the same result”, they will fail the cosine rule when the two measurements are different. If the plan is “let’s decide NOW, we’ll adapt at the moment of measurement” (let’s say they both choose to have vertical polarization for simplicity): if the particles find two vertically polarized measurements, GREAT! They give the same result. If the two angles are different? GREAT! We’ll get the cosine rule, because they can choose in advance that for each angle there will be a different probability to pass, and this will result in a cosine distribution of probabilities. But wait, what happens if the two detectors are not vertically aligned (and neither at 90° from the vertical), but are at the same angle? The two photons should always give the same answer, but in this scenario they must decide on a cosine probability. Which will mean that at least some of the time, the two photons will give different answers (because only cos^2(0) = 1 or cos^2(90) = 0).

Every other possible plan you can conceive is a variation of these two.

There is NO WAY for the two particles to choose a consistent plan dependant on angle, that is in accordance with the experimental result. Quantum entanglement is not relegated to Malus’ law, because not all measurement types are polarization experiments*. But we do tend to show entanglement in action with polarizers, simply because it’s the easiest experiment we can do. The macroscopic behavior of a polarizer is a CONSEQUENCE of the probabilistic distribution of each single photon, so it’s rather weird to say “quantum entanglement is just Malus’ law in action”, that’s having it backwards.

*a different kind of entanglement is involved in momentum measurements. Entangled partners always have opposite sign momentums, whose sum will always equal a known constant. And that does not depend on an angle.

Hi Sandra: Thanks so much for the detailed explanation. I’ve seen it derived and read similar explanations to yours above, although I think yours has been the best so far. Unfortunately it is still muddy in my mind. I can follow the math for Bell’s inequality but I’m an experimentalist at heart and statistics are not my favorite. Usually I will write a sim in matlab to see it in action with some visualization and then it becomes clearer.

.

“If the plan is “let’s always give the same result”, they will fail the cosine rule when the two measurements are different.” —- this is the part that I need to expand upon and see it action. I’m assuming it’s obvious to mathematicians and it will be obvious to me after I see it running in code. I’m also assuming this has been done 100s of times by many grad students that didn’t believe it at first and the result it always the same: “damn, guess the Bell tests ring true”.

.

So do you have an opinion about what is actually going on, as far as I remember the wave function doesn’t represent reality at all, it’s just the probability. Do you believe there is an underlying reality or we really can’t go any further than the wave function to describe anything? In another comment it sounded like you think it is FTL effects, if so, is it just a cruel joke that there’s no FTL communication possible? To me, it seems like either there is FTL and somehow there will eventually be FTL comms, or there isn’t this spookiness and there is a workaround of entanglement somehow. And yes I know this classical thinking of the universe goes against the results we have so far.

Hi Doug, no need to do any experiments, actually there were experiments done with non entangled photons and also simulations and they show the same thing because this is just classic physics. The papers are available but like John said they do not see the light of day.

I will repeat what I said in reply to Sandra in the other topic:

The crucial fantasy QM has is that there is something like a point particle. There is no evidence to support this, not even the famous double split. What is always observed(measured) is interaction, at a certain quantized level.

For photons and their interaction with polarizers there is a very basic experiment that shows this clearly:

https://www.youtube.com/watch?v=0-8tQlOhCBA&ab_channel=UMDPhysicsVideos%21

you cannot explain these results if you hold on to the fact that light consists of particles that go through or not ….

The thing I struggle with is the amplitude of these wave packages. I have also read that the amplitude of photons is constant (which in itself is very intriging and must have some underlying cause, my personal guess is this is related to the “structure” of the aether), but in order for the intensity of light to change through a polarizer something needs to decrease, and it is not the frequency… so for me it seems the amplitude will be affected.

Perhaps @John that could be something interesting to explore in the future (I know you have touched upon the amplitude of a photon but I think more is needed, and I am not convinced it is constant). This is very important for the way all other elements are built: electron , proton (I am very charmed by the idea the proton is like a trefoil, my favorite shape)…

One last thing, I do think entagled particles are real ie if there are events that create 2 particles at the same time it is very logical that both particles have some properties opposite of each other (which balance to zero, or same as before the event). however these properties are then Fixed and intrinsic. whatever you measure in the future will always show opposite values. The QM interpretation that the property is only there are the moment of measurement (and therefor you need instant change for the other) is pure nonsense, illogical, and pure belief (that is why I refer to the mainstream science as the new religion…) which not a single experiment has ever demonstrated (of course after you analyse how they did the experiment, which John so expertly did for us). All the best, peace and love, Raf

Raf: apologies, your comment was in the spam filter. I’m afraid Askimet can be overly zealous at times. But I’m relieved that you were able to re-post it. I agree that there are no point particles. I find it amazing that serious physicists talk about point particles despite the clear evidence for the wave nature of matter. As far as I can tell it was promoted by the Copenhagen school to counter the realist guys like Schrodinger, who talked about waves in closed paths. I watched the video: https://www.youtube.com/watch?v=0-8tQlOhCBA&ab_channel=UMDPhysicsVideos%21. I think it’s a shame he talked about separate E and B fields, and didn’t say the polarizers alter the polarization of the light. It’s me who has said the amplitude of all photons is the same, and that action h can be expressed as momentum x distance because there’s a real distance involved. I don’t think the photon amplitude can be altered in the lab. So I don’t think a polarizer can change the amplitude of the photons. When the intensity of the light is reduced, I think it must be because the number of getting through photons is reduced. I’m glad you like the trefoil proton. But sadly I don’t think you’ll be reading any papers about that in

Natureany time soon. I too think that the properties of particles is real, and I reject the QM interpretation that says measurement of one particle instantly changes the other particle. That might be mainstream, but it’s a castle in the air with no foundation. It’s currently held up by hype and hot air, but it can’t stay aloft forever.Hi Doug, no need to do any experiments, actually there were experiments done with non entangled photons and also simulations and they show the same thing because this is just classic physics. The papers are available but like John said they do not see the light of day.

I will repeat what I said in reply to Sandra in the other topic:

The crucial fantasy QM has is that there is something like a point particle. There is no evidence to support this, not even the famous double split. What is always observed(measured) is interaction, at a certain quantized level.

For photons and their interaction with polarizers there is a very basic experiment that shows this clearly:

https://www.youtube.com/watch?v=0-8tQlOhCBA&ab_channel=UMDPhysicsVideos%21

you cannot explain these results if you hold on to the fact that light consists of particles that go through or not ….

The thing I struggle with is the amplitude of these wave packages. I have also read that the amplitude of photons is constant (which in itself is very intriging and must have some underlying cause, my personal guess is this is related to the “structure” of the aether), but in order for the intensity of light to change through a polarizer something needs to decrease, and it is not the frequency… so for me it seems the amplitude will be affected.

Perhaps John that could be something interesting to explore in the future (I know you have touched upon the amplitude of a photon but I think more is needed, and I am not convinced it is constant). This is very important for the way all other elements are built: electron , proton (I am very charmed by the idea the proton is like a trefoil, my favorite shape)…

One last thing, I do think entagled particles are real ie if there are events that create 2 particles at the same time it is very logical that both particles have some properties opposite of each other (which balance to zero, or same as before the event). however these properties are then Fixed and intrinsic. whatever you measure in the future will always show opposite values. The QM interpretation that the property is only there are the moment of measurement (and therefor you need instant change for the other) is pure nonsense, illogical, and pure belief (that is why I refer to the mainstream science as the new religion…) which not a single experiment has ever demonstrated (of course after you analyse how they did the experiment, which John so expertly did for us). All the best, peace and love, Raf

Hey Raf: if there are a few papers saying there no entanglement and thousands saying there is, it seem illogical to believe just the ones you want. Did you go through every line of math in those papers to make sure there’s not an error? I didn’t, so just because it’s published doesn’t mean it magically negates all the other papers or there aren’t errors.

.

The triple polarizer just shows how polarizers work, LCD screens are fun polarizer experiments to. I don’t see how either of these prove anything else.

.

If there is a work around for entanglement, and I’m hoping there is, nature probably would publish it, they love flashy stuff that just touches the surface.

.

John has inspired me to play with it more and maybe nobody has delved into it deep enough, but my confidence is waning after rethinking my initial confidence that it’s been overlooked and there’s a simple workaround.

.

P.S. I am a gullible person, I used to think quantized inertia by McCulloch was a good theory, and since I’m not in physics I have the freedom to believe in crazy things without pushback.

Doug: there really is no need for math or simulations, it’s very simple once you get it.

.

“If the plan is “let’s always give the same result”, they will fail the cosine rule when the two measurements are different.” by this I mean that if the photons were entangled in such a way as to always pass BOTH polarizers, or get blocked at BOTH, they would reproduce experiments where the polarizers are at the same angle, but not when they are at different angles, where you can have only one of them pass the test.

.

With the second plan you can reproduce experiments where the angles are different, but you won’t reproduce the results when the angles are the same, because sometimes you’ll get different answers at the two detectors. These two plans are complementary, and yet inconsistent with one another, just like possible local hidden variables.

.

I don’t think FTL is involved. My belief is that we’re misrepresenting the fundamental structure of our universe, but how is the big question. Extra dimensions? Many worlds? Time is an illusion? Who knows. All I know is that so far everyone that claimed to have the answer turned out not making anything of it.

.

.

.

Raf: again, Bell’s argument has nothing to do with particles, waves, QM, GR, or whatever. It’s a logic argument. It takes what we know from the results of experiments and rules out those explanations that are not consistent with those results. The only fundamental assumption is that the subject of measurement is a local entity, meaning that it can only be influenced by it’s immediate surroundings, limited by the speed of light (in this sense even waves are local). Bell’s theorem says this assumption is NOT consistent with experiment. That’s it.

.

“but in order for the intensity of light to change through a polarizer something needs to decrease, and it is not the frequency… so for me it seems the amplitude will be affected.” It’s the number of photons making it through the polarizer that determines the final amplitude.

.

“entagled particles are real ie if there are events that create 2 particles at the same time it is very logical that both particles have some properties opposite of each other (which balance to zero, or same as before the event). however these properties are then Fixed and intrinsic.” they are not fixed and intrinsic, at the very least not for each particle separately. The reason is given by my reply to Doug above, if you assume so you end in contradiction with experiment: either you predict correctly for equal angles and wrong for different angles or vice-versa.

Hi Sandra, thank you for the reply since I am still confused on what really is the output after the polarizer. The confusion is caused by the different pictures you can find related to this.

So in your view the outcome is probabilistic: a polarized photon (in the direction of the filter) will come out (or not) in correspondance with the angle? What happens in this case that the photon that passed has changed polarization (and was alligned by the filter), but all other properties stay the same. For the photons that did not pass, they where absorbed in the filter. When there is another filter at different angle after that it is the same story, with a probability related to the relative angle between the filters. OK that explains the 3 filter experiment, and consistent with Malus Law.

However there is also another option which I was describing, namely that the amplitude of the photon is reduced, and that the component perpendicular to the filter is absorbed. I am still considering this as an option, not sure if this can be fully ruled out… because the probabilistic effect comes back in the measurement of this weaker photon… with exactly the same outcome as in the first explanation. I would appreciate if someone could guide me to something that can rule this out (because remember the photoelectric effect does not depend on the intensity, it is the frequency that matters). Also I mentioned this before but I really would like to know about a single photon detector, this seems technically very challenging…

Now back to Bell, I am afraid I have to disagree with you but so be it. The assumptions that were made to come up with the formula involve linearity (and also non correlated measurements), but then you cannot use a device in your experiment that is not lineair (that is also the reason why at angle 0, 45, and 90 there is no violation, and there is a correlation between the measurements, described by Malus Law). So this reference classical curve is not correct.

Edit: John not Jeff, so sorry John!! Your blog obviously inspired basically everything I wrote in my previous post.

No problemo Doug. But as for extra dimensions, many worlds, and time is an illusion: oh FFS, when will the pseudoscience woo ever stop? As for Bell’s argument, it’s straw-man smoke-and-mirrors sleight-of-hand. It was always meant to be confusing, so you couldn’t understand it. Peddled by an opportunist bullshitter out to make a name for himself by saying

Einstein was wrong. See the fourth diagram in my article? The blue rectangle? Bell’s argument is merely an “inequality” based on a straight-line chart. Only Malus’s law means there is no straight line in the experiments. In similar vein there is no FTL communication because there is no FTL. There is no magic, there is no spooky action at a distance, there is no entanglement. The emperor has no clothes Doug, despite what they all say.I hope you are right,.. I want to believe … the next step would be proving this mathematically, but I fear it’s not so easy or perhaps, as Sandra says, impossible. And until there’s some proof showing you get the Bell test results without entanglement, it will be impossible to convince the wider community even if it is infact true.

.

Hopefully it is just as you say, but I think it would be something we haven’t thought of, but something you alluded to. Perhaps a rotation on an axis perpendicular to the direction the photons/electrons are traveling that are just correlated at the source and somehow produce the results we see in the experiments.

PPS, and obviously then black hole is no longer black. This should have been at the end of the long post I just made earlier.

Just in case my long post didn’t go through that I refer to with the PPS above, it is copied below, hopefully I don’t create a double post John!

Hi Sandra: Interestingly enough, I still don’t get it. And now since you, John and Raf are all saying don’t waste time by writing a sim, I definitely will. To me, if the difference in angles is theta, it would seem like the correlation difference being cos^2(theta) would follow logically, and not the linear relationship that is the ‘classical result’. So I’m clearly missing something that will be obvious in hindsight. Writing the sim will clear things up because I can see how everything works and will have to write everything down myself and can implement entangled or classical states to see the difference.

.

I dislike extra dimensions, really hate many worlds (although interesting, it’s basically the opposite of original QM, completely believe the wave function is real) and think we live in a 3d universe such as the one discussed on this blog although entanglement is still a thorn. Let me go on a rant that’s been in my mind for a while, perhaps you will find it interesting, amusing or most likely confusing since I’m just typing it here without any diagrams.

.

I think rewriting all the equations as a universe made of a nonnewtonian fluid, where hitting it soft it springs back slowly (long radiowaves/low freq.) and hitting it hard it springs back hard/fast (gamma rays/high freq.) could be a start. I like the idea of pair production in this blog, I’m not convinced the mobius trip or trifoil knot are correct (they might well be but idk) but a closed loop wave spinning so that it looks like a sphere over a full rotation is a nice idea for the electron. If the mass/energy is equally distributed but the E field is like a cosine, then the exact shape of the photon and the knot shape it makes when it’s an electron would determine the gyromagnetic ratio. If the actual equations were known, or inferred somehow, and you could simulate an electron as a closed loop photon, then you could just keep pumping up the energy and see if there are islands of stability at the same mass ratios of electron/proton and semi stable states for things like the muon/tau (and also calculate the gyromagnetic ratios for all of them). After this things like the idea of the neutron presented in this blog might be able to be addressed.

.

I also think the goal is basically a single field (E), so a photon is a wave in E with a rotation (changing curl in E instead of using B) in the along axis direction as it travels. This rotation in E helps in particle production since in some respect it can help each half of the photons spin together into electrons and positrons.

.

So then a spinning electron is shooting out ‘virtual photons’, waves of electric field that push other electrons away / pull positrons towards them. And then B is really an average over a full cycle of the electron frequency; the E field is actually spiraling outwards from the spinning electron and it is not an isotropic field (except over an entire rotation of the electron). Imagine a circle on the screen spinning along/around an axis direction north/south, the classic electron picture with the mag field pointed north south. For it to be isotropic over a rotation, the E field at the top is very weak while the E field is much stronger at the sides. So directly above it, along the magnetic field axis, it is like a corkscrew of electric field radiation upwards. Now imagine these electric field waves spiraling outwards in this corkscrew fashion from the electron hitting another electron above it, one side gets hit first, then the other, and it all depends on the orientation of the electron it’s hitting how this will effect the electron above it, since it’s spinning too and isn’t isotropic. I assume the effect of this is the B field effects but which is actually caused from this finer structure of the asymmetry of the waves emitted by the spinning electron on timescales of the electron frequency.

.

If that is true and can be described then if I understand the idea of Magnetohydrodynamics, we can infer that gravity acts in the same way (i.e., we can’t get frame dragging etc, unless it’s also like the B field effect described above). Perhaps photons do feel an E field, where one half of the photon is pushed away and one half of the photon is pulled, but it’s going to be on the timescales of the electron frequency (or higher frequencies for the protons), so we cannot detect this except as the average results over time, a decrease in the speed of the photon, or gravity. This also leads to the idea you can never hit 0 speed of light, because then the spinning electron doesn’t spin, doesn’t emit waves, and then doesn’t contribute to slowing down photons (i.e., doesn’t contribute to gravity). If studying objects very close to a black hole show a deviation from GR this could be a sign that there needs to be corrections like these made. And obviously the black hole is no longer black.

.

Don’t know if this all makes sense in this post but it’s clearer in my mind where I can picture things that I don’t show here.

.

P.S. John, I also like the idea of the neutrino in this blog and dark matter not actually being matter at all but just a feature of the empty space in that area. Perhaps Sandra has other opinions on dark matter.

Doug: please do a simulation. I think simulations will tell us something important. For a start, there is no clear “official” description of what a photon is. I’ve tried to describe it here, but I’m not sure I’ve got it right. I do however think the electron and proton descriptions are generally correct, because of the g factors. Credit to Williamson and van der Mark for the electron. The spin ½ does suggest to me that the photon has a rotation to it. But I don’t think an electron is shooting out any virtual photons. I think electrons and positrons move like vortices. Counter rotating vortices attract, co-rotating vortices repel. See the first picture in https://physicsdetective.com/what-charge-is/. Note the grey arrows denoting linear electric force and rotational magnetic force. The electromagnetic field of the electron looks like a standing field but it’s really a field variation, a wave, wrapped up Mobius style. You have to cut a long thin sinusoidal strip of paper to really “get it”. Then form it into a Mobius strip. It goes round twice, and the thin parts combine with the thick parts such that it’s the same thickness all round. In similar vein the electron’s field is isotropic. The field is only spiralling outward because it’s like frame dragging. Take a look at some pictures of the gravitomagnetic field. The electron goes round in circles in a uniform magnetic field because of Larmor precession, which is akin to gyroscopic precession. There’s nothing coming out of the magnet, and there’s no field of force pushing the electron around in circles. It does that itself, like a little motor boat with a stuck rudder. I don’t think gravity is quite the same, and is instead a simple refraction. Light curves downwards because a gravitational field is a place where the speed of light varies. At the black hole event horizon, the speed of light is zero. That’s what Einstein said, and I think he was right. That means black holes are black, like the scientific evidence says. The light doesn’t get out because the upward light beam is motionless. Thanks re the dark matter. Isn’t physics fun!

Doug: the ‘classical’ result stems from assuming a binary hidden property for photons. Each photon would have a hidden variable that can take on only two values, “pass test” or “don’t pass test”, with each having a 50% probability. This is analogous to the first plan i proposed, with both photons agreeing on passing or not passing, except it only applies to single photon experiments. And as I mentioned, this clearly does not result in a cosine rule, hence it’s linear and agrees only for 45°. Which is the whole point of the thought experiment, showing that linear relationships don’t agree with experimental results. The second part of the argument is that even if you assume a non-linear relationship you still get the wrong result (the second plan i talked about).

.

I don’t have a strong opinion on dark matter, we just don’t have enough data to say one way or another. I do like a lot John’s proposal, with it being fundamentally a variable space energy density, but that comes with some problems as well like redshift relationships (it can still be right, but we’d need to assume the distance ladder is wrong in some other way).

.

.

Raf: aside from the fact that the amplitude of a single photon in a beam is not at all well defined in quantum mechanics, i don’t see why you need to assume a photon is only partially absorbed by the filter. Polarizers are made of atoms, and photon can only be absorbed by atoms completely or not at all. There IS evidence of partial quantum jumps in electrons, but the energy is always given back as the full photon.

.

.

Let me clear up a final confusion point: the quantization of light is a direct result of the quantization of energy levels in atoms. The electromagnetic field itself is a continuous entity, that can take on any value (accelerating charges emit non-quantized radiation). The ONLY way to produce photons is by direct interaction of the EM field with matter. Just like free strings can vibrate in any mode you want, but as soon as you put those strings on a guitar you can only play certain notes.

Guess my post about a single field unifying E, B and gravity was too confusing, was hoping it would blow someone’s mind.

.

I’ll try to get this posted with figures on a website when I actually have some free time for this hobby project.

.

Thanks for all the inspiration I’ve gotten from this website!

Sorry to be slow replying Doug. I’ve had a tough week.

No worries at all! I know the feeling! Let’s hope for some better weeks ahead

This comment is to both Sandra and John: So I went ahead and made the simulations, it all makes sense now.

.

I think Sandra’s explanation of entanglement was assuming I already knew the basics, which I guess I never got. Seems obvious in hindsight as I thought it might.

.

I will try to explain/handwave, let’s say we have two vertically polarized photons that we emit from a source that are entangled:

.

Classically, when you measure off axis, at say 30 degrees, there is a cos^2(30) chance it will go through one polarizer.

And if the other polarizer is at 50, there’s a cos^2(50) chance it went through that one. So the chance it went through both is cos^2(30)*cos^2(50) ~ 31%.

.

However with an entangled photon pair, if we measure one at 30, it instantly (spookiness here) slams the other photon to this 30 degree angle. So then the chance of the other one going through the 50 degree polarizer is now cos^2(50-30) -> cos^2(20). So then the chance of it going through both is now cos^2(30)*cos^2(20) ~ 66 %.

.

And that is an easy way handwavy way to explain entanglement to a dense mind like mine. But I probably still wouldn’t have believed it if I hadn’t coded it up. I guess it matters which it went through first, so there’s the whole Bell’s inequalities which eventually gets it all down to just theta, where theta is the difference between the angles.

.

I did the Bell test and CSHS game. I think the game made it clearer to me, but doing the whole Bell test was interesting since it is more complicated to get the S parameter as a function of four angles.

.

Here are the published matlab scripts in PDF form, scroll down to see the results and figures. It’s not perfect since it was only 10^4 iterations.

.

CSHS game

.

Bell test

.

I’ll throw these on my github at some point too, if you are going to give this to lots of people to use/play with, it would be nice if you let me know so I can throw my name and affiliation on the codes 🙂

Doug: well done for working so hard on this. And apologies for not replying sooner – I’ve been way too busy (again) this week. But I’m sorry, because like Raf said, your model has the usual assumptions built in. See this?

.

“However with an entangled photon pair, if we measure one at 30, it instantly (spookiness here) slams the other photon to this 30 degree angle”..

The problem with that is that you aren’t just measuring the polarized light. Polarizer A alters the polarization of the light passing through polarizer A. That’s missing from your model. It presumes that polarizer A doesn’t alter the polarization of light passing through polarizer A, but instead alters the polarization of light passing through polarizer B. I wish I had the time to model this like you have done, but I’m afraid I don’t. Sorry.

Doug: think about the situation where one photon goes through two polarizing filters A and B. If it gets through A the chances of it getting through B is cos²θ°, where θ is the difference in angles between A and B. If it doesn’t get through A you don’t measure anything. The entanglement experiments are similar, but where you have two identical photons moving in opposite directions, one through A, the other through B. If you don’t measure a photon coming out of A you don’t have a pair, so you don’t measure anything. So again you have a cos²θ° result.

I used think this way just last week.

.

You can see my code and modify it as you see fit. You can clearly see in the CSHS game that simple pairs of photons emitted at the source will never allow you to beat the 75% chance.

.

In the Bell experiment, if the photons are vertically polarized and A is at vertical you don’t get anything interesting, it’s just how you say, it’s the off axis stuff that beats classical.

.

Classical how you suggest, I put in the cos^2, you can see it in the code, will never be greater than S=2, but it happens with the quantum entanglement.

.

It helps to code it up and then you will see entanglement does something different. Or if you figure out a way out of it, then it will easily be publishable. But you can’t just say it, you have to show it.

.

Good luck!, I’ll definitely play around with it more myself to see if there’s a way out, would be great 🙂

.

Aspect did both these experiments, his papers on this are freely available.

Sorry, you’ll have to add the cos^2 classical polarizer yourself to the cshs game, I did try it and it’s worse than the 75% classical max anyway. You can add anything you want, lots have tried, and as Sandra said, all have failed. It’s really just an entangled state with spookiness that can beat 75%. Or at least, so far no one has figured it out. It’s much simpler than doing the full Bell test and calculating S over 4 angles so that’s where I would recommend starting.

Let me try to explain the game, it’s really interesting. two bits, x and y, both random (0 or 1), are sent to A and B, who can’t communicate with each other . Then A and B send back either 0 or 1, where the goal is xor(A,B)=and(x,y).

.

If A and B always choose zero, they win in every case except where x=y=1. So 75%. There are no plans for A and B choices to get higher. There are 16 options and I do all of them in the script.

.

If you send vertically polarized photons to A and B, with the x,y, it doesn’t help them beat the game, how could it, unless it includes information about x and y. There’s nothing that you send that would help if it doesn’t say what x and y were.

.

But if u send an entangled pair and now A and B decide what to send back based on their spin measurement, there’s a way to do it. Basically B is sitting there measuring the spin of the entangled pair, getting spin up with 50 percent, 50 percent spin down, and they choose the axis they measure along depending if the y they got was 0 or 1. And A is measuring along one axis also getting half up half down. But then A decides to measure along a different axis, depending if the x they got was 0 or 1. This instantly changes B’s measurements. And this allows them to beat the game more and get 85%.

.

It’s insane but it means that entanglement has spooky instant action across a distance, you can easily comment out the line that changes Bs measurements, by slamming the entangled bit to the angle A measured it at, and you won’t get 85% anymore… And they have done the experiments and got 85%. So it means there’s no way out and that it’s, as far as we know, an instant effect. Maybe there is some upper speed limit, many times c, we will see because they keep doing entangled photons across larger distances.

.

Blew my mind so much, the game is a really creative idea, and I feel like this proves without a doubt that entanglement is a real instant effect, so I guess locality really does go out the window.

.

So then it seems obvious that information about x from A was sent to B instantly. But since x and y were random to begin with, no useful information was sent. And it’s also been proven that you can’t send actual information unless you can beat this game by higher than 85%! And the experiments can’t get higher… Damn.

Hi Doug, thanks for the simulations, I hope it gave you some satisfaction in creating them, but you used the same assumptions about the classical method like Bell did, which in my opinion are wrong.

It is no suprise then that you get the same results as all previous experiments (that lead to Nobel price…)

The key is that the calculation of the “classical” method makes assumptions which do not hold up in real life because the polarizer do change the photons and therefor there is a correlation between the polarizers (and why in the QM method theta is used, so the difference in angles).

I linked to the video of the 3 polarizers to remind us about this. I know a lot of stuff in the video about explaining is not good (yes John I agree, that is a typical problem with many of these science videos online, it seems that if you repeat a lie over and over it becomes true…) . There is nothing QM about this experiment, it just rules out the whole notion that there is something like a point particle. The reason why your simulation and the bell equation gives the result for the “classical” is because they treat the photons as particles that go through or not and the measurement has no effect and there is no correlation. However the reality is that this is not true with polarisers. There is a correlation between the measurements, and you have to work with theta, the difference in angles, not the absolute angles like you have in your similation. Two setups are possible: 1. with entangled photons, the have the same polarisation when they leave, then you have to use the difference in this random polarisation and the actual filter afterwards. The expected result from “classical” is exactly the QM, because it is simply the Malus law. 2. non entangled, but both photons go through the same polariser to start with, so they have the same polarisation. You will again have the same result because in the calculation you need to use the difference between first and second polariser.

BTW, have you noticed in the last graph of your Bell simlation that actually there are points in the graph that are exactly the same as QM, and there are also points at theta 0 that give 0 for E(a,b), which is clearly wrong! The reason why you have all these points in between is because of the formula used for the “classical” is wrong and honestly I think you accept the truth of that statement too easily.

Raf: “The key is that the calculation of the “classical” method makes assumptions which do not hold up in real life because the polarizer do change the photons and therefor there is a correlation between the polarizers”

.

The whole point of using entangled states is that you get a “copy” of the particle you’re measuring. As you say, the polarizers in a row for the same photon can be explained away as the polarizer changing the state. With entanglement you solve this by only measuring each photon once. The incredible thing is that your measurements will still be affected just as in the single photon case, with the added trouble that your detectors could each literally be light years away from the photon source, and since the choice of measurement angle is independent for each polarizer and random, you get “spooky action at a distance”.

.

.

“There is nothing QM about this experiment, it just rules out the whole notion that there is something like a point particle.”

.

Right, but it doesn’t really give us any hint about the “true” nature.

.

.

“The reason why your simulation and the bell equation gives the result for the “classical” is because they treat the photons as particles that go through or not”

.

Yes, that is the first part of the argument. The second part is that even if you introduce local variables to get Malus’ law, you still don’t get the experiment right. Explaining further:

.

“1. with entangled photons, the have the same polarisation when they leave, then you have to use the difference in this random polarisation and the actual filter afterwards. The expected result from “classical” is exactly the QM, because it is simply the Malus law.”

.

The photon DONT have a definite random polarization when they leave. If they did, for measurements where the difference in angle between polarizers is 0, but not aligned with the random “secret” polarization, they would not always give the same result (both pass, or both not pass), because each photon has an INDEPENDENT, local (referred to its difference in angle with the polarizer it encounters) cos^2(x) chance of passing or not passing. For this to work you’d need the photons to have a probability correlated with the difference in angles of polarizers, which you can’t know in advance (because it’s random), and the instruments can in principle be space-separated. Which implies, spooky action.

.

.

You are too fixated on the meaning of Malus’ law, when I already explained that we use it as an experimental fact to deduce the absorption probability for single photons. The problem is that the photons don’t follow this rule consistently if we assume each has an independent existence. The only way to do it is to consider the two photons as a single indivisible system that can have correlated local effects, but you’ll quickly find out that this implies that very structure does not exist in physical space but in probability space. I’ll explain this separately in another comment.

.

If I had to sum it up, the gist is “you can’t know the difference in angle between the polarizers faster than the speed of light, but photons apparently can, and this messes our brains up”.

Actually disregard the probability space sentence, I got confused rewriting the comment multiple times and didn’t edit out that part.

Hi Sandra, thanks I knew you were going to reply to this (“The photon DONT have a definite random polarization when they leave.”) and I was planning to write something in my original reply but it got already too long to include this..

I think we disagree here and I am on Johns side. For me they do have a definite polarization, everything is sort of defined at all times. The reason for the uncertainty principle is due to measurement issue, not because the wavefunction needs to collapse (btw try to explain the mechanism of that , because nobody has ever done that….) the wavefunction is just a mathematical expression, as in it is not real. I know we will keep disagreeing about this, but this is fundamental…. Not a single experiment has really shown this, since all measurements involved have their impact on the result, like here with the polarisers. Since they do have definite properties the outcome is known, and governed by simple physics…

I think the polarization is definite in that there’s no magic, and that Sandra is just repeating the mantra that polarizer A merely measures the polarization of a photon passing through it, and somehow instantly affects events at polarizer B. Even now it looks like she won’t admit to herself that polarizer A alters the polarization at polarizer A. Sadly it looks like Doug is effectively stating the same about measuring spin up and spin down. An electron moves round in circles in a uniform magnetic field because the latter causes a Larmor precession of the electron spin. It rotates it. The non-uniform magnetic field of a Stern-Gerlach detector also alters electron spin.

.

By the by, you might want to look at https://physicsdetective.com/the-double-slit-experiment/ where I liken wavefunction collapse to an optical Fourier transform. Having read about the experiments by Aephraim Steinberg et al and Jeff Lundeen et al, I think of wavefunction as something real rather than something probabilistic. I think of it as something like Bohm’s pilot wave, but a pilot wave that isn’t piloting anything.

If all you do with a detection is change the polarization of an already set secret polarization, how then do you always get a 50% chance of detecting the photon? If malus’ law was acting on the single photon, you’d expect your detection on one detector to be biased towards congruent polarizations, but this way you don’t get a cos^2 dependence on the difference between the angles. I could show you with math:

.

For each possible angle x formed by the detector with the secret polarization, at one detector you have an average (Integral of cos^2x)/2pi = 0.5 chance of finding the photon. Good.

The other detector measures a different angle, beta, which is simply x with added a known angle. With respect to alpha, the chance of detecting is sin^2(x+c) (because photon polarizations are orthogonal), whose average is still 0.5 regardless of c.

.

What is the average chance of detecting both photons at some difference angle c (which corresponds to the experimental distribution)? It’s simply given by the average of the function cos^2(X)sin^2(x+c). You can see for yourself this does not give you the desired chance at any angle, not even 45° (the result is 3/8). 0°? 1/8, when it should be 1. 90°? 1/8 again.

.

.

Remember, it’s only when you bring together the two sets of synchronous results that you can build the correlations. For example:

.

x v

x x

x x

v x

v v

v v

.

In the sets above, it’s obvious you detected only half the photon at each side. Yet, taken together, 4/6 results agree, which is more than half. Of course this is just an example, imagine the same type of dataset hundreds of detection, but with an 85% agreement for angles of 22.5°.

Hi Sandra,

One last reply and then I leave it like this. Your statement about the 50 % is wrong. This only applies to randomly polarized light, that is uniformly distributed over all angles. Individual chance for each photon follows the Malus law and there is no difference in the QM calculation as in my explanation.

Two scenarios: the photons from the source (dont need to be entangled) first pass through the same polariser with angle 0, and then at each end through another polariser: version A has again 0, but B has turned to 22.5°. The chance you will detect something at both ends follows exactly the Malus law because A will always detect in this case (stays at 0), and B gives you the 85% because of Malus (NOT because of QM). I believe this experiment was done in the past and showed exactly the same results…

Second scenario: entangled photons, basically what this does is remove the first polariser from previous experiment since both photons have the same polarisation to start with. Then the same thing will happen with whatever polariser at and angle you put after it. The change for passing is NOT 50% for each photon, but given by Malus…and since both photons have the same DEFINITE polarisation, you get the coincidence numbers from the actual experiment.

I think you get confused by looking at this in a sense that all is random (as in not defined untill measured) , and I think this Copenhagen interpretation is the cause for the whole mess we are in. Of course there is probablitiy/uncertainty involved but that is because everything is a vibration, a wiggly thing and is never a point like object (this last aspect is only an illusion due to the nature of measurement/detection).

Raf: that’s not how the correlations work. You’re assuming every single photon pair has always the same polarization, aligned with A (which is equivalent to assuming A has some kind of FTL influence on how the photon is generated). This definitely works for every angle: at A, the chance is cos^2(0)=1, at B it is cos^2(x). The total chance is 1*cos^2(x), which is obviously malus’ law. I must also point out that in entangled pairs, the photons polarizations are not the same but actually always ORTHOGONAL; this though doesn’t really matter in the actual calculations.

.

What happens if in your second scenario A is at 22.5* and B is at 45*, but the photon polarization is still at 0*? Then your combined chance is cos^2(22.5)*cos^2(45)=43%, which is a no-no. The point you’re missing is that in experiment we find the photons follow malus’ law for the DIFFERENCE in angle between the two detectors, not at each single detector.

.

What we see at each detector is essentially RANDOM, which is why I tried introducing a 50% chance. At the same detector setting, each time we send a photon pair we get PASS half the time, and NOT PASS the other half. If it were NOT random, as you claim, but followed malus’ distribution (which would mean each photon has always the same polarization), then you would not see the correlations, because at that point the pairs would not be entangled anymore.

.

.

.

Doug: “did you ever try showing that B is just a feature of the E field anisotropy at timescales of the electron frequency? Which requires you think an electron is not a spherical point particle, which maybe you do?”

I don’t understand what you’re asking here. I don’t think the electron is a point particle, the standard model doesn’t assume so either, and John probably knows this. In QFT the electron is an excitation of a field, and is an extended entity: at each point of the field where the excitation is present is assigned a probability of finding the electron. That’s all.

Whether I agree with the probabilistic interpretation is another matter. For that I think we need to actually find a way to differentiate between a real wavefunction and a mathematical one, until then it’s philosophy.

Hi Sandra,

I hope with this reply I might manage to convince you, so here goes:

first scenario (which I think you agree on) was demonstrated in scientific way which also John mentioned: see https://arxiv.org/abs/2002.02723

Second case is where you are calculating with probabilities, however that is not what is done in the actual experiment. It measures coincidences, I suggest you read the 1972 paper of Freedman and Clauser again. The trick is how they come up with the coincidences: “For a given run, R(y)/R0,was calculated by summing counts for all configurations corresponding to angle y and dividing by half the sum of the counts in the adjacent periods of the sequence in which both polarizers were moved.”

The reason they have to do it this way is mainly beause they dont know how many photons are generated, and to get a rate you need to have this baseline.

As a result of this they only take 25% of the detections into account (actually only here applies the calculation you used, 0,5*0,5=0,25% chance for both detecting)

Now comes the crucial point: if there is a detection at A, there is a cos2(theta) probability it will also be detected at B. I know QM says 50% but that is not true, that is a wrong assumption, and actually scenario 1 proves my point of view, because both photons have the same polarization (or orthogonal, does not really matter).

The same applies to detection at B, the probablitly if detected is also same 85% at A in case the angle is 22,5.

Doug: if you program your game like this, you will get exactly the same results as the real experiment, I think some people did this already that is why I said don´t waste your time…

Honestly I don´t know why people keep sticking to this 50% for passing or not. It clearly depends on the polarisation the photon actually has (and NO , it does not appear like magic out of thin air), and it is thus governed by Malus law.

Many thanks Raf.

Raf: most serious Bell tests use a polarizing beam splitter (PBS) or similar (such as a Wollaston prism). These have a detection probability near 100%. Instead of recording a hit or a miss, they record H for Horizontal or V for Vertical (or T for transmit and R for reflect). When there is agreement for the 2 photons, it is because the results are HH or VV. When there is disagreement, the results are HV or VH.

.

https://arxiv.org/pdf/quant-ph/9810080.pdf

.

This means that all photons are taken into account.

.

Your scenario 1 does not involve entangled photons, and there’s clearly no mystery there.

.

.

.

All: previously I was asked what my opinion on Bell’s results are. I mentioned that I did not believe in a FTL influence. I also implied I think the probabilistic interpretation of QM and the born rule are garbage pseudoscience (John will very likely agree with me here). If you can digest some matrix calculus, I highly suggest you take a look at “Quaternions, Spinors and the Hopf Fibration: Hidden Variables in Classical Mechanics”:

.

https://arxiv.org/abs/1601.02569

.

TLDR, ELI5: the fundamental misunderstanding in QM is related to quaternions (which are the extension of complex numbers). Quaternions can represent rotation in 3D but also in 4D. That’s why we can’t make sense of anything: we’re attributing quantum properties to rotations in 3D only, while they’re actually 4D rotations. Spin is the perfect example: the hidden variable is the rotation in the 4th dimension. The hopf fibration, which John and other users mentioned multiple times, is the projection of the hypersphere on the surface of a sphere: if we say that the electron’s structure is that of the hopf fibration, what we’re actually saying is that the electron is a system that rotates in 4D, but we can only ever detect a 3D portion of the whole thing.

.

Quantum entanglement involves two particles, so we need to swap from quaternions to octonions to take into account all degrees of freedom. But if we do, the system goes back to being semi-classical and determinate. This is the source of the non-locality.

.

Think of a 3D banana in Flatland. Flatlanders will only be able to see two circles corresponding to the points of the banana, and will think these are actually different particles. But the truth is that there is only one object there.

.

You can extend this reasoning to all of spacetime, and all the things in it. Mathematically, you can PROVE it’s actually all one single object, viewed from the different perspectives of our limited 3D perception.

.

Now the following will be my speculation, the paper doesn’t talk about it: the fundamental reason we don’t see these “extra dimensions” is that the electromagnetic interaction (which ALL our senses are based on) has a U(1) symmetry. In plain talk, this means that different configurations of the extradimensional space correspond to the exact same pure EM field configuration, so we can’t possibly notice the difference; on the other hand, we can infer their existence, via quantum effects of lower symmetry. It depends entirely on how we “move” on the surface of the hypersphere.

After all, John’s model of the electron is that of an inflated overlapping torus. You can’t have such a structure in 3D only, you need an extra degree of freedom so that the inner and outer surfaces don’t actually collide with one another.

Raf, I think you are misunderstanding the CSHS game. Read Aspects experiments on it, do the game without entangled photons, you can’t get 85%. It’s actually a simple game, and the code is simple too, it’s just confusing the first couple of times you do it. You just say there’s error in the code and everybody’s experiments etc, but as Sandra says, they’re taking into account all photons and there’s literally nothing besides an FTL effect that can get you over 75%.

.

Sandra, how can it not be FTL, or at least FTL in the projection in our 3d space.

Doug: it’s not FTL because the object has always been a single entity, it doesn’t become “connected” at the moment of measurement.

When you detect the photon through the polarizer you’re performing a rotation at 2 different points in 3D space, but the result is still pre-determined by the extra degree of freedom in 4D. In that sense, the act of measurement itself is very local, but the object you’re measuring is not. There is a VR game that I highly suggest you try if you have the possibility, which lets you play around with higher dimensional solids.

Hi Sandra,

Makes sense, I just meant we would see it as an FTL effect in the projection in our 3d space. Having an extended entity across 4d space that can be rotated and the other side rotates at the same time is still FTL as far as I can tell. Unless I am misunderstanding and it is extremely small in the 4d space. I was thinking about it like the banana analogy, where the size in 3d is at least as large as the distance between the two circles it projects in 2d.

Doug: think about the different measurements as looking at the same thing from different angles. In the case of the flatlander banana, imagine you’re tilting the plane at one of the two circles with respect to the banana. What you’re actually doing is making a different projection on your 3D space.

But yes, i guess entanglement does look like FTL. The point though is to remember that it isn’t, and that photons and electrons are not 4D bananas.

Thank you Sandra, I will look into it, I can follow most things, however I am an engineer, not a mathematician or physicist, so I see things more from how they could work.

Also thank you for bringing up the 4th dimension, as I believe (hopefully someone can find a proof for this) that indeed that is how everything works. My job is in 3D printing, so I can print any shape, but it lead me to a path were actually all we experience is like a projection from 4D space. I am very interested in the VR app, can you tell me what it is? I fully get the banana analoge and yes that can be very reasonable explanation that all is one and a projection or intersection in 4D space. This is actually very consistent with spiritual teachings (I am adherent to Advaita Vedanta), and that space is like the Akashic records…

Personnaly I have given a lot of thought on the structure of the ether (and dimensionally it is related to the Planck lenght) and came intuitively to the 24 cell : “The 24-cell does not have a regular analogue in 3 dimensions. It is the only one of the six convex regular 4-polytopes which is not the four-dimensional analogue of one of the five regular Platonic solids. It is the unique regular polytope, in any number of dimensions, which has no regular analogue in the adjacent dimension, either below or above”. The reason I came to that is that it is 4D space filling, and the cross section also is space filling in 3D, from cube to rhombic dodecahedron (see ), which can perfectly explain how the ether is excited in continious way and you can explain photons etc… I plan to one day model this in 3D, maybe with this VR tool it would be even better experience.

All the best, peace and love, Raf

All: can I remind you that the goalposts have moved repeatedly. They originally claimed that a polarization measurement at A alters a polarization measurement at B, which had to be instantaneous spooky action at a distance because it didn’t match Bell’s straight line plot. Then after we point out Malus’s law, the goalposts then claim that we still have instantaneous spooky action at a distance because the prediction is half the plotted cosine, even though we don’t know what photons are in the apparatus. Then when we reiterate that a polarization measurement at A performs a rotation at A and a measurement at B performs a rotation at B, we now have a 4D banana wherein a measurement a A performs a rotation at both A and B, whilst the rotation at B has somehow vanished. Might I remind you of this in the article:

.

“Not only that, but Clauser and Freedman’s photons weren’t even entangled. The wavelengths were 581 nanometres and 406 nanometres. They weren’t the same wavelength because the photons were not produced at the same time. Clauser and Freedman referred to an intermediate state lifetime of 5 nanoseconds. Those photons were produced 5 nanoseconds apart. A photon travels 1.5 metres in 5 nanoseconds. By the time the second photon was emitted, the first photon was through the polarizer. Those photons weren’t entangled at all. Aspect used the same calcium cascade. So his photons weren’t entangled either”..

Perhaps I need to write a new article on a later “loophole free” experiment to demonstrate that the explanation is simple and mundane, with no requirement for any quantum weirdness.

It’s the same claim all the time, the measurement at A rotates the later measurement at B. I even have the option in the code I wrote to enable Malus law only, for both Bells and the CSHS game, and the results without spooky action don’t match the experimental results. Additional, they have been a lot of tests since then with single photon detectors of entanglement and they all show the same thing, spooky action.

.

Nobody can explain the CSHS game without entanglement, even the 4d idea from Sandra instantly rotates and gives spooky action across space. So it’s all the same, spooky action exists and has been tested time and time again, with better and better experiments.

.

This is how the military passes one time codes completely securely, using entanglement. So it even has applications nowadays.

.

Fun stuff!

Doug: I’m afraid quantum cryptography doesn’t actually use quantum entanglement. It’s all part of the myth I’m afraid. Google on

quantum crytopgraphy doesn’t use entanglement.I will write a future article on the better and better experiments. Fun stuff indeed!Thanks, Raf: The points at 0 on the classical graph for theta =0 is very simple, it’s when you start with vertically polarized photons but your polarizers are horizontal. It’s just like your video.

.

Like I have tried to say, I think the game is much simpler to understand, see my previous post, and it quickly shows there is no way out with anything classical or anything that doesn’t instantly change the correlation between A and B measurements.

.

Relating it to your triple polarizer, think of entangled photons separated by light years, how does the orientation you set the polarizers at on one side instantly effect the measurements done at the other side, light years away?

.

It has to be something spooky.

The conundrum is that information is still limited by light speed. Even though I didn’t like it or believe it, nature doesn’t care and the results of experiments have proved it to be this way. It was just that I didn’t understand the test.

Doug, I interprete theta 0 as there is no difference between the polarisers, so you should get always the same result at both ends, so that is why I say it is wrong, the coincidence should be 1, like on the graph to the right.

The other game you did has the same issues as the one with polarisers. The Stern-Gerlach type of experiments to measure spin also effect the state of the particles (from wiki page: ” Instead they alter the state by observing it (as in light polarization)”) and measurements of different angles are correlated because the particles in my conviction do have definite properties and the outcome is predictable (if you know the beginning state of course).

I think the whole discussion boils down to the fact that you believe or not that particles have definite properties at all times, or as QM proposes it, its state is undefined untill you measure it (and thus comes with the spookiness and all the rest with it…). I think all the articles of John so far are in line with the first interpretation… And after carefull analysis I have not seen any definite proof of the second interpretation (because by measuring, you actually effect the state…)

When you measure one all the way at A, it effects the state you measure at B. And that’s an instant effect. Go through the game, there are no alternatives. Random photons at random polarizations sent to A and B doesn’t help you win the CSHS game, you need an instant effect across the distance between A and B to do that, which is entanglement.

.

When you understand the game, it becomes obvious that your analysis is incorrect.

Hi Doug, the game outcome is no surprise since it was scripted with the desired outcome, in no way for the classical one in that calculation get more than 75% and since you included the term 1/sqrt(2) in your calculations from the Born rule, you get the 85%, there is no mystery or unexplained outcome here….

BTW the CSHS game is a thought experiment, I dont think this was ever done in real life, so there is no “proof”, and you will need these single qubits and a device to measure them accuratly. I think this will be very challenging, and what I have heard so far about error correction (due to decoherence and quantum noise) most likely it will be impossible. And even if they manage to do this, it still would not prove anything different when you interprete that the entangled particles have a defined property at all times, so if you measure at A, the state at B is always matching that no matter how far apart…

What I propose is that if we do an experiment where the outcome of the experiment is linear, you will see that there is no problem at all… what Bell has seemed to wipe under the carpet is the linearilty of the measurement/detection. The fact that at angle 0, 45 and 90 of the polariser there is no violation is also a clear sign of this… I cannot think however of such a device that can do this, so uniform probability over all inputs/angles. I am 100% sure there will not be any outcome different from “classical” …

Also the plot you don’t like in the Bell test is four terms, you subtract away the chances it went through one but got the opposite result on the other. So it’s not the simple way you’re thinking, and the way I was thinking of last week, just pairs emitted at the source. The spookiness has to be instant across the distance for these tests to work.

.

But again, the game is simpler to understand IMHO. But both tests prove the same thing. It proves it so well I have given up trying to find a way out and just accept nonlocality now.

One day Doug, I will try to show you the way out via a model akin to yours. I suspect it may feature a photon like the penultimate picture here: https://physicsdetective.com/the-photon/. This is it:

.

” alt=”null” />

.

Until then, I’m afraid we’re going to have to agree to differ on this.

Sounds like a plan! Great discussion, I wouldn’t have kept trying to understand the game if it wasn’t for y’all, thanks for a great physics website.

Thanks Doug. I try! And some, like the wife, say I am trying!

Haha 🙂 it’s like the only non mainstream physics site that’s still interesting!

Rest assured Doug, that one day all this stuff

willbe mainstream!Thanks all for gor the nice interactions, i am afraid it is a neverending story. The only thing i find sad is that QM seemed to have taken over as the only possible explanation for these experiments which have also intrigued me in the past but now i come to the sobering conclusion there are more reasonable explanations possible. The constancy of speed of light is another one of these…

All the best , looking forward to the next article!

Thanks Raf. I’m afraid to say that I find the state of contemporary physics to be

extremelysobering. Gravitational physics is full of weird and wonderful things like wormholes that lack experimental support. Quantum physics is akin to magick that surpasseth all human understanding. Particle physics is like cataloguing short-lived ephemera without even trying to understand the small list of stable particles. I think it’s a sad state of affairs. I look forward to things changing, and I for one will do what I can to help that along. I shall have a think about that next article. I will try to write about something positive and uplifting.Hey John ! Hey Family ! What is everyone’s opinion on the Nature Physics by Springer News website ?

I was able to pull up several interesting articles by searching for “quantum knot theories “. Those articles suggest that a few brave matter and particle physicists, not QM theorists are on the forefront of ” connecting the dots ” towards questioning the SM and embracing better, classic based ideas; all of which are in turn based on their own respected findings !

Greg: do you have a link for that? When I had a look ( https://www.nature.com/search?journal=nphys&q=quantum%20knot ) the articles that came up were fairly old.

Will do, unfortunately via personal email. My computer skills are still minimalistic.

Thanks. I have them. I fear I may have some email issues coming up. There’s changes happening with my Internet Service Provider (ISP).

Hi Sandra, I like this explanation better than the others but I’m going to go out on a limb and say they won’t be convinced, it’s not clear enough. It’s also still confusing with the 6 sets where there are doubles, two xx and two vv without explanation of x or v. And it might be easier to just do entangled states with the same polarization for explanations, or they will claim this is how it really is.

.

Idk, the cshs game convinced me it literally can’t be anything, secret polarizations, or literally anything except some effect much much faster than c. And personally I don’t know if anything else would have convinced me, no matter how clear the explanation is. Going through the experiment/game step by step was the only way for me.

Doug: yes I know it wasn’t very clear, but I can only do so much on the comment section. The thing to take away is that correlations build up by comparing two entirely random sets. Mathematically, a pre-determined photon polarization is in principle fine with random outcomes, but it can’t explain the joint results. If I were any good with latex I would have made the equations much clearer.

Well sald. I do find it interesting how some people are convinced right away and others (like myself) take forever to really buy it. I’m just curious, did you immediately see the spookiness was a requirement for the joint results or did you have to go back home and go through the theorems/math line by line to convince yourself?

.

I used to be good at latex but nowadays i just use ms word cause I’m lazy and it’s good enough 😉

.

Also, even after being convinced there’s spookiness I find it much more palatable that it’s just FTL rather than having extra dimensions, etc. Now if only there was a way to communicate via entanglement…

Sandra: did you ever try showing that B is just a feature of the E field anisotropy at timescales of the electron frequency? Which requires you think an electron is not a spherical point particle, which maybe you do?

.

John agrees that it’s not a point particle obviously, but I think John said that this idea is a lot cause, and B is fundamental just like E.

.

It was in the really long post I made a month or so ago. I was just wondering what your opinion on these two points are. And John, please correct me if I mischareterized your opinion on this.

Doug: I’m not sure I’ve said it’s a lost cause. What I have said is that B and E are two aspects of the same thing. See for example https://physicsdetective.com/the-theory-of-everything/, where I use a canoe analogy to describe E as the spatial derivative of the electromagnetic-wave photon, and B as the time derivative. Lower down I talk about the electron as a “wrapped and trapped” photon, so B and E still apply, but now the electromagnetic wave is configured to look like an all-round standing field. Hence the wave nature of matter. You might like to read:

.

https://physicsdetective.com/the-electron/

https://physicsdetective.com/the-screw-nature-of-electromagnetism/

https://physicsdetective.com/what-charge-is/

https://physicsdetective.com/the-mystery-of-mass-is-a-myth/

I love the mystery of mass article, that was fantastic! I have actually read every article on here, I couldn’t help myself. So now I’m thinking of how to progress further, what are the equations, etc. For example, I like the article about making a high gravity/low speed of light chamber in a ship to slow you down to live longer, or throw a black hole off to the side of your ship when you want to change direction without feeling any g force. But we gotta figure out how to exploit this to make these devices.

.

For example, does a black hole not behave like general relativity says but instead how I think you were saying, time slows down for it so much you break conservation of momentum.

.

So I was tinkering around with doing the math for a metal shell superconducting spaceship with high voltage pulses on the outside of the ship, if you can squish enough electrons close enough together to get that effect then they will pull the protons in the metal hull forwards, and you could create a propellent less spaceship.

.

But there’s two unknowns there, the big one being are the general relativity equations wrong and momentum conservation breaks, like in that article you have where you can steer a spaceship by creating a black hole near it.

Doug: sorry to be slow replying. Thanks re the mass article.

.

I didn’t mean you create or throw a black hole off the side of the ship. I meant you created a gravitational field on one side of the ship. This would be akin to the gravitation field of a black hole, so the ship changes direction rapidly with no g-forces on the occupants. As for how to make something like that, well, I wish I had more time and money.

.

As for the black hole, I just can’t see how the usual method by which matter falls down can apply. As far as I can tell, an electron is light in a closed chiral spin ½ loop, and it falls down because of a refraction. If the light is part of a black hole, then the infinite gravitational time dilation means it’s moving at a speed of zero. So how can it possibly get refracted? So I’m left saying some zillion-ton rock will fall towards a gazillion-ton black hole, but the black hole won’t fall towards the rock. I am not happy about breaking conservation of momentum, but I can see no other option.

.

Methinks the propellent-less spaceship has to work another way. Do forgive me for keeping mum about that. But like I was saying, you don’t steer a spaceship by creating a black hole near it. You create an artificial gravitational field near it.

John: I like the idea of creating an artificial gravitational field, however, I haven’t heard of any experiment able to do that, or any idea even of how to do that. So I figure just try to create a regulator gravitational field. There’s some interesting stuff on using a ” rel=”nofollow ugc”>plasma pinch/z pinch/Bennett Pinchwhere eventually gravity will take over. So if you can run something like around the outer hull of your ship, maybe you can build a UFO.

There are a lot of comments here, and I have not read all of them so not sure if this has already been suggested. But it seems to me the best way to prove that the Bell Inequality is not proving anything special is to come up with an experiment where you can replicate the Bell Inequality result using clearly visible macroscopic objects. That is, create an apparatus where some visible object will pass through some opening with Pr = cos^2(theta). Think of passing a metal washer through a magnetic grating. An “entangled” washer can be passed in the opposite direction and will have the same angle, and thus the same Pr of passing through. If you can demonstrate that visible objects can produce the same outcome as photons in a Bell test, it should warrant an explanation from QM sycophants.

Brad: see the fourth image in the article, where I gave a depiction of Bell’s Inequality. It’s the blue rectangle: Bell’s Inequality predicts a straight line result, and Clauser and Freedman’s experiment yielded a cosine-like result. This was used to claim the existence of instantaneous action-at-a-distance. However it’s a false claim: photons passing through polarizers do not match Bell’s Inequality because polarizers rotate photons. Hence the three-polarizer effect and Malus’s law. The response to that is as per the comment I posted earlier today. People like Gill, who wants Joe Public to believe in instantaneous action-at-a-distance, shift the goalposts. If I were somehow able to create an apparatus, I’m sure they’d shift the goalposts again. But thanks for the suggestion anyway.

John: you’ve been saying this a lot both in comments and in the article. Can you explain thoroughly what you mean by “photons passing through polarizers do not match Bell’s Inequality because polarizers rotate photons”? I already explained how the correlations arise and showed how, mathematically, the percentages don’t work out if you assume photons have a definite polarization and the polarizer rotate them according to a cosine rule, and Doug showed you the CSHS game. Since you keep on the same line it’s clear that we are not talking about the same thing.

.

.

.

Also, I’m obviously not saying Bell’s experiment is explained by 4D bananas. That was just a pictorial example.

.

“we now have a 4D banana wherein a measurement a A performs a rotation at both A and B, whilst the rotation at B has somehow vanished.”

.

This is a gross misrepresentation of what I said, perhaps the example was not as effective as i thought. Aside the obvious that particles are not bananas, even in that example nothing disappeared: in Flatland, the planes crossing the 3D object tilt when performing some measurement. This is simply a way to represent a 4th degree of freedom on the object, and is still not exactly what is going on (if you read the article i linked, the actual hidden variables are the global and azimuthal phase on the 3-sphere). If we insist with the analogy, the “way the planes are tilted” is not the hidden variable, but the missing information after the projection of the 3D object is. If the two angles of the polarizer are very similar, you’re going to “sample” a very similar portion of the 4D object, hence you’ll get very similar projections (=very similar results). Another pictorial way to put it is, the actual polarization of the photon does not lie exactly in our 3D plane, and your polarizer will project it back down by rotating it.

.

I might work out the math If I got the time.

Sandra: you are in italics.

.

John: you’ve been saying this a lot both in comments and in the article. Can you explain thoroughly what you mean by “photons passing through polarizers do not match Bell’s Inequality because polarizers rotate photons”?.

Bell’s Inequality yields a straight-line “classical” result. We do not see this because polarizers rotate photons in line with Malus’s law, thereby yielding a curved cosine plot. The curved cosine plot is claimed to be proof that there are no hidden variables, when it is not. Point this out, and people like Gill will say the classical result yields a cosine plot that is only half the experimental result, and so on.

.

I already explained how the correlations arise and showed how, mathematically, the percentages don’t work out if you assume photons have a definite polarization and the polarizer rotate them according to a cosine rule, and Doug showed you the CSHS game. Since you keep on the same line it’s clear that we are not talking about the same thing..

We are talking about the same thing, but you and Doug are employing mathematics and gameplay to try to prove the existence of instantaneous action at a distance, whilst I focus in on the nature of the photon, the experimental evidence, and the physics. For example I’ve said repeatedly that Clauser and Freedman’s photons weren’t even entangled, but you appear to have refrained from commenting on that.

.

Also, I’m obviously not saying Bell’s experiment is explained by 4D bananas. That was just a pictorial example..

Good. I’m not fond of abstract concepts as a substitute for evidential physics. Re “we now have a 4D banana wherein a measurement at A performs a rotation at both A and B, whilst the rotation at B has somehow vanished.”

.

This is a gross misrepresentation of what I said, perhaps the example was not as effective as i thought. Aside the obvious that particles are not bananas, even in that example nothing disappeared: in Flatland, the planes crossing the 3D object tilt when performing some measurement..

You said “Spin is the perfect example: the hidden variable is the rotation in the 4th dimension” and “the electron is a system that rotates in 4D, but we can only ever detect a 3D portion of the whole thing”. I absolutely disagree with this. You then went on to say “Think of a 3D banana in Flatland. Flatlanders will only be able to see two circles corresponding to the points of the banana, and will think these are actually different particles. But the truth is that there is only one object there”. I disagree with that too. The truth is that there are two objects there, be they photon pairs, or electrons and positrons. You also said “You can extend this reasoning to all of spacetime, and all the things in it. Mathematically, you can PROVE it’s actually all one single object, viewed from the different perspectives of our limited 3D perception“. I absolutely disagree with that too. The truth (for Clauser & Freedman and Aspect) is that there are two photons, emitted at different times and different frequencies. Please note something I’ve said elsewhere: spacetime is an abstract mathematical arena which models space at all times. There is no motion in spacetime, but we live in a world of space and motion. The map is not the territory.

.

This is simply a way to represent a 4th degree of freedom on the object, and is still not exactly what is going on (if you read the article i linked, the actual hidden variables are the global and azimuthal phase on the 3-sphere)..

I don’t have an issue with this. The toroidal electron might appear to have a spherical symmetry, but it has additional complexity. (Note that what it doesn’t have is any actual surfaces. Martin van der Mark depicted it as a torus with a surface, but you have to try to imagine more and more “onion ring” layers. Only these layers don’t exist either. It’s just a wave in space, in a closed path).

.

If we insist with the analogy, the “way the planes are tilted” is not the hidden variable, but the missing information after the projection of the 3D object is. If the two angles of the polarizer are very similar, you’re going to “sample” a very similar portion of the 4D object, hence you’ll get very similar projections (=very similar results). Another pictorial way to put it is, the actual polarization of the photon does not lie exactly in our 3D plane, and your polarizer will project it back down by rotating it..

I have no issue with additional complexity, but I must insist that there’s no evidence of an extra spatial dimension. For a flatlander analogy, the additional complexity might be stretching the sheet horizontally. This might be interpreted as an extra spatial dimension, but it isn’t what it is. It’s a degree of freedom, a dimension in the sense of measure, but not an extra spatial dimension that offers freedom of movement. IMHO you can apply the same logic to 3D space. I would urge you to read https://physicsdetective.com/the-psychology-of-belief/ and try to test any concepts you hold such as hidden spatial dimensions. I think it can be an uncomfortable experience, but rewarding in the end.

.

I might work out the math If I got the time..

What would be more productive would be for you to point out the best paper you can find which claims to demonstrate loophole-free evidence for instantaneous action at a distance. Then I will write an article on it.

John: to both

.

“For example I’ve said repeatedly that Clauser and Freedman’s photons weren’t even entangled, but you appear to have refrained from commenting on that.”

.

and

.

“the best paper you can find which claims to demonstrate loophole-free evidence for instantaneous action at a distance. Then I will write an article on it.”

.

I had already provided a link, maybe you missed it. Serious Bell tests use parametric down conversion to generate a photon pair entangled in polarization.

https://arxiv.org/pdf/quant-ph/9810080.pdf

.

.

“I have no issue with additional complexity, but I must insist that there’s no evidence of an extra spatial dimension. For a flatlander analogy, the additional complexity might be stretching the sheet horizontally. This might be interpreted as an extra spatial dimension, but it isn’t what it is. It’s a degree of freedom, a dimension in the sense of measure, but not an extra spatial dimension that offers freedom of movement.”

.

The mathematics employ quaternions, which are used to encode rotations, not metric distortions. For the latter you use tensors. Quaternions work for 3D rotations (symmetry SO3) but also 4D rotations (symmetry SU2, that of spinors). I’d wager the evidence for the extra dimension is the behavior of spinors, but regardless.

.

.

You might want to take a look at this animation by the same author of the quaternion article I posted previously:

.

https://www.youtube.com/watch?v=NWii0C71ocQ

.

This is showing the path bosons and fermions (integer and half integer spins) take on the Bloch sphere. Each path is the equivalent of half rotation (fermions) or full rotation (bosons) of the surface of a higher dimensional sphere, i.e. one rotation can correspond to multiple paths.

Sandra: I will look at both items and give a considered reply, perhaps as an article.

.

Meanwhile, please will you have a look at https://physicsdetective.com/the-electron/ . You don’t need 4D rotations, just two orthogonal 3D rotations. You may also like to read https://physicsdetective.com/a-worble-embracing-itself/ which is more historical.

I’m not a smart man and the leap of faith that quantum entanglement takes for me is too much. I’ve been attempting to learn as much as I could on this topic after the initial shock factor of the unexplored potential got me all excited for the future. Now however, I find myself in disbelief. Without even getting into the math, the emphasis on observation is off-putting and seemingly self-deifying. I struggle to believe my measurement of something changes it’s outcome never mind that of another particle that is wholely disconnected.

Can someone please explain in another way for the less enlightened like I, the validity of the experiments that are verifying quantum entanglement at record distances? Is there a metric available that provides the over and under on how often these particles are demonstrating entanglement or is it a shotgun blast with cherry picked results?

Please help me learn, this is legitimately keeping me up at nights.

Sorry to be slow replying, Jesse. I had company this weekend.

.

Can someone please explain in another way for the less enlightened like I, the validity of the experiments that are verifying quantum entanglement at record distances?.

I think they are not valid at all. When you wade through the morass of deliberate confusion, you find that Freeman and Clauser’s experiment, and the Aspect experiment, employed “cascade” photons emitted with different frequencies at different times. They weren’t even entangled to begin with. On top of that, the experiments employed a specious mathematical smoke-and-mirrors claim from Bell that classical physics predicted a straight-line chart, and that a cosine chart was proof positive of spooky action-at-a-distance. However this totally ignored Malus’s law I = I₀ cos²θ, which is associated with the way a polarizing filter rotates a photon, which is why a third polarizer inserted between two others results in light transmission. Zeilinger did not use cascade photons, he used a beam splitter, so he was dealing with photons emitted at the same time with the same frequency, But the Malus’s law issue is still there.

.

Is there a metric available that provides the over and under on how often these particles are demonstrating entanglement or is it a shotgun blast with cherry picked results?.

I don’t think there’s a metric available, or cherry picked results. I’d say the results are claimed to be measurements of a local photon that alters the remote photon, and what’s really happening is that the measurement of the local photon alters the local photon.

Just my two cents , the cosine graph for entanglement is the ‘s’ parameter not Malus’ law for detecting a single photon. Aspect’s later experiments used spontaneous down conversion to create entangled photons at the same time, and he did the CSHS game and beat what classical physics suggests. He also did separation and detection faster than light could travel to close the remaining loop holes. Current experiments can detect single photons and idk if there are any remaining loop holes.

.

Either way, checkout triple entanglement, then it’s an exact outcome that goes against classical physics, https://www.mdpi.com/2218-1997/5/10/209, see eq. 30 and 31. So now you don’t have to worry graphs look similar, it’s one outcome for any classical state and a different outcome for entanglement.

Doug, I was expecting this paper to describe an experiment, but it doesn’t. It’s more of a meta paper describing the field. For example, in the introduction we can read this:

.

The paper is organized in the following way: In Section 2, we give a brief overview of bipartite entanglement..

The overview is heavily mathematical. There is no physics content, and no mention of Malus’s law. Section 4 gives a very brief history, but it is just so scant. Compare and contrast with the history I gave. Also note that the brief history ends with this:

.

A loophole-free Bell inequality violation was obtained using electron spins separated by 1.3 kilometers [52]..

These people don’t even know what electron spin is, or that a magnetic field causes Larmor precession and so alters the spin. Thereafter, yes, like I said, this is a meta paper describing other people’s research, without question. I’m sorry Doug, it’s just another layer of the mythology. Is there a particular paper of Aspect’s you’d like me to look at?

Thank you for the reply! I’m glad I’m not the only skeptic here. I feel like this topic has become pop culture.

Sorry I’ll try to find a better paper on triple entanglement, they have done the experiments, ill send you a link when I find my book describing the experiment (it must be around here somewhere!) .

.

I’ve tried to convince you the CSHS game only works with entanglement but you haven’t touched it. I’ve also tried and showed how the ‘s’ parameter for correlation in Aspects papers and Bell’s inequality aren’t possible using Mallus law.

.

Not sure what would convince you, here’s a paper from ’98 that used a BBO crystal producing entangled photons simultaneously with km of separation at the detectors, https://arxiv.org/abs/quant-ph/9810080

Doug: the CSHS game is only a game. It isn’t physics. Sorry if I passed on your s parameter. The Zeilinger paper you linked to starts by saying this:

.

The stronger-than-classical correlations between entangled quantum systems, as first discovered by Einstein, Podolsky and Rosen (EPR) in 1935 [1]….

That’s bullshit. Einstein Podolsky and Rosen did not “discover” stronger-than-classical correlations. See https://physicsdetective.com/quantum-entanglement-i/ where I talked about the EPR paper. I said what we have here is something like two photons produced by some event, which then move apart in opposite directions. Measure the momentum of the first photon, and the second photon is left in the state ψk(x2). Measure the position of the first photon, and the second photon is left in a different state φk(x2). However nothing can travel faster than light, so there’s no way to affect the second photon, and it can’t have two different states, because that would mean there were two different realities. Hence they said “we are forced to conclude that the quantum-mechanical description of physical reality given by wave functions is not complete”. The Zeilinger paper then says this:

.

occupied a central position in the discussions of the foundations of quantum mechanics. After Bell’s discovery [2] that EPR’s implication to explain the correlations using hidden parameters would contradict the predictions of quantum physics….

That’s more bullshit. Bell didn’t “discover” anything. He ignored Malus’s law and put up a straw-man argument obscured by mathematical smoke and mirrors to claim that hidden parameters would predict a straight-line graph. Zeilinger surely knew the history. And yet he distorts it. Doug, how can you not notice this? I will give comments on the rest of the Zeilinger paper over the weekend.

> polarizers rotate photons

I’m definitely with you on this. Malus law shows the results of a so called Bell Experiments are trivial. The outcome of such tests are self evident given the nonlinear relationship between polarizer angle and Pr of passing through. FWIW you don’t even need to stack polarizers to produce this cosinusoidal relationship if you can generate photon-pairs with the same polarity (fancy folks might call these entangled photons). Here is MATLAB code I wrote to simulate such an experiment…

clc; close all; clear;

t = 180;

n = 10000;

Polar1Angles = zeros(1,t);

Polar2Angles = linspace(0,180,t);

for i = 1:t

Photon1Angles = randi(361,1,n)-1;

Photon2Angles = Photon1Angles;

PrP1 = cosd(Photon1Angles – Polar1Angles(i)).^2;

PrP2 = cosd(Photon2Angles – Polar2Angles(i)).^2;

Photon1Passed = PrP1 > rand(1,n);

Photon2Passed = PrP2 > rand(1,n);

PrPassed(i) = mean(Photon1Passed == Photon2Passed);

end

plot(Polar2Angles,1-PrPassed);

——–

Here’s what that plot looks like…

https://ibb.co/jJFhxKP

Wouldn’t this suggests a Bell inequality could be produced from a setup using just a single polarizer at each end of the room? I still think the result is trivial. But at least in this configuration you don’t need to worry about what happens to photon orientations after passing through (multiple) polarizers.

That plot looks good Brad. Sadly I don’t know Matlab. I will look into it. I’m an IT guy so that sort of thing is easy for me.

“Photon1Passed = PrP1 > rand(1,n);

Photon2Passed = PrP2 > rand(1,n);

PrPassed(i) = mean(Photon1Passed == Photon2Passed);

end

plot(Polar2Angles,1-PrPassed);”

Pardon me I don’t use Matlab much at all.

What does this last bit do? What does photonpassed return? I don’t get what PrPassed(i) is doing either.

You can use chatgpt to translate it to python or whatever language you want, but I find the code confusing as to why it is done this way.

.

Basically, PrP1 > rand(1,n) sees how many times this value (PrP1 is also n values) is greater than a set of ‘n’ random numbers uniformly distributed from 0 to 1.

.

Then PrPassed is the normalized mean of PrP1 and PrP2, however PrP1 and PrP2 were compared to different sets of ‘n’ random numbers from 0 to 1.

.

Then the plot is the polar2 angle, since polar 1 was set at 0… vs… ‘1-PrPassed’, so like the inverse.

.

And it never goes to 0 and never goes to 1.

.

Idk, not sure about any of it, the math was done with VV or HH photons?, but the plot is 1-PrPassed, maybe to emulate VH photons? And comparing to random numbers? Either way, if you don’t plot the 1-PrPassed and instead PrPassed, it’ll have a max at 0 and 180 kinda like the Bell plots I made assuming there is no entanglement. I kinda see this as all smoke and mirrors, but maybe I missed something and actually this shows entanglement is all bs, idk.

And also obviously S in the Bell experiments is never calculated at all, minor details 😉

Doug: “Then PrPassed is the normalized mean of PrP1 and PrP2, however PrP1 and PrP2 were compared to different sets of ‘n’ random numbers from 0 to 1.”

This makes very little sense to me. You should be measuring correlations, not averages, and this is done taking measurements of the same angle difference multiple times. As is, the code is simply cycling through random angle orientations and making an average of that.

.

I think the use of the random numbers is simply a way to turn the cosine value in a probabilistic value, the higher the cosine the more likely it is for the result to be more than the random number. I still don’t understand what photonpassed returns, it can’t be either 0 or 1, otherwise the average wouldn’t produce that sine curve.

Sandra,

.

Actually maybe the code is fine. It Calculates the probabilities, then photon passed is the percent of the time both passed. Then the plot is 1-photonpassed instead of just photonpassed.

.

So it should be a cosine plot instead of a sin.

.

But either way the S statistic in the Bell test isn’t calculated and plotting photonpassed is just the classic result.