Page 1 of 10

Re: Coming Soon!

Posted: Thu Nov 04, 2021 8:24 pm
by FrediFizzx
Ok, back to the original question. I think I need to do some slight fixin' on it.
FrediFizzx wrote:
FrediFizzx wrote:This expression seems a bit odd to me.

Image

In order to get the probabilities for each of the four outcome pairs say in a large simulation, they first have to be averaged over many trials per (a-b) angle. It seems to me that in a proper simulation each of the four probabilities are going to converge to 1/4 for very large number of trials. At least that is what I am finding with our latest simulation.

Ave ++ = 0.248903
Ave -- = 0.248803
Ave +- = 0.246508
Ave -+ = 0.255786

That was for 10,000 trials. For 5 million trials,

Ave ++ = 0.249787
Ave -- = 0.249991
Ave +- = 0.250293
Ave -+ = 0.249929

Much closer to 1/4 each. So, for analytical purposes, it doesn't seem unreasonable to assign 1/4 to each of the four outcome pair probabilities.
Ok, now for the next part of this.

QM assigns for those 4 outcome probabilities,

Image

Again, in a simulation with many trials, we have to average and over all the (a-b) angles. Lo and behold, when we do that we obtain,

,
,

Because .

So, it seems to me that all of the parts of the original E(a, b) expression are all equal to 1/4. Analytically-wise.
Ok, now a question. Since all P(++)'s, etc. are equal to a 1/4 and the average of = 1/4, etc., does that prove that , etc. for our analytical situation? Keeping mind that P(++) and , etc. are actually averages.
______________________________________________________________________________________________________________________________
Ok, what we actually have here for the simulation is .
So, the question is actually this. Does,

??

Now the only way to equate those two would be by their totals or length of the lists. So, let's see. For a million trials, there are 250,000 events that are (++). and there are 250,000 events that are which makes sense since basically there would be a 1/4 each of the sin^2 and cos^2 probabilities. So, they match quantity-wise. But what we are actually trying to find out is this,

. Is that true for our simulation? Now, we know that the probability of getting (++) for the simulation is 1/4 so



Now, we also know the average of getting,



So,



Then by virtue of what QM says about it, I think we can say that,

for both the simulation and the analytical formulas. But maybe more proof is needed?
.

Re: Coming Soon!

Posted: Fri Nov 05, 2021 12:02 am
by gill1109
Great that this new forum is up and running!

Fred, in the original formula
Image
the probabilities are understood as conditional probabilities, or probabilities given a fixed experimental condition; i.e. P(++) is shorthand for P(++ | a,b), and similarly for the other three.

Thus:

P(++ | a,b) = N(++ | a, b) / N(a, b)

In the earliest literature where something like this formula was first used, they were moreover not (estimated) probabilities (relative frequencies), but absolute frequencies. One reason for this is that in a decent experiment, N(a, b) is roughly constant. So one could replace all relative frequencies with absolute frequencies.

In the earliest experiments, one also had no idea of the total number of *emissions* N(a, b) because of detector inefficiency. One made the fair sampling assumption that detection (yes or no) does not depend on hidden variables and that detection (yes or no) does not depend on settings, either. So one used total numbers N(+ + | a, b) etc instead of relative frequencies.

By the way, I have to apologise: your claim: “I'm claiming in the context of a simulation where x = RandomReal[0, 2 pi] is dependent on the number of trials, that if < f > = < g >, then f = g” is absolutely correct.

(A mathematician might add the phrase “with probability one” to the condition "if < f > = < g >".)

Re: Coming Soon!

Posted: Fri Nov 05, 2021 1:57 am
by FrediFizzx
@gill1109 I really don't understand this yet.

P(++ | a,b) = N(++ | a, b) / N(a, b)

What is N(++ | a, b) exactly?
.

Re: Coming Soon!

Posted: Fri Nov 05, 2021 3:02 am
by gill1109
FrediFizzx wrote: Fri Nov 05, 2021 1:57 am @gill1109 I really don't understand this yet.

P(++ | a,b) = N(++ | a, b) / N(a, b)

What is N(++ | a, b) exactly?
N(++ | a, b) would be the number of trials in which Alice sees outcome "+" and Bob sees outcome "+", when Alice has setting "a" and Bob has setting "b".

N(a, b) would be the number of all trials when Alice has setting "a" and Bob has setting "b"

Of course, different people may have different definitions of what is a "trial". I think of it as a time-slot.

Even then, if Alice's outcome could be "+", "-", or "no detection", you might only want to count those trials in which Alice and Bob both have a "+" or a "-".

Re: Coming Soon!

Posted: Fri Nov 05, 2021 4:33 am
by FrediFizzx
gill1109 wrote: Fri Nov 05, 2021 3:02 am
FrediFizzx wrote: Fri Nov 05, 2021 1:57 am @gill1109 I really don't understand this yet.

P(++ | a,b) = N(++ | a, b) / N(a, b)

What is N(++ | a, b) exactly?
N(++ | a, b) would be the number of trials in which Alice sees outcome "+" and Bob sees outcome "+", when Alice has setting "a" and Bob has setting "b".

N(a, b) would be the number of all trials when Alice has setting "a" and Bob has setting "b"

Of course, different people may have different definitions of what is a "trial". I think of it as a time-slot.

Even then, if Alice's outcome could be "+", "-", or "no detection", you might only want to count those trials in which Alice and Bob both have a "+" or a "-".
That is sort of like my pp2.

Code: Select all

{{0, 1}, {0, 2}, {0, 3}, {0, 4}, {0, 5}, {1, 6}, {1, 7}, {2, 8}, {2, 
  9}, {3, 10}, {8, 11}, {5, 12}, {10, 13}, {14, 14}, {13, 15}, {9, 
  16}, {17, 17}, {11, 18}, {22, 19}, {16, 20}, {26, 21}, {36, 
  22}, {48, 23}, {38, 24}, {48, 25}, {47, 26}, {62, 27}, {68, 
  28}, {64, 29}, {88, 30}, {63, 31}, {97, 32}, {88, 33}, {102, 
  34}, {103, 35}, {121, 36}, {125, 37}, {156, 38}, {171, 39}, {157, 
  40}, {164, 41}, {181, 42}, {211, 43}, {218, 44}, {242, 45}, {234, 
  46}, {206, 47}, {273, 48}, {289, 49}, {308, 50}, {269, 51}, {323, 
  52}, {345, 53}, {384, 54}, {362, 55}, {325, 56}, {360, 57}, {396, 
  58}, {446, 59}, {452, 60}, {488, 61}, {510, 62}, {474, 63}, {556, 
  64}, {541, 65}, {575, 66}, {548, 67}, {530, 68}, {574, 69}, {647, 
  70}, {674, 71}, {648, 72}, {656, 73}, {690, 74}, {814, 75}, {771, 
  76}, {814, 77}, {804, 78}, {785, 79}, {862, 80}, {838, 81}, {896, 
  82}, {912, 83}, {892, 84}, {940, 85}, {962, 86}, {950, 87}, {1010, 
  88}, {1027, 89}, {998, 90}, {1045, 91}, {1089, 92}, {1017, 
  93}, {1082, 94}, {1109, 95}, {1113, 96}, {1145, 97}, {1229, 
  98}, {1116, 99}, {1180, 100}, {1216, 101}, {1179, 102}, {1255, 
  103}, {1280, 104}, {1284, 105}, {1216, 106}, {1286, 107}, {1240, 
  108}, {1228, 109}, {1301, 110}, {1333, 111}, {1282, 112}, {1356, 
  113}, {1261, 114}, {1355, 115}, {1306, 116}, {1346, 117}, {1369, 
  118}, {1330, 119}, {1365, 120}, {1455, 121}, {1341, 122}, {1350, 
  123}, {1298, 124}, {1374, 125}, {1413, 126}, {1395, 127}, {1340, 
  128}, {1423, 129}, {1364, 130}, {1308, 131}, {1306, 132}, {1353, 
  133}, {1327, 134}, {1395, 135}, {1320, 136}, {1308, 137}, {1320, 
  138}, {1346, 139}, {1388, 140}, {1248, 141}, {1316, 142}, {1262, 
  143}, {1212, 144}, {1245, 145}, {1205, 146}, {1217, 147}, {1161, 
  148}, {1143, 149}, {1142, 150}, {1163, 151}, {1037, 152}, {1040, 
  153}, {1072, 154}, {999, 155}, {957, 156}, {920, 157}, {966, 
  158}, {899, 159}, {830, 160}, {850, 161}, {791, 162}, {732, 
  163}, {673, 164}, {706, 165}, {622, 166}, {602, 167}, {560, 
  168}, {559, 169}, {475, 170}, {470, 171}, {413, 172}, {316, 
  173}, {344, 174}, {236, 175}, {233, 176}, {174, 177}, {159, 
  178}, {88, 179}, {42, 180}, {13, 181}, {55, 182}, {122, 183}, {146, 
  184}, {195, 185}, {240, 186}, {283, 187}, {371, 188}, {365, 
  189}, {442, 190}, {417, 191}, {462, 192}, {515, 193}, {569, 
  194}, {673, 195}, {642, 196}, {660, 197}, {677, 198}, {725, 
  199}, {805, 200}, {799, 201}, {901, 202}, {866, 203}, {916, 
  204}, {979, 205}, {1005, 206}, {1015, 207}, {1044, 208}, {1049, 
  209}, {1064, 210}, {1104, 211}, {1147, 212}, {1101, 213}, {1193, 
  214}, {1179, 215}, {1208, 216}, {1209, 217}, {1184, 218}, {1300, 
  219}, {1321, 220}, {1319, 221}, {1306, 222}, {1266, 223}, {1376, 
  224}, {1328, 225}, {1319, 226}, {1373, 227}, {1385, 228}, {1335, 
  229}, {1394, 230}, {1308, 231}, {1398, 232}, {1252, 233}, {1345, 
  234}, {1400, 235}, {1355, 236}, {1383, 237}, {1407, 238}, {1383, 
  239}, {1271, 240}, {1378, 241}, {1377, 242}, {1401, 243}, {1332, 
  244}, {1354, 245}, {1396, 246}, {1324, 247}, {1320, 248}, {1323, 
  249}, {1323, 250}, {1339, 251}, {1273, 252}, {1351, 253}, {1291, 
  254}, {1269, 255}, {1300, 256}, {1278, 257}, {1272, 258}, {1267, 
  259}, {1175, 260}, {1215, 261}, {1190, 262}, {1174, 263}, {1132, 
  264}, {1130, 265}, {1129, 266}, {1108, 267}, {1052, 268}, {1128, 
  269}, {1011, 270}, {1081, 271}, {1075, 272}, {1016, 273}, {952, 
  274}, {949, 275}, {974, 276}, {836, 277}, {924, 278}, {878, 
  279}, {909, 280}, {879, 281}, {865, 282}, {820, 283}, {779, 
  284}, {826, 285}, {728, 286}, {757, 287}, {782, 288}, {725, 
  289}, {663, 290}, {657, 291}, {660, 292}, {637, 293}, {678, 
  294}, {569, 295}, {579, 296}, {564, 297}, {542, 298}, {522, 
  299}, {492, 300}, {484, 301}, {465, 302}, {455, 303}, {437, 
  304}, {391, 305}, {411, 306}, {383, 307}, {369, 308}, {331, 
  309}, {304, 310}, {308, 311}, {304, 312}, {274, 313}, {226, 
  314}, {235, 315}, {235, 316}, {230, 317}, {216, 318}, {222, 
  319}, {190, 320}, {191, 321}, {177, 322}, {147, 323}, {137, 
  324}, {138, 325}, {126, 326}, {120, 327}, {103, 328}, {84, 
  329}, {84, 330}, {74, 331}, {77, 332}, {66, 333}, {66, 334}, {48, 
  335}, {45, 336}, {48, 337}, {44, 338}, {40, 339}, {37, 340}, {34, 
  341}, {27, 342}, {25, 343}, {13, 344}, {17, 345}, {16, 346}, {6, 
  347}, {8, 348}, {5, 349}, {3, 350}, {3, 351}, {3, 352}, {2, 
  353}, {1, 354}, {1, 355}, {1, 356}, {0, 357}, {0, 358}, {0, 
  359}, {0, 360}, {0, 361}}
That didn't work so good. I should have taken a picture. First number of the pair is how many ++'s per the angle (a-b) for the second number. Well, I can easily do "a" for the second number and "b" for the 3rd.

Here is a picture that is easier to understand perhaps. The counts are large on some because 1 million trials.

Image

The total of the ++'s is 251613 for all "a" and "b". Which is what we would expect. About 1/4 of the trials.
.

Re: Coming Soon!

Posted: Fri Nov 05, 2021 7:14 am
by gill1109
OK, you showed me the 361 pairs of numbers which I would denote by ( N(++ | d ), d ) where d = a - b runs from 1 to 361.

Add the counts altogether and I would denote that by N(++)

Re: Coming Soon!

Posted: Fri Nov 05, 2021 8:13 am
by FrediFizzx
gill1109 wrote: Fri Nov 05, 2021 7:14 am OK, you showed me the 361 pairs of numbers which I would denote by ( N(++ | d ), d ) where d = a - b runs from 1 to 361.

Add the counts altogether and I would denote that by N(++).
It is not RUNS 1 to 361. Those are the (a-b) angles at one degree increments (IOW, bins). The run was still a million trials. We are counting ++'s per angle.

Going back to P(++ | a,b) = N(++ | a, b) / N(a, b), P(++ | a,b) is an average also besides being a probability.
.

Re: Coming Soon!

Posted: Fri Nov 05, 2021 3:30 pm
by Justo
gill1109 wrote: Fri Nov 05, 2021 12:02 am Great that this new forum is up and running!

Fred, in the original formula
Image
the probabilities are understood as conditional probabilities, or probabilities given a fixed experimental condition; i.e. P(++) is shorthand for P(++ | a,b), and similarly for the other three.

Thus:

P(++ | a,b) = N(++ | a, b) / N(a, b)

In the earliest literature where something like this formula was first used, they were moreover not (estimated) probabilities (relative frequencies), but absolute frequencies. One reason for this is that in a decent experiment, N(a, b) is roughly constant. So one could replace all relative frequencies with absolute frequencies.
Ok. However to test the inequality you need mean values. How to do calculate means from absolute frequencies?
gill1109 wrote: Fri Nov 05, 2021 12:02 am By the way, I have to apologise: your claim: “I'm claiming in the context of a simulation where x = RandomReal[0, 2 pi] is dependent on the number of trials, that if < f > = < g >, then f = g” is absolutely correct.

(A mathematician might add the phrase “with probability one” to the condition "if < f > = < g >".)
I do not understand this. "<f>=<g>, then f=g" is obviously false in general. Of course, it does not mean that it could happen f=g in a particular case. Why we should have f(x)=g(x) for this particular case?

Re: Coming Soon!

Posted: Fri Nov 05, 2021 4:17 pm
by FrediFizzx
Justo wrote: Fri Nov 05, 2021 3:30 pm
gill1109 wrote: Fri Nov 05, 2021 12:02 am By the way, I have to apologise: your claim: “I'm claiming in the context of a simulation where x = RandomReal[0, 2 pi] is dependent on the number of trials, that if < f > = < g >, then f = g” is absolutely correct.

(A mathematician might add the phrase “with probability one” to the condition "if < f > = < g >".)
I do not understand this. "<f>=<g>, then f=g" is obviously false in general. Of course, it does not mean that it could happen f=g in a particular case. Why we should have f(x)=g(x) for this particular case?
Akkk!! Same mistake everyone else made. "in general". We are not talking about "in general".
.

Re: Coming Soon!

Posted: Fri Nov 05, 2021 10:40 pm
by gill1109
Justo wrote: Fri Nov 05, 2021 3:30 pm I do not understand this. "<f>=<g>, then f=g" is obviously false in general. Of course, it does not mean that it could happen f=g in a particular case. Why we should have f(x)=g(x) for this particular case?
Fred is considering the case when X1 … Xn are a random sample from a uniform distribution. Think in particular of the case n = 1. Then <f> = f(X) where X is uniform on [0, 2 pi]. If <f> = <g> with probability 1 (I.e., with certainty!), then f = g almost everywhere. Under some regularity conditions (e.g., f and g are continuous) then f = g everywhere.

I didn’t figure out a proof for larger “n” but I’m sure it’s true, up to some sensible regularity assumptions.

Re: Coming Soon!

Posted: Sat Nov 06, 2021 4:33 am
by Justo
gill1109 wrote: Fri Nov 05, 2021 10:40 pm
Justo wrote: Fri Nov 05, 2021 3:30 pm I do not understand this. "<f>=<g>, then f=g" is obviously false in general. Of course, it does not mean that it could happen f=g in a particular case. Why we should have f(x)=g(x) for this particular case?
Fred is considering the case when X1 … Xn are a random sample from a uniform distribution. Think in particular of the case n = 1. Then <f> = f(X) where X is uniform on [0, 2 pi]. If <f> = <g> with probability 1 (I.e., with certainty!), then f = g almost everywhere. Under some regularity conditions (e.g., f and g are continuous) then f = g everywhere.

I didn’t figure out a proof for larger “n” but I’m sure it’s true, up to some sensible regularity assumptions.
Now it makes sense to me. Since I was not the only one who did not understand I think the problem was the way he explained it.

Re: Coming Soon!

Posted: Sat Nov 06, 2021 5:17 am
by FrediFizzx
Justo wrote: Sat Nov 06, 2021 4:33 am Now it makes sense to me. Since I was not the only one who did not understand I think the problem was the way he explained it.
Justo, apparently you need to actually read and understand more of what I say. It was fully explained before this new forum was started. Please pay closer attention if you are going to comment on something. I am really tired of seeing you post nonsense and have to call you out for it. But occasionally you do have some good things to say.
.

Re: Coming Soon!

Posted: Sat Nov 06, 2021 11:32 am
by gill1109
gill1109 wrote: Fri Nov 05, 2021 10:40 pm
Justo wrote: Fri Nov 05, 2021 3:30 pm I do not understand this. "<f>=<g>, then f=g" is obviously false in general. Of course, it does not mean that it could happen f=g in a particular case. Why we should have f(x)=g(x) for this particular case?
Fred is considering the case when X1 … Xn are a random sample from a uniform distribution. Think in particular of the case n = 1. Then <f> = f(X) where X is uniform on [0, 2 pi]. If <f> = <g> with probability 1 (I.e., with certainty!), then f = g almost everywhere. Under some regularity conditions (e.g., f and g are continuous) then f = g everywhere.

I didn’t figure out a proof for larger “n” but I’m sure it’s true, up to some sensible regularity assumptions.
Here’s a proof in the general case. Suppose f and g are not equal almost everywhere. Then there is a set of positive probability where f > g, or a set of positive probability where f < g. Consider the first case. There is then, for any n, positive probability that all of X1, X2, .., Xn lie in this set, consequently that <f> > <g>. In the second case we get a similar result for <f> < <g>. In either case: <f> is not certain to equal <g>.

Conclusion: f and g are essentially the same function if and only if <f> = <g> is certainly true.

Re: Coming Soon!

Posted: Sat Nov 06, 2021 1:01 pm
by FrediFizzx
I don't think we have had the simple version of the awesome simulation with 3D vectors. So, here it is with no quaternions for those that are challenged by quaternions. :D 5 million trials; one degree resolution. Though we lose some explanation power without quaternions.

Image

A thing of real beauty. If that ain't convergin' to -a.b, I''ll eat my hat except I don't have one. :D

Cloud File.

https://www.wolframcloud.com/obj/fredif ... e-forum.nb

Direct Files.

download/newCS-34--3D-simple-forum.pdf
download/newCS-34--3D-simple-forum.nb

But even this more simple version blows up to a jillion pieces Bell's junk physics theory and Gill's junk theory! Enjoy!!! :mrgreen: :mrgreen: :mrgreen:
.

Re: Coming Soon!

Posted: Sat Nov 06, 2021 11:50 pm
by gill1109
FrediFizzx wrote: Sat Nov 06, 2021 6:49 pm Here is 10 million trials to go with the 5 million trials.

Image

WoW! You Bell fanatics are doomed for good. :mrgreen: :mrgreen: :mrgreen:
.
I'm happy for you, Fred!

How's the paper on this model coming on?

Re: Re: Coming Soon!

Posted: Sun Nov 07, 2021 1:40 am
by FrediFizzx
@gill1109 Thanks. Paper is in a lull right now. It will NOT be coming soon after all.

So if any non-nonsense questions or comments, NOW is the time to do it.
.

Re: Coming Soon!

Posted: Sun Nov 07, 2021 1:23 pm
by FrediFizzx
Here is 10 million trials to go with the 5 million trials.

Image

WoW! Is it SCREAMING -a.b or what? You Bell fanatics are doomed for good. :mrgreen: :mrgreen: :mrgreen:
.

Re: Coming Soon!

Posted: Mon Nov 08, 2021 10:07 pm
by gill1109
FrediFizzx wrote: Fri Nov 05, 2021 8:13 am
gill1109 wrote: Fri Nov 05, 2021 7:14 am OK, you showed me the 361 pairs of numbers which I would denote by ( N(++ | d ), d ) where d = a - b runs from 1 to 361.

Add the counts altogether and I would denote that by N(++).
It is not RUNS 1 to 361. Those are the (a-b) angles at one degree increments (IOW, bins). The run was still a million trials. We are counting ++'s per angle.

Going back to P(++ | a,b) = N(++ | a, b) / N(a, b), P(++ | a,b) is an average also besides being a probability.
.
I said “d runs from 1 to 361”.
That means: d takes successively the values 1, 2, …, 361.
I know that d is a - b.

Yes, P(++ | a,b) is an average, and it is a relative frequency. A statistical estimate of a probability.

Yes, your graph screams negative cosine. Try blowing up the differences by a factor square root of numbes of trials (I.e., number of trials for each data point, I.e. the numbers in the denominator of each fraction). If the result looks like random noise, you could be on target now.

Re: Coming Soon!

Posted: Mon Nov 08, 2021 11:20 pm
by FrediFizzx
gill1109 wrote: Mon Nov 08, 2021 10:07 pm
FrediFizzx wrote: Fri Nov 05, 2021 8:13 am
gill1109 wrote: Fri Nov 05, 2021 7:14 am OK, you showed me the 361 pairs of numbers which I would denote by ( N(++ | d ), d ) where d = a - b runs from 1 to 361.

Add the counts altogether and I would denote that by N(++).
It is not RUNS 1 to 361. Those are the (a-b) angles at one degree increments (IOW, bins). The run was still a million trials. We are counting ++'s per angle.

Going back to P(++ | a,b) = N(++ | a, b) / N(a, b), P(++ | a,b) is an average also besides being a probability.
.
I said “d runs from 1 to 361”.
That means: d takes successively the values 1, 2, …, 361.
I know that d is a - b.

Yes, P(++ | a,b) is an average, and it is a relative frequency. A statistical estimate of a probability.

Yes, your graph screams negative cosine. Try blowing up the differences by a factor square root of numbes of trials (I.e., number of trials for each data point, I.e. the numbers in the denominator of each fraction). If the result looks like random noise, you could be on target now.
I don't need to do that. I'm on target as good as one can probably get using +/-1's. Remember, it is impossible to produce a continuous wave with +/-1's. You can always find a hole somewhere in-between data points. Diether-Gull law. :D

Re: Re: Coming Soon!

Posted: Tue Nov 09, 2021 11:33 am
by gill1109
You are nicely on target, I agree. The next thing I want to see are the errors, blown up by a factor sqrt N.

Here is a perfect simulation of the EPR-B correlations, with setting differences 0, 1, ..., 360 degrees. I use the same number of trials, N, for each angle difference. I did the experiment with N = 1e02, 1e04 and 1e06.

https://rpubs.com/gill1109/epr-b

I blew up estimate minus truth by sqrt N. Notice one gets to see what looks like pure noise, roughly of the same size however large N is, but with an amplitude which has a pretty shape. Any statistician can write down the formula for you. I will add that later. It will be fun to see if your "noise" looks like what it ought to look like!