0% found this document useful (0 votes)
37 views19 pages

Search For Future Influence From L.H.C.: Holger B. Nielsen

The document proposes an experiment to test for possible backward causation or "influence from the future" by the Large Hadron Collider (LHC). It involves pulling cards from a stack that dictate restrictions on the LHC run, such as luminosity, energy levels, or full shutdown. The purpose is to see if the future possibility of producing many Higgs particles, a new fundamental scalar, somehow influences the card selection to avoid their production. The authors have theorized that the probability of an event occurring depends on its outcome at all times, including the future, which could allow influence from the future.

Uploaded by

evilmonsterbeast
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views19 pages

Search For Future Influence From L.H.C.: Holger B. Nielsen

The document proposes an experiment to test for possible backward causation or "influence from the future" by the Large Hadron Collider (LHC). It involves pulling cards from a stack that dictate restrictions on the LHC run, such as luminosity, energy levels, or full shutdown. The purpose is to see if the future possibility of producing many Higgs particles, a new fundamental scalar, somehow influences the card selection to avoid their production. The authors have theorized that the probability of an event occurring depends on its outcome at all times, including the future, which could allow influence from the future.

Uploaded by

evilmonsterbeast
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Search for Future Inuence from L.H.C.

arXiv:0707.1919v2 [hep-ph] 16 Jul 2007

Holger B. Nielsen 1
The Niels Bohr Institute, University of Copenhagen, Copenhagen , DK2100, Denmark and

Masao Ninomiya 2
Yukawa Institute for Theoretical Physics, Kyoto University, Kyoto 606-8502, Japan

Abstract We propose an experiment which consists of pulling a card and use it to decide restrictions on the running of L.H.C. at CERN, such as luminosity, beam energy, or total shut down. The purpose of such an experiment is to look for inuence from the future, backward causation. Since L.H.C. shall produce particles of a mathematically new type of fundamental scalars, i.e. the Higgs particles, there is potentially a chance to nd hitherto unseen eects such as inuence going from future to past, which we suggest in the present paper.

1 2

On leave of absence to CERN, Geneva from 1 Aug. 2007 to 31 May 2008. Also working at Okayama Institute for Quantum Physics, Kyoyama 1, Okayama 700-0015,

Japan.

Usually it is believed by causality that backward causation [1], in the sense of what happens at a later time inuences what happens earlier, does not occur. But each time we pass a new energy scale, so as to produce for example a type of particle with new mathematical properties, we should retest our well working principles in earlier experiments. This model of ours is the one for the initial conditions of the Universe in a similar way as that of the no-boundary initial condition postulate by Hartle and Hawking [2]. We suggest that our theoretical model building [35] especially calls for such an experiment. When the Higgs particle shall be produced, we shall retest if there could be inuence from the future so that, for instance, the potential production of a large number of Higgs particles in a certain time development would cause a pre-arrangement so that the large number of Higgs productions, should be avoided. Such prearrangements may be considered inuence from the future. One of us (H.B.N.) has contemplated through past several years the idea of inuence from the future on the other settings [6, 7]. One also nds such future inuence on eective coupling constants in Baby Universe theories [811], and in some model behind the multiple point principle [7,12,13]. We used ourselves the baby universe theory [14].

Our model with imaginary part of action


In previous publications we described our model by simply putting up a functional called P [path] depending on the path, (which could be most easily chosen as a classical path of all the elds in the universe) that denotes the probability of just this path being realized. The idea should be that this P should be calculable from some physically reasonable formula from the path. Since we would like P to depend on the path in a way that obeys the usual physical symmetries and principle of locality in space time it is expected to be of such a form as P e2SI [path] where SI [path] has a similar form as that of the action SI [path] = LI (x) gd4 x. (2) (1)

The most elegant formulation of our model is simply to allow the action S [path] in the Feynman path way integral eiS [path] D [path] to be complex (3)

S [path] = SR [path] + iSI [path]

and then assume that the imaginary part of the Lagrangian density LI (x) in (2) has extension thereof. Only the coecients much the same form as the real part LR (x) in, say, the Standard Model, or some
1 2 2 Z, m , gN

etc. of the various terms (3)

1 a F (x)F a (x) , 2 gN

/, , Z D

(4)

etc. are not the same for LI as for LR but the forms of the eld dependences are the same, since the renormalization factors, Z and other coupling constants, are dierent in LR and LI . At rst stage a model of this type seems to be obviously false, since

SI [path] which gives the probability of the development path of the universe would a priori depend strongly on what goes on today or later. Thus such eects would look like that the universe were prearranged to achieve various goals, that would obtain the largest possible negative contributions to SI . However, we believe to have found some arguments that the importance of the ination era should be much bigger than the present era in selecting the to be realized path, and henceforth the dependence of what goes on today is strongly suppressed. In this way we claim to obtain the second law of thermodynamics out of our model. The governing of development of the universe as to avoid production of Higgs particle is suggested in our previous works to be the rather tiny left over eect of this general government that makes SI possibly most negative. We have already put forward a model for unifying equations of motion and the choice of the initial conditions, or better the selection of the solutions of the equation of motion to be realized. It is at least some unication to get the selection of the solution to be realized by some law. The very unusual feature of this type of models is that such an imaginary part of the action SI = which leads to a probability weighting e2SI 3 (6) LI dt (5)

depends not only on the happenings in the very rst moment of the existence of the Universe, but also on what happens at all times. If we did not provide detailed speculations that the main eect on selecting the solution to be realized comes from the big bang era, our model would be falsied by the upper bound on occurrence on prearranged events, or say by the second law of thermodynamics. We have, however, some rather naturally working mechanisms [3, 4] that can make the eects of the imaginary part of the action negligible under some conditions that likely prevail until L.H.C. starts colliding beams. In fact we have in earlier articles [3,4] argued that the imaginary part LI of the Lagrangian would be constant - and thus unimportant - in the following [3] cases: 1) If particles are either (a) nonrelativistic and conserved (b) massless (i.e. say photons) then the SI -eect will be negligible. or 2) Even with relativistic particles the eect of SI vanishes provided the Lagrangian LR + iLI is homogeneous in a eld type and has only one by symmetry restrictions independent coecient; for example, the Standard Model quark and lepton elds and we only one term until Yukawa couplings added. ative to one of the kinetic terms, say the right handed one. 3) In addition the imaginary part of the Lagrangian LI for a Yang Mills theory gets forbidden provided there exist monopoles [3, 15]. In daily life the point 1) is enough to suppress the eects of inuence from future, via LI , so that no prearrangements would occur strongly there. However, high energy physics machines with their relativistic particles would, if it were only for 1), may inuence their past and for instance such inuence could have meant that these machines would have been met with bad luck by prearrangement and got their funds cut so as not work. Seemingly there were no such eects of bad luck for relativistic accelerators as ISR wherein the particles were even stored for long times. To rescue 4 Lagrangian with SU (3) SU (2) U (1) is homogeneous in second order in the

But the latter can be chirally transformed to have no independent phase rel-

our model from being already falsied by ISR etc., we could, however, make in our opinion the very mild speculation that fundamentally there exists magnetic monopoles [15] as far as allowed for the Yang Mills elds in the Standard Model. Such monopoles together with the remark that the Lagrangian of the fermions quarks and leptones - is homogeneous in the fermion elds could by means of 2) and 3) provide the argument for that even for the high energy experiments so far no eects of bad or good luck should have been observed. However, the Higgs particles is the rst fundamental scalar to be investigated and the arguments 1), 2), 3) above may very likely not be sucient for eliminating inuence from future eects concerned with Higgs. Thus it is really not unrealistic that precisely at the rst a large number of of Higgs production also our model-expectations that is inuence from the future would show up. Very interestingly in this connection is that the S.S.C. in Texas [16] accidentally would have been the rst machine to produce Higgs on a large scale. However it were actually stopped after a quarter of the tunnel were built, almost a remarkable piece of bad luck.

Proposal of the experiment


If we just very generally consider a model in which the probability P (s) = e2SI (s) for a solution s to the equations of motion to be realized is a functions of what happens on this solution s at all dierent times t, we should be able to see inuences from the future. If as is suggested above P (s) would depend on whether Higgs particles are produced or not in large amounts during the world development s, then the actually realized development would seem to either seek or to avoid Higgs production. It seems most likely that production of Higgs particles should lead to smaller P (s) than no Higgs production since otherwise there would presumably already have been produced lots of Higgs particles in nature somehow. With this model we expect, that a Higgs producing machine will be stopped by some accident or another if the eect is suciently large and it gets exponentiated with the number of Higgses produced P (swith machine ) C Higgses P (swithout ) 5 (7)

The experiment proposed in the present article is to give the foresight, so to speak, a chance of avoiding having to close L.H.C. by some funding or other bad luck accident, as it happened to S.S.C., by instead playing a game of pulling a card from a well mixed stack about the running of L.H.C.. We have in mind that one produces a big stack of cards on which there are written various restrictions for the running of L.H.C. such as for example allow to produce only 10 Higgs particles. On most of the cards there should be just written use L.H.C. freely so that they cause no restrictions. But on a very small fraction of the cards there should be restrictions for luminosity or beam energies or some combination. On one card one may even have close(shut down) L.H.C.. The crucial idea of this proposal is that if our model were true then the most likely development s with the P (s) e2SI (s) factor included would be a development with one of the cards with strong restriction on the Higgs production at L.H.C..

Estimation of probabilities and choice of closing L.H.C. probability for game


We think that before settling for the rules of the game of card about restricting L.H.C., one should carefully discuss what is the most economical and optimal value to choose for, for instance, the probability of the closing L.H.C. card. Thus we suggest a severe discussion before settling, of course limited in time by the starting plans. In order to give an idea about what probability p to choose for closing say L.H.C. or essentially closing - we postpone the partial closings or milder restrictions to next section -, we shall introduce the following symbols for the relevant probabilities: r ; the probability that our model is right so that there is a prearrangement mechanism ensuring that L.H.C. will not come to run. a; the probability that without any such mysterious interference the L.H.C. will by accident fail so as not to start. d; the average excess damage occurring under an accidental bad luck event preventing L.H.C. from working relative to the value of L.H.C. itself. p; the probability for the close L.H.C. card or an essentially close L.H.C. card in the stack. (We can choose p) 6

The numbers r, a and p should of course be very small, whereas the excess average damage, presumably is of order unity. One could, however, estimate that this damage extra presumably involves even human lives so that several people may be killed during some explosion stopping L.H.C.. In that case the damage could turn out appreciably more severe than the pure loss in L.H.C. itself. So we might take the loss d value of L.H.C., say an order of magnitude bigger than the value of L.H.C.. I.e. we could reasonably think d 10. Taking it that r, a and p are all taken small of course the most likely is that

L.H.C. just starts up quite expectedly, and only the small probability that it has normal accident a, an accidental stop due to the card close L.H.C. p. In the with probability r case that our model is right L.H.C.cannot be allowed to start up. But now it can fail in two ways: With probability With probability
p a+p a a+p

accident, and with likely extra damage given by the factor d.

r will have a seemingly normal

these probability estimates we used that the two types of stopping should occur with relative probability a : p as if our theory were wrong. We can now estimate the average cost due to the various failures in the natural units of L.H.C.values. Let us indeed denote by C the average loss doe to severe failure in units of the price say 3.2 109 CHF(Swiss Francs), of L.H.C.itself. C = p + a (d + 1) + r p a + (d + 1) a+p a+p (8)

r it will be stopped by the card close L.H.C.. To give

Here we took the loss, (d + 1) by seemingly natural failure as the excess loss d plus the loss 1 of the machine itself. Simplifying it we get C = (p + a (d + 1)) 1 + r a+p . (9)

Since p is at our disposal one would say that we should choose it for ethical and economical reasons so as to minimize the loss in L.H.C.-price units. This minimization occurs for r r C = (1 + ) + (p + a(d + 1))( ) p a+p (a + p )2 ard = 1 (a + p )2 = 0

(10)

which leads to p2 = 2pa + a2 ard = 0 which has the roots p = a ard . (12) (11)

Of course we have to let the chance for the closing card p be positive and thus we can only choose p 0. If ard < a r< a d (13)

i.e. if the chance for our theory to be right r is less than the chance for a natural failure of L.H.C.divided by the excess damage factor d, it would not be optimal to make our card game for any possible value and it would cause damage to perform our experiment and one should only do it in order to get our theory conrmed (or falsied). If, however, one judges that the chance of correctness of our model is so big that r> a d (14)

then it would be uneconomic and unethical not to perform our card game. In this case, supposing we only compute with orders of magnitude the optimal value of p to choose for damage avoidance purposes should be p = a + ard ard . (15) ard a. This means what

But under this assumption of it being relevant this

is obvious anyway: Unless the chance of pulling close L.H.C. p is at least as big order-of-magnitude wise as the chance of a normal failure of L.H.C.it is of no use for preventing damage to perform our proposed card game. would stop L.H.C.by a normal failure rather than by our card game.

Also of course it means that if we choose p a even if our theory were right it

Consideration on checking our model


The purpose of doing the proposed card play about potential restrictions or closing of L.H.C.is in addition to economical or ethical attempt to rescue L.H.C.from even 8

worth fates to make a very clean test of our model. Crudely a supercially normal accident stopping L.H.C.from starting would already be a strong evidence for our model. However, it would be numerically even more clean if it were stopped by a card play. Then one would have a very clean knowledge of the statistical accuracy with which our prearrangement eect had worked and had been tested. To know in advance a good estimate for a that L.H.C.would suer an ordinary failure is not so easy. Therefore one could easier talk away such a natural failure and say that in spite of it one should not trust our theory Oh, it were some diplomatic bad accident. With a card play showing a single special card among 2 million cards could only be achieved either by a card magician or by a model like ours. Of course in principle a statistical freak would be possible but in practice not! But in order that our model be safely conrmed we have to choose p so small that a pulling of close L.H.C. would indeed be convincing. To suggest the number for that we may recall that the discovery of the Higgs particle is suggested to be performed by a 5 standard deviation peak. A 5 standard deviation peak occurs by accident in a band only with the probability 5 107 i.e. one in 2 millions. If one should trust these 5 standard deviation discoveries even if one had say 10 mass discovery even if an accidental reproduction of data would occur with probability 10 5 107 = 5 106 or one in 200,000. An experiment of our proposed type with such a for discovery barely needed bands in which to look for the Higgs peak it would mean that one would accept a

p 5 106 would stop L.H.C.with this probability and that would on the average 1.7 104CHF(Swiss Fran) But this average loss of 17,000 CHF would be compensated

mean an expense equal to 5 106 cost of L.H.C. = 5 106 3.3 109 CHF by the danger of a natural stop by explosion or bankruptcy of CERN or the like caused by our eect of SI in the case our theory were right. Compensation in average cost would occur provided: 1. it were more unlikely than p = 5 106 that L.H.C.would have a normal failure and if 2. d the excess loss times the chance that our model were true r , i.e. dr were bigger than p 5 106 If you include the dangers of the reason for the failure of L.H.C.could be war 9

between the member states of CERN of course the extra damage d could be very and then big, but that sounds exaggerating. Presumably we should take d 1 to 10 say 5 dr 20r 5 106 = psuggest for r 106.

(16)

I.e. if there were just one chance in a million that our model were right and if

normal failure were extremely seldom, then the 17,000 CHF would be already paid for. If our model had more than one chance in a million one might rather begin to worry that taking only p 5 106 might cause too much danger, so this failure by itself a were bigger than 5 106 our card experiment would fail in the sense that the card pulled would not be the close L.H.C. even if our model were true. Nevertheless we might come to believe our model in that case by seeing a natural accident stopping L.H.C.. But that would less clearly convincing of our model, and it could be appreciably more expensive. We think that it is urgent to perform an honest estimate of the reliability of the L.H.C.construction going to nish so as to have the machine work, i.e. to estimate a. Such an estimate of a could be crucial for the decision as to what p to choose, i.e. the rules of the game about L.H.C..

Partial closings
It would presumably pay to make not only a single possibility close L.H.C. to possibly be pulled, but allow several incomplete closings on some cards in the stack. It would of course be a bit less of a loss to pull allowed only 20 Higgses rather than a total close L.H.C. card. There could be many variants of the restriction cards, limits on beam energy, limits on luminosity, or on the living time of the machine, or it could postpone the start. But all the strong restriction cards should together only have some small probability p0 so just pulling one of them should convincingly conrm the truth of our model. 10

Action for particles from action for elds


In this article we are suggesting that the Higgs particle, which we have not yet studied well, will lead to our inuence from future eects while we - because otherwise our model would be already falsied - suggest that the eect is not present for the already found particles: quarks, leptones, and gauge particles. In order to explain the speciality of the Higgs particle we shall here study how the action for a classical particle approximation to a eld theory goes on. In the usual case of just a real action for the (quantum) eld theory you can identify particles as wave packets in the eld moving along. But then the action you should use is for a particle propagating in space time SR
part

1 wave oscillations 2

(17)

where these wave oscillations are the phase rotations in wave packet represented on the eld propagating in space time. It may be a little surprising that the action for the particle description of the world SR
part

is not simply the action contribution to the eld theory action SR from

the wave packet propagating and representing the particle. Indeed it is easy to see that if the Lagrangian with respect to a certain type of eld is homogeneous as for instance the -involving part of the Lagrangian (D / m + g y ) then the Lagrangian can be constructed from the equation of motion (D / m + g y ) = 0 (19) (18)

expression and it follows that in the classical eld approximation the action for the is zero (onshell). So it is needed that the eective action for the particle description is not simply the contribution to the eld theory action SR , because then we would have got just zero contribution from all the free pieces of the particle (in between interaction points). Now we already saw that the main physical signicance of the imaginary part that SI becomes this meaning as minus half the logarithm of the probability weight. SI of the action is that a development path is assigned a probability P e2SI so

In going over from the eld description to a description with particles the wave 11

development in the two dierent languages description modes. Thus contrary to what we just claimed for the real part SR , that SR part = SR (20)

log P should be the same if we describe the same weight P and thus SI = 1 2

we need for the imaginary part - due to its physical signicance - to have correspondence SI part = SI (21)

in the situations corresponding to each other. But now it is easily seen that the argument for action being zero in the homogeneous case works to make both imaginary SI and real SR parts of the actions zero. Because phenomenologically we do not seem to see any prearrangement eects involving quarks and leptones - accelerators with relativistic electrons not producing say Higgs particle have ran without interruptions - we have to take this to mean that in the particle description SI part = 0 for the homogeneous action particles.

What is so special about the Higgs?


The special property of the Higgs that makes it such a favourite candidate for showing up eects due to our imaginary part of the action SI or Lagrangian LI is that 1) it is not a gauge particle and so no monopole existence argument could be used to exclude imaginary coecients and 2) There are even in the free part of the have by symmetries unrestricted coecients, so that these independent coecients could have dierent phases. Also for quarks and leptones one have at rst independent coecients on the kinetic and the mass term, but for them one can perform a chiral transformation L L R ei R which can be adjusted to bring the mass and the kinetic coecients to have the same phase. Thereby the imaginary part of the Lagrangian LI be proportional to the real part LR
quarks & leptons . quark & leptons

Higgs Lagrangian two terms, the kinetic |D H |2 and the mass term m2 |H |2 which

(22)

is forced to

Since then the Lagrangians are

12

homogeneous of second order one gets LR and thus also LI


quarks & leptons

quark & leptons

= 0 using equations of motion

= 0. to vanish. However, the Higgs

The Higgs Lagrangian is not even homogeneous which could be a further reason for making it lacking an argument for LI Lagrangian LHIGGS (x) = Z |dH |2 m2 |H |2 |H |4 4 (23)
Higgs

rest, which is of second order. The equation of motions from dierentiating with respect to Zd d H m2 H and from dierentiating with respect to
2 Zd d H m H

|H |4 term, which is of fourth order contrary the is not homogeneous due to the 4

2 | H | 2 H = 0 4

(24)

2 | H | 2 H = 0 4

(25)

will, by multiplication with elds H and H respectively, and adding and subtracting, not - as for homogeneous case - lead to that both real and imaginary part of the eld theory action are zero, but rather to the Lagrangian on shell values LR = R | H | 4 , 4 I LI = |H |4 . 4

(26)

Here we have expanded the self coupling on real and imaginary part = R + iI (27)

Now for the Higgs eld one shall have in mind that there is a big background or vacuum expectation value H eld which contributes and that it is in fact only the extra contribution coming from a true particle propagating through this vacuum and described by a wave packet in H . We may consider a single Higgs particle described by a wave packet in the Higgs eld Hwp . Then we obtain in the well known background eld case with H = Hbg = constant in space time = 246 GeV/ 2 that |H |2 |Hbg |4 + |Hbg |2 4|Hbg |2 + . 13 (28)

This means that we get a contribution to SI which again is identied with the particle SI part given as SI part = I |Hbg |2 4 |Hwp |2 d4 x . (29)

Now the density in 3-space of genuine Higgs particles is 2 = Hwp Hwp |Hwp | EH so that SI part = I |Hbg |2 4 1 d3 xdt . EH (31) (30)

For just one particle we have the normalization d3 x = 1 . (32)

Clearly for Higgs particles with a reasonably well dened energy EH we have that the eigentime dierential d = and thus we simply get SI part = 1 I |Hbg |2 4 mH d . (34) mH dt EH (33)

Therefore the imaginary action to use for a few description in terms of Higgses is as expected the eigentime integral do not truly know the imaginary part I of the self coupling, but a priori the guess part R which is of order
1 3 1 d multiplied by the constant I |Hbg |2 m . We H

would be that it as dimensionless of order unity, or rather same order as the real say.

Estimation of the eect of a Higgs particle


We see that the contribution to SI from a Higgs particle seen from its rest frame with the life time i.e. with
decay

d =
production

(35)

14

is SI part = I |Hbg |2 . 4 mH (36)

This contribution will lower the probability of the order of magnitude of the vacuum expectation value H = Hbg say half Hbg . Even if we put the Higgs width [17] as large as say 1 GeV the order of magnitude of the exponent in the probability decrease factor becomes of the order 100, which exponentiated becomes so big that no Higgses would be allowed to achieve a so long life time. Rather we should expect the Higgses to be by prearranged accident brought to decay much faster. We would expect an eective allowed width to rather be of the order 1 2I |Hbg |2 . mH (37)

Looking for this broadening of the Higgs width according to the eect of our model might be in itself a very interesting prediction [3]. But once the broadening takes place the eective probability decrease factor will only be of order unity or at least no smaller than of the order factor
1 O (100) apriori . 2I |Hbg |2 /mH

This would mean of the order a

rather than eO(100) . Because of such a mechanism of making Higgs

decay miraculously fast the suppression by each Higgs by more than something of order unity may be avoided and thus a few Higgses might be allowed such as could already have been seen at L.E.P., but huge amounts of Higgses should be totally avoided. So machines like L.H.C.making lots of Higgses should be stopped quickly, before having made more than a few Higgs particles! Concerning other particles than the Higgses, they lack the self interaction term of fourth order in the elds except for the Yang-Mills eld. Indeed because of renor for quarks and leptones cannot malizability requirements the Fermion elds , be allowed to more than second order. For them thus SI part = 0 the Yang-Mills imaginary LI contribution we seek to suppress by our argument about assuming monopoles [3, 15]
3

If we take it that our model imply MPP [7, 12, 13] should give the lowest allowed standard model Higgs [18] mass allowing only bb-decay and a much smaller width by a factor 500.

15

Conclusion and Outlook


We have put forward the proposal for what is really an experiment at L.H.C.looking for an eect of inuence from the future as proposed in our own model which most nicely is described by there being an imaginary part SI of the action S . The experiment is exceedingly primitive in as far as it consists simply of a card-pulling game arranged so that with a probability p of the order of 5 106 some severe restriction card - which has so low probability p 5 106 - were pulled this would essentially on the running of L.H.C.- essentially closing - is imposed. It indeed a restriction mean that our model or something like it had to be true! If, however, just a normal accidental stopping of L.H.C.occurs ! It must be warned that if our model were true and no such game about restricting strongly L.H.C.were played, or if the probability in the game p for restricting were too small, then a normal (seemingly accidental) closure should occur. This could be potentially more damaging than just the loss of L.H.C.itself. Therefore not performing (or not performing with suciently big p) our card game proposal could - if our model were correct - cause considerable danger. Of course a priori - as just a proposed eect to look for - the chance r of such a model being right is very low. However, we already published a few papers [3 5] on this type of backward causation model, and several predictions seem to be phenomenologically good: For instance we can claim to have speculations leading to a cosmological constant of the same order as the matter-density [5]. That is to say that our model is promising with respect to solving the cosmological constant problem and the why today problems. Also we claim that it is promising with respect to on explaining why there should be a bottom in the Hamiltonian [5]. A further consequence is the principle of many degenerate vacua [7, 12, 13] (MPP = multiple point principle) on which one of us have worked for many years with a bit of success. Finally let us mention that we are working on an article suggesting that our type of model could be claimed to be called for to cope with the measurement problem in quantum mechanics [19]. If one wants to have the settling of an eigenvalue of a measured quantity before the enhancement of the signal in the measurement instrument has occurred then some sort of backward causation seems to be called for: 16

card meaning no restriction is pulled our theory would be falsied unless a seemingly

Without the signal enhancement can one really say if it is a genuine measurement? We think that before really performing the proposed experiment we should carefully get discussed and evaluated the most optimal choice of the rules of the game. Most important to choose is p giving the chance assigned in the game for an essential closing of L.H.C.. However, we might in the case of an essential closure being imposed get interesting information about details of our in such a case conrmed model by having many cards with various partial closings. Say, how many Higgses would we it be allowed to let L.H.C.produce before closing completely? Such allowance of tiny amounts of Higgs production and running of L.H.C.could, if our model were true, give us some information on details of the model. It is presumably very protable to organize for several possibilities of partial closings. Such possibilities - which should be arranged by the rules to be very unlikely unless our model were true - might in case our type of model where right tell us e.g. about the size of I .

Acknowledgments
We acknowledge the Niels Bohr Institute (Copenhagen) and Yukawa Institute for Theoretical Physics for their hospitality extended to one of them each. The work is supported by Grant-in-Aids for Scientic Research on Priority Areas, Number of Area 763 Dynamics of Strings and Fields, from the Ministry of Education of Culture, Sports, Science and Technology, Japan. We also acknowledge discussions with colleagues especially John Renner Hansen about the S.S.C.

References
[1] Jan Faye, The reality of the future, Odense University Press. [2] J. B. Hartle and S. Hawking, Phys. Rev. D28 2960-2975 (1983). [3] H. B. Nielsen and M. Ninomiya, Future Dependent Initial Conditions from Imaginary Part in Lagrangian, Proceedings to the 9th Workshop What

17

Comes Beyond the Standard Models, Bled, 16 - 26 September 2006, DMFA Zaloznistvo, Ljubljana, hep-ph/0612032. [4] H. B. Nielsen and M. Ninomiya, Law Behind Second Law of ThermodynamicsUnication with Cosmology, JHEP ,03,057-072 (2006), hep-th/0602020. [5] H. B. Nielsen and M. Ninomiya, Unication of Cosmology and Second Law of Thermodynamics: Proposal for Solving Cosmological Constant Problem, and Ination, Progress of Theoretical Physics, Vol. 116, No. 5 (2006) hep-th/0509205, YITP-05-43, OIQP-05-09. [6] H. B. Nielsen and S. E. Rugh, Arrows of time and Hawkings no-boundary proposal, Niels Bohr Institute Activity Report 1995. [7] H. B. Nielsen and C. Froggatt, Inuence from the future, School and Workshops on Elementary Particle Physics, Corfu, Greece, September 3-24, 1995. [8] S. Coleman, Nucl. Phys.B307 867 (1988). [9] S. Coleman, Nucl. Phys.B310 643 (1988). [10] T. Banks, Nucl. Phys.B309 493 (1988). [11] S. W. Hawkings, Phys. Lett.134B 403 (1984). [12] D. L. Bennet and H. B. Nielsen, Int. Journ. Mod. Phys A9 (1994) 5155-5200. [13] H. B. Nielsen and M. Ninomiya, Degenerate vacua from unication of second law of thermodynamics with other laws, hep-th/0701018. [14] Reviews: H. B. Nielsen and M. Ninomiya, Lecture notes of International Symposium on the Theory of Elementary Particles, Ahrenshoop, DDR, October 17-21, 1988. [15] Chan Hong-Mo and Tsou Shwing Tsun, hep-th/9904102, April 1999, Int. J. Mod. Phys A14(1999) 2139-2172. [16] Jerey Mervis and Charles Seife, 10,1126/science, 302.5642.38, New focus: 10 years after SSC. Lots of Reasons, But few lessons.

18

[17] Lidiya Zivkovi c, Weizmann Institute, Measurements of Standard Model Higgs Parameters of Atlas, July 20, 2003, Conferences/2003/Praha 2003 LidiyaHiggs-parameters.ppt. [18] C. D. Froggatt and H. B. Nielsen, Standard Model criticality prediction top mass 1735 GeV and Higgs mass 1359 GeV, phys. letters B368 (1996) 96102. [19] H. B. Nielsen and M. Ninomiya, paper in preparation.

19

You might also like