The Time Gate
If Cramer's experiments as described previously succeed, this proves that there is a 'spooky action at a distance' associated with the observer entanglement but it will not measure the exact timing of this change in the light pattern caused by observer entanglement. To do that the experiment will have to be redesigned once more. The moveable detector in the upper light path will have to be replaced by something faster. Only when the changes measured in the two light paths can be timed exactly to within microseconds will we know for sure whether retrocausality is occurring and whether a Time Gate, as I will call it, has been built. A time gate I am defining as any practical device that can send information backwards through time. According to Cramer's own calculation a 10 km path difference produced by extending one path with a pair of fibre optic cables will create a 50 microsecond inverted delay. If his argument concerning information being transmitted between the individual entangled pairs of photons is correct, then that is exactly what will be observed. 50 microseconds does not sound like much of an inverted delay, but it is enough to build a time gate as I am about to show.
The fictitious telechronic batteries described in the Asimov page were limited in the range they could send an effect back through time by the number of cells. The information they sent was only a single bit, either they have or have not been exposed to solvent at a later point in time. In the real world things will not be so limited. Even a 50 microsecond inverted delay, with appropriate instrumentation becomes a very powerful discovery:
Given that it should be possible to receive a packet of information before it is sent, we can choose to re-send that same package back through time again and again. By re-sending the information many times it can be sent back indefinitely. The information can never be sent back to an earlier time than when the experiment first becomes fully operational but having once opened the 'Time Gate', there is no limit in principle to how far in the future information may come back to us from.
Anyone wanting to know how big a packet of information will be will realise that this correlates with how fast the light pattern can be modulated, and how quickly these modulations detected. This in turn will depend on quite a number of experimental factors. I will not go into this because this is all engineering (Although I have certainly given this subject consideration.) I am concerned here with the test and proof of a principle. If it works at all, we can be sure that people will soon improve the technique to the point where it works well enough that this packet can contain a useful amount of information. (And heck, next weeks lottery numbers is not a lot of information but it would be very useful knowing it a week before anyone else.)
The proof that the retrocausal transmission of information is possible leads to an entirely new world but it will take a lot to convince most physicists and probably at least one other group of researchers will have to repeat the experiment before it will be taken seriously. The current culture in physics is surprisingly conservative, there are only a very few researchers who would risk their time and reputation on this venture. This problem of the culture of physics is discussed by Lee Smolin in 'The Trouble with Physics'. Lee was kind enough to send a lengthy reply to my inquiries but did not directly discuss some of my questions. I realise that these are deep points and that it would be hasty to judge them, but I am aware that even Lee was cautious. I hope that he is giving careful consideration and will respond when the time is ripe. The consensus is that we already know all the important ideas at the low energy levels associated with this kind of experiment. The first shake up would come for physicists having to admit that retrocausality is even hypothetically possible. There is also a very important principle under investigation: Causality. It is generally believed that any form of time machine will lead to logical contradictions. Causality prevents this by always requiring that a cause must precede an effect. Where physical theory or experiment lead to a prediction that runs counter to causality, there is a very strong expectation that there is a mistake. Causality is fundamental and cannot be violated without a breakdown in the orderly understanding of reality.
The assumption of the conventional point of view that time and causation are inextricably linked is not entirely supported by the already known laws of physics. There are a number of different ways to define time. At a microscopic level of interactions between individual particles, it is impossible to distinguish past from future. The problem of the 'arrow of time' as it is known has long been discussed. The widely accepted conclusion is that the direction time flows can most meaningfully defined at a scale much large than individual quantum interactions and on this larger scale entropy enforces increasing levels of disorder with the passage of time. There is no avoiding entropy, it can never decrease but there are two problems with this approach, first that physics has to account for the extraordinarily low level of disorder at the moment of creation and second since this direction of time does not apply to individual interactions, it leaves a loop hole for supporters of time travel to question its authority. Any process that does not involve the degradation of energy to lower forms might in principle be time reversible. Only increases in entropy prevents time reversal. Observer entanglement involves a transmission of information without energy transfer, and by implication there will be no entropy increase associated with this form of interaction.
Ludwig Boltzmann back at the end of the 19th century developed the idea that the arrow of time can best be understood in relation to entropy but his arguments did not persuade physicists then because the universe was thought to be eternal and should have undergone 'heat death', everything should have become equally luke warm, in a state of thermal equilibrium. Boltzmann showed how arbitrarily large fluctuations from equilibrium were theoretically possible but others were not impressed. Today Bolzmann's idea is accepted as correct and the low level of disorder is considered to be a product of inflation. I remain unhappy with the Big Bang. Inflation has greatly improved its explanatory powers for times after the moment of creation, but the moment itself remains a complete mystery. Various theories concerning colliding branes and oscillatory universes have been mooted in recent years but none appears to be likely. The most interesting idea that goes beyond the Big Bang I have found is Lee Smolin's concept for evolving universes. The subject of how to define time was taken up again by Roger Penrose investigating conditions inside a black hole. It is known that as you approach the central singularity, space-time will be increasingly distorted and finally destroyed. He invented spin networks in 1971 to show how something resembling space-time can be built from simpler elements. This subject continues to be a investigated by Lee Smolin, Fotini Markopoulou and others.
More recently non temporal causality has directly come under study. It began with a mathematical structure called the Causal set first described by Bobelli and others in 1987 and developed by Sorkin in 1990. A causal set has members p, q, r... These members can then be ordered according to rules: p less than or equal to q, etc. The lowest order can be identified with the earliest point in time. From this simple beginning complex mathematical structures quickly grow. This is relevant to cosmologists trying to understand the Planck time, the era that lasts just 10 to the minus 35 of a second at the beginning of the universe during which none of the existing measures of time can be used. Space-time is scrambled into a foam and no known physical processes would exist. The approach adopted by most physicists is to reconstruct something that resembles conventional time, referred to as globally hyperbolic, however it does not have to be as simple as that, the mathematics allows for other possibilities. Computer theoreticians are exploring options with multi path time. These mathematical structures could form the basis for future quantum computer architecture. They may also be a description of an interconnected Many Worlds reality. I do not know, because I have not researched further but I suspect that there are ideas in Emergence and Complexity theory and the study of networks that are also consistent with a complex structure to time. Most physicists have been cautious to explore non globally hyperbolic theories; there is an expectation these theories lead to the breakdown of all order but I suspect this will be shown to be simplistic, rather they lead to the appearance of new and more richly structured orders.
Understanding time in a universe where information can flow in both directions will be hard. Neither Cramer's TI or the Many Worlds interpretation will be sufficient I suspect. A younger generation of scientists will take up these problems. Finally, if time travel is possible, the answers will come back to us from the future. The challenge will be to understand these new ideas.