8

The Church-Turing hypothesis says one can not build a computing device which has more computing power (in terms of computability) than the abstract model of Turing machine. So, there is something in our laws of physics which prevent us from making devices which are more powerful than Turing machine, so in this respect it can be viewed as a law of physics.

What is the physicists' view of Church-Turing hypothesis?

Can the Church-Turing hypothesis be deduced from other fundamental law of physics?

user774025
  • 1,431

4 Answers4

11

You are asking two questions. I am only going to address one of them:

Can the Church-Turing hypothesis be deduced from other fundamental law of physics?

There are two fundamental theories of physics that account for nearly all experiments and observations performed to date: general relativity and the Standard Model. If we could simulate these theories by Turing machines, then the outcomes of any experiment could be deduced by a Turing machine, and then then any physical computational device could be simulated by a Turing machine.

General Relativity: There was a breakthrough in numerical relativity in 2005, and we now have computer programs that do an excellent job of simulating general relativity. While we can't rigorously show that relativity satisfies the Church-Turing hypothesis, this is good evidence that it does.

The Standard Model: Lattice field theory seems to do a very good job of simulating the Standard Model (albeit with enormous computation times). Again, we can't rigorously show that the Standard Model satisfies the Church-Turing hypothesis, but this is good evidence that it does.

If you are talking about computing devices that can be built using any conceivable future technology, these two theories probably cover all of them.

Peter Shor
  • 11,536
5

This is an interesting question, but I think the description of the CT thesis given in the first paragraph is inaccurate. AFAIK the CT thesis is not a statement about what we can or can't build. It's more like an observation that a variety of superficially dissimilar mathematical (not physical) models of computation are all equivalent (ignoring complexity). In reality, we can't even build a Turing machine, since a Turing machine has an infinite memory (tape).

When mathematicians create a Turing machine, which has an infinite memory, they're indulging in abstraction. Similarly, when mathematicians talk about real numbers they're also doing an abstraction. The purpose of the abstractions is to make systems whose properties are simple, so that one can prove things about them easily. Physics works by taking finite-precision measurements, doing finite-memory computations on them, and using them to make testable predictions about other finite-precision measurements.

It's possible to take a physical theory, abstract it, and make it into something that's no longer physics but that one can easily write proofs about. An example would be Andreka 2011. In this flavor of work, which is mathematics and not physics, I think it's an open question as to what models of computation are realizable.

Andréka et al., Closed Timelike Curves in Relativistic Computation, http://arxiv.org/abs/1105.0047

1

Complement to the initial answer (below)
(added on September 4 2013)

My initial answer to this question (below) was only based on my knowledge of syntactic and semantics issues in computing, as well as some knowledge of various computing techniques such as computing on reals with infinite precision, or more precisely with arbitrary precision.

Looking for a better understanding of the issue, I found that it is currently actively researched. Though I have not explored much, I found that my perception of the crucial role of denumerability, which can be traced to the fact that everything is expressed syntactically, hence with denumerable systems of symbols, or symbols conbinations, is indeed justified.

One approach to prove Church-Turing thesis as a law of physics relies on assuming a specific property of the physical world, presented as dual of the limitation on the speed of light and information, which is a limitation on the space density of information, both limitations together ensuring density limitation in space-time. The translation of this new law in physical terms can actually be subtle to account for various existing physical laws. This apparently excludes unregulated use of real numbers.

Giving a bound to the amount of information to be found in a given volume of space-time seems to be the direct counterpart of the fineteness of what can be done by a computing process in finite time, and of the consequent denumerability of whatever may be considered without setting a time limitation.

I am also addind a note at the end to explain why, depite their infinite tape, Turing machines must be considered a physical model of computation, and have been thus considered by most people, afaik.


Initial answer

Note: I try to give the best answer I can, but I am stretching my confidence in my own understanding. The distinction between mathematics and physics is a topic I find fascinating (though I do not have much to say about it), and it is at the heart of the question, imho.

I somewhat disagree with Ben Crowell's interpretation of Church-Turing thesis. If it were the observation that a given variety of models of computation are all equivalent, it should be a mathematically provable hypothesis, or at least we could hope for a proof. This is not the case because the thesis states indeed the a function is computable according to whatever model of computation only if it is computable by a Turing machine (hence by any Turing complete model).

However this thesis is indeed motivated by the fact that all of the many models of computations that were designed by mathematicians and logicians turned out to be equivalent (a few being a bit weaker).

It is true that the Turing machine has an infinite memory tape, which is not too physical. This is not really essential since the interesting results are those computable in finite time, thus using only a finite part of the infinite tape. The tape is chosen to be infinite because it is not known in advance how much will be needed. The infinity of the tape becomes a problem in itself only when envisionning the possibility of infinite time, for example in Closed timelike curves. Many "non physical", or "not yet physical" (?) models of computation have been considered by mathematicians, which could become "physical" (that is, "operational") if ... (see hypercomputation). An interesting example is also oracle machines: what can we compute if we know how to solve such and such problem (not Turing computable, of course). If some physical breakthrough actually allowed to solve this problem, the problems addressable by oracle machine would become computable.

The wording of the previous paragraph actually means that, despite its infinite tape, the Turing machine is actually viewed as a physical device (by computer scientists if not by physicists), even if somewhat idealized. This is probably because a computer scientist considers that an infinite object is computable, or definable, if it is the limit of a sequence of finite objects that are all computable. A machine with infinite tape may be seen as the limit of a sequence of finite tape machines.

In this sense, we can say that Turing computability is a consequence of the laws of physics, since these law allow us to make computers, and we can in principle add memory as needed, for the time we are willing to wait for the result. This is no worse an approximation than physics theories have proved to be so far, given enough time to find out that they only approximate physical reality. This is also no worse than using calculus to reason about phenomena that are ultimately discrete rather than continuous.

But this is not the question being asked. The actual question is whether the Church-Turing hypothesis can be derived from the laws of physics. This thesis is that there is not a more powerful model of computation. If all phenomena describable by existing physical theories can be simulated by a Turing machine (a computer program), that does indicate that there is nothing in these theories that allows for more powerful computing models. But can they be simulated ?

An important chracteristic of Turing-Church computability is its discrete character (this is actually true of all human intellectual activities). It is mostly what people call derisively "symbol pushing". What can be computed is essentially discrete and denumerable. Physics theories are usually express as continuous structures. Are they, or is it just a convenient representation ? If they are continuous, it is not obvious that they can be simulated by a computer, even though computers can to some extent deal with continuous entities, such as real number, finitely represented, but there are all kinds of limitations to these computations and their uses.

Simulating discrete approximations of these theories is probably not satisfactory, and seems (to me at least) a bit tautological, saying that we can simulate with a Turing machine that part of physics that is simulatable by a Turing Machine (because it is discretized). Hence I do not see as very convincing the existence of computer programs that simulate known physics theories, and I do not much believe in the reducibility of physics to computing. But is that necessary to make the Church-Turing thesis a law of physics ?

Furthermore, the Church-Turing thesis is only an hypothesis. We do not know whether it is actually true. But it is true as far as we know, like laws of physics.

To push the problem further, there is a result called the Curry-Howard isomorphism, that shows that computer programs (read "Turing Machines") and mathematical proofs are fundamentally the same, in the sense that they have identical axiomatisation. Hence, for any statement you wish to make about Church-Turing computability, there is an equivalent statement about mathematical proofs.

So the question might also wonder whether our very exclusive way to do mathematics since Euclid's Elements can be viewed as a consequence of a law of physics. And it could restate the last point as: can it be deduced from fundamental laws of physics that mathematical theories must be constructed as we do it. And that applies naturally also to physical theories. Now we get a bit in a loop that I will not try to sort out.

Note about the physicality of Turing machines despite their infinite tape.
(added on September 4 2013)

The fact that Turing Machines (TM) have infinite tape cannot be construed as evidence that they are not a physical model of computation. The infinite tape is only a mathematical device to simplify the analysis of computability, but whatever they compute only uses a finite tape.

The whole theory could be built on Finite Turing Machines (FTM), i.e., Turing Machines with finite tape, which are actually just finite state automata, the simplest kind of of formal computing device there is. These are definitely easy to implement as physical devices. FTM have a special state they enter when they run out of tape, called MemOverflow.

Then we consider classes of FTM, such that all FTM that differ only by the length of their finite tape belong to the same class.

Let C be such a class. We say that C halts on an input x if there is a FTM M in the class C such that its tape is long enough to be initialized with x, and the computation of M with its tape thus initialized never loops (which is easy to detect on a finite state device) and never enter the state MemOverflow. If there is no such machine M in C, then C is said to not to halt on this input.

For all intents and purposes, there is just no difference between these classes of finite state machines and Turing machines with infinite tapes. Doing the theory is just a bit more awkward for no benefit.

babou
  • 3,806
  • 3
  • 22
  • 37
0

I love Alonzo Church, but not for his contributions to Physics.

The connection between algorithms and the real physical world can never be more than approximate. Noise is essential to the concept of probability, and probability is now essential to Physics via Quantum Mechanics. But Wiener saw this even classically, so I will go classical.

No finite apparatus can exactly reproduce a string of bits, e.g., consider how one would either produce, or process, or detect a square wave. All algorithms are Boolean, but no machines are Boolean: they can, at best, be approximately, i.e., noisily, Boolean.

Now the division of a real phenomenon into signal + noise is subjective. For Nature, there is only signal, never noise.

It is essential to the concept of computation or algorithm or Turing machine that somebody generates a square wave, a string of bits, somone else (which can be a machine) processes the string of bits, and then someone else detects the signal in the product of that process. There are never any perfect square wave inputs, and hence the information-theoretic modelling of an input by a string of bits is always a subjective division into signal + noise---there can never be a perfect realisation of any of the algorithms of which a Turing machine is capable so the transform of the input is never truly one of the algorithms studied---and detection is, as any of my signal processing students know, a matter of compromises and errors and noise.

All the popular discussion of «Physics as Computation» is claptrap. The only truly scientific and physical discussion going on in this regard goes on among signal processing engineers.