42

Suppose I build a machine which will be given Rubik's cubes that have been scrambled to one of the $\sim 2^{65}$ possible positions of the cube, chosen uniformly at random. Is it possible for the machine to solve the cubes without giving off any heat?

Solving the cube might be thought to consist of destroying about 65 bits of information because it takes 65 bits to describe the cube's state before entering the machine, but zero bits to describe it afterwards (since it is known to be solved).

If the information stored on a Rubik's cube is equivalent to any other type of physically-stored information, then by Landauer's principle we might expect the machine to have to give off a heat of $T \mathrm{d}S \sim 65 T k_B \ln(2)$, but is it valid to apply Landauer's principle to the information stored in such a manner? What sort of argument does it take to say that a certain type of information is physically meaningful, such that destroying it necessitates paying an entropy cost somewhere else?

6 Answers6

36

Let's suppose you have a Rubik's cube that's made of a small number of atoms at a low temperature, so that you can make moves without any frictional dissipation at all, and let's suppose that the cube is initialised to a random one of its $\sim 2^{65}$ possible states. Now if you want to solve this cube you will have to measure its state. In principle you can do this without dissipating any energy. Once you know the moves you need to make to solve the cube, these can also be made without dissipating any energy.

So now you decide to build a machine that will solve the cube without dissipating any energy. First it measures the state and stores it in some digital memory. Then it calculates the moves required to solve the cube from this position. (In principle this needn't generate any heat either.) Then it makes those moves, solving the cube.

None of these steps need to give off any heat in principle, but your machine ends in a different state than the state it starts in. At the end of the process, 65 bits' worth of the machine's state has effectively been randomised, because it still contains the information about the cube's initial state. If you want to reset the machine so that it can solve another cube, you will have to reset those bits of state back to their initial conditions, and that's what has to dissipate energy according to Landauer's principle.

In the end the answer is just that you have to pay an entropy cost to erase information in all cases where you actually need to erase that information. If you only wanted to solve a finite number of cubes then you can just make the memory big enough to store all the resulting information, so there's no need to erase it, and no heat needs to be generated. But if you want to build a finite-sized machine that can keep on solving cubes indefinitely then eventually dumping entropy into the environment will be a necessity.

This is the case for Maxwell's demon as well: if the demon is allowed to have an infinite memory, all initialised to a known state, then it need not ever dissipate any energy. But giving it an infinite memory is very much the same thing as giving it an infinite source of energy; it's able to indefinitely reduce the thermodynamic entropy of its surroundings only by indefinitely increasing the information entropy of its own internal state.

N. Virgo
  • 35,274
3

In principle I agree with your analysis, but I don't agree with the conclusion. From an algorithmic point of view, you can solve the cube without expending any heat, as long as information is not lost. So in principle you can have an extra cube in a known state, which you then transform in tandem to the cube you are trying to solve. The initial state of the first cube is then encoded in the final state of the second cube. In the field of reversible computing, the second cube represents an ancillary variable.

lionelbrits
  • 9,543
1

I actually read the title in a different way, so let me answer a different question: what is the minimum thermodynamic requirement to solve a cube? Now, if you analyze the starting position (which some algebraists have done), then you know how many moves it takes to solve. If you do a weighted sum over all starting states, i.e. weighted by the number of moves to solution from each state, you quickly find the expected energy (in "move units") , the std. dev, etc.

I guess this is more boring than the question intended :-( .

Carl Witthoft
  • 11,106
1

Suppose I build a machine which will be given Rubik's cubes that have been scrambled to one of the ~$2^{65}$ possible positions of the cube, chosen uniformly at random. Is it possible for the machine to solve the cubes without giving off any heat?

If by "giving off heat" you mean change of mechanical/electrical energy into internal energy, then in practice no - in real machines there is always some friction and dissipation of mechanical/electrical energy. It is extremely hard to prevent it entirely if there is some motion involved.

In theory, if we could build machine that transforms cube without dissipation of energy (obeying reversible mechanics where heat is not present or working with negligible amount of energy (slowly)) then I think the answer is yes, since there are algorithms for solving Rubik's cube and I do not see a reason why these algorithms could not be run by that kind of machine. I am not sure though.

Solving the cube might be thought to consist of destroying about 65 bits of information because it takes 65 bits to describe the cube's state before entering the machine, but zero bits to describe it afterwards (since it is known to be solved).

If by "destroy information" you mean "restore the cube into solved state and reset the machine into state ready" then I agree; in the sense that after the cube is solved the information about the initial state of the cube cannot be acquired from it anymore.

However, let me elaborate on one point that often gets confusing; physical state is not information. Using term "information gets destroyed" confuses the analysis, because the process actually results in increase of information about the cube; we did not know the initial state, but in the end we know it is solved.

That's why it is important to distinguish between physical state of cube and information about the state of the cube. What gets destroyed in the process is not information, but the initial physical state; the information actually increases.

Of course, the information about the initial state may still be acquirable from the state of the machine or its environment.

...by Landauer's principle we might expect the machine to have to give off a heat of $TdS$~$65Tk_B\ln(2)$, but is it valid to apply Landauer's principle to the information stored in such a manner?

No.

When the machine is reset by the action of the environment, information entropy of the machine + cube decreases. If information entropy was the same thing as thermodynamic entropy and the whole process could be reasonably described as reversible thermodynamic process, one could think that this is accompanied with system giving off heat to the environment since Clausius has shown that in such case $\Delta S_{thermodynamic} = \int dQ/T$.

But this is not that case at all. Even if we assume that information entropy of the environment increases as a result of the process, this by itself is not sufficient to conclude that thermodynamic entropy does the same. It may not even be applicable to the environment. If it is, the whole process may still occur with arbitrarily small energy transfer so no lower bound on the amount of heat can be implied.

I do not understand why people put some much belief and enthusiasm in the Landauer principle. The concepts of temperature, heat transfer and thermodynamic entropy are of limited applicability and their proper area of use is thermodynamics of macroscopic systems. It makes little sense to complicate description of computational processes by using only limited terms of thermodynamics or statistical physics.

What sort of argument does it take to say that a certain type of information is physically meaningful, such that destroying it necessitates paying an entropy cost somewhere else?

I am not sure why you use expression "physically meaningful". Information is not a physical property of bodies. It is a non-physical concept. Originally information resides in the mind. Then it may be encoded into physical state of other body such as a book, hard drive or Rubik's cube, but the mind is still needed to transform the state into information.

However, the entropic cost is plausible, that is information entropy cost. After the environment has interacted with a system of unknown state (Rubik's cube), the amount of information about environment we have most probably decreases. This does mean information entropy (our ignorance of environment's state) increases, so that's why the cost.

However, I like to say again here that there is no direct implication for change in thermodynamic entropy (or generation of heat) in any of these systems.

Information entropy and thermodynamic entropy are very different concepts and there is no universally valid correlation between their changes. Only in thermodynamically reversible thermodynamic process, they correspond to each other. It is not necessary that environment undergoes such process as the machine solves Rubik's cube.

0

Depending on how broadly you interpret the idea of a Rubik's cube, a quantum mechanical version requires no heat either to randomize or solve. Suppose we have a virtual cube whose state is represented by 65 qbits. It is desirable that different states of the system have very low coupling, but in practice they must have some, so a system that begins in a basis state with each bit having a definite value will, in the long run, evolve into a superposition. To randomize the system we wait a very long (but random) time and then read the qbits. We then conduct a series of unitary operations to return the qbits to the state $|0,0,0...0 \rangle$, representing a solved cube. Since in principle neither operation requires energy, there is no thermodynamic cost.

user27118
  • 793
-1

Rubik's cube can store information. The information can be changed. Rubik's cube is a memory device. Changing one bit of information in a Rubik's cube takes at least energy kT ln 2. That's the Landauer's principle.

The energy of changing a bit in a Rubik's cube becomes heat energy of the Rubik's cube.

A virtual Rubik's cube in computer's memory obeys that same law.

stuffu
  • 2,374