Suppose I build a machine which will be given Rubik's cubes that have been scrambled to one of the ~$2^{65}$ possible positions of the cube, chosen uniformly at random. Is it possible for the machine to solve the cubes without giving off any heat?
If by "giving off heat" you mean change of mechanical/electrical energy into internal energy, then in practice no - in real machines there is always some friction and dissipation of mechanical/electrical energy. It is extremely hard to prevent it entirely if there is some motion involved.
In theory, if we could build machine that transforms cube without dissipation of energy (obeying reversible mechanics where heat is not present or working with negligible amount of energy (slowly)) then I think the answer is yes, since there are algorithms for solving Rubik's cube and I do not see a reason why these algorithms could not be run by that kind of machine. I am not sure though.
Solving the cube might be thought to consist of destroying about 65 bits of information because it takes 65 bits to describe the cube's state before entering the machine, but zero bits to describe it afterwards (since it is known to be solved).
If by "destroy information" you mean "restore the cube into solved state and reset the machine into state ready" then I agree; in the sense that after the cube is solved the information about the initial state of the cube cannot be acquired from it anymore.
However, let me elaborate on one point that often gets confusing; physical state is not information. Using term "information gets destroyed" confuses the analysis, because the process actually results in increase of information about the cube; we did not know the initial state, but in the end we know it is solved.
That's why it is important to distinguish between physical state of cube and information about the state of the cube. What gets destroyed in the process is not information, but the initial physical state; the information actually increases.
Of course, the information about the initial state may still be acquirable from the state of the machine or its environment.
...by Landauer's principle we might expect the machine to have to give off a heat of $TdS$~$65Tk_B\ln(2)$, but is it valid to apply Landauer's principle to the information stored in such a manner?
No.
When the machine is reset by the action of the environment, information entropy of the machine + cube decreases. If information entropy was the same thing as thermodynamic entropy and the whole process could be reasonably described as reversible thermodynamic process, one could think that this is accompanied with system giving off heat to the environment since Clausius has shown that in such case $\Delta S_{thermodynamic} = \int dQ/T$.
But this is not that case at all. Even if we assume that information entropy of the environment increases as a result of the process, this by itself is not sufficient to conclude that thermodynamic entropy does the same. It may not even be applicable to the environment. If it is, the whole process may still occur with arbitrarily small energy transfer so no lower bound on the amount of heat can be implied.
I do not understand why people put some much belief and enthusiasm in the Landauer principle. The concepts of temperature, heat transfer and thermodynamic entropy are of limited applicability and their proper area of use is thermodynamics of macroscopic systems. It makes little sense to complicate description of computational processes by using only limited terms of thermodynamics or statistical physics.
What sort of argument does it take to say that a certain type of information is physically meaningful, such that destroying it necessitates paying an entropy cost somewhere else?
I am not sure why you use expression "physically meaningful". Information is not a physical property of bodies. It is a non-physical concept. Originally information resides in the mind. Then it may be encoded into physical state of other body such as a book, hard drive or Rubik's cube, but the mind is still needed to transform the state into information.
However, the entropic cost is plausible, that is information entropy cost. After the environment has interacted with a system of unknown state (Rubik's cube), the amount of information about environment we have most probably decreases. This does mean information entropy (our ignorance of environment's state) increases, so that's why the cost.
However, I like to say again here that there is no direct implication for change in thermodynamic entropy (or generation of heat) in any of these systems.
Information entropy and thermodynamic entropy are very different concepts and there is no universally valid correlation between their changes. Only in thermodynamically reversible thermodynamic process, they correspond to each other. It is not necessary that environment undergoes such process as the machine solves Rubik's cube.