1

In chapter 9, Goldstein ($3^{rd}$ ed.) includes a discussion and a few "trivial special cases" of Canonical Transformation which keeps the form of the Hamiltonian unchanged and named it Identity Transformation. Recently I've encountered another generating function- $$F_1(q,Q)= {1\over2}m\omega \cot\theta (Q^2+q^2)-{m \omega qQ \over \sin \theta} ,$$ which also generates an identity transformation for the following Hamiltonian- $$H={p^2 \over 2m}+{1 \over 2}m \omega^2 q^2.$$ I must add that $F_1$ merely generates a "rotation+some scaling" in phase space- $$Q=q\cos\theta-{p \over m\omega}\sin\theta,$$ $$P=m\omega q\sin\theta+p\cos\theta.$$I should also mention that $\theta$ is not a function of time. Anyway, one has to go through a lengthy calculation (such as deriving the CT $eq^n$ from GF+ expressing the old variable in terms of the new + substituting it in the old Hamiltonian+ deriving the new H) before discovering that it is just an identity transformation. And this motivated me to ask the following questions-

  1. Is there a way to know beforehand whether a generating function produces an identity transformation or not?
  2. Identity transformation adds no benefits to the problem as it leaves the form of the Hamiltonian unchanged and produces the same $eq^n$s of motion. So why do we study them?
Cosmas Zachos
  • 67,623
Minotaur
  • 127

1 Answers1

2

First, the scaling you mention amounts to failure to clean up your variables, by defining $$ \tilde p\equiv \frac{p}{m\omega}, $$ likewise for $\tilde P$, and $$ \tilde H \equiv \frac{H}{m\omega^2}= \frac{1}{2} (\tilde p^2+ q^2). $$

It is then apparent that the transformation $$ Q=q \cos \theta -\tilde p \sin\theta\\ \tilde P= \tilde p \cos\theta + q \sin\theta, \tag{0} $$ the rotation you observed, is a continuous symmetry of your $\tilde H$, a rotational scalar.

It is also manifestly a canonical transformation without even consideration of generating functions, wont to confuse you, since it preserves Poisson Brackets, essentially by inspection, $$ \{ Q,\tilde P \}=1, $$ (of course, $\{ Q,Q \}=0, \quad \{ \tilde P,\tilde P \}=0$.)

But, in your particular case, the generator which produces your canonical transformation by Lie Poisson commutation, $$ Q= e^{\{ G, \bullet \}} ~~ q = q +\{ G, q \} + \frac{1}{2} \{ G , \{ G, q \} \} +\frac{1}{3!}\{G,\{ G, \{ G, q\}\}\}+ ... \\ \tilde P = e^{\{ G, \bullet \}} ~~ \tilde p = \tilde p +\{ G, \tilde p \} + \frac{1}{2} \{ G , \{ G, \tilde p \} \} +... \tag{1} $$ happens to be $$ G= \theta \tilde H, $$ so, of course, it Poisson-commutes with itself, $\{ G, \tilde H \}=0$. (Do check that, since $\{G,q\}=-\theta \tilde p$, etc, the above series sum to the exact finite rotation (0).) Consequently, the transformation it generates leaves the Hamiltonian invariant. Your criterion is Poisson-commutation with the Hamiltonian.

  • But this is exactly how this Hamiltonian, through Lie iteration of Hamilton's equations, acts to generate motion, here a mere phase-space rotation. (Coincidentally, here, you are replicating the general fact that motion is a canonical transformation, crucially important in QM.)

To sum up, in this language, given a generator $G(q,\tilde p)$ Poisson-commuting with the Hamiltonian, it will be a symmetry thereof, so the (provably canonical, below) transformations it generates will leave the Hamiltonian invariant.

I am not sure I understand your second point, since I cannot fathom your "no benefits to the problem". Canonical changes of variables allow you to simplify problems to the point where their solutions are virtually self-evident. Have you gotten to the action-angle variables part of your course? The fact that motion is a canonical transformation and preserves the local dynamics of the theory in question is also central.


  • Proof of the generic finite transformation (1) being canonical, as per comment. (Geeky)

For generic $G=\theta g(q,p)$, not just the above, is the following quantity =1? $$ Y(\theta)= \{Q,P \} \equiv \{ e^{\{\theta g , \bullet \}} ~q ~,~ e^{\{ \theta g, \bullet \}} ~p \} \\ \equiv 1+\theta y_1 +\theta^2 y_2+\theta^3 y_3+ ... $$

Consider, for any $\theta$-independent function $f(q,p)$, $$ \frac{\partial (e^{\{ \theta g, \bullet \}}~ f)} {\partial \theta}= \{ g ~,~ e^{\{ \theta g, \bullet \}} ~f \}, $$ implying, by the Jacobi identity, $$ \frac{\partial Y(\theta) } {\partial \theta}= \{\{ g,Q\},P\}+ \{Q,\{g,P\}\}=\{g,Y(\theta)\}. $$

So, then, $y_{n+1}=\{ g,y_n\}$, constraining all higher coefficients to zero, since $y_1=\{g,1\}=0$, hence $Y(\theta)=1$, independent of $\theta$.

In the same breath, you may prove the all-orders converse Noether's theorem, $\{g,H\}=0 \Longrightarrow H(Q,P)=H(q,p)$, paradoxically easier to prove in QM (!), but available in Arnold's classic text. Here, I will just all but remind you the lowest-order result is evident by inspection, $$ H(Q,P)=H(q,p)+ \theta ~ \left (-\frac{\partial H}{\partial q} \frac{\partial g}{\partial p}+\frac{\partial H}{\partial p}\frac{\partial g}{\partial q} \right )+ O(\theta^2)\\ = H(q,p)+ \theta ~ \{ g,H\} + O(\theta^2)=H(q,p)+ O(\theta^2). $$ The $O(\theta^2)$ vanishes as well, since, by $\{Q,P\}=1$, it does not matter which basis our PBs are in; and hence, likewise and by the above proof, $\partial H(Q,P)/\partial \theta= ... = \{g,H\}=0$, to all orders in $\theta$.

Cosmas Zachos
  • 67,623