When liquid water is at 100°C under 1 atm pressure, any additional heat instead of increasing kinetic energy of water molecules increase the potential energy between water molecules, leading to phase transition.
However, at the same 1 atm pressure but with water at 60°C, the same amount of added heat primarily increases the kinetic energy, raising the temperature.
How does the added heat "know" when to increase potential energy versus kinetic energy? What's the fundamental mechanism that dictates this behavior?