TL;DR: The reason why some qubits need to work at very low temperature is to avoid noise which would make quantum gates having poor fidelity.
For this reason, a room-temperature superconductor would be a necessary but not sufficient condition to have room-temperature superconducting qubits.
Longer/more detailed answer with a concrete example:
Of course to have a superconducting qubit, we need the material they are built in to be superconducting, which means that for ambient pressure, its temperature should be low. However the temperature at which superconductivity is lost is usually much higher than the typical temperature of a transmon qubit (which is around $10mK$). I explain why in what follows.
If you want to drive transmon qubits, you send microwave signals that interact with the qubit. If the waveguide in which these microwaves travel contains "undesired signal" (i.e. noise), the quantum gate you will implement on your qubit will be noisy: you don't want that.
Putting the qubit at low temperature can suppress this noise.
In practice, a quantum treatment of the lines tells you that photons travel down the waveguide to interact with the qubit. There are then the "useful" photons (i.e. the ones of the signal necessary to perform the quantum gates), and the "noisy" photons (i.e. the undesired photons).
The number of "noisy" photons for a waveguide at temperature $T$, at the frequency $\omega/(2 \pi)$ is given by the Bose Einstein population: $n_B(\omega)=1/(e^{\hbar \omega/k_b T}-1)$ ($k_b$ is the Boltzmann constant). ( * )
What we want is that the photons at the qubit frequency $\omega_0/2\pi$ satisfy $n_B(\omega_0) \ll 1$. In practice, it means that $k_b T \ll \hbar \omega_0$. As $\omega_0/2 \pi \approx 5 GHz$ in transmon qubits, $T$ in the range $[10mK,20mK]$ is considered to be a "low enough" temperature by experimentalists. In practice, $T=0K$ would be ideal, but it would imply an infinite energy consumption because removing heat at $0K$ requires an infinite amount of energy, think about Carnot efficiency. The value $\approx [10mK,20mK]$ is then considered to be a good tradeoff by the community.
For superconducting qubits there are other mechanisms which can introduce noise if the qubits are not at low temperature (i.e., mechanisms which cannot be understood simply with Bose Einstein population). One example I am aware of is related to the presence of quasiparticles in the superconductor which can corrupt the behavior of the qubit. Overall, one key reason of why superconducting qubit has to be at low temperature is because we want to reduce the amount of noise felt by the qubits.
( * ) I oversimplified the story: in practice, because the signals are generated at room temperature, one end of your waveguide will be at room temperature and the other at the qubit temperature (there is not a single temperature in the physics but rather a temperature gradient). However, by putting big attenuators on the waveguide (an attenuator is a kind of resistor), it is "in some sense" possible to treat the waveguide as if it was thermalized at the qubit's temperature (I skip these details as they are not important for the bulk of the question).