In chemical kinetics, the rate constant for a first-order reaction depends on temperature according to the Arrhenius equation, $$ k=Ae^{\frac{-E_a}{RT}}, $$ due to the activation energy barrier. Radioactive decay also follows first-order kinetics, with a rate given by $$ \frac{\mathrm{d}N}{\mathrm{d}t}=- \lambda N, $$ where $\lambda$ is the decay constant. However, $\lambda$ is said to be independent of temperature, unlike chemical rate constants. Why doesn’t the decay constant exhibit a similar temperature dependence, given that both processes are first-order? Is there a fundamental difference in the underlying mechanisms, or does the nuclear energy scale simply render thermal effects negligible?
I'm new to radioactive decays so, hoping a high-school level explanation of this topic.