A battery has a nominal charge capacity \$Q_N\$ in the order of 1 to 100 Ah for usual consumer variants (AA, mobile, car battery ...). The battery also has a nominal voltage \$U_N\$ related to its chemistry and architecture.
An oversimplified model of a battery gives a constant output voltage until its charge is depleted. If you apply a constant load to this (e.g. resistors and LEDs, no switching), then the time you can run the load from this battery could be calculated by
\$t = \frac{Q_n}{I} = Q_n\cdot\frac{R_{total}}{U_N}\$, where \$I\$ is the constant discharge current.
As can be seen, increasing the Resistance \$R_{total}\$ increases the battery runtime. But the Question is whether the circuit is still able to fulfill its purpose. A LED which receives less current due to a higher series resistor will emit less light, possibly making it useless for room lighting or as torch light.
Real batteries however are far more complicated and have to be modeled at least with a series resistor. There are two effects in a battery which will actually increase its useable charge \$Q_u\$ over its nominal charge capacity when you apply smaller discharge currents (~ higher load resistance) :
- The Peukert Effect describes that one can get more charge out of a battery if discharged with a low (constant) current.
- The Recovery Effect says that in periods of low/no discharge currents the reduced "useable" charge due to high current loads gets partially replenished.
The reasons for both are the chemical processes in the battery. Therefore battery runtime is affected by a lot more factors than Ohm's Law alone. These effects do not create charge magically out of nowhere, but rather show that the discharging process is more effective in the given circumstances of lower currents.
Parts of this answer are excerpts from an related older answer of mine.