There is indeed a pretty fundamental relationship between data rate and range. It's embodied in the idea that for a certain data rate purity (i.e. small numbers of errors) there is a certain energy per bit required to "reliably" counter the background noise.
The background noise is related to the receiver band width and the bigger the data rate is the wider the bandwidth needs to be. Thus wide bandwidth transmission links need more power to get a certain distance to overcome the receivers noise that is picked up. It all comes down to the following equation: -
Power required by the receiver in dBm is -154dBm + 10log\$_{10}\$(data rate) dBm
So, a 1Mbps link requires a receive signal level of -94 dBm for example.
Some WiFi units may specify a slightly higher figure but it's all down to understanding their assumptions about the bit error rate that they judged to be acceptable. Here's a useful table: -

Stolen from this website and please note the theoretical free space ranges quoted have got feet and meters mixed up.
Some of the figures in the table above assume different packet error rates than others i.e. the 11Mbps data rate should theoretically require -83.6 dBm but the table specifies -90 dBm. Also different modulation schemes can introduce a variability in the value. See this page for a few more pointers: -
