Entropy
First, I'm just going to quickly explain what entropy is. Entropy is the number of ways a system could be rearranged without anyone noticing. Let's say Bob and Joe have an apartment, and Joe comes to Bob, obviously upset, saying "We've been ransacked!" Bob thinks this is nonsense, and as evidence points to his room: "There's a couple t-shirts on the floor, couple of crushed soda cans, sheets unorderly. Nothing's different!" But Joe says, "No, no, come to my room! See, the Shakespeare plays are out of alphabetical order, my musical collection is all messed up, and the bed sheets are unmade instead of made! Ransacked!"
Now, think about Bob's room. There are probably a ton of different ways you could rearrange it without Bob noticing, right? You could throw an extra shirt onto the ground, you could leave three shirts inside out instead of two, you could scatter one a little bit to the left. You could move the sheets on the bed back a few centimeters. All those different ways you could leave the room different without Bob noticing is the level of entropy. There are very few - maybe even zero - ways to rearrange Joe's room without him noticing, so his room is low entropy. But since there are many ways you could rearrange Bob's room so that it is different but so that he will not notice, it is high entropy.
To translate this to real life, let's say you have a tea kettle, and it's giving off a bunch of steam. Pass your hand through the steam molecules, and it still looks the same, right? The system is high entropy. Now, stack some wooden blocks. Knock them over. You notice, right? The system is low entropy. Imagine a box, mentally divided in half, and filled with gas. There are far fewer ways for for the atoms of gas to arrange themselves on one side of the box as opposed to spreading out throughout the box. If the atoms are all on one side of the box, it is low entropy. If they are spread throughout, it is high entropy.
With this new understanding of entropy, Boltzmann was able to derive the Second Law of Thermodynamics in a statistical sense. To explain this simply, there are far more ways for a system to be high entropy than low entropy, so it is no wonder systems naturally increase in entropy, but systems do not naturally decrease in entropy.
Arrow of time
Now for the arrow of time. There are several different arrows of time, like the psychological arrow of time (we remember the past, but not the future), the quantum arrow of time, and the thermodynamic arrow of time. The relevant one in this case, is of course the thermodynamic arrow of time. To quickly explain this, let's say you have a video of a teacup dropping onto the ground and shattering. You can easily tell whether it's played backwards or forwards, right?
There's a reason we don't see broken teacups putting themselves together and leaping into our hands: entropy always increases. So, if entropy is low in the past, but high in the future, we have an arrow of time, a direction things must go. If we are given two events, a broken teacup, and a whole teacup, we know the broken teacup must go second, because entropy always increases. So passage of time can be seen by the increase of entropy.
Relation between heat exchange and entropy
Now, for heat transfer. In heat transfer, heat is always transferred from the warmer system to the colder system. Here's a good way to imagine this. Take two boxes, each made of metal. One is heated to 100 degrees Fahrenheit, the other to 60 degrees Fahrenheit. They are set touching each other. Heat will be transferred from the hotter box to the cooler box. This results in lower entropy in the originally hotter box and higher entropy in the originally cooler box, but higher entropy across the entire system.
So what does this mean in terms of the arrow of time? Well, yes, you could use heat transfer as an arrow of time in situations where it is applicable (heat transfer does not occur everywhere in the universe) but entropy is a much better basis for an arrow of time, and here's why. Entropy is increasing all across the universe. Heat transfer is a specific case where entropy increases. In other words, entropy describes more than heat transfer does.
Let's go back to the teacup example at the beginning of the arrow of time section and apply the heat transfer arrow of time. We have no clue which direction is backwards or forwards when the video is played using just the heat transfer arrow of time (we are assuming the temperature of the teacup stays constant throughout)! But if we use entropy as the basis of our arrow of time, we do know what's going on. Your idea is a good one, it just doesn't describe as many situations.
Resources
My explanation of entropy is copied over from this answer I did. I used this website for a little bit of the heat transfer section (I think count_to_10 used the same website). A good book to read about entropy and the arrow of time is From Eternity to Here by Sean Carroll.
Hope this helps!