Jaynes wrote a paper called information theory and statistical mechanics (1957). I tried to read this but found it somewhat hard to follow, and I suspect it's because it's one of the earlier works, and that a modern text might be more refined since we've had time to work it out. On the other hand, what I've read from physics textbooks don't explicitly talk about information theoretic concepts like shannon entropy.
Is there a good modern introduction to the information theoretic view of entropy?