Entropy is a measure of the order/disorder during the transformation of the state of a system and is defined as the total variation of energy at a defined temperature. From point of view of statistical mechanics, this variation of energy is generated from statistical transitions of the internal states of the system. In this sense, entropy can measure how easy it is to reach a defined state of the system. Now, imagine a text stream that arrives to you character by character in a screen. If the text is meaningless, then every character has the same probability of appearing to you and therefore the entropy is maximal because this disorder is maximal. If you want to transfer information, then you have to spend a little bit of energy in ordering the characters because this does not happen spontaneously. The final state of the system is more ordered in respect to the earlier one, so the entropy is less than the entropy of random text. This means that, if you want to reduce entropy in order to transfer information, then you must spend energy.