Information: a informs about b iff a has some property that is relevant to some property that b may have or lack, where relevant is taken in the probabilistic sense.
Note that what is or is not relevant in the sense in which it is used here depends on what one's probabilities are, and so what is informative for one may not be so for another, while what is important and unexpected information at one time may be not very important additional confimation at a latter time for the same person.
There is a technical usage of "information", that derives from information theory, which basically is concerned with the problem of sending messages along a channel like a telephone line that may be disturbed (noisy, imperfect). The definition in this sense is that the information a message carries is the negative of the logarithm of its probability. The idea behind it is that the logarithm of the probability is a measure of the number of alternative possibilities the message excludes. (One takes the negative because a probability between 0 and 1 has a negative logarithm, so taking the negative makes the result positive.)
This technical usage is helpful were the premisses of information theory apply, and then fairly mathematical, but less helpful outside this class of cases.
