>Shannon identified information with surprise. He chose the negative of the log of the probability of an event as the amount of information you get when the event of probability p happens. For example, if I tell you it is smoggy in Los Angles then p is near 1 and that is not much information, but if I tell you it is raining in Monterey in June then that is surprising and represents more information. Because log 1=0 the certain event contains no information.
>Let us pause and examine what has happened so far. First, we have not defined “information”, we merely gave a formula for measuring the amount. Second, the measure depends on surprise, and while it does match, to a reasonable degree, the situation with machines, say the telephone system, radio, television, computers, and such, it simply does not represent the normal human attitude towards information. Third, it is a relative measure, it depends on the state of your knowledge. If you are looking at a stream of “random numbers” from a random source then you think each number comes as a surprise, but if you know the formula for computing the “random numbers” then the next number contains no surprise at all, hence contains no information! Thus, while the definition Shannon made for information is appropriate in many respects for machines, it does not seem to fit the human use of the word. This is the reason it should have been called “Communication Theory”, and not “Information Theory”. It is too late to undo the definition (which produced so much of its initial popularity, and still makes people think it handles “information”) so we have to live with it, but you should clearly realize how much it distorts the common view of information and deals with something else, which Shannon took to be surprise.
>In science if you know what you are doing you should not be doing it.
>In engineering if you do not know what you are doing you should not be doing it.
>with apparently only one life to live on this earth, you ought to try to make significant contributions to humanity rather than just get along through life comfortably — that the life of trying to achieve excellence in some area is in itself a worthy goal for your life.
>Shannon identified information with surprise. He chose the negative of the log of the probability of an event as the amount of information you get when the event of probability p happens. For example, if I tell you it is smoggy in Los Angles then p is near 1 and that is not much information, but if I tell you it is raining in Monterey in June then that is surprising and represents more information. Because log 1=0 the certain event contains no information.
>Let us pause and examine what has happened so far. First, we have not defined “information”, we merely gave a formula for measuring the amount. Second, the measure depends on surprise, and while it does match, to a reasonable degree, the situation with machines, say the telephone system, radio, television, computers, and such, it simply does not represent the normal human attitude towards information. Third, it is a relative measure, it depends on the state of your knowledge. If you are looking at a stream of “random numbers” from a random source then you think each number comes as a surprise, but if you know the formula for computing the “random numbers” then the next number contains no surprise at all, hence contains no information! Thus, while the definition Shannon made for information is appropriate in many respects for machines, it does not seem to fit the human use of the word. This is the reason it should have been called “Communication Theory”, and not “Information Theory”. It is too late to undo the definition (which produced so much of its initial popularity, and still makes people think it handles “information”) so we have to live with it, but you should clearly realize how much it distorts the common view of information and deals with something else, which Shannon took to be surprise.
>In science if you know what you are doing you should not be doing it.
>In engineering if you do not know what you are doing you should not be doing it.
>with apparently only one life to live on this earth, you ought to try to make significant contributions to humanity rather than just get along through life comfortably — that the life of trying to achieve excellence in some area is in itself a worthy goal for your life.
This is a 227-page book by Richard W. Hamming from 1997/2005.
The link doesn't seem to work for me on mobile. I'm not sure why.
See also https://youtube.com/playlist?list=PL2FF649D0C4407B30