By Alfred Renyi
Read or Download A diary on information theory PDF
Similar probability & statistics books
This paintings is a clean presentation of the Ahlfors-Weyl conception of holomorphic curves that takes under consideration a few fresh advancements in Nevanlinna conception and a number of other complicated variables. The therapy is differential geometric all through, and assumes no prior acquaintance with the classical idea of Nevanlinna.
BuchhandelstextDas erfolgreiche Werk des Autors wird durch einen Band erg? nzt zu spezielleren mathematischen Themen, die im Hauptstudium behandelt werden. In der bew? hrten Methodik und Didaktik wird weniger Wert auf mathematische Strenge gelegt als vielmehr auf anschauliche, anwendungsnahe Beispiele.
Extra info for A diary on information theory
Let's go deeper. What about the matter-like nature of information? Well, if energy is matter, then information must somehow be spirit-natured. But how can this be so? No, no, no ... information is matter because it can only be transmitted by means of matter (or energy), such as symbols on paper or electrical or chemical impulses.
It would be possible to measure unexpectedness in some new kind of unit and give it a name too, but there is no need for all that. Unexpectedness doesn't need "dimension", it is enough to make it a dimensionless number. And I thought about the lecturer's joke too, that from /(^, rj)^0, it follows that no matter what we learn at the University it cannot harm us since in the worst case, it will simply be of no use. Of course, one can say that one will get smarter by studying and thinking, but simple memorizing makes the mind dull, so I can't say that preparing for exams doesn't destroy one's intellectual capacities.
The uncertainty of a is greatest when p=~, while the unexpected- ness of the event A is greater and greater as p is smaller and smaller. Let us compare the decrease in uncertainty with a change in unexpectedness. The situation is hke this; With the observation of a random variable, the uncertainty (entropy) relating to another random variable will always decrease, or stay as it is — the latter result occurring in the case of two independent random variables. On the other hand, with the observation of an event, the unexpectedness of another event can decrease, increase or stay the same — the last possibility again occurring in the case of independent events.