A list of puns related to "Entropy (information theory)"
Hello,
I am not an expert in mathematics and so apologies if my language is not clear or I use terminology incorrectly. My question is this. Suppose you have a coin, which may or may not be biased. Suppose also that you do not even know what side the coin favors or to what degree it favors it. You begin to flip the coin.
Based on my understanding of coin flipping, future results are not dependent on the past. Therefore, if you tossed a coin of known fairness 50 times and achieved 50 heads, we would still assume that the next result was p =.50. Based on my knowledge of entropy in information theory, this coin of known fairness would have maximal entropy. However, over large spans of time, we could say with relative certainty that flipping the coin will result in ~50% heads, and ~50% tails. We can't make any bold statement of when, but we have to concede that the results will at some point approximate the coin's probability.
However, with the coin that I described earlier, we cannot even make such long-term predictions about the results. Wouldn't this add some new degree of entropy to the coin?
Just to make it more clear, the coin can represent any object with 2 possible states with an unknowable probability of occupying each state. Not sure if such an object exists but the coin is my idea of a real world approximation.
I hope this isn't completely silly with obvious fallacies but if it is feel free to let me know haha.
In thermodynamics entropy seems to be a measurement of stored enery per volume(or mass? or per system?) and in infromation theroy entropy is a measurement of information density. Both formulas seem to be very similar(an intergal/sum of all posible states) but ive never bee able to make the connection in meaning. Thermodynamic enropy incrases over time, can the same be said about informational entropy or is there an analogy in information theory for this increase?
Consider that you have a legible English sentence. It has very low entropy compared to a random bit sequence of equal length. Now, imagine you encrypt it. It appears outwardly random. If you donβt have the decryption key, it appears as random as any other sequence of bits and thus should have the same low entropy as a random sequence. If you have the key, the original can be easily reconstructed to its original, low-entropy state, which seems to contradict the notion that, you canβt go from a high entropy to a low entropy system, unless you assume data entropy is observer-dependent.
Is there an error somewhere in my reasoning?
The amount of information in a message and the entropy in a system are, in a way, the same quantity in different situations, but that got me thinking, what about Gibbs Free Energy (GFE)?, what about Enthalpy?, what about heat capacity?...
It seems to me like all thermodynamical quantities have a meaning in the context of information theory. For example, I get the feeling that Enthalpy tells you how hard a message is to 'send', and GFE tells you something about how easy it is to 'understand'.... I think, I'm probably wrong though.
Or maybe I'm completely wrong and only entropy makes sense in information theory and the other quantities do not, but then, can someone explain to me why?
The most basic theories related to information and intelligence is information theory. This lecture series is to explain how entropy and information theory has to do with intelligence and story of Claude Shannon. It is produced in Korean, but all the slides are in English and English subtitle is available.
Information Theory, Entropy and Intelligence (1)
For instance, diceware tells me that adding an additional diceware word to a password adds "12.9 bits of entropy."
On the other hand, physical entropy is measured in joules per kelvin.
Why the different units? And how are these concepts related?
For example, the set {AABBB} has an entropy of 0.97.
I could, for example, make a smart clustering {AA} & {BBB} with a total entropy of 0. Or, I could be a brute and have 5 sets {A}{A}{B}{B}{B}, also with entropy of 0.
I'm sure people are interested in minimizing entropy in balance with complexity. How can I find more information about this?
Shannon replied: βThe theory was in excellent shape, except that he needed a good name for βmissing informationβ. βWhy donβt you call it entropyβ, Von Neumann suggested. βIn the first place, a mathematical development very much like yours already exists in Boltzmannβs statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage"
So I am totally confused now, cause I just wasted a whole evening arguing over the meaning of the word 'entropy' in information theory. From my understanding, the more the entropy, the more the randomness, and so the less information you have about the system. which means entropy = anti-information.
But some people say more entropy means more information, or that entropy and information are exactly the same thing. How is this consistent with the 2nd law of thermodynamics.. since if the entropy of a closed system always increases, we would then always need more information to describe it.. which does not seem intuitive.
Please explain where I am going wrong..
My professors suggested checking Coursera and other MOOCs for at least an outline or framework as to how to go about the individual study, but I can't seem to find anything apart from a class that was offered a few years back
Do you guys know of any sources like this?
(Even a good textbook could be used as a framework for the course)
Let me know if you have any suggestions!
Thanks!
This measure manifests itself in many different contexts and I would love to figure it out once and for all. Thanks!!!
I'm trying to calculate conditional entropy for two sets. They both have 15 entries with {0,1} values for survive and {0,1} values for sex.
The data sets are below. Male = 1, Survived = 1
the probability of being male and surviving,
If I want to calculate H(Survival|X=Male)
p(S = 1,M=1) = 2/15 P(S=1) = 7/15
So H(Survival|X=Male) = - (2/15) * log((2/15)/(7/15)) + -(2/15) * log((2/15)/(8/15))
Is this correct?
And also, what are similarities?
I'm writing a research paper on Bayesian analysis and I'm trying to read the wikipedia postings on Maximum Entropy for Bayes Nets (at https://en.wikipedia.org/wiki/Bayesian_network#Parameter_learning).
So how the heck can maximum noise represent an optimal information state? And how do you measure goodness-of-noise???? I can hear the whoosh of something goin' straight over my head.
Let me describe my confusion (best guess) then maybe you can help. Suppose you're trying to understand some process. So when you've factored out all the stuff you understand and can predict, you're left with something that is maximally noisy (?). Beyond here there be dragons. You suddenly and precisely know what you don't know.
This seems totally backwards. Why isn't that MINIMUM entropy. This all sounds terribly Nihilistic. Its like I decide life is meaningless and I'm done! (joking).
Is this what Shannon and Jayne et al are getting at??? Please help.
Trying to read through some secondary sources before getting into Shannon's original paper, and I want to make sure my qualitative understanding is correct:
For general purposes, knowing (I believe) the entropy to be the sum of p(log2(1/p)) for all possible messages, if you have two random correlated variables A & B, where A is a possible introduction to a message and B is the message, given you (a) have a value for A and (b) know the probability distribution for A, the conditional entropy of B would just be a modification of how much additional information the message gives you (or a limit on the maximum information it could give you) given you have an introduction of known information content & probability (a kind of Bayesian update, I guess?). Or, otherwise, it's supposed to be an indicator of how much I(A,B) overlaps I(B)?
Maybe someone could formalize it for me and show a quick derivation if I've got the qualitative more or less right?
I want to know more about f-divergence, Renyi, Shannon, KL, Wasserstein divergences. I am not a Math student, is there any book (or any kind of resources) that you recommend for the non-Math students to get a more general and high-level understanding of these metrics/divergences?
So I am currently at university and taking a course on information theory which is part of a Master program and I wanted to say a few things about the recent speculation about extracting knowledge from the library.
Information is, scientifically, measured in bits and described through the entropy measure. The formula for calculating how much information something contains is based on the probability distribution of the events.
For example: you would not be surprised if I tell you that you will breathe tomorrow. This message contains close to zero information because the probability that you will not breathe tomorrow is very low.
As the library contains every possible arrangement of letters the probability for finding any given arrangement is the same, which is 100%.
If we know for sure we will find everything in it it there is no information to gain from it. That said the library contains zero information.
True wisdom is knowing which arrangements of letters make sense. If the library contains every possible book it might as well contain none without any difference in effect.
Even though this might sound sad it still is very fascinating. We can still find nice story's in there, actually any story with any outcome we wish for. Even the bible is in there somewhere, the whole freaking bible.
If the universe is continuous, the position of each particle requires an infinite number of bits to be represented. Does this cause problems with information theory? I don't know much about information theory, but AFAIU, information has physical effects, so it would seem problematic if the universe has infinite information.
Somewhat of a random question, but I'm curious if any critical theorists (or ppl adjacent) have written on entropy. I'm working on some fiction involving the concept, and such texts might help me think new thoughts about it. Thank you :)
Yβknow something like a weird cult that worshiped VHS tapes or something like that. Really weird things, preferably not anything to do with things like hollow earth or moon nazis or anything like that, just not what Iβm looking for at the moment. The weirder the better!
Thermodynamic entropy, Gibbs entropy, Shannon entropy, Von Neumann entropy, Holevo entropy, RΓ©nyi entropy, etc.
It's extremely confusing and makes it rather difficult to follow discussions on quantum information theory, which I'm trying to write a paper on for a class.
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.