Entropy (information theory) of a coin of unknown bias

Hello,

I am not an expert in mathematics and so apologies if my language is not clear or I use terminology incorrectly. My question is this. Suppose you have a coin, which may or may not be biased. Suppose also that you do not even know what side the coin favors or to what degree it favors it. You begin to flip the coin.

Based on my understanding of coin flipping, future results are not dependent on the past. Therefore, if you tossed a coin of known fairness 50 times and achieved 50 heads, we would still assume that the next result was p =.50. Based on my knowledge of entropy in information theory, this coin of known fairness would have maximal entropy. However, over large spans of time, we could say with relative certainty that flipping the coin will result in ~50% heads, and ~50% tails. We can't make any bold statement of when, but we have to concede that the results will at some point approximate the coin's probability.

However, with the coin that I described earlier, we cannot even make such long-term predictions about the results. Wouldn't this add some new degree of entropy to the coin?

Just to make it more clear, the coin can represent any object with 2 possible states with an unknowable probability of occupying each state. Not sure if such an object exists but the coin is my idea of a real world approximation.

I hope this isn't completely silly with obvious fallacies but if it is feel free to let me know haha.

πŸ‘︎ 10
πŸ’¬︎
πŸ‘€︎ u/mcgtank
πŸ“…︎ Aug 30 2021
🚨︎ report
How entropy can be seen as a volume - the geometric interpretation of information theory ruvi.blog/2021/03/31/entr…
πŸ‘︎ 390
πŸ’¬︎
πŸ‘€︎ u/adiabaticfrog
πŸ“…︎ Mar 31 2021
🚨︎ report
How entropy can be seen as a volume - the geometric interpretation of information theory ruvi.blog/2021/03/31/entr…
πŸ‘︎ 16
πŸ’¬︎
πŸ‘€︎ u/adiabaticfrog
πŸ“…︎ Mar 31 2021
🚨︎ report
Whats the relation of entropy in physics and entropy in information theory?

In thermodynamics entropy seems to be a measurement of stored enery per volume(or mass? or per system?) and in infromation theroy entropy is a measurement of information density. Both formulas seem to be very similar(an intergal/sum of all posible states) but ive never bee able to make the connection in meaning. Thermodynamic enropy incrases over time, can the same be said about informational entropy or is there an analogy in information theory for this increase?

πŸ‘︎ 2k
πŸ’¬︎
πŸ‘€︎ u/McMasilmof
πŸ“…︎ Nov 20 2019
🚨︎ report
The grave of american mathematician Claude Shannon, founder of information theory, with the formula of what is now called the "Shannon Entropy".
πŸ‘︎ 174
πŸ’¬︎
πŸ‘€︎ u/vincentblt
πŸ“…︎ Nov 03 2020
🚨︎ report
Is information theory entropy dependent on the knowledge of the observer?

Consider that you have a legible English sentence. It has very low entropy compared to a random bit sequence of equal length. Now, imagine you encrypt it. It appears outwardly random. If you don’t have the decryption key, it appears as random as any other sequence of bits and thus should have the same low entropy as a random sequence. If you have the key, the original can be easily reconstructed to its original, low-entropy state, which seems to contradict the notion that, you can’t go from a high entropy to a low entropy system, unless you assume data entropy is observer-dependent.

Is there an error somewhere in my reasoning?

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/GeneReddit123
πŸ“…︎ Apr 23 2021
🚨︎ report
How entropy can be seen as a volume - the geometric interpretation of information theory ruvi.blog/2021/03/31/entr…
πŸ‘︎ 16
πŸ’¬︎
πŸ‘€︎ u/KiddWantidd
πŸ“…︎ Apr 03 2021
🚨︎ report
How entropy can be seen as a volume - the geometric interpretation of information theory ruvi.blog/2021/03/31/entr…
πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Uroc327
πŸ“…︎ Mar 31 2021
🚨︎ report
[University | Information Theory: Entropy] Is my calculation wrong or is the answer on the lecture slide wrong?
πŸ‘︎ 2
πŸ’¬︎
πŸ“…︎ Oct 20 2020
🚨︎ report
Entropy and information are deeply related but, what about other thermodynamical quantities?, how are they important for information theory?

The amount of information in a message and the entropy in a system are, in a way, the same quantity in different situations, but that got me thinking, what about Gibbs Free Energy (GFE)?, what about Enthalpy?, what about heat capacity?...

It seems to me like all thermodynamical quantities have a meaning in the context of information theory. For example, I get the feeling that Enthalpy tells you how hard a message is to 'send', and GFE tells you something about how easy it is to 'understand'.... I think, I'm probably wrong though.

Or maybe I'm completely wrong and only entropy makes sense in information theory and the other quantities do not, but then, can someone explain to me why?

πŸ‘︎ 10
πŸ’¬︎
πŸ‘€︎ u/Frigorifico
πŸ“…︎ Jun 21 2020
🚨︎ report
How does the physical concept of entropy relates to the information theory concept of entropy?
πŸ‘︎ 2k
πŸ’¬︎
πŸ‘€︎ u/Lichewitz
πŸ“…︎ Jan 14 2017
🚨︎ report
[P] Information Theory, Entropy and Intelligence Lecture Series

The most basic theories related to information and intelligence is information theory. This lecture series is to explain how entropy and information theory has to do with intelligence and story of Claude Shannon. It is produced in Korean, but all the slides are in English and English subtitle is available.

Information Theory, Entropy and Intelligence (1)

Information Theory, Entropy and Intelligence (2)

Information Theory, Entropy and Intelligence (3)

πŸ‘︎ 51
πŸ’¬︎
πŸ‘€︎ u/hiconcep
πŸ“…︎ Jan 29 2019
🚨︎ report
How is the concept of "entropy" in information theory related to the concept of "entropy" in thermodynamics?

For instance, diceware tells me that adding an additional diceware word to a password adds "12.9 bits of entropy."

On the other hand, physical entropy is measured in joules per kelvin.

Why the different units? And how are these concepts related?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/anotheryoungerman
πŸ“…︎ Jun 14 2019
🚨︎ report
[Uni Information theory] Do Shannon entropy penalize complexity?

For example, the set {AABBB} has an entropy of 0.97.

I could, for example, make a smart clustering {AA} & {BBB} with a total entropy of 0. Or, I could be a brute and have 5 sets {A}{A}{B}{B}{B}, also with entropy of 0.

I'm sure people are interested in minimizing entropy in balance with complexity. How can I find more information about this?

πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/BeatriceBernardo
πŸ“…︎ Jun 24 2018
🚨︎ report
Mind blown.. if this is how the word 'entropy' came into Information Theory !!

Shannon replied: β€œThe theory was in excellent shape, except that he needed a good name for β€œmissing information”. β€œWhy don’t you call it entropy”, Von Neumann suggested. β€œIn the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage"

So I am totally confused now, cause I just wasted a whole evening arguing over the meaning of the word 'entropy' in information theory. From my understanding, the more the entropy, the more the randomness, and so the less information you have about the system. which means entropy = anti-information.

But some people say more entropy means more information, or that entropy and information are exactly the same thing. How is this consistent with the 2nd law of thermodynamics.. since if the entropy of a closed system always increases, we would then always need more information to describe it.. which does not seem intuitive.

Please explain where I am going wrong..

πŸ‘︎ 79
πŸ’¬︎
πŸ‘€︎ u/somnophobiac
πŸ“…︎ Oct 06 2011
🚨︎ report
My Professors have given me the Green-Light to engage in an Individual Study this Spring Semester. I'd like to study Shannon Entropy and Information Theory. Do you guys have any resources to suggest?

My professors suggested checking Coursera and other MOOCs for at least an outline or framework as to how to go about the individual study, but I can't seem to find anything apart from a class that was offered a few years back

Do you guys know of any sources like this?

(Even a good textbook could be used as a framework for the course)

Let me know if you have any suggestions!

Thanks!

πŸ‘︎ 12
πŸ’¬︎
πŸ‘€︎ u/NegativeGPA
πŸ“…︎ Oct 21 2016
🚨︎ report
[Uni Information theory] Do Shannon entropy penalize complexity? reddit.com/r/learnmath/co…
πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/BeatriceBernardo
πŸ“…︎ Jun 24 2018
🚨︎ report
Entropy in the Context of Information Theory. This link is not meant to be the end of your investigation en.wikipedia.org/wiki/Ent…
πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/NegativeGPA
πŸ“…︎ Feb 02 2017
🚨︎ report
Can anyone provide an intuitive explanation or point me somewhere that describes entropy in the context of information theory?

This measure manifests itself in many different contexts and I would love to figure it out once and for all. Thanks!!!

πŸ‘︎ 19
πŸ’¬︎
πŸ‘€︎ u/lpiloto
πŸ“…︎ Sep 29 2011
🚨︎ report
[Information Theory] Calculating conditional entropy.

I'm trying to calculate conditional entropy for two sets. They both have 15 entries with {0,1} values for survive and {0,1} values for sex.

The data sets are below. Male = 1, Survived = 1

  • Survive: 1 0 1 1 0 1 0 0 0 1 0 1 0 0 1
  • Sex : 0 1 0 1 1 0 0 1 1 0 1 0 0 1 1

the probability of being male and surviving,

If I want to calculate H(Survival|X=Male)

p(S = 1,M=1) = 2/15 P(S=1) = 7/15

So H(Survival|X=Male) = - (2/15) * log((2/15)/(7/15)) + -(2/15) * log((2/15)/(8/15))

Is this correct?

Wikipedia Entry on Conditional Entropy.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/o_safadinho
πŸ“…︎ Oct 29 2015
🚨︎ report
What's the difference between entropy in science / physics and entropy in information theory?

And also, what are similarities?

πŸ‘︎ 9
πŸ’¬︎
πŸ‘€︎ u/Graebson
πŸ“…︎ Feb 21 2016
🚨︎ report
Information Theory: Principle of Maximum Entropy: I'm really confused

I'm writing a research paper on Bayesian analysis and I'm trying to read the wikipedia postings on Maximum Entropy for Bayes Nets (at https://en.wikipedia.org/wiki/Bayesian_network#Parameter_learning).

So how the heck can maximum noise represent an optimal information state? And how do you measure goodness-of-noise???? I can hear the whoosh of something goin' straight over my head.

Let me describe my confusion (best guess) then maybe you can help. Suppose you're trying to understand some process. So when you've factored out all the stuff you understand and can predict, you're left with something that is maximally noisy (?). Beyond here there be dragons. You suddenly and precisely know what you don't know.

This seems totally backwards. Why isn't that MINIMUM entropy. This all sounds terribly Nihilistic. Its like I decide life is meaningless and I'm done! (joking).

Is this what Shannon and Jayne et al are getting at??? Please help.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/equulz
πŸ“…︎ Jun 30 2014
🚨︎ report
[Information Theory] Conditional Entropy of a Message

Trying to read through some secondary sources before getting into Shannon's original paper, and I want to make sure my qualitative understanding is correct:

For general purposes, knowing (I believe) the entropy to be the sum of p(log2(1/p)) for all possible messages, if you have two random correlated variables A & B, where A is a possible introduction to a message and B is the message, given you (a) have a value for A and (b) know the probability distribution for A, the conditional entropy of B would just be a modification of how much additional information the message gives you (or a limit on the maximum information it could give you) given you have an introduction of known information content & probability (a kind of Bayesian update, I guess?). Or, otherwise, it's supposed to be an indicator of how much I(A,B) overlaps I(B)?

Maybe someone could formalize it for me and show a quick derivation if I've got the qualitative more or less right?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/regula_et_vita
πŸ“…︎ Aug 30 2017
🚨︎ report
Entropy and Information Theory [pdf] www-ee.stanford.edu/~gray…
πŸ‘︎ 36
πŸ’¬︎
πŸ‘€︎ u/mstoehr
πŸ“…︎ Jun 23 2008
🚨︎ report
Entropy and Information Theory - Robert M. Gray ee.stanford.edu/~gray/it.…
πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/mtrn
πŸ“…︎ May 17 2013
🚨︎ report
The Thermodynamic Theory of Ecology: how do you estimate the total number of species in an ecosystem based on limited information? A new physics-based theory called maximum entropy could provide the answer simonsfoundation.org/quan…
πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/nastratin
πŸ“…︎ Sep 04 2014
🚨︎ report
[D] Any recommendation of books for "Information theory and Statistics"?

I want to know more about f-divergence, Renyi, Shannon, KL, Wasserstein divergences. I am not a Math student, is there any book (or any kind of resources) that you recommend for the non-Math students to get a more general and high-level understanding of these metrics/divergences?

πŸ‘︎ 33
πŸ’¬︎
πŸ‘€︎ u/Blasphemer666
πŸ“…︎ Dec 16 2021
🚨︎ report
Gentle Intro to Information Theory: Guessing Entropy learningclojure.com/2011/…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/gtani
πŸ“…︎ Jan 19 2011
🚨︎ report
Information theory

So I am currently at university and taking a course on information theory which is part of a Master program and I wanted to say a few things about the recent speculation about extracting knowledge from the library.

Information is, scientifically, measured in bits and described through the entropy measure. The formula for calculating how much information something contains is based on the probability distribution of the events.

For example: you would not be surprised if I tell you that you will breathe tomorrow. This message contains close to zero information because the probability that you will not breathe tomorrow is very low.

As the library contains every possible arrangement of letters the probability for finding any given arrangement is the same, which is 100%.

If we know for sure we will find everything in it it there is no information to gain from it. That said the library contains zero information.

True wisdom is knowing which arrangements of letters make sense. If the library contains every possible book it might as well contain none without any difference in effect.

Even though this might sound sad it still is very fascinating. We can still find nice story's in there, actually any story with any outcome we wish for. Even the bible is in there somewhere, the whole freaking bible.

πŸ‘︎ 50
πŸ’¬︎
πŸ‘€︎ u/wedontknow_
πŸ“…︎ Oct 12 2021
🚨︎ report
Information theory in a continuous universe

If the universe is continuous, the position of each particle requires an infinite number of bits to be represented. Does this cause problems with information theory? I don't know much about information theory, but AFAIU, information has physical effects, so it would seem problematic if the universe has infinite information.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Logisk
πŸ“…︎ Sep 26 2021
🚨︎ report
I was banned by r/conspiracy theories for speaking the truth about covid. The information I posted came from various information I found from the subreddit. Some people can’t handle the truth!
πŸ‘︎ 367
πŸ’¬︎
πŸ‘€︎ u/PlotDevious
πŸ“…︎ Jan 10 2022
🚨︎ report
Are there any good texts on entropy?

Somewhat of a random question, but I'm curious if any critical theorists (or ppl adjacent) have written on entropy. I'm working on some fiction involving the concept, and such texts might help me think new thoughts about it. Thank you :)

πŸ‘︎ 16
πŸ’¬︎
πŸ‘€︎ u/evilgiraffemonkey
πŸ“…︎ Dec 17 2021
🚨︎ report
I’m looking for REALLY obscure theories and prices of information.

Y’know something like a weird cult that worshiped VHS tapes or something like that. Really weird things, preferably not anything to do with things like hollow earth or moon nazis or anything like that, just not what I’m looking for at the moment. The weirder the better!

πŸ‘︎ 132
πŸ’¬︎
πŸ“…︎ Dec 12 2021
🚨︎ report
We've released never before seen emails showing Dr. Fauci may have concealed information about #COVID19 originating from the Wuhan lab & intentionally downplayed the lab leak theory. @RepJamesComer & @Jim_Jordan want Fauci under oath. Time for answers. twitter.com/GOPoversight/…
πŸ‘︎ 247
πŸ’¬︎
πŸ‘€︎ u/sublimeinslime
πŸ“…︎ Jan 11 2022
🚨︎ report
Integrating information in the brain’s EM field: the cemi field theory of consciousness | Neuroscience of Consciousness academic.oup.com/nc/artic…
πŸ‘︎ 25
πŸ’¬︎
πŸ‘€︎ u/MuuaadDib
πŸ“…︎ Jan 20 2022
🚨︎ report
Former Tulsan here. Is this information getting out or is just being ignored or worse, wound into conspiracy theories?
πŸ‘︎ 36
πŸ’¬︎
πŸ“…︎ Dec 17 2021
🚨︎ report
Using Information Theory to Find Critical Scout Timings qnt.gg/information-theory…
πŸ‘︎ 18
πŸ’¬︎
πŸ‘€︎ u/ZephyrBluu
πŸ“…︎ Apr 08 2021
🚨︎ report
Made a little thing because I think a lot of people are unaware. As the image says, if you want to discard this information keep making theories, that’s fine with me. Have fun!
πŸ‘︎ 186
πŸ’¬︎
πŸ‘€︎ u/A_BrightDawn
πŸ“…︎ Jan 02 2022
🚨︎ report
Why are there so many different kinds of entropy?

Thermodynamic entropy, Gibbs entropy, Shannon entropy, Von Neumann entropy, Holevo entropy, RΓ©nyi entropy, etc.

It's extremely confusing and makes it rather difficult to follow discussions on quantum information theory, which I'm trying to write a paper on for a class.

πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/LoganJFisher
πŸ“…︎ Nov 23 2021
🚨︎ report
ELI5: Entropy and the remainder in the context of information theory.
πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/wille179
πŸ“…︎ Apr 07 2016
🚨︎ report
ELI5: Entropy in information theory
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/matholwch
πŸ“…︎ Jun 24 2013
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.