Kolmogorov's extension theorem, limits, and compactness

Kolmogorov's extension theorem tells us that for any prescribed set of finite dimensional distributions which could be the fdd's of a stochastic process, there does indeed exist a stochastic process with the given fdd's. Being technical, we can formulate it as follows (copying it off from a book by Le Gall on stochastic calculus):

Let E be some Polish space. Let [;\Omega^* = E^{\mathbb{R}^*};] be the space of all mappings [;\omega: \mathbb{R}^* \to E;], equipped with the sigma algebra [;\mathcal{F}^*;] generated by the coordinate mappings [;\omega\to\omega(t);] for [;t\in\mathbb{R}^*;]. Let [;F(\mathbb{R}^*);] be the set of finite subsets of [;\mathbb{R}^*;], and, for every [;U\in F(\mathbb{R}^*);], let [;\pi_U: \Omega^* \to E^U;] be the obvious projection that sends a function from [;\mathbb{R}^*;] to E to its restriction to the subset U. Similarly, for [;U\subset V;], let [;\pi^V_U: E^V\to E^U;] be the obvious projection map.


Thm (Kolmogorov's extension theorem)
Assume that we have, for every [;U\in F(\mathbb{R}^*);], a specified finite dimensional distribution [;\mu_U;], that is, a probability measure [;\mu_U;] on [;E^U;]. Assume further that these finite dimensional distributions are consistent in the sense that if [;U\subset V;], then [;\mu_U;] is the image of [\mu_V;] under [;\pi^V_U;].

Then there exists a probability measure [;\mu;] on [;(\Omega^*,\mathcal{F}^*);] such that [;\pi_u(\mu) = \mu_U;] for every [;U\in F(\mathbb{R}^*);].


A nice enough theorem, though it does seem to give us a measure defined on a very stupid sigma algebra on the wrong space, forcing you to do other things to get any kind of regularity.

Either way, this formulation of the theorem, unlike others I've seen, actually reminds me of two things from other parts of mathematics:

  1. The collection [;F(\mathbb{R}^*);] obviously form a directed poset, and when phrased in this way, the conditions on our [;\mu_U;] and [;\pi^V_U;] look a whole lot like the conditions for an inverse limit.
  2. We specify an infinite collection of axioms we want our desired stochastic process to fulfil, and require that this collection be consistent in some sense. We also observe that for any finite subset of our axioms, there exists a process which fulfils them. From this, we draw the conclusion that there exists a process which fulfils all of them simultaneously. When I phrase it like this, it sounds similar to the comp
... keep reading on reddit ➑

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/GLukacs_ClassWars
πŸ“…︎ Jan 08 2018
🚨︎ report
Since Kolmogorov axioms laid the foundation for modern probability theory, how did Bayes' Theorem come about before that?

So, this is kind of a "history of mathematics" question, but this is one thing I've been curious about. I asked my Statistics professor (I'm a computer science undergraduate so I'm limited on this subject) and she didn't know how to answer straight away. Sure there's more to statistics before Kolmogorov than Bayes' Theorem but let's limit our discussion for this particular question.

πŸ‘︎ 6
πŸ’¬︎
πŸ“…︎ Sep 19 2019
🚨︎ report
Kolmogorov theorem

If we have infinite random variables, we can still calculate the finite dimensional distribution and the associated stochastic process. This is trivial. But what if we have the finite dimensional distribution, is there a way to find the associated stochastic process?

The answer is yes, and it is possible if the FDD follows the two conditions satisfied by the Kolmogorov theorem. Could anyone please explain the conditions in detail?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/Aghi_06
πŸ“…︎ Aug 25 2020
🚨︎ report
Isn't the incomputability of Kolmogorov complexity a non-practical theorem?

I was reading about the Incomputability of Kolmogorov complexity, but I could not understand how the proof was relevant to our world.

The problem I saw was that say you had a function Halts(x) that takes a program and returns true if it halts (ignoring the halting problem for now) then it becomes computable. We create this program that loops through all possible programs until if finds one that halts on the input string, then returns the size of that program.

KolmogorovComplexity(string s)
    for i = 1 to infinity:
        for each Program P of size i:
            if Halts(P):
                if s equals P():
                    return sizeof(P)

This would contradict the proof, since the proof states we can find a smaller program. But that's impossible, we already tried all programs that are smaller. In fact we already tried running it through the contradiction proof, but since it is referring to itself it created an infinite loop and therefore doesn't halt.

Of course, this relies on us ignoring the halting problem, but since the halting problem doesn't exist for finite memory (like in our physical world) then isn't it a lie to say we can't make an algorithm that can find the smallest compression of a string?

tl;dr: shouldn't it be possible to create an algorithm that returns the smallest string compression possible for a given amount of memory?

Edit: I realized there were a crucial flaw with my thinking, kahirsch gave a good explanation of why it still isn't going to be practical.

πŸ‘︎ 11
πŸ’¬︎
πŸ‘€︎ u/MestR
πŸ“…︎ Dec 05 2012
🚨︎ report
Question about Kolmogorov Complexity & Godel's Incompleteness Theorem

Kolmogorov Complexity cast in terms of Godel's First Incompleteness Theorem states that there is a number c depending on T (where T is a consistent formal system that incorporates "sufficient" amount of arithmetic) such that T does not prove any statements of the form "the complexity of string s is greater than c". Thus unless T is inconsistent then there are statements like "the complexity of string s is greater than c" that are undecidable in T. Investigating this proof does not give rise to any such statements. I was wondering what are examples such statements?

πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/coforce
πŸ“…︎ May 25 2012
🚨︎ report
Great exposition of Cybenko's and Kolmogorov's theorems for Universal Approximation with Neural Networks. cstheory.stackexchange.co…
πŸ‘︎ 20
πŸ’¬︎
πŸ‘€︎ u/DevFRus
πŸ“…︎ Jan 22 2014
🚨︎ report
Extension on the Submissive and Breedable theorem
πŸ‘︎ 62
πŸ’¬︎
πŸ‘€︎ u/TickoTicko-Nii
πŸ“…︎ Oct 14 2021
🚨︎ report
Decomposition into weight Γ— level + jump, an extension of the fundamental theorem of arithmetic

Hi,

I would like to present you the decomposition into weight Γ— level + jump.

Definitions of the decomposition into weight Γ— level + jump on the OeisWiki (en).

50 sequences decomposed into weight Γ— level + jump in one GIF

It's a decomposition of positive integers. The weight is the smallest such that in the Euclidean division of a number by its weight, the remainder is the jump (first difference, gap). The quotient will be the level. So to decompose a(n), we need a(n+1) with a(n+1)>a(n) (strictly increasing sequence), the decomposition is possible if a(n+1)<3/2Γ—a(n) and we have the unique decomposition a(n) = weight Γ— level + jump.

We see the fundamental theorem of arithmetic and the sieve of Eratosthenes in the decomposition into weight Γ— level + jump of natural numbers. For natural numbers, the weight is the smallest prime factor of (n-1) and the level is the largest proper divisor of (n-1). Natural numbers classified by level are the (primes + 1) and natural numbers classified by weight are the (composites +1).

Decomposition into weight Γ— level + jump of natural numbers.

For prime numbers, this decomposition led to a new classification of primes. Primes classified by weight follow Legendre conjecture and i conjecture that primes classified by level rarefy. I think this conjecture is very important for the distribution of primes.

It's easy to see and prove that lesser of twin primes (>3) have a weight of 3. So the twin primes conjecture can be rewritten: there are infinitely many primes that have a weight of 3.

Decomposition into weight Γ— level + jump of prime numbers with OEIS sequences. Classification of primes

Here the decomposition into weight Γ— level + jump of prime numbers in 3D (three.js, WebGL).

I am not mathematician so i decompose sequences to promote my vision of numbers. By doing these decompositions, i apply a kind of sieve on each sequences.

There are 1000 sequences decomposed on my website with 3D graphs (three.js - WebGL), 2D graphs, first 500 terms, CSV

... keep reading on reddit ➑

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/Nunki08
πŸ“…︎ Oct 29 2021
🚨︎ report
[Q] Extension of the Frish-Waugh-Lovell theorem

Hello all,

I have a question regarding the Frisch-Waugh-Lovell theorem: Does the theorem work when the regression has more than two independent variables?

Say for example I have the following model: Y= \alpha + \beta_1 X + \beta_2 D + \gamma_2 K +e, (let each independent variable be a 1 by n vector)

If I decide to estimate the OLS coefficient for X using this method:

X= c + \gamma_1D + \beta_3 K + \mu

then

Y= \delta \mu+ \epsilon

Will \delta= \beta_1 ?

Thanks!

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/Lambdapie
πŸ“…︎ Sep 02 2021
🚨︎ report
Extension of the "extreme value theorem"

"InΒ calculus, theΒ extreme value theoremΒ states that if a real-valuedΒ functionΒ "f" isΒ continuousΒ on theΒ closedΒ intervalΒ { [a,b]}, thenΒ fΒ must attain aΒ maximumΒ and aΒ minimum, each at least once.Β "

Source: https://en.m.wikipedia.org/wiki/Extreme_value_theorem

Is there also a theorem that says something along the lines of "a n degree function must have less than n -1 maximas or minimas?

Thanks

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/blueest
πŸ“…︎ May 26 2021
🚨︎ report
Extension of Tarksi's Undefineability Theorem

Hi all - can anyone tell me if this line of logic holds, and if it has a name? Apologies if this is hackneyed.

  1. Any human explanation of the meaning of the universe must be describable in an interpreted language.
  2. That interpreted language itself must be describable in order to have meaning.
  3. By reference to Tarski, the language cannot describe itself, therefore a stronger language must be used.
  4. But the stronger language form can also only be described by a yet stronger language, which takes us back to point 3.
  5. This leads to an infinite regression, reminiscent of the ontological argument for the proof of God.
  6. Thus logic cannot be used to explain the universe.
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/bmcollier
πŸ“…︎ Jun 13 2021
🚨︎ report
Caratheodory's Extension Theorem

I understand that in measure theoretic probability, this theorem is important in allowing us to assert the existence of measures on sigma algebras.

How many probability theorists know the proof by heart, and how many statisticians know this proof by heart?

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/A_N_Kolmogorov
πŸ“…︎ Mar 16 2021
🚨︎ report
Extension of Liouville's Theorem

Is this a valid extension of Liouville's Criterion for Liouville Numbers to Cantor Series?

$$0\le\sum_{k=n+1}^{\infty}\frac{a_k}{b_k!}\le\sum_{k=n+1}^{\infty}\frac{b_k-1}{b_k!}$$

$$a_k\ ,\ b_k\ \in\ \mathbb{N}$$

$$b_n!=b_1b_2b_3...b_n$$

$$a_k\neq a_{k+1}\ $$

$$b_k\neq b_{k+1}$$

$$\sum_{k=1}^{\infty}\frac{b_k-1}{b_k!}=\sum_{k=n}^{\infty}\frac{1}{b_k!}\ -\sum_{k=n+1}^{\infty}\frac{1}{b_k!}=\frac{1}{b_n!}$$

$$0\le\sum_{k=n+1}^{\infty}\frac{a_k}{b_k!}\le\frac{1}{b_n!}$$

$$\sum_{k=n+1}^{\infty}\frac{a_k}{b_k!}$$

$$a_k\ ,\ b_k\ \in\ \mathbb{Q}$$

$$b_n!=b_1b_2b_3...b_n$$

$$a_k\neq a_{k+1}\ $$

$$b_k\neq b_{k+1}$$

$$b_k=\frac{c_k}{d_k}$$

$$c_k,\ d_k\in\mathbb{N}$$

$${{1\le d}_k\le c}_k$$

$$\sum_{k=n+1}^{\infty}\frac{a_k}{b_k!}=\sum_{k=n+1}^{\infty}\frac{d_k!a_k}{c_k!}$$

$$a_k={(b}_k-1)=\frac{c_k}{d_k}-1$$

$$a_k=\frac{c_k-d_k}{d_k}$$

$$\sum_{k=n+1}^{\infty}\frac{a_k}{b_k!}=\sum_{k=n+1}^{\infty}\frac{d_k!a_k}{c_k!}=\sum_{k=n+1}^{\infty}\frac{d_k!{(c}_k-d_k)}{c_k!d_k}$$

$$\sum_{k=n+1}^{\infty}\frac{d_k!{(c}_k-d_k)}{c_k!d_k}=\sum_{k=n+1}^{\infty}\frac{d_{k-1}!}{c_{k-1}!}\ -\sum_{k=n+1}^{\infty}\frac{d_k!}{c_k!}=\frac{d_n!}{c_n!}$$

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/B_And_A1st
πŸ“…︎ Apr 10 2021
🚨︎ report
Extension/approximation of the binomial theorem

Extension of the binomial theorem

Hello everyone. In my last post on approximating pi, I had alluded to a generalization of the binomial theorem to non-integer power p of (x+a)^p by infinite sums using the falling factorial and other interesting tools. I mentioned I had managed to raise the number of sums up to 172. Here's a demonstration of how decent of an approximation it presents: https://www.desmos.com/calculator/sy358svujz?lang=en I hope you enjoy.

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/WiwaxiaS
πŸ“…︎ Mar 20 2021
🚨︎ report
Kolmogorovs Boundry Question

Any time I have the artifact active it works for a couple minutes until I add a few more memory and then reverts to normal speed. What am I missing?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/bombmachinist
πŸ“…︎ Jan 17 2022
🚨︎ report
Some fun extensions to the classic Monty Hall problem: what if Monty always opens the door to the left of the one you choose? And what if there are only 2 doors and Monty says at least one has a goat - what do you do now? The answer (of course) lies with Bayes Theorem... tomrocksmaths.com/2021/02…
πŸ‘︎ 11
πŸ’¬︎
πŸ‘€︎ u/tomrocksmaths
πŸ“…︎ Feb 23 2021
🚨︎ report
[Request] What would the fundamental theorem of calculus look like if it were expressed purely in first-order logic? Is it even possible or would an extension to second- or higher-order logic be necessary?
πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/Vanitas_Daemon
πŸ“…︎ Jan 20 2021
🚨︎ report
Discussing intensive, extensive and specific properties of fluids over a glass of whiskey. If nothing else, you will hear how to order a beer in a bar in the proper fluid mechanics style! This video is a precursor to the derivation of the great and powerful Reynolds transport theorem. youtu.be/3OUfe8Y5QdI
πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/Hitman8Sekac
πŸ“…︎ Oct 29 2021
🚨︎ report
Dr. Stafford explains the Sill Plate Extension-Reduction Theorem streamable.com/jvveu
πŸ‘︎ 33
πŸ’¬︎
πŸ‘€︎ u/Patrick_Spens
πŸ“…︎ Oct 17 2019
🚨︎ report
Petersen's Theorem Extension for bridgeless 5regular graphs

How would I do this? Been struggling for days.

Is it true that every 5-regular graph without bridges contains a 1-factor? If true, provide proof. If false, provide a counterexample, and explain using Tutte's theorem why your example does not contain a 1-factor

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/ISeeThings404
πŸ“…︎ Nov 06 2020
🚨︎ report
Can a distribution follow all three of Kolmogorov's axioms and still be "just" a quasiprobability distribution?

Basically everything's in the title already (I guess).

πŸ‘︎ 176
πŸ’¬︎
πŸ‘€︎ u/renatomello
πŸ“…︎ May 07 2021
🚨︎ report
Kolmogorov Complexity for DNA Sequences Analysis in Python youtube.com/watch?v=QkwPf…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/research_pie
πŸ“…︎ Jan 04 2022
🚨︎ report
[Q] Wasserstein distance and Kolmogorov-Smirnov statistics as measures of effect size

So I've been dealing with 2 sample hypothesis testing with very large samples (around 20,000s each). Whenever I test for the equality of distribution I always reject the null hypothesis, even though they aren't as different. I completely understand why this happens with large samples.

The advice around here is to use measureS of the effect size to assure that although the data comes from different distributions they aren't as different. The most recommended measure is cohens d

$d = \frac{\bar{x}-\bar{y}}{s_pooled}$

I think this measure is not that good because it only compare the standarized difference of means.

I thought that maybe the Wasserstein distance or the Kolmogorov-Smirnov statistic can be good measures of the effect size between the two distributions. Are they?

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Jay31416
πŸ“…︎ Oct 21 2021
🚨︎ report
Are there any theorems that let one assume a holomorphic extension of a real function?

If f(x) is a smooth function over at least some subset of R (which can also be semi-infinite) then are there conventional theorems that would let me automatically assume this function has a holomorphic extension? I understand there is a difference between real analytic and complex analytic, though I'm not sure what it is, but it seems like there should be at least some theorems that can make this implication.

Like maybe if the inverse of f(z) is holomorphic, then f(z) is also holomorphic or if f(x) is analytic and contains no essential singularities then f(z) is also holomorphic or something like that.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/RedditChenjesu
πŸ“…︎ Dec 29 2019
🚨︎ report
Kolmogorov Complicity and the Parable of Lightning (2017) slatestarcodex.com/2017/1…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/qznc_bot2
πŸ“…︎ Dec 19 2021
🚨︎ report
Kolmogorov Complicity and the Parable of Lightning (2017) slatestarcodex.com/2017/1…
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/PatientModBot
πŸ“…︎ Dec 19 2021
🚨︎ report
211 BC: Archimedes struggles to solve the SIP roof extension-reduction theorem
πŸ‘︎ 21
πŸ’¬︎
πŸ‘€︎ u/Patrick_Spens
πŸ“…︎ Aug 28 2018
🚨︎ report
Who is the professor who essentially gave students a packet of math theorems and told them to solve it all themselves. Once they were solved they could use them to solve other theorems in said packet?

Thanks to anyone who is willing to help.

πŸ‘︎ 478
πŸ’¬︎
πŸ‘€︎ u/debased3
πŸ“…︎ Jan 21 2022
🚨︎ report
To every student that asks β€œWhen will I use this in real life?” I used the Pythagorean theorem at my bar to install new lights. And it looks perfect.

My boss asked what I was doing with my measurements and calculator - I proudly showed her. The new bulbs are a perfect β€œX” in the dining room. Thank you to the math teachers of Lake High School. 🀘

πŸ‘︎ 911
πŸ’¬︎
πŸ‘€︎ u/MattyBoomBlattyYo
πŸ“…︎ Jan 14 2022
🚨︎ report
[Q] Using the Kolmogorov-Smirnov test to compare 2 distributions - have I done it wrong?

Hi all,

I have 2 distributions that I’ve been working with and I’m trying to quantify whether or not they (statistically) have significant differences. Let’s call these array1 and array2.

Array1 has 500 values, ranging from -20 to 20, and array2 has 300 values, also ranging from -20 to 20. What I’ve done is histogram these data (from array1 and array2) separately, between -20 and 20, using binsizes of 1, then divide by the total number of histogrammed data to get a probability density function (PDF) for each array.

I.e. so instead of the number of counts in each bin, it’s a number between 0 and 1.

Using these PDF’s from array1 and array2, I’ve calculated the CDF’s. I’ve then calculated the D value by finding the index where the difference between the two CDF’s is maximised.

So, this is how I’ve done it. However, the in-built packages for both IDL and Python are giving me a slightly different D statistic, and I don’t understand where I’m going wrong. It’s not clear what those functions take as input though (should these be the raw array1/array2, or the PDFs etc.). If it’s the raw data, how does it know information about binsizes etc.

Does it sound like I’ve done it right? Am I correct in assuming that I do want to be histogramming/binning to the data (since that’s how I can calculate a PDF/CDF, right?)

πŸ‘︎ 17
πŸ’¬︎
πŸ‘€︎ u/JLane1996
πŸ“…︎ Jun 30 2021
🚨︎ report
Does GΓΆdel's incompleteness theorem tell us anything about whether reality is finite or infinite, in its essence?
πŸ‘︎ 64
πŸ’¬︎
πŸ‘€︎ u/lepandas
πŸ“…︎ Jan 22 2022
🚨︎ report
[D] Importance of Kolmogorov's Probability Axioms

https://en.wikipedia.org/wiki/Probability_axioms

I read earlier today, that Kolmogorov's Probability Axioms are some of the most important results in probability.

Can someone please explain why these are so important? What relevance and application do they have?

Thanks

πŸ‘︎ 22
πŸ’¬︎
πŸ‘€︎ u/SQL_beginner
πŸ“…︎ May 03 2021
🚨︎ report
Dusted off the ol’ Pythagorean theorem for this accent wall my wife wanted.
πŸ‘︎ 1k
πŸ’¬︎
πŸ‘€︎ u/discoslimjim
πŸ“…︎ Jan 12 2022
🚨︎ report
An interdisciplinary and combinatorial analysis between the Cosmic Microwave Background, Kolmogorov Complexity, Vacuum Fluctuations, Instantons & de-Sitter Space. /r/quantumcosmology/comme…
πŸ‘︎ 11
πŸ’¬︎
πŸ‘€︎ u/chavisvonbradford
πŸ“…︎ Aug 11 2021
🚨︎ report
Differences of Kolmogorov-Smirnov Test and Shapiro-Wilk Test for Normality

Hello, I was curious about these two tests because they are apparently both used for checking the data before running a two-way ANOVA (or maybe it was just one of them and not both...?), so I was curious as to what their differences are and when is it needed to use these tests. Thanks!

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/Izuvivace
πŸ“…︎ Jul 29 2021
🚨︎ report
Inputs for a Kolmogorov-Smirnov test

Can you use a kstest to see how similar two sets of data are that do not come from a normalized probability distribution?

For example, if you have two sets of data that are similar and come from a sinusoidal function, would a kstest be valid even though a sine wave is not a valid probability distribution?

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/BeautifulParsley
πŸ“…︎ Jul 30 2021
🚨︎ report
One thing I don't understand about kolmogorov complexity

What if there is a way to express something succinctly but it is not obvious to anyone? Take any sequence that has a high kolmogorov complexity, what if there is, for lack of a better scenario, an alien civilization that can describe the sequence perfectly with very few words.

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/mathjournal
πŸ“…︎ Jul 08 2021
🚨︎ report
[D] Kolmogorov-Chapman vs Kolmogorov Backwards/Forwards
  • https://en.wikipedia.org/wiki/Chapman%E2%80%93Kolmogorov_equation

  • https://en.wikipedia.org/wiki/Kolmogorov_backward_equations_(diffusion)

What is the main difference between the Kolmogorov-Chapman equation vs Kolmogorov Backwards/Forwards equations?

It seems like the Kolmogorov-Chapman equation can be used to calculate the vector containing the probabilities that a discrete-time Markov Chain will be any of it's states.

Whereas the Kolmogorov Forwards/Backwards Equations are used in continuous time Markov Chains to find out the probability that the Markov Chain was in a particular sequence of states (backwards: historical) or predict that the Markov Chain will be in a particular sequence of states (forwards: future)?

Is this correct?

Thansk!

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/SQL_beginner
πŸ“…︎ Sep 28 2021
🚨︎ report
What are some vastly misinterpreted math theorems?
πŸ‘︎ 472
πŸ’¬︎
πŸ‘€︎ u/ilya123456
πŸ“…︎ Dec 19 2021
🚨︎ report
What are some interesting β€œall but x” theorems?

Basically a theorem that says β€œall but some number of cases” satisfies the theorem

πŸ‘︎ 442
πŸ’¬︎
πŸ‘€︎ u/SomeoneRandom5325
πŸ“…︎ Jan 03 2022
🚨︎ report
Exercise Solutions for "An Introduction to Kolmogorov Complexity and its Applications"

Does anyone have a resource for this? I am self-studying the book in grad school and don't have anyone to bounce my solutions off. I understand many of the exercises are basically open research questions, but it would still be helpful to get as many answers as possible.

πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/drcopus
πŸ“…︎ Jul 18 2021
🚨︎ report
Help with thesis statistics: kolmogorov-smirnov 2 sample test

Hi! I’m wondering if anyone has any knowledge about this test/it’s applications.

I’m using it to compare the cumulative distributions of egg production between groups of tardigrades, but I’m not sure how to account for changing numbers of tardigrades in each group.

Ex: 2 tardigrades in 1 sample have died, leaving 8. These 8 produce 10 eggs, but a different group of 10 also produces 10 eggs.

To my understanding, this test only compares the eggs produced, ie comparing 10 to 10, without considering the differing size of the groups that produced these eggs. How should I account for these changing group sizes?

Thanks in advance, I’d be happy to explain further if this was unclear!

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/akb700
πŸ“…︎ Mar 25 2021
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.