A list of puns related to "Conditional Proof"
Is it viable to prove a statement like "Let A, B, and C be sets. If ((The Cartesian Product of A and C is a subset of the Cartesian Product of A and B) then (A is a subset of B)) by assuming the antecedent and deriving the consequent? Wikipedia says that all the tools of propositional calculus are valid in higher order logic but just want to make sure that was indeed the case. For some reason conditional proofs aren't taught in my discrete math class beyond a single mention but they seem invaluable to me in these kind of proofs. I'm not familiar at all with how to format mathematical notation in reddit so forgive me for that.
Here is an outline of my proof, any notes would be much appreciated.
Assume (A x C) is a subset of (B x C)
Let x be an element of A, y an element of C, therefore (x,y) is an element of (A x C). Because we assumed (A x C) to be a subset of (B x C), (x,y) is also an element of (B x C), for any element in A it must also be an element in B, therefore A is a subset of B.
This is a homework problem. I suppose in the interest of ethics don't just provide me with a solution instead point out any flaws in my logic. Thanks!
In the definition of a conditional expectancy: (omega,F,P) a probability space and G a sub set of F then:
E[X|G]=X
but why?? i havent found any proof, just the proposition, in my mind it works like E[X|G]=E[X] but like i dont get it why it equals X??
I cant stop thinking about it and makes me uncomfortable to use it when i really dont know why its correct
Hi, I need to write down several conditional proofs, following this format:
I need to replicate the lines and arrow for my homework.
Hello, i know that conditional probability is a new probability space (Ξ©`, Ξ£` ,P`) (where Ξ©` = B) taken from some bigger probability space (Ξ©, Ξ£, P). If i'm wrong, correct me. I have a task to prove the Kolmogorov axioms for the function P` of the new space (Ξ©`, Ξ£` ,P`).
I just need to confirm that i did it well:
P`(Ξ©`) = P(B|B) = P(B β© B)/P(B) = P(B)/P(B) = 1
so for if A β C i have to prove that
P`(A) β€ P`(C)
P`(A) = P(A β© B)/P(B)
P`(C) = P(C β© B)/P(B)
is it enough to just prove that P(A) β€ P(C)?
A = (A β© C)
C = (C/A) U (A β© C)
so P(A β© C) β€ P(A β© C)+P(C/A)
P`(A) = P(A β© B)/P(B)
P`(C) = P(C β© B)/P(B)
P`(A β© C) = P((A β© C) β© B)/P(B)
P`(A U C) = P((A U C) β© B)/P(B) = P((A β© B) U (C β© B))/P(B)
completly lost my mind and idk what do to next.
Can you confirm that i've proven the first and second axioms and give some advices for the third one? Thank you.
(J v I) > S
A>J /A > S v T
I A ACP
I J MP 2,3
I am thoroughly stuck on how to get to S even though you can clearly see J which proves J v I but I legit cant figure out what rule to use. I know I will use Add when I get to S to get S v T
Problem:
Prove or Disprove (X β«« W | Z,Y) & (X β«« Y | Z) β (X β«« Y,W | Z)
The textbook (Pattern Recognition and Machine Learning by Bishop, pg 373) only gives a single, simple example of how this works, and does so using probability.
Ex. p(a,b|c) = p(a|b,c)p(b|c) = p(a|c)p(b|c) therefore, aβ««b|c
I don't know how to expand this to the larger problem.
I tried converting it to a probability expression along the lines of
p(x,w|y,z) & p(x,y|z) => p(x,y|z)p(x,w|z)
p(x|z)p(x|y)p(w|z)p(w|y) & p(x|z)p(y|z) => p(x|z)p(y|z)p(x|z)p(w|z)
but this didn't seem neither the correct way to do it, nor did it seem like I expanded the probabilities correctly. I know Bayes Theorem is probably a key part, but I'm not sure where.
I'm just not sure how to start this problem. Searching for Conditional Independence doesn't give anything similar enough to make a guess.
Use conditional proof to derive the conclusion of the following argument.
a) (N v D) β (M v C)
b) (M v Q) β C
Conclusion: N β C
United States
I have less than 10 employees.
Is it ok for me to provide a few hundred dollars bonus offer to all of of my employees, and to get it, they need to prove they have covered themselves for health insurance for the year prior, while employed?
The reason for this is to encourage financial and personal responsibility. I've learned that many of my employed do not have health insurance, even though it is often available to them for very little. (For example. $5 a month after tax credits.)
Hi All. As my title says, I'm trying to prove the addition rule of inference using a conditional proof. It's very easy to prove using an indirect proof, but much harder (imo) to prove using a conditional proof. I think I was successful though. Someone who saw my work implied that I'd made at least one mistake without elaborating. So I thought I'd check to see whether I'd solved it or not. And whether someone can come up with other ways of solving it. Please see my solution below:
P
/ P v Q
P --> Q ACP
Q 1, 2, MP
(P -->Q) --> Q 3, 4, CP
(~P v Q) --> Q 4, MI
~(~P v Q) v Q 5, MI
(~~P * ~Q) v Q 6, DM
(P * ~Q) v Q 7, DN
Q v (P * ~Q) 8, comm
(Q v P) * (Q v ~Q) 9, dist
Q v P 10, simp
P v Q 11, comm
I tried looking for an answer to this, but all I found was the differences between a Proof by Contraposition and Proof by Contradiction.
Let's say I have an conditional statement If a then b. Then I perform the contraposition of this statement making the logically equivalent statement If not b, then not a. From this contraposition, I perform a Proof by contradiction assuming b is true, thus a is true (basically the converse of If a, then b.) Then find a contradiction because a is not true. Therefore, if not b, then not a. Thus by contraposition, If a, then b.
Been stuck at this for some time. Does anyone know how to do this?
(AβB)β§(DβA)
Goal/conclusion: Bβ(Dβ§A)
many thanks!
Hey r/logic, been struggling with this one all weekend. I'm sure it's easy but something isn't clicking.
I am working within basic sentential logic, from Klenk's "Understanding Symbolic Logic", with the usual rules of inference, replacement rules, conditional and indirect proof, two universal and two existential quantifiers, quantifier negation rules and complex quantifier rules.
The task is to prove the following argument
Conclusion: ~(βx)Fxβ(βx)~Mx
It doesn't look like any quantifier negations can be used here (right?), so I instantiated such that I can work with the argument and got
then I did a bunch of sentential rules trying to get to the something I can do conditional proof on, and got
Now, even with all this, I can't seem to find a conditional proof that I can solve. I see that in the conclusion I will need to first achieve an instance of (βx)~Fxβ(βx)~Mx, and I see that the conclusion could be written as (βx)~Fxβ~(βx)Mx so maybe there is an indirect proof nested in the conditional proof.... but I cannot for the life of me figure it out. I hope this is a good start, but any suggestions would be much appreciated! (First time posting here so I hope I did alright)
EDIT: Lots of formatting. The justifications make sense if you number the steps properly in the order they appear... I don't know why they don't show up that way. Sorry, I'm not great at reddit.
Given:
P(A) = P(B) = 1/2
P(C|(AnB)) = 1/8
P(C|(A n (not B) )) = 1/10
I know I have to show P(AnB) = 1/4. How do I do this?
You guys were really helpful the last time I posted. I thought I was good on these two kinds of proofs, but I'm completely at a loss. Here is the first conditional proof:
ACP
ACP
MP 1,3
MP 2,4
AIP?
MT 5,7
MP 6,8
Conj 8,9
7-10 IP?
3-12 CP?
I don't know what steps 7 and 11 would be called or if they're even "legal" but I think it works. Thanks to /u/stua8992
I think you have to assume multiple premises, but still have no clue what to do.
Here is the indirect:
Here's my attempt at this proof, I think it might be ok.
AIP
MP 1,4
Simp. 5
Simp. 5
Add. 6
Comm. 8
M.P. 3,9
Conj. 6,10
DeMorgan 11
M.T. 2,12
DeMorgan 13
D.N. 7
D.S. 14,15
Conj. 4,16
IP 4-17
I'll update as I make any progress at all on these. Thanks for any help at all in advance. Edit: I'm gettin' downvoted, not sure why, but if this isn't following the rules mods please remove it.
a = minimum(a,b) <=> a<=b where minimum(a,b) = a + b - abs(a-b) / 2
I was given this problem during a quiz last week, and I lost some points because, I'm guessing, I wasn't rigorous enough.
I'd like an explanation for the backward part of this proof. a<=b => a = min(a,b);
This is what I did.
Case 1: a<b
then a - b is the negative difference between a and b.
then abs(a-b) is the positive difference between a and b.
then b - abs(a-b) is b minus the positive difference between a and b ( b -abs(a-b) = b -(b - a)).
a + b - (b -a) /2= a + a/2 = 2a 2a/2 = a
a = min(a,b)
Case 2 a = b
then a + b - abs(a-b)/2 = a + b /2 = a+a/2 = a
a = minimum(a,b)
To do this, I had to think very hard about the the term abs(a-b). I thought of a and b as vectors and came to the conclusion that a - b is the negative difference between the two when a<b. My friend said he did this problem using algebra, but I have no clue how that could be done.
What are the flaws in my way, and what other ways could this be solved?
I'm having a really difficult time solving this proof, but I think that has more to do with the countless hours of logic I have done today and my brain is worn out. Anywho, any sort of help on this proof would be greatly appreciated! http://imgur.com/a/hCr5i
I made an argument a while back that took the form:
I was accused of making a circular argument, yet, I don't think that complaint was correct. If A entails B, it doesn't seem to be circular to argue that "if A, then B".
So, I decided to take this to you all: Is there a situation in which a conditional proof can be circular?
The standard proof (only one I've seen) for this uses the fact that (X-qY)^2 >=0 for all rational q. Now, conditioning the random variable on a sigma-algebra G yields us
E((X-qY)^2 | G) = E(X^2 | G)-2qE(XY | G)+q^2 E(Y^2 |G),
and with probability one for all rational q this is non-negative. However, a quadratic polynomial is non-negative iff its discriminant is non-positive. Now we should be able to write the formula as
4E(XY|G)^2 -4E(X^2 | G)E(Y^2 | G) <=0 ,
and this is where I get lost. All help is appreciated.
First and foremost, this is a proof-reading request. I'm going through Velleman's "How To Prove It" because I found that writing and understanding proofs is a prerequisite to serious study of mathematics that I did not meet. Unfortunately, the book is very light on answers to its exercises and there is no solution manual available for purchase. Also, I understand that the best (and only?) way of learning to write proofs is by getting feedback from actual human beings. For what it's worth, I have an engineering degree's curriculum worth of math.
>Exercise 3.2.2. This problem could be solved by using truth tables, but don't do it that way. Instead, use the methods for writing >proofs discussed discussed so far in this chapter. (See Example 3.2.4.) > > (a) Suppose [; P \rightarrow Q ;] and [; R \rightarrow \neg Q ;] are both true. Prove that [; P \rightarrow \neg R ;] is true. > > (b) Suppose that [; P ;] is true. Prove that [; Q \rightarrow \neg (Q \rightarrow \neg P) ;].
At this point Velleman has introduced sentential and quantificational logic. He has talked about a few techniques that can be used to tackle proofs involving conditionals and negations, including proof by contradiction and contrapositive, how one can use the givens of a problem to infer other givens via inference rules such as modus ponens and modus tollens, etc. For example, he says that to prove a statement of the form [;P \rightarrow Q;], one can assume that [; P ;] is true and then prove [; Q ;].
The first one is rather simple:
The second one I found at least 3 ways of proving, so I'm most interested in input on this one. Which one would be better? Are there other, better ways of proving it?
(By assuming the antecedent is true and proving the consequent. Contrapositive.) Suppose [;Q;] is true. Then [; \neg (Q \rightarrow \neg P) ;] is also true since it is equivalent to [; Q \wedge P ;] and we know [;P;] is true. Therefore, if [; Q ;], then [; \neg (Q \rightarrow \neg P);].
(By contrapositive.) We will prove [;Q \rightarrow \neg (Q \rightarrow \neg P);] by its contrapositive [;(Q \rightarrow \neg P) \rightarrow \neg Q;]. Suppose [;Q \rightarrow \neg P;] is true. Since we know [;P;] is true, it cannot be that [;Q;] is true (again b
Some context: I want to study economics and I am Romanian. I will be sitting the Romanian Baccalaureate in July. I have applied at: Oxford University, LSE, UCL, Warwick and Lancaster. Next week I will be required to sit the TSA for Oxdord.
LSE has sent me an email to inform me that they will take a decision in the coming weeks.
Yesterday, I received an offer from Lancaster. In my UCAS letter, they asked me for further proof regarding my CAE results. Today I sent them a scanned copy of my diploma. My question is, whether I just made them my firm choice by answering their querry, I haven't received any other offers yet, but the deadline is still far.
Did I act to fast? Should I have waited more?
I've been doing fine with natural deduction with rules of replacement and implication, but the conditionals are tough.. Help please?
[~(N v ~A) -> ~X]
[A -> (N -> C)] / [X -> (A -> C)]
C.2.
[J -> (T & L)]
[P -> (R & M) / [(L -> P) -> (J -> M)]
Here's the assignment:
I. Give a conditional proof of the following argument:
G > (E > N)
H > (~N > E) / G > (H > N)
II. Give an indirect proof of the following argument:
S > (R ^ ~T)
(S ^ R) > (T v E)
(Q v ~T) > ~E /~S
Any help would be awesome.
Prove this using a conditional proof??? Please help!!!
/ ~P --> (H --> ~B)
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.