A list of puns related to "Curryโhoward Isomorphism"
After looking at the correspondence between types,programs : theorems, proof I am stuck at a point while studying lambda expressions. Consider the following functions
\x -> x*x + 2*x + 1
\x -> (x + 1) * (x + 1)
I would like to arrive at a normal form in lambda calculus so that I can say the algebraic equivalence of the above functions (Please consider fix-point operator will be omitted for checking equivalence).
But is arriving at a normal form using beta-reduction in lambda calculus equivalent to running the program itself ?
Or is it just algebraic reduction similar to what a SMT does (like SBV in Haskell, Microsoft z3) ?
And if so is there is a equivalent of evaluation of program in the logic land according to Curry-Howard isomorphism ?
I'm learning from this book: https://disi.unitn.it/~bernardi/RSISE11/Papers/curry-howard.pdf (Lectures on Curry-Howard Isomorphism - 1998 version) for some project. And due to time constraints, I probably won't be able to cover all of the material in the book, in my study. Luckily, although it would be useful,I don't think I will need to know everything in this book, but rather selected topics. At the moment, I learned the first chapter and something like a third of the second chapter, and from what I learned so far in the second chapter it seems like you don't actually need to know the first chapter in order to learn this chapter, it seems like the two chapters cover separate topics. So, perhaps that's true for other chapters as well. Of course, some of them will require knowledge of previous chapters, especially, I can imagine the 4th chapter on the Curry-Howard isomorphism, but even the chapters that require knowledge of previous chapters, might not require all the previous chapters.
So, it could be very helpful if someone with experience with the topics covered in this book, could list to each chapter all the prerequisites for learning it. Especially, for chapters 4 and 11 (Heyting Arithmetic), that cover material that I totally need.
By the way, I asked this question on stack exchange math, so if you want to answer the question there, here is the link: https://math.stackexchange.com/questions/3489031/a-question-about-the-order-of-learning-from-the-book-lectures-on-the-curry-howa
Say I have a simply typed lambda calculus with two base types: Nat and Bool, containing the obvious constants.
If I write the function (ฮป (x : Nat) . True)
I have written something with the type Nat -> Bool
. Isn't this the CH-equivalent to proving a -> b
in natural deduction or the like? Something which is certainly not a valid theorem. Surely, having values belonging to types allows us to 'prove' that type whenever we want?
Obviously I'm wrong, but where is the gap in my understanding? Also, apologies if this isn't the most relevant subreddit, I wasn't sure where else to ask this. I am writing a type-checker for the STLC in Haskell, and this has confused me.
Here's my understanding:
In classical logic, we define logical implication such that False -> P
is true regardless of what proposition P is (even if P is false), and we define P -> False
to be true if and only if P is false; P -> False
is false if P is true.
Therefore, in intuitionistic logic, we say that False -> P
is inhabited for all P, and P -> False
is only inhabited if P is uninhabited, and vice versa.
Therefore, according to the the Curry-Howard isomorphism, there exists some function with the type Bottom -> P
regardless of what type P
is. There also exists some function of type Bottom -> Bottom
. However, there does not exist any function of type P -> Bottom
, unless P is the Bottom type.
And yet I can think of how to implement a function of type P -> Bottom
(I believe in Haskell, it'd be something like const undefined
; in imperative languages, you could just enter an infinite loop or throw an exception or something), but I can't see how to implement a function of type Bottom -> P
without knowing P ahead of time.
How do I reconcile this?
I started watching the NBA only after Dwight Howard left the spotlight. But I've heard a lot about how Dwight Howard would smile during games and he got criticized by the media for not taking the game seriously. I'm wondering why Steph Curry, who is probably the most expressive player currently in terms of smiling, laughing, and having fun while playing, has avoided similar criticism for these behaviors?
Magic fan here
Been years but my brain still remembers that 2008-2009 team.
Dwight as limited on offense he was besides his post ups and screen and rolls, helped incite a revolutionary Magic squad that shot lights out.
4 out and 1 in, Dwight surrounded by shooters. Curry obviously is the greatest shooter, but I think this Magic team because of Dwight, was a forgotten main influence to this trend
Beck wrote a pretty interesting story for SI that gets into player-coach relationships with a lot of interviews from Shaq, Hakeem, Steph, Kerr, Carlisle etc. He basically wonders if weโll ever see a long term player-coach relationship like Duncan-Pop, Phil - MJ/Kobe/Shaq, Isiah - Chuck Daly due to player empowerment, and if thereโs something missing in the NBA without that.
https://www.si.com/nba/2021/05/07/nba-coach-star-partnerships-daily-cover
Some of the more interesting quotes from it:
> You cannot tell the story of the Warriorsโ dynastic run without Curry and Kerrโthough they just might be the final avatars of this once-standard archetype. The model is fast eroding in the NBAโs Player Empowerment Era, undermined by superstar mobility, coaching instability, evolving power dynamics and shifting sensibilities on the notion of โloyaltyโ in professional sports. Simply put, todayโs superstars and coaches donโt stay together long enough to forge those deeper bonds.... Coaching still matters, but coaches seem like an afterthought. Perhaps even a bit diminished. The great partnerships of the past were built on trust and communication, and forged over years. Itโs how winning cultures take rootโthe star and coach amplifying one another, setting expectations for everyone else. What happens when that bond disappears? Does culture go with it? Do dynasties disappear? Have todayโs stars lost something essential?
IT, Shaq, & Hakeem discussing this:
> More than a decade earlier, Isiah Thomas drew a similar hard line with the Pistons to save Chuck Daly from being fired in the midst of the 1985โ86 season.
> The Pistons would go on to win titles in 1989 and 1990, cementing Thomas and Daly as legends. Daly was elected to the Hall of Fame in 1994, and six years later he stood as the presenter for Thomasโs induction. โI never would have became the champion and player that I became, had Chuck not been my coach,โ Thomas says.
> Itโs a common theme. OโNeal and Bryant, by their own assessment, needed Jackson to summon the best versions of themselves, to become champions. So did Jordan, who was widely regarded as a selfish gunner until Jackson arrived and persuaded him to embrace the triangleโa system Jordan initially mocked as an โequal-opportunity offense.โ
> Hakeem Olajuwon was a perennial All-Star under Fitch and later Don Chaney. But his best years came with Rudy Tomjanovich, who after nearly a decade as a Rockets assistant was promoted to the h
... keep reading on reddit โกI have been looking into k-uniform Euclidean tilings recently (https://en.wikipedia.org/wiki/List_of_k-uniform_tilings). As far as I know, their list is complete only to k=7.
I have made and implemented an algorithm (a variant of my previous tiling search approach) that can extend this list, and extend it significantly (I'm currently running it up to k=12, although this will take a few days to complete).
Here's the rub: I think that the algorithm is guaranteed to find every solution. (I haven't actually proven it, but the logic seems sound.) But the trouble is that the same solution can be (and usually is) found multiple times. Some solutions are actually found many times (particularly those that contain many similar vertex types such as the many, many solutions consisting of rows of squares and triangles alternating in some pattern).
I've been trying to go through the solutions by hand, but the potential for human error is too large. I managed to *almost* replicate the lists of 3-uniform and 4-uniform tilings from the Wikipedia, but I have always overlooked a few solutions (they were in the data set, I have just missed them).
I need help with devising some sort of pruning algorithm that could go over the result file and specifically point out unique solutions.
Some details: This is how a typical output looks:
Number of polygons: 10
(6,6,6)F, (3,3,6,6)F, (3,3,3,3,3,3)A2, (3,3,6,6)F, (3,3,3,3,6)F, (3,3,3,3,3,3)A2, (3,3,3,3,6)F, (3,3,3,3,6)F, (3,3,3,3,3,3)A2, (3,3,3,3,6)A
(6,6,6)F, (3,3,6,6)Fx2, (3,3,3,3,6)A, (3,3,3,3,6)Fx3, (3,3,3,3,3,3)A2x3
TES file: 10\10_36\3g 4e2 5a 5b3 6i3\eu raw 3g 4e2 5a 5b3 6i3 11.tes
(0 1')[1](2)(0' 2''')[2'](3' 2'')(0'' 2@4)(1'' 3''')(0''' 1@4)[1'''](0@4 1@6)[3@4](4@4 2@5)(0@5 4@7)(1@5 2@6)[0@6](3@6 3@7)(4@6 0@8)[0@7](1@7 0@9)(2@7 1@8)[2@8 2@9](3@9)
0: 0/1(6)-*1/*0(6)-*1'/*0'(6)-*2'''/*1'''(6)-1'''/2'''(6)-0'/1'(6)
1: 1/2(6)-2/0(6)-1'/2'(6)-*2'/*1'(6)-*0/*2(6)-*2/*1(6)
2: 2'/3'(3)-2''/*2''(3)-*3'/*2'(3)
3/4: 3'/0'(3)-2'''/3'''(3)-1''/2''(3)
*0'/*3'(3)-*2''/*1''(3)-*3'''/*2'''(3)
5/6: 0''/1''(3)-3'''/0'''(3)-1@4/2@4(3)
*1''/*0''(3)-*2@4/*1@4(3)-*0'''/*3'''(3)
7: *0''/0''(3)-2@4/3@4(3)-*3@4/*2@4(3)
8: 0'''/1'''(6)-*1'''/*0'''(6)-*1@4/*0@4(6)-*1@6/*0@6(6)-0@6/1@6(6)-0@4/1@4(6)
9: 3@4/4@4(3)-2@5/*2@5(3)-*4@4/*3@4(3)
10/11: 4@4/0@4(3)-1@6/2@6(3)-1@5/2@5(3)
*0@4/*4@4(3)-*2@5/*1@5(3)-*2@6/*1@6(3)
`12/1
So โ a problem Iโve created for myself at work.
I have two graphs sets in different systems that were created based on the same dataset, but Iโve lost the master key. (As always, the real lesson here is to practice good data hygiene instead of what I did, which was a billion different versions all with various minor tweaks and no real version control.)
The original graph (G, E) is in GIS. Each vertex has between 0-10 neighbours, skewed to the low side (median degree is 2).
The copy (Gโ, Eโ) is in excel/SQL, and it wasnโt important to retain more than 5 neighbours for each vertex. So, itโs the same vertex set but the edge set is a strict subset (though a fairly large one).
Question: how can I best retrieve the mapping G <-> Gโ?
Obviously I donโt care about isolated vertices with degree 0. My data structures arenโt really meant to be graphs, and so itโs a pain to do graph traversal steps โ Iโd rather find a solution thatโs solely based on inspecting lists of neighbours.
Can someone give me a short explain the Curry-Howard Correspondence?
Also, how important was the discovery of this correspondence and what are some other insights/theorems/fields that it led to?
After looking at the correspondence between types,programs : theorems, proof I am stuck at a point while studying lambda expressions. Consider the following functions
\x -> x*x + 2*x + 1
\x -> (x + 1) * (x + 1)
I would like to arrive at a normal form in lambda calculus so that I can say the algebraic equivalence of the above functions (Please consider fix-point operator will be omitted for checking equivalence).
But is arriving at a normal form using beta-reduction in lambda calculus equivalent to running the program itself ?
Or is it just algebraic reduction similar to what a SMT does (like SBV in Haskell, Microsoft z3) ?
And if so is there is a equivalent of evaluation of program in the logic land according to Curry-Howard isomorphism ?
I'm learning from this book: https://disi.unitn.it/~bernardi/RSISE11/Papers/curry-howard.pdf (Lectures on Curry-Howard Isomorphism - 1998 version) for some project. And due to time constraints, I probably won't be able to cover all of the material in the book, in my study. Luckily, although it would be useful,I don't think I will need to know everything in this book, but rather selected topics. At the moment, I learned the first chapter and something like a third of the second chapter, and from what I learned so far in the second chapter it seems like you don't actually need to know the first chapter in order to learn this chapter, it seems like the two chapters cover separate topics. So, perhaps that's true for other chapters as well. Of course, some of them will require knowledge of previous chapters, especially, I can imagine the 4th chapter on the Curry-Howard isomorphism, but even the chapters that require knowledge of previous chapters, might not require all the previous chapters.
So, it could be very helpful if someone with experience with the topics covered in this book, could list to each chapter all the prerequisites for learning it. Especially, for chapters 4 and 11 (Heyting Arithmetic), that cover material that I totally need.
By the way, I asked this question on stack exchange math, so if you want to answer the question there, here is the link: https://math.stackexchange.com/questions/3489031/a-question-about-the-order-of-learning-from-the-book-lectures-on-the-curry-howa
I'm learning from this book: https://disi.unitn.it/~bernardi/RSISE11/Papers/curry-howard.pdf (Lectures on Curry-Howard Isomorphism - 1998 version) for some project. And due to time constraints, I probably won't be able to cover all of the material in the book, in my study. Luckily, although it would be useful,I don't think I will need to know everything in this book, but rather selected topics. At the moment, I learned the first chapter and something like a third of the second chapter, and from what I learned so far in the second chapter it seems like you don't actually need to know the first chapter in order to learn this chapter, it seems like the two chapters cover separate topics. So, perhaps that's true for other chapters as well. Of course, some of them will require knowledge of previous chapters, especially, I can imagine the 4th chapter on the Curry-Howard isomorphism, but even the chapters that require knowledge of previous chapters, might not require all the previous chapters.
So, it could be very helpful if someone with experience with the topics covered in this book, could list to each chapter all the prerequisites for learning it. Especially, for chapters 4 and 11 (Heyting Arithmetic), that cover material that I totally need.
By the way, I asked this question on stack exchange math, so if you want to answer the question there, here is the link: https://math.stackexchange.com/questions/3489031/a-question-about-the-order-of-learning-from-the-book-lectures-on-the-curry-howa
After looking at the correspondence between types,programs : theorems, proof I am stuck at a point while studying lambda expressions. Consider the following functions
\x -> x*x + 2*x + 1
\x -> (x + 1) * (x + 1)
I would like to arrive at a normal form in lambda calculus so that I can say the algebraic equivalence of the above functions (Please consider fix-point operator will be omitted for checking equivalence).
But is arriving at a normal form using beta-reduction in lambda calculus equivalent to running the program itself ?
Or is it just algebraic reduction similar to what a SMT does (like SBV in Haskell, Microsoft z3) ?
And if so is there is a equivalent of evaluation of program in the logic land according to Curry-Howard isomorphism ?
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.