A list of puns related to "HindleyβMilner type system"
I have some interest in trying to support this combination in my langage, and I'm trying to see what progress has already been made. So far I haven't found anything more recent than 2009, but I don't know if that's because it doesn't exist.
The 2009 paper I found is Practical Variable-Arity Polymorphism, which actually works with typed scheme, which isn't Hindley-Milner based. Still, I found it helpful for understanding the scope of the problem. It cites a few other papers itself, which I mostly haven't looked at yet and which are all apparently limited compared to what this paper implements.
Anything else I should be looking at?
(I don't necessarily expect to make significant progress on this, and there's a decent chance I decide it just looks too difficult and move on. But no harm in looking into it.)
I have just read a discussion about argument labels in Swift, and then remembered how strange that feature sometimes felt in OCaml. When we through in default arguments, I wonder if it is compatible at all to standard lambda-calculus functions and their type inference.
Is there such a system somewhere? Because I would really like to see how the authors solved some complications like:
I don't think such a system would be impossible or even scientifically interesting, though. I am just curious if anyone ever made the effort to give a formal description of one.
Hello,
I am looking for alternatives for the "Algorithm W" for type interference.
I saw another way to implement this in Stephen Diehls book : http://dev.stephendiehl.com/fun/006_hindley_milner.html
I like the idea of splitting up the generation of constraints and solving these constraints.
But I don't know where this algorithm is from and if there are better algorithms/implementations for type interference.
Can you please help me about that.
Haskell 2010 Report says
> Haskell uses a traditional Hindley-Milner polymorphic type system to provide a static type semantics [4, 6], but the type system has been extended with type classes (or just classes) that provide a structured way to introduce overloaded functions.
> [4] Luis Damas and Robin Milner. Principal type-schemes for functional programs. In Conference Record of the 9th Annual ACM Symposium on Principles of Programming Languages, pages 207β12, New York, 1982. ACM Press.
> [6] R. Hindley. The principal type scheme of an object in combinatory logic. Transactions of the American Mathematical Society, 146:29β60, December 1969.
I am unfamiliar with PLT, and reading Pierce's Types and Programming Languages to catch up, trying to understand what the quote means.
I was wondering where in Pierce's book the following concepts are introduced:
a traditional Hindley-Milner polymorphic type system
type class for introducing overloaded functions
Thanks.
Hi everyone, I'm trying to understand type inference by implementing the Hindley-Milner algorithm in a small language. I have found this guide which so far has been very helpful, however, I've been stuck at the function type inference process, where the author introduces Context
and Environment
. Despite the explanation that the author gives, it's still unclear for me what's the purpose of Context
and Environment
. As far as I could understand, and please correct me if I'm wrong, the latter helps to keep track of variables in scope. Graphically:
let x = 3; <----- x lives in environment E1, being a global variable.
fn foo() { <--- a new environment E2 is introduced here, where x is a local variable.
let x = 7;
}
I was wondering, could anyone share an ELI5 explanation of these concepts? Any further resource or suggestion would be appreciated. Thanks in advance!
I have an HM typechecker mostly implemented, but there's a bug at the intersection of those two features: when the user supplies a type declaration on part of a pattern, such as (translated into Haskell)
case (undefined :: Maybe Int) of
Just (x :: a) -> ...
this should be a type error but currently isn't. I've been studying this for a few days to try to figure out how to fix it, but I haven't got anywhere yet.
Part of the problem is that the resources I based my implementation off of don't touch this area. Write you a Haskell implements a language with no user-supplied type declarations. Typing Haskell in Haskell has them on variable bindings, but not expressions or patterns. (Expression type signatures were part of Haskell 98, but I don't think pattern type signatures were.) The tutorial posted here a few days ago doesn't have them either.
So it wouldn't be too surprising if there's a thing I just fundamentally haven't implemented yet, or an idea I'm missing, or something like that. I can probably figure something out eventually by myself, but... if anyone knows any papers that cover this, or example implementations, or something, that would be helpful.
Resource Post
A lot of work has gone into type inference over the decades, and I believe the sweet-spot of Hindley-Milner (Damas-Milner?) type inference has been made even sweeter with some advances that I'm 95% sure play nicely together. I find myself motivated to share those resources here, since they've been incredibly helpful in designing and implementing my own language, and I'll include why I believe these extensions are worthwhile. A lot of modern languages have at least one or two, sometimes more.
Maybe this can start a small discussion about the state of type inference, but I'm also posting this as a collection for posterity. Y'know, in case anyone else is as possessed by principal types as I am and wants a path to get started adding advanced type inference in their own language.
Papers I rely on:
Hi,
I'm creating a toy programming language using Haskell. I adapted Hindley-Milner to my PL as described here(I used the first solution, not the Constraint Generation method.) But in contrast to the language presented in the link, my PL has custom data types with parametric polymorphism. If I was working on predefined data type set, I could simply extend the Type
data type like this to add List
type (and other types):
data Type
= TVar TVar
| TCon String
| TArr Type Type
| TList Type -- Added this
deriving (Show, Eq, Ord)
And I can build upon this. But I have custom data types with arbitrary amount of type variables (They can be limited, I have no problem with that). What is the best strategy to use here? I want to keep things simple as possible.
So far, I think I need a data type to represent kinds and I need to extend TCon with that data type. Like:
data Kind = Star | KArr Kind Kind -- KArr stands for Kind arrow but I don't know is that a real thing.
newtype TCon = TC String Kind
data Type
= TVar TVar
| TCon TCon -- Changed this
| TArr Type Type
deriving (Show, Eq, Ord)
So that List
type can be represented as: TCon (TC "List" (KArr Star Star))
. But this does not give me a way to represent, for example, list of ints
. So probably I need to add another constructor to Type
to represent this situation, like:
data Type
= TVar TVar
| TCon TCon -- Changed this
| TArr Type Type
| TTypApply Type Type
deriving (Show, Eq, Ord)
So that list of ints
become TTypApply (TCon (TC "List" (KArr Star Star))) (TCon "Int" Star)
. These are probably enough to represent data types with arbitrary type variables.
But I really don't have any idea that how should I modify the inference algorithm itself. Type variables with kinds and TTypApply
makes me confused. What are the changes about the unifying process I need to do? Does type schemes effected by this?
And this occurred to me now, that I can replace TArr
with TCon (TC "->" (KArr Star (KArr Star Star)))
. So probably I need to generalize the bits that uses TArr
in the algorithm. But I'm still confused.
Any links or ideas regarding to this would be greatly appreciated. Thanks. (Sorry for the long post. It kinda helped me to understand my problem better.)
I have read the wikipedia page for Hindley-Milner type inference several times at this point.
Each time I've read it, I've gotten a little bit more out of it. But there is still one section I dont understand at all:
> While the development above sort of misused the monotypes as "open" proof variables, the possibility that proper monotype variables might be harmed was sidestepped by introducing fresh variables and hoping for the best. But there's a catch: One of the promises made was that these fresh variables would be "kept in mind" as such This promise is not fulfilled by the algorithm. > > --------- > ... meaning that the algorithm fails to detect all type errors. This omission can easily be fixed by more carefully distinguishing proof variables and monotype variables. > > --------- > The authors were well aware of the problem but decided not to fix it. One might assume a pragmatic reason behind this. While more properly implementing the type inference would have enabled the algorithm to deal with abstract monotypes, they were not needed for the intended application where none of the items in a preexisting context have free variables. In this light, the unneeded complication was dropped in favor of a simpler algorithm.
I think my confusion comes from not knowing the difference between proof variables and monotype variables, which probably comes from not knowing what proof variables are
https://en.m.wikipedia.org/wiki/Hindley%E2%80%93Milner_type_system
I am failing at searching today. Does anyone know if there is an extension of Hindley-Milner that can deal with infinite types? Also, is there any significant simplification that can be made for languages that never have free variables?
https://github.com/leosbotelho/hm-def-light
Any feedback is welcome.
Please support.
:)
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.