A list of puns related to "Lexical Functional Grammar"
On p. 18 of his book On Lisp, Paul Graham writes:
...suppose we want to write a function which takes a list of numbers and adds a
certain amount to each one. The function list+
(defun list+ (lst n)
(mapcar #'(lambda (x) (+ x n))
lst))
will do what we want:
> (list+ '(1 2 3) 10)
(11 12 13)
If we look closely at the function which is passed to mapcar within list+, it's
actually a closure. The instance of n is free, and its binding comes from the
surrounding environment. Under lexical scope, every such use of a mapping
function causes the creation of a closure.[1]
[1] Under dynamic scope the same idiom will work for a different reasonβso long
as neither of mapcar's parameter is called x.
The last phrase ("so long as neither of mapcar
's parameter is called x
") makes no sense to me. It seems like a vacuous condition (akin to "as long as 1 is not equal to 2"); how could "[either] of mapcar
's parameters" in this case be "called x
"? The names of these parameters are fixed by the definition of list+
. And even if one were to redefine list+
(perversely enough) as
(defun list+ (x n)
(mapcar #'(lambda (x) (+ x n))
x))
...it would still behave as it did before.
If someone were kind enough to make sense of Graham's comment for me, I'd be most grateful.
EDIT: I just learned of an Errata page for On Lisp, which includes the following: > p. 18. In the footnote, x should be n, ...
!π€¦!π€¦!π€¦!
I'm writing my BA thesis. I want to analyze some titles on video platform, but it involved many noun phrases and interrogative sentences, which puzzles me a lot. Many books about SFL are using declarative sentences as examples.
I don't know whether it belongs to beginner's questions because i just read some books of Haliday's theory and step into this new area for four weeks QAQ. i would appreciate if you can provide some papers, books or videos concerned.
I have a strong memory of Adam spending a part of one of his videos talking about lexical grammar but I can't but my finger on which video it is and honestly don't have the patience to try and brute force find it. Would any of you happen to know which video it is he mentions it?
An example of what I mean would be something like a language that marks tense with tone or plurality with voicedness but never distinguishes words by contrasting those phonetic features.
function droids(arr) {
let result = ''; --- this variable is in the scope of the first function, but not in the scope of the second function, so it has a different lexical scope, correct?
function iterate(item) {
if(item === 'Droids') {
result = "Found Droids!"
} else {
result = "These are not the droids you're looking for."
}
}
arr.forEach(iterate) --- calling the second function that exists within the lexical scope of the first function to loop through a globally scoped array passed as an argument to the first function
return result;
}
Hello, everyone. I'm looking for a bit of help.
In the sentence "The government has declined to say how much has been spent on the new airport" is the process material, mental, or verbal?
My current thinking is that 'declined' is the process of this clause and 'to say...' is the start of embedded clause. If that is the case, is 'declined' a material, mental, or verbal process? I don't think it fits neatly into any of them.
If, however, 'declined to say' is a complex verb group, it would appear as though it's a verbal process and the remainder of the sentence (how much had been spent on the new airport) would be the verbiage.
I have been doing quite a bit of reading on the topic, but if you have any references that you feel might be of benefit, please do let me know.
Thank you.
Broadly speaking, I'm interested in entrenchment of root + inflection combinations. But I'd also like to know what people may or may not have said about entrenchment of lexical + functional morphemic / syllabemic combinations in languages like Vietnamese, Mandarin, etc.
For example, the Vietnamese string bαΊ‘n ΔΓ£ biαΊΏt[?], roughly "you + PST + know", occurs very often, with various appropriate translations: did you know, you knew, you had known, etc. I would be particularly interested in whether combinations of the ΔΓ£ biαΊΏt type might require higher frequency thresholds before entrenchment occurs than equivalent constructions in moderately and highly inflecting languages, due to the higher exponential workload they bear. (Since all this would ultimately be relevant to the words-and-rules debate, we would probably be more interested in mid- or low-frequency roots / inflected forms than high-frequency roots / inflected forms like know/knew, savoir/savait, etc.) I can expand if necessary, but hopefully those in the know have seen enough to weigh in. Thanks!
i am so excited that there is name for something i experience! ever since i can remember, every word i say and sometimes hear has a very distinct taste/texture in my mouth and in parts of my tongue. i avoid saying certain words because of the overwhelming taste or texture that they produce. i didnβt realize that lexical-gustatory synesthesia is so rare, and as a kid i thought everyone else could taste what they said too. iβm just really excited to know iβm not crazy!
What is Transitivity? with regards to Systemic Functional Grammar.
Also, if possible.. could you give some examples in the English language.
I'm reading The Language Instinct by Steven Pinker. He writes:
>The particular ways that languages do form questions [moving the auxiliary to the front of the sentence] are arbitrary, species-wide convetions.
Could it not be arbitrary, but a common functionality? For instance (but not limited to) the listener benefiting from knowing with the first word of a sentence whether it is a question or a statement.
>The universal plan underlying languages, with auxiliaries and inversion rules, nouns and verbs, subjects and objects, phrases and clauses, case and agreement, and so on, seems to suggest a commonality in the brains of speakers ...
Or a commonality in function...?
>...because many other plans would have been just as useful.
Are they really?
>It is as if isolated inventors miraculously came up with identical standards for typewriter keyboards or Morse code or traffic signals.
I am pretty sure certain tools have been invented by different cultures seperatedly. See Multiple Discovery.
I'm quite convinced language, tool use and plenty of other things are innate to humans, or human nature in a sense. I am however not convinced, there has to be a "mental organ, a neural system, and a computational module" for language to explain the finding above...
Not to start a discussion whether Universal Grammar Theory is correct (even though I'm still happy about any input!). My main point is: Does my criticism make sense? Couldn't there be a common functionality in explaining overlapping grammar?
Some of the functions I'm developing have so many setq
s in them and I wonder whether it is proper.
I find having to structure the code to ensure all variables are assigned under let
statements rather awkward so I simply use setq's and try not to use those variables in other programs.
Will setting -*- lexical binding : t -*-
make the setq
apply only within the file?
Context of Situation is about Tenor, Field and Mode.
Then what is context of culture about?
I thought I was starting to understand arrow functions and the this
keyword. I keep coming back to the this
keyword in JS, and it's very demotivation.
Anyways, here is what I wrote to test myself:
const parent = {
name: "Dad",
child: {
name: "Daughter",
getName: () => this.name,
},
};
parent.child.getName() // undefined
I could have sworn it would've printed 'Dad'. I thought the arrow function looked in the lexical scope and would've found name
on the parent
object and logged it to the console.
Where is my mental model incorrect? I'm thinking this is out of scope (pun not intended...) of the arrow function this
rules. It feels deeper than that.
Thank you so much. I've been staring at this for what feels like forever, and I would really appreciate some clarity as to what is going on.
I was watching this talk. In this he explains importance of running the lexer during initialisation. There is a slide at 34:25 saying "Can't run a goroutine to completion during initialisation"
I watched this part multiple times, but still I am not able to understand. Can anyone explain?
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.