generative grammar, innateness, &c.

For the discussion of language mechanics, grammar, vocabulary, trends, and other such linguistic topics, in english and other languages.

Moderators: gmalivuk, Moderators General, Prelates

sdedeo
Posts: 36
Joined: Tue Apr 08, 2008 10:52 pm UTC
Contact:

generative grammar, innateness, &c.

Postby sdedeo » Wed Apr 09, 2008 4:41 am UTC

OK, so I did indeed search for "generative" and "generative grammar" on the boards, but couldn't find anything?

As I understand it, the "cocktail party" story starts with the Poverty of Stimulus argument; children learning a language have a finite number of instances to work from, but can produce an infinitude of grammatically correct sentences. Somehow, that finite set "triggers" some internal patterns in the brain that we can use to keep going. Further, however, it's not like our parents sit us down and explain how pronouns work -- we seem able to figure out those rules on our own.

And, further, we tend to settle on a small class of grammatical rules from the large set that would be compatible with the finite set of instances. Very often, we pick just the one rule that everyone else seems to pick, even if that rule would be is kind of "odd" -- not necessarily the simplest, for example. It's the equivalent of seeing "1, 2, 4, 8, 16" and saying the next in the series is 17 along with everybody else.

OK, even further, totally disconnected language trees seem to have the same rules. Not necessarily on the "surface" level -- i.e., the particular way you form questions out of statements in French and English is not isomorphic -- but if you go "deep" enough, abstract out various structures that can have competing implementations, it turns out there is an isomorphism. This is the "surface grammar" / "depth grammar" difference. How you get from the depth to the surface may vary between languages, of course (this movement is called a "transformation").

[I have heard it said that "depth grammar" is purely semantic -- but this cannot be the case because then the isomorphism would be trivial (i.e., if you are trying to show that two languages have deep similarities in how they express the same thought, the similarities must be in how the two thoughts are contingently represented.)]

So the conclusion is that this depth grammar (1) guides us in the construction of grammatically OK sentences, and (2) is shared by huge communities of people who not only are exposed to different "training sets", but indeed training sets from different languages. Then you say something like, oh man, that's so crazy, the only way it could happen is if there was a notion of "innate language," if the structure of the human mind -- and not simply contingent cultural forces -- put restrictions on the nature of language.

Did I get any of that right? Also, I have a friend who studied linguistics but she is possibly traumatized by either the study of it, or the fact that I care, and while she will neither confirm nor refute my story above, will say that all of this is massively out of fashion and that nobody believes the innateness story?

User avatar
targetpractice
Posts: 25
Joined: Wed Jan 23, 2008 7:26 pm UTC
Location: Ann Arbors
Contact:

Re: generative grammar, innateness, &c.

Postby targetpractice » Wed Apr 09, 2008 12:58 pm UTC

I think the reason you haven't gotten any replies is that you've pretty much summarised the situation. The idea of generative grammar is that we are born with an innate grammar set. The depth grammar is composed of many "hidden" components, that depending on the surface grammar, become visible or remain hidden. These appear in a multitude of ways, from languages that set up their grammar by order or others that use markers to harken back to hidden sets by means of conjugated endings and other such tools. I like to think of surface grammar as a big Rube Goldberg device, where no matter how complicated everything is, it uses and reflects back on a very simple set of motions - say, a swinging hammer pressing a button. Chinese is a good language for transparent surface/depth grammar correlations, they put pretty much every word in where there's a depth grammar point and are restricted by word order - something syntax loves. Languages like Russian or Latin, however, can be more complex, with each word on the surface carrying hidden markers that tell you what it refers to, and where it should be put in the depth grammar. I don't think the concept is that difficult to imagine - birds and other animals are born with songs, you can throw one in a cage for its whole life and it'll still sing similar (probably not identical) songs to its mates. Human language is so complex that it would be better just to genetically set a base system that can use any set of sounds - the Mimic bird has something similar to this, although all it does is copy sounds it hears, but at the end of the day, it's combining all those random sounds into one mating song, it knows it's supposed to do that, by genetics alone. I wish I could be more specific about the Syntax, but it's been a few years and I've forgotten much of the specifics. As to the "truth" of this setup - your friend, as far as I know, is on the wrong side of the tracks if she says this is "out of style" and "nobody believes it." Last I heard, every syntax class was teaching Chomskyian syntax, which is generative grammar, which is "innate grammar." Every other option besides generative grammar is frighteningly complicated and strange, and also doesn't always work - unlike Chomsky's. Linguists love Occam's razor, and Occam probably would have vomited at the Poverty of Stimulus argument without generative grammar to explain how it's possible. I'm a firm "believer" in generative grammar - whether or not Chomsky's setup happens to be the exact right way to go about it doesn't matter - it makes sense biologically and sociologically.
Science is facts; just as houses are made of stones, so is science made of facts; but a pile of stones is not a house and a collection of facts is not necessarily science.
- Henri Poincare

zenten
Posts: 3799
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Re: generative grammar, innateness, &c.

Postby zenten » Wed Apr 09, 2008 3:27 pm UTC

sdedeo wrote:OK, so I did indeed search for "generative" and "generative grammar" on the boards, but couldn't find anything?

As I understand it, the "cocktail party" story starts with the Poverty of Stimulus argument; children learning a language have a finite number of instances to work from, but can produce an infinitude of grammatically correct sentences. Somehow, that finite set "triggers" some internal patterns in the brain that we can use to keep going. Further, however, it's not like our parents sit us down and explain how pronouns work -- we seem able to figure out those rules on our own.

And, further, we tend to settle on a small class of grammatical rules from the large set that would be compatible with the finite set of instances. Very often, we pick just the one rule that everyone else seems to pick, even if that rule would be is kind of "odd" -- not necessarily the simplest, for example. It's the equivalent of seeing "1, 2, 4, 8, 16" and saying the next in the series is 17 along with everybody else.



I'm not sure on the rest of what you wrote, but I have to disagree on these points.

It's not like everyone just happens to say 17, it's that you say 17 after hearing many people say 17 at the end of that sequence thousands of times.

Sure kids are learning rules instead of specific examples, but the principle still holds. Just look at what happens to children who are raised without anyone talking for their formative years.

User avatar
targetpractice
Posts: 25
Joined: Wed Jan 23, 2008 7:26 pm UTC
Location: Ann Arbors
Contact:

Re: generative grammar, innateness, &c.

Postby targetpractice » Wed Apr 09, 2008 4:00 pm UTC

[/quote]

I'm not sure on the rest of what you wrote, but I have to disagree on these points.

It's not like everyone just happens to say 17, it's that you say 17 after hearing many people say 17 at the end of that sequence thousands of times.

Sure kids are learning rules instead of specific examples, but the principle still holds. Just look at what happens to children who are raised without anyone talking for their formative years.[/quote]

The kids learn a surface grammar - they don't know how to express depth grammar without the taught phonetics and style. Children are inventors of language - in south america there's a group of deaf kids that made their own sign language, and the younger and younger the kids get, the more "fluent" the language becomes - rather, it's evolving to become a full language that meets the requirements of the depth grammar. Pidgins, evolving into Creoles, will have been fulled out by children who apparently "need" certain bits of information, such as a full set of personal pronouns or future and past tenses. Kids who grow up in without people to talk to won't make up their own language because they don't need one. Given any opportunity to make something up, they will, and as soon as they are exposed to language, they try to learn it - its just that their language centers have atrophied from lack of use during formative years - same way it's hard for older people to learn new languages, unless they already know a whole lot of them.
Science is facts; just as houses are made of stones, so is science made of facts; but a pile of stones is not a house and a collection of facts is not necessarily science.
- Henri Poincare

zenten
Posts: 3799
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Re: generative grammar, innateness, &c.

Postby zenten » Wed Apr 09, 2008 5:05 pm UTC

targetpractice wrote:
zenten wrote:
I'm not sure on the rest of what you wrote, but I have to disagree on these points.

It's not like everyone just happens to say 17, it's that you say 17 after hearing many people say 17 at the end of that sequence thousands of times.

Sure kids are learning rules instead of specific examples, but the principle still holds. Just look at what happens to children who are raised without anyone talking for their formative years.


The kids learn a surface grammar - they don't know how to express depth grammar without the taught phonetics and style. Children are inventors of language - in south america there's a group of deaf kids that made their own sign language, and the younger and younger the kids get, the more "fluent" the language becomes - rather, it's evolving to become a full language that meets the requirements of the depth grammar. Pidgins, evolving into Creoles, will have been fulled out by children who apparently "need" certain bits of information, such as a full set of personal pronouns or future and past tenses. Kids who grow up in without people to talk to won't make up their own language because they don't need one. Given any opportunity to make something up, they will, and as soon as they are exposed to language, they try to learn it - its just that their language centers have atrophied from lack of use during formative years - same way it's hard for older people to learn new languages, unless they already know a whole lot of them.


Ok, I think I just don't understand this "surface grammar"/"depth grammar" distinction. Could you give some examples?

sdedeo
Posts: 36
Joined: Tue Apr 08, 2008 10:52 pm UTC
Contact:

Re: generative grammar, innateness, &c.

Postby sdedeo » Wed Apr 09, 2008 5:07 pm UTC

zenten wrote:It's not like everyone just happens to say 17, it's that you say 17 after hearing many people say 17 at the end of that sequence thousands of times.


OK, so I know for a fact that's incorrect, at least, I know it as a wiki-fact -- see http://en.wikipedia.org/wiki/Poverty_of_stimulus#Evidence_for_the_argument. We understand that:

(1) Anyone who is interested can see me later. --> *Is anyone who interested can see me later?

is incorrect even though it follows one of the possible rules you can derive from the simpler:

(2) You are happy. --> Are you happy?

However, it's implausible that we heard (1) thousands of times and simply memorize the correct answer, "Can anyone who is interested see me later?"

zenten
Posts: 3799
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Re: generative grammar, innateness, &c.

Postby zenten » Wed Apr 09, 2008 5:12 pm UTC

sdedeo wrote:
zenten wrote:It's not like everyone just happens to say 17, it's that you say 17 after hearing many people say 17 at the end of that sequence thousands of times.


OK, so I know for a fact that's incorrect, at least, I know it as a wiki-fact -- see http://en.wikipedia.org/wiki/Poverty_of_stimulus#Evidence_for_the_argument. We understand that:

(1) Anyone who is interested can see me later. --> *Is anyone who interested can see me later?

is incorrect even though it follows one of the possible rules you can derive from the simpler:

(2) You are happy. --> Are you happy?

However, it's implausible that we heard (1) thousands of times and simply memorize the correct answer, "Can anyone who is interested see me later?"


Well, no, not just from hearing.

But young children and people learning new languages make all sorts of mistakes, which they are then corrected on. They learn from that sort of thing as well.

Also, that bit you cited is really lacking in the NPOV, but that's a side issue.
Last edited by zenten on Wed Apr 09, 2008 5:14 pm UTC, edited 1 time in total.

sdedeo
Posts: 36
Joined: Tue Apr 08, 2008 10:52 pm UTC
Contact:

Re: generative grammar, innateness, &c.

Postby sdedeo » Wed Apr 09, 2008 5:13 pm UTC

targetpractice, I take it you have more understanding than me on this issue, so let me ask you a question.

1. you have the depth grammar, OK. This is a way to represent thoughts. The way the depth grammar works -- e.g., how it combines two thoughts in a conjunction, or how it expresses relations between thoughts of different kinds, how it negates or whatever -- this is thought to be "universal" and innate.

2. you have the transformational rules, how to get from the depth grammar to the surface grammar you see on the page. Are these transformational rules entirely arbitrary? Or are there patterns (above and beyond those that would arise from shared ancestry perhaps) that persist across languages?

Also, can you recommend a book on this? I think it would be helpful for me to see some explicit examples and how the simularities arise. I tried reading Chomsky's Language and Mind a while ago, but I got stuck because it's a shade "not general" enough (there was some linguistics background required.)

User avatar
targetpractice
Posts: 25
Joined: Wed Jan 23, 2008 7:26 pm UTC
Location: Ann Arbors
Contact:

Re: generative grammar, innateness, &c.

Postby targetpractice » Thu Apr 10, 2008 9:23 am UTC

The transformational rules are what cause the surface grammar, so they're all different based on the language. This is why children make mistakes. They're learning the transformational rules, which have to be taught, but some are easier to learn than others. To clarify, the transformational rules are not unique in and of themselves on a language to language basis, but it is the set of rules that are different per language. It is possible that the transformational rules themselves are also genetically ingrained, and children make mistakes because they are trying to sort out which ones exactly to use, rather than learning the rules slowly but surely. This would make sense, because children rather quickly figure out very basic things, such as whether their language is SVO, VSO, OSV, etc. These are transformational rules, but they learn them very quickly. So these rules are not arbitrary, but it's not entirely clear right now how they evolve, are selected or mutate. They follow language families, for instance, the hillbilly slang "That there <noun>" seems redundant, but the depth grammar actually requires both of those sets to have some sort of reference, because it's technically a conjunction. It's perfectly correct, but modern english doesn't use it. However, you can see the exact same thing happen in Bavarian slang. I can't reproduce it here, but it and the double "the" are fairly famous examples betwixt German and English. As for books, I can't think of anything off the top of my head that would be a good lay read, but I'll ask around and try to find something. Honestly, a good syntax textbook may be all you really need - thats depth grammar. Also a book about child language acquisition would be good to look for - you don't need to go the whole philosophical route. I'll try to find something.
Science is facts; just as houses are made of stones, so is science made of facts; but a pile of stones is not a house and a collection of facts is not necessarily science.
- Henri Poincare

zenten
Posts: 3799
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Re: generative grammar, innateness, &c.

Postby zenten » Thu Apr 10, 2008 12:06 pm UTC

targetpractice wrote:The transformational rules are what cause the surface grammar, so they're all different based on the language. This is why children make mistakes. They're learning the transformational rules, which have to be taught, but some are easier to learn than others. To clarify, the transformational rules are not unique in and of themselves on a language to language basis, but it is the set of rules that are different per language. It is possible that the transformational rules themselves are also genetically ingrained, and children make mistakes because they are trying to sort out which ones exactly to use, rather than learning the rules slowly but surely. This would make sense, because children rather quickly figure out very basic things, such as whether their language is SVO, VSO, OSV, etc. These are transformational rules, but they learn them very quickly. So these rules are not arbitrary, but it's not entirely clear right now how they evolve, are selected or mutate. They follow language families, for instance, the hillbilly slang "That there <noun>" seems redundant, but the depth grammar actually requires both of those sets to have some sort of reference, because it's technically a conjunction. It's perfectly correct, but modern english doesn't use it. However, you can see the exact same thing happen in Bavarian slang. I can't reproduce it here, but it and the double "the" are fairly famous examples betwixt German and English. As for books, I can't think of anything off the top of my head that would be a good lay read, but I'll ask around and try to find something. Honestly, a good syntax textbook may be all you really need - thats depth grammar. Also a book about child language acquisition would be good to look for - you don't need to go the whole philosophical route. I'll try to find something.


What sorts of grammars could there be that would be just as usable as the real ones, but don't exist in any language?

User avatar
targetpractice
Posts: 25
Joined: Wed Jan 23, 2008 7:26 pm UTC
Location: Ann Arbors
Contact:

Re: generative grammar, innateness, &c.

Postby targetpractice » Thu Apr 10, 2008 6:34 pm UTC

What sorts of grammars could there be that would be just as usable as the real ones, but don't exist in any language?


That depends on if you're asking about depth or surface grammars. Depth grammars aren't very easy to change - there's really only one difference right now in the base syntax that we can see, and thats right or left headed - japanese is left headed, almost all the rest of us are right headed. Surface grammars can be of any kind - I don't have a full list of what kinds of grammars we use, but its likely that we aren't using all the possible kinds. We use about 1/3 of the sounds that the human voice is capable of making and the human ear is capable of hearing, so clearly we haven't reached our full linguistic potential - although it may not be necessary. Honestly, I'm having a hard time thinking of some set of grammar that doesnt exist right now, we have plenty of fusional languages where full sentences are crammed into one word, or Isolationist languages like Chinese (and English, sort of - it's in the middle, advancing towards isolationist) where almost every depth grammar set is produced in the surface grammar. The world runs the gamut of grammatical oddities, and the other problem is that our grammar doesn't contain all the information necessary - it provides the structure for meaning to be produced by phrases, but not single words or unspoken knowledge. Semantics takes care of the first one, and mostly the second, but the rest can be covered by visual cues - btw, kids can learn sign language much earlier than spoken language. "Baby signing" is a rather popular yuppie educational technique. I can imagine that there could be some sort of grammar that didn't need any sort of individual semantic meaning per word, but instead used grammatical constructs to convey ALL meaning, something we don't do. But our language isn't really capable of that - maybe telepathic robot aliens can manage that shit, but we cannot. Perhaps structures that were capable of conveying two tenses at the exact same moment? Not using conjunctions, but in fact capable of having two tense markers per phrase? That just sounds like imperfective and perfective, however. It's a good question, but sort of an impossible one to answer, since I don't know every single possible and used human grammar. All of the sentence orders are used, although some are much rarer than others - OVS, VSO, VOS are some of the lesser used ones. Maybe we'll figure that shit out when we find out what those damn dolphins, whales and Humbolt squid are saying.
Science is facts; just as houses are made of stones, so is science made of facts; but a pile of stones is not a house and a collection of facts is not necessarily science.
- Henri Poincare

zenten
Posts: 3799
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Re: generative grammar, innateness, &c.

Postby zenten » Thu Apr 10, 2008 8:08 pm UTC

targetpractice wrote:
What sorts of grammars could there be that would be just as usable as the real ones, but don't exist in any language?


That depends on if you're asking about depth or surface grammars. Depth grammars aren't very easy to change - there's really only one difference right now in the base syntax that we can see, and thats right or left headed - japanese is left headed, almost all the rest of us are right headed. Surface grammars can be of any kind - I don't have a full list of what kinds of grammars we use, but its likely that we aren't using all the possible kinds. We use about 1/3 of the sounds that the human voice is capable of making and the human ear is capable of hearing, so clearly we haven't reached our full linguistic potential - although it may not be necessary. Honestly, I'm having a hard time thinking of some set of grammar that doesnt exist right now, we have plenty of fusional languages where full sentences are crammed into one word, or Isolationist languages like Chinese (and English, sort of - it's in the middle, advancing towards isolationist) where almost every depth grammar set is produced in the surface grammar. The world runs the gamut of grammatical oddities, and the other problem is that our grammar doesn't contain all the information necessary - it provides the structure for meaning to be produced by phrases, but not single words or unspoken knowledge. Semantics takes care of the first one, and mostly the second, but the rest can be covered by visual cues - btw, kids can learn sign language much earlier than spoken language. "Baby signing" is a rather popular yuppie educational technique. I can imagine that there could be some sort of grammar that didn't need any sort of individual semantic meaning per word, but instead used grammatical constructs to convey ALL meaning, something we don't do. But our language isn't really capable of that - maybe telepathic robot aliens can manage that shit, but we cannot. Perhaps structures that were capable of conveying two tenses at the exact same moment? Not using conjunctions, but in fact capable of having two tense markers per phrase? That just sounds like imperfective and perfective, however. It's a good question, but sort of an impossible one to answer, since I don't know every single possible and used human grammar. All of the sentence orders are used, although some are much rarer than others - OVS, VSO, VOS are some of the lesser used ones. Maybe we'll figure that shit out when we find out what those damn dolphins, whales and Humbolt squid are saying.


Well, on the baby sign thing, that's more about the difficulty in making most sounds for infants due to physical differences in their throat and whatnot.

As to the rest, part of making your case that depth grammar is inborn would be to come up with depth grammars that can be used to communicate, but humans don't use because they're not wired that way.

sdedeo
Posts: 36
Joined: Tue Apr 08, 2008 10:52 pm UTC
Contact:

Re: generative grammar, innateness, &c.

Postby sdedeo » Thu Apr 10, 2008 8:44 pm UTC

targetpractice wrote:Depth grammars aren't very easy to change - there's really only one difference right now in the base syntax that we can see, and thats right or left headed - japanese is left headed, almost all the rest of us are right headed.


This remark confused me. Surely, for the innate hypothesis to be non-trivial, there must be a huge range of possible depth grammars, only a tiny fraction of which appear in human languages. But you seem to be saying here that the "analytic", prior constraints on depth grammars are strong enough that you could not imagine any other form they might take than the ones they do?

User avatar
targetpractice
Posts: 25
Joined: Wed Jan 23, 2008 7:26 pm UTC
Location: Ann Arbors
Contact:

Re: generative grammar, innateness, &c.

Postby targetpractice » Fri Apr 11, 2008 2:41 am UTC

This remark confused me. Surely, for the innate hypothesis to be non-trivial, there must be a huge range of possible depth grammars, only a tiny fraction of which appear in human languages. But you seem to be saying here that the "analytic", prior constraints on depth grammars are strong enough that you could not imagine any other form they might take than the ones they do?[/quote]

Oh, I'm sorry - that's very true - There are a massive number of options that don't show up because the connections we make don't actually work that way. To apologise, I wrote that response at 2:30 in the morning, drunk. Thinking about it more clearly now, there are plenty of depth grammar options. For example, right now you have to have all sorts of sets - CP-NP-VP, etc. and some can only contain some others. For example, NP's can only lead to NPs, CPs, APs and APs can only lead to NPs or other APs, but we could imagine a world where an N'-NP'-N string, which technically is a dead end, somehow allows N'-NP'-N-VP, making a rapid and direct association with a noun and it's verb. There's untold options available that human grammar just doesn't do. Honestly, there's probably more possible variation at the depth level than there is at the surface level. I believe what i was thinking about is the only amount of variation that we can perform physically. Right now, it's impossible to do what i mentioned before - your mind just doesn't do it. To make such drastic changes to our depth grammar, we literally have to evolve physically, our brains have to be rewired to accept these different ways. Don't be deceived by simple sentences like "Susie loves John" being something as basic as just NP-VP-NP. I'm a little confused as to how to put images up here without just making them an attachment, so I can't just draw one and put it up (yes, i know, bunch of noob crap.) If you need any of these terms to be defined, let me know, otherwise, wikipedia should have fairly good answers, linguists are notorious sticklers on wikipedia.
Science is facts; just as houses are made of stones, so is science made of facts; but a pile of stones is not a house and a collection of facts is not necessarily science.
- Henri Poincare

zenten
Posts: 3799
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Re: generative grammar, innateness, &c.

Postby zenten » Fri Apr 11, 2008 3:44 am UTC

targetpractice wrote:Oh, I'm sorry - that's very true - There are a massive number of options that don't show up because the connections we make don't actually work that way. To apologise, I wrote that response at 2:30 in the morning, drunk. Thinking about it more clearly now, there are plenty of depth grammar options. For example, right now you have to have all sorts of sets - CP-NP-VP, etc. and some can only contain some others. For example, NP's can only lead to NPs, CPs, APs and APs can only lead to NPs or other APs, but we could imagine a world where an N'-NP'-N string, which technically is a dead end, somehow allows N'-NP'-N-VP, making a rapid and direct association with a noun and it's verb. There's untold options available that human grammar just doesn't do. Honestly, there's probably more possible variation at the depth level than there is at the surface level. I believe what i was thinking about is the only amount of variation that we can perform physically. Right now, it's impossible to do what i mentioned before - your mind just doesn't do it. To make such drastic changes to our depth grammar, we literally have to evolve physically, our brains have to be rewired to accept these different ways. Don't be deceived by simple sentences like "Susie loves John" being something as basic as just NP-VP-NP. I'm a little confused as to how to put images up here without just making them an attachment, so I can't just draw one and put it up (yes, i know, bunch of noob crap.) If you need any of these terms to be defined, let me know, otherwise, wikipedia should have fairly good answers, linguists are notorious sticklers on wikipedia.


Yeah, some explanation might be good.

You can use the image tag (click on the "Img" button to see how) to link to a picture contained on another server, if you don't want to attach it here.

User avatar
targetpractice
Posts: 25
Joined: Wed Jan 23, 2008 7:26 pm UTC
Location: Ann Arbors
Contact:

Re: generative grammar, innateness, &c.

Postby targetpractice » Fri Apr 11, 2008 4:18 am UTC

Ok, here's a good simple sentence

Image

There's a few other parts that you don't see here, and this is a very simplified graph - just do a google image search for "syntax tree" and you'll see what i mean. But this is a good representation of a deep syntax (grammar). The key here is: S-sentence, NP - noun phrase, Det (D) determiner, N - noun, VP - verb phrase, V - verb. In this tree, whoever made it seems to have simplified further - making AP's count as Ds. AP is "adjective phrase" and can be used for either adverbs or adjectives. Technically, the tree would branch "VP(V-likes)-AP(A-dry)-NP(N-martinis), but the important thing here is the picture and you guys seeing what I've been working with. Each phrase has two possible branches - you can do trees with three, but it's really ugly and doesn't seem to work very well. Each type of phrase can branch into other things - as i mentioned in an earlier post. I'm sorry I assumed that most people here would have a basic linguistics education - this is all stuff you learn in Intro. If you guys want, i'll find the most complicated, full deep syntax tree i can for a simple sentence, but really, just google image search and you'll see what the hell i'm talking about. Computer Science people should also be familiar with syntax trees. This image here is what every sentence in any language looks like on a skeletal level. Not, mind you, a molecular level. This isn't the bottom, it's just the framework.
Science is facts; just as houses are made of stones, so is science made of facts; but a pile of stones is not a house and a collection of facts is not necessarily science.
- Henri Poincare

jimrandomh
Posts: 110
Joined: Sat Feb 09, 2008 7:03 am UTC
Contact:

Re: generative grammar, innateness, &c.

Postby jimrandomh » Fri Apr 11, 2008 2:05 pm UTC

If generative grammars seem unnatural to you, it may be because they're not the way we actually think. You might want to check out link grammars (http://www.cs.cmu.edu/afs/cs.cmu.edu/project/link/pub/www/papers/ps/tr91-196.pdf) Link grammars, as it turns out, are isomorphic to generative grammars, and to my mind they make a lot more sense as something that babies could learn by hearing. They also connect grammar and semantics in a fairly straightforward way, which generative don't do.

User avatar
targetpractice
Posts: 25
Joined: Wed Jan 23, 2008 7:26 pm UTC
Location: Ann Arbors
Contact:

Re: generative grammar, innateness, &c.

Postby targetpractice » Tue Apr 22, 2008 1:02 am UTC

Sorry for the long wait in the debate/lecture! I had midterms here and also happened to lose my internet. I still haven't gotten around to reading the link grammars paper - i just skimmed it, but it seems rather interesting - I actually hadn't heard of this before. While I sort of like the idea, it gives me visions of meaning heavy words like "bank" just covered with flailing arms desperate to latch onto anything that comes their way - which I suppose would account for ambiguity, but at too high a frequency if those arms just happen to latch onto whatever fitting end comes nearby. I still prefer the skeletal structure, which doesn't seem as haphazard as a bunch of flailing links. Still, it looks very interesting, and I'll give it a chance when I have time to read all 93 pages of the paper :)
Science is facts; just as houses are made of stones, so is science made of facts; but a pile of stones is not a house and a collection of facts is not necessarily science.
- Henri Poincare

Robin S
Posts: 3579
Joined: Wed Jun 27, 2007 7:02 pm UTC
Location: London, UK
Contact:

Re: generative grammar, innateness, &c.

Postby Robin S » Tue Apr 22, 2008 1:21 am UTC

In case you haven't already followed the link from the top section of "Poverty of Stimulus", here's another obligatory wiki link.
This is a placeholder until I think of something more creative to put here.

User avatar
vorpal
Posts: 9
Joined: Mon Apr 28, 2008 5:47 am UTC
Location: .au
Contact:

Re: generative grammar, innateness, &c.

Postby vorpal » Mon Apr 28, 2008 10:22 am UTC

sdedeo wrote:Also, I have a friend who studied linguistics but she is possibly traumatized by either the study of it, or the fact that I care, and while she will neither confirm nor refute my story above, will say that all of this is massively out of fashion and that nobody believes the innateness story?


Yeah, generative grammar had its time in the sun and is now well and truly being eclipsed by other theories that engage with actual evidence (ie. what people actually say) rather than just the made up examples (like "colourless green ideas sleep furiously"). Certainly in socio-linguistics, psycho-linguistics and language acquisition, generative grammar is at best one paradigm among many, and really not a very important in terms of the really interesting research that's been done in the last couple of decades. In 'core' linguistics (syntax, morphology, phonology) generative linguistics is still an important influence, but I think you'd be hard pressed to find many researchers (especially outside the US) who are strong followers of Chomsky anymore. In semantics generativism is even less important and in pragmatics, contact linguistics (with the exception of Myers-Scotten's work on code-switching), and typology generativism is stone dead.

Some theoretical perspectives that seem to be gaining traction are LFG, role and reference grammar, HPSG, various construction grammars and cognitive grammar (ie. Langacker et al.). From a not strictly grammatical viewpoint various forms of discourse analysis are also in ascension (though the more syntactic of these approaches like systemic-functional grammar seem to be giving way to interactional approaches like conversation analysis).

Also it is important to note that while for a time generative grammar was THE orthodoxy it was really only ever the orthodoxy in North America (and to a lesser extent the UK/Australia/NZ) and in the domains of syntax, morphology and phonology. And even then there were still loud and influential dissenting voices like Pike, Fillmore, Lakoff, etc. as well as others who stayed with good ol' fashioned American structuralism (out of which generativism grew).

Why is this so? I think there are five main reasons:
1. Chomsky takes a very impoverished view of language. Basically he makes the tail wag the dog. He starts with grammar (not just syntax, but the 'rules' of language including phonology) as his starting point and then builds language around that rather than taking language and then trying to see how it is structured. The problem with this is that there is no reason to think that humans evolved a rule system, and then threw meaning and joint social action on top of that. Instead humans used joint social actions to create meaning and rules systems evolved to allow for the complexity that were demanded of them. This argument can be seen in the poverty of stimulus argument. Kids aren't trying to learn a complex rule system, they are trying to engage in joint social actions. They don't have to choose out of an infinite set of possible grammatical theories because the pragmatics of the situation don't allow for an infinite number of interpretation. As Heidegger points out we start in the midst of things - the bootstrapping problem is created out of a misundertanding of social interaction.
2. Chomsky doesn't address many issues of interest to the study of language (or even tries to banish their study to outside of linguistics). As a very simple example, Generativism (or structuralism for that matter) falls apart when you start to look at idioms which is why Fillmore came up with the concept of a construction grammar. Also his division of language in to competency and production (in which he was following De Saussure - most of Chomsky's ideas are just old ideas rehashed in less clear language) banishes what people actually say to outside of linguistics. This has also unfortunately led to many socio- and psycho-linguists disengaging with core linguistics.
3. The arguments that Chomsky bases his theories on are dubious at best. See previous comments about the poverty of stimulus and the competency/production divide.
4. There is just so much counter-evidence to what Chomsky argues which Chomsky refuses to engage with. As an example LFG (lexical-functional grammar) a major and important theory of grammar has never even been acknowledged by Chomsky (as of early 2000s). I doubt he has ever engaged with any other major competing theories either. He just doesn't engage with arguments or evidence from outside of his little empire.
5. Generativism keeps changing. First there was transformational grammar, then principles and paradigms and now minimalism. While each of these share similarities they also have some fundamental differences. I think people think if Generativism's previous theories have been abandoned, why should I bother learning the new one if it is just going to get rejected in a decade or so. As they say, once bitten, twice shy.

Finally while generativism is dying that isn't the same as saying the innate hypothesis is dying. While generativism is based on the concept of a language acquisition device (a bit of the brain that has rules for a possible grammar already encoded), a theory of innate knowledge of certain aspects of language is shared by many different theories of language acquisition. Having said that, it seems to me that most of the big movers and shakers in the language acquisition debates (such as Snow, Bates and Tomasello) favour an understanding of innate language ability that is related to other more general cognitive skills (drawing on the work of Piaget on the one hand and various cognitive linguists on the other). This view transcends the nature/nurture to some extent, and it seems like that's where people are heading. They are just so sick of the debate and it not getting anyone anywhere. Of course there are still influential people like Pinker who still wants to push a relatively strong innate line (though Pinker is from MIT, like Chomsky, and I get the feeling that a lot of people have simply stopped listening to the MIT cognitive scientists).

Disclaimer: I work in interactional and cognitive linguistics, both of which take a very hostile view to most of the fundamental assumptions of generativism, so I may be somewhat biased, though I have tried not to be.
Due to the death of a Pitjantjatjara man named Ngayunya, the singular first person pronoun "ngayu" was tabooed and replaced by "nganku"; a subsequent death made "nganku" taboo, and therefore "ngayu" was revived.


Return to “Language/Linguistics”

Who is online

Users browsing this forum: No registered users and 5 guests