Malka Rappaport Hovav
Words are sensitive to syntactic context. Argument realization is the study of the relation between argument-taking words, the syntactic contexts they appear in and the interpretive properties that constrain the relation between them.
Bert Le Bruyn, Henriëtte de Swart, and Joost Zwarts
Bare nominals (also called “bare nouns”) are nominal structures without an overt article or other determiner. The distinction between a bare noun and a noun that is part of a larger nominal structure must be made in context: Milk is a bare nominal in I bought milk, but not in I bought the milk. Bare nouns have a limited distribution: In subject or object position, English allows bare mass nouns and bare plurals, but not bare singular count nouns (*I bought table). Bare singular count nouns only appear in special configurations, such as coordination (I bought table and chairs for £182).
From a semantic perspective, it is noteworthy that bare nouns achieve reference without the support of a determiner. A full noun phrase like the cookies refers to the maximal sum of cookies in the context, because of the definite article the. English bare plurals have two main interpretations: In generic sentences they refer to the kind (Cookies are sweet), in episodic sentences they refer to some exemplars of the kind (Cookies are in the cabinet). Bare nouns typically take narrow scope with respect to other scope-bearing operators like negation.
The typology of bare nouns reveals substantial variation, and bare nouns in languages other than English may have different distributions and meanings. But genericity and narrow scope are recurring features in the cross-linguistic study of bare nominals.
Blocking can be defined as the non-occurrence of some linguistic form, whose existence could be expected on general grounds, due to the existence of a rival form. *Oxes, for example, is blocked by oxen, *stealer by thief. Although blocking is closely associated with morphology, in reality the competing “forms” can not only be morphemes or words, but can also be syntactic units. In German, for example, the compound Rotwein ‘red wine’ blocks the phrasal unit *roter Wein (in the relevant sense), just as the phrasal unit rote Rübe ‘beetroot; lit. red beet’ blocks the compound *Rotrübe. In these examples, one crucial factor determining blocking is synonymy; speakers apparently have a deep-rooted presumption against synonyms. Whether homonymy can also lead to a similar avoidance strategy, is still controversial. But even if homonymy blocking exists, it certainly is much less systematic than synonymy blocking.
In all the examples mentioned above, it is a word stored in the mental lexicon that blocks a rival formation. However, besides such cases of lexical blocking, one can observe blocking among productive patterns. Dutch has three suffixes for deriving agent nouns from verbal bases, -er, -der, and -aar. Of these three suffixes, the first one is the default choice, while -der and -aar are chosen in very specific phonological environments: as Geert Booij describes in The Morphology of Dutch (2002), “the suffix -aar occurs after stems ending in a coronal sonorant consonant preceded by schwa, and -der occurs after stems ending in /r/” (p. 122). Contrary to lexical blocking, the effect of this kind of pattern blocking does not depend on words stored in the mental lexicon and their token frequency but on abstract features (in the case at hand, phonological features).
Blocking was first recognized by the Indian grammarian Pāṇini in the 5th or 4th century
Andrej L. Malchukov
Morphological case is conventionally defined as a system of marking of a dependent nominal for the type of relationship they bear to their heads. While most linguists would agree with this definition, in practice it is often a matter of controversy whether a certain marker X counts as case in language L, or how many case values language L features. First, the distinction between morphological cases and case particles/adpositions is fuzzy in a cross-linguistic perspective. Second, the distinctions between cases can be obscured by patterns of case syncretism, leading to different analyses of the underlying system. On the functional side, it is important to distinguish between syntactic (structural), semantic, and “pragmatic” cases, yet these distinctions are not clear-cut either, as syntactic cases historically arise from the two latter sources. Moreover, case paradigms of individual languages usually show a conflation between syntactic, semantic, and pragmatic cases (see the phenomenon of “focal ergativity,” where ergative case is used when the A argument is in focus). The composition of case paradigms can be shown to follow a certain typological pattern, which is captured by case hierarchy, as proposed by Greenberg and Blake, among others. Case hierarchy constrains the way how case systems evolve (or are reduced) across languages and derives from relative markedness and, ultimately, from frequencies of individual cases. The (one-dimensional) case hierarchy is, however, incapable of capturing all recurrent polysemies of individual case markers; rather, such polysemies can be represented through a more complex two-dimensional hierarchy (semantic map), which can also be given a diachronic interpretation.
Jane Chandlee and Jeffrey Heinz
Computational phonology studies the nature of the computations necessary and sufficient for characterizing phonological knowledge. As a field it is informed by the theories of computation and phonology.
The computational nature of phonological knowledge is important because at a fundamental level it is about the psychological nature of memory as it pertains to phonological knowledge. Different types of phonological knowledge can be characterized as computational problems, and the solutions to these problems reveal their computational nature. In contrast to syntactic knowledge, there is clear evidence that phonological knowledge is computationally bounded to the so-called regular classes of sets and relations. These classes have multiple mathematical characterizations in terms of logic, automata, and algebra with significant implications for the nature of memory. In fact, there is evidence that phonological knowledge is bounded by particular subregular classes, with more restrictive logical, automata-theoretic, and algebraic characterizations, and thus by weaker models of memory.
Connectionism is an important theoretical framework for the study of human cognition and behavior. Also known as Parallel Distributed Processing (PDP) or Artificial Neural Networks (ANN), connectionism advocates that learning, representation, and processing of information in mind are parallel, distributed, and interactive in nature. It argues for the emergence of human cognition as the outcome of large networks of interactive processing units operating simultaneously. Inspired by findings from neural science and artificial intelligence, connectionism is a powerful computational tool, and it has had profound impact on many areas of research, including linguistics. Since the beginning of connectionism, many connectionist models have been developed to account for a wide range of important linguistic phenomena observed in monolingual research, such as speech perception, speech production, semantic representation, and early lexical development in children. Recently, the application of connectionism to bilingual research has also gathered momentum. Connectionist models are often precise in the specification of modeling parameters and flexible in the manipulation of relevant variables in the model to address relevant theoretical questions, therefore they can provide significant advantages in testing mechanisms underlying language processes.
Construction Morphology is a theory of word structure in which the complex words of a language are analyzed as constructions, that is, systematic pairings of form and meaning. These pairings are analyzed within a Tripartite Parallel Architecture conception of grammar. This presupposes a word-based approach to the analysis of morphological structure and a strong dependence on paradigmatic relations between words. The lexicon contains both words and the constructional schemas they are instantiations of. Words and schemas are organized in a hierarchical network, with intermediate layers of subschemas. These schemas have a motivating function with respect to existing complex words and specify how new complex words can be formed.
The consequence of this view of morphology is that there is no sharp boundary between lexicon and grammar. In addition, the use of morphological patterns may also depend on specific syntactic constructions (construction-dependent morphology).
This theory of lexical relatedness also provides insight into language change such as the use of obsolete case markers as markers of specific constructions, the change of words into affixes, and the debonding of word constituents into independent words. Studies of language acquisition and word processing confirm this view of the lexicon and the nature of lexical knowledge.
Construction Morphology is also well equipped for dealing with inflection and the relationships between the cells of inflectional paradigms, because it can express how morphological schemas are related paradigmatically.
William F. Hanks
Deictic expressions, like English ‘this, that, here, and there’ occur in all known human languages. They are typically used to individuate objects in the immediate context in which they are uttered, by pointing at them so as to direct attention to them. The object, or demonstratum is singled out as a focus, and a successful act of deictic reference is one that results in the Speaker (Spr) and Addressee (Adr) attending to the same referential object. Thus,
(1)A:Oh, there’sthat guy again (pointing)B:Oh yeah, now I see him (fixing gaze on the guy)
(2)A:I’ll have that one over there (pointing to a dessert on a tray)B:This? (touching pastry with tongs)A:yeah, that looks greatB:Here ya’ go (handing pastry to customer)
In an exchange like (1), A’s utterance spotlights the individual guy, directing B’s attention to him, and B’s response (both verbal and ocular) displays that he has recognized him. In (2) A’s utterance individuates one pastry among several, B’s response makes sure he’s attending to the right one, A reconfirms and B completes by presenting the pastry to him. If we compare the two examples, it is clear that the underscored deictics can pick out or present individuals without describing them. In a similar way, “I, you, he/she, we, now, (back) then,” and their analogues are all used to pick out individuals (persons, objects, or time frames), apparently without describing them. As a corollary of this semantic paucity, individual deictics vary extremely widely in the kinds of object they may properly denote: ‘here’ can denote anything from the tip of your nose to planet Earth, and ‘this’ can denote anything from a pastry to an upcoming day (this Tuesday). Under the same circumstance, ‘this’ and ‘that’ can refer appropriately to the same object, depending upon who is speaking, as in (2). How can forms that are so abstract and variable over contexts be so specific and rigid in a given context? On what parameters do deictics and deictic systems in human languages vary, and how do they relate to grammar and semantics more generally?
Displacement is a ubiquitous phenomenon in natural languages. Grammarians often speak of displacement in cases where the rules for the canonical word order of a language lead to the expectation of finding a word or phrase in a particular position in the sentence whereas it surfaces instead in a different position and the canonical position remains empty: ‘Which book did you buy?’ is an example of displacement because the noun phrase ‘which book’, which acts as the grammatical object in the question, does not occur in the canonical object position, which in English is after the verb. Instead, it surfaces at the beginning of the sentence and the object position remains empty. Displacement is often used as a diagnostic for constituent structure because it affects only (but not all) constituents. In the clear cases, displaced constituents show properties associated with two distinct linear and hierarchical positions. Typically, one of these two positions c-commands the other and the displaced element is pronounced in the c-commanding position. Displacement also shows strong interactions with the path between the empty canonical position and the position where the element is pronounced: one often encounters morphological changes along this path and evidence for structural placement of the displaced constituent, as well as constraints on displacement induced by the path.
The exact scope of displacement as an analytically unified phenomenon varies from theory to theory. If more then one type of syntactic displacement is recognized, the question of the interaction between movement types arises. Displacement phenomena are extensively studied by syntacticians. Their enduring interest derives from the fact that the complex interactions between displacement and other aspects of syntax offer a powerful probe into the inner workings and architecture of the human syntactic faculty.
Jonathan David Bobaljik
Distributed Morphology (DM) is a framework in theoretical morphology, characterized by two core tenets: (i) that the internal hierarchical structure of words is, in the first instance, syntactic (complex words are derived syntactically), and (ii) that the syntax operates on abstract morphemes, defined in terms of morphosyntactic features, and that the spell-out (realization, exponence) of these abstract morphemes occurs after the syntax. Distributing the functions of the classical morpheme in this way allows for analysis of mismatches between the minimal units of grammatical combination and the minimal units of sound. Much work within the framework is nevertheless guided by seeking to understand restrictions on such mismatches, balancing the need for the detailed description of complex morphological data in individual languages against an attempt to explain broad patterns in terms of restrictions imposed by grammatical principles.