COMMON FEATURES OF MODERN MODELS OF LANGUAGE

The modern models of language have turned out to possess several common features that are very important for the comprehension and use of these models. One of these models is given by the Meaning Û Text Theory already mentioned. Another model is that based on the Head-Driven Phrase Structure Grammar. The Chomskian approach within the Western linguistic tradition includes various other models different from HPSG.

Here are the main common features of all these models:

· Functionality of the model. The linguistic models try to reproduce functions of language without directly reproducing the features of activity of the brain, which is the motor of human language.

· Opposition of the textual/phonetic form of language to its semantic representation. The manual [9], depicting three different well-known syntactic theories (including one of the recent variant of the theory by N. Chomsky), notices: “Language ultimately expresses a relation between sound at one end of the linguistic spectrum and meaning at the other.” Just as the diffuse notion spectrum is somehow sharpened, we have the same definition of language as in the MTT. The outer, observable form of language activity is a text, i.e., strings of phonetic symbols or letters, whereas the inner, hidden form of the same information is the meaning of this text. Language relates two these forms of the same information.

· Generalizing character of language. Separate utterances, within a speech or a text, are considered not as the language, but as samples of its functioning. The language is a theoretical generalization of the open and hence infinite set of utterances. The generalization brings in features, types, structures, levels, rules, etc., which are not directly observable. Rather these theoretical constructs are fruits of linguist's intuition and are to be repeatedly tested on new utterances and the intuition of other linguists. The generalization feature is connected with the opposition competence vs. performance in Chomskian theory and to the much earlier opposition language vs. speech in the theory of Ferdinand de Saussure.

· Dynamic character of the model. A functional model does not only propose a set of linguistic notions, but also shows (by means of rules) how these notions are used in the processing of utterances.

· Formal character of the model. A functional model is a system of rules sufficiently rigorous to be applied to any text by a person or an automaton quite formally, without intervention of the model’s author or anybody else. The application of the rules to the given text or the given meaning always produces the same result. Any part of a functional model can in principle be expressed in a strict mathematical form and thus algorithmized.[19] If no ready mathematical tool is available at present, a new tool should be created. The presupposed properties of recognizability and algorithmizability of natural language are very important for the linguistic models aimed at computer implementation.

· Non-generative character of the model. Information does not arise or generated within the model; it merely acquires a form corresponding to other linguistic level. We may thus call the correspondences between levels equative correspondences. On the contrary, in the original generative grammars by Chomsky, the strings of symbols that can be interpreted as utterances are generated from an initial symbol, which has just abstract sense of a sentence. As to transformations by Chomsky in their initial form, they may change the meaning of an utterance, and thus they were not equative correspondences.

· Independence of the model from direction of transformation. The description of a language is independent of the direction of linguistic processing. If the processing submits to some rules, these rules should be given in equative (i.e., preserving the meaning) bi-directional form, or else they should permit reversion in principle.

· Independence of algorithms from data. A description of language structures should be considered apart from algorithms using this description. Knowledge about language does not imply a specific type of algorithms. On the contrary, in many situations an algorithm implementing some rules can have numerous options. For example, the MTT describes the text level separately from the morphologic and syntactic levels of the representation of the same utterance. Nevertheless, one can imagine an algorithm of analysis that begins to construct the corresponding part of the syntactic representation just as the morphologic representation of the first word in the utterance is formed. In the cases when linguistic knowledge is presented in declarative form with the highest possible consistency, implementing algorithms proved to be rather universal, i.e., equally applicable to several languages. (Such linguistic universality has something in common with the Universal Grammar that N. Chomsky has claimed to create.) The analogous distinction between algorithms and data is used with great success in modern compilers of programming languages (cf. compiler-compilers).

· Emphasis on detailed dictionaries. The main part of the description of any language implies words of the language. Hence, dictionaries containing descriptions of separate words are considered the principal part of a rigorous language description. Only very general properties of vast classes and subclasses of lexemes are abstracted from dictionaries, in order to constitute formal grammars.