WHEN THE "COLOURLESS GREEN IDEAS" WAKE UP

     (THE MEANING UNITS GRAMMAR FOR DECLARATIVE SENTENCE
                      OF INDICATIVE MOOD)

                       Vladimir Polyakov

                  Russian Academy of Sciences
                   Pushchino Research Center
         box 97, Pushchino, Moscow Region, 142292, Russia         
          E-mail: tan@ibpm.serpukhov.su (for Polyakov)
Moscow State Steel and Alloys Institute (Technological University)
              E-mail: polyakov@asu.misa.ac.ru


I.MOTIVATION
   The natural language (NL) is a universal tool for  
description of reality in terms of models known to the 
agent. The ability to reason as well as to perform
other cognitive functions is based on a knowledge 
representation, therefore the grammar of the NL should be 
coincide with representation model of the agent's knowledge 
about the environment.
   The problem on synthesis of a model of language (=grammar)
and a knowledge representation are far from handling the 
final solution despite the great efforts made in this field
[12,15,20,9,3,19] for the last years.
   The present article describes the NL grammar based on the
meaning units. The Manyaspect Model of Meaning for Sentence 
(MAMM-S) described in [26,27]  forms the basis of semantic 
model for the Meaning Units Grammar(MUG). (Note 1). 
   The results of the researches[10,6,7,8,2,18,4,5,21,22,24] 
influenced on development of MUG and MAMM-S.
The grammar provides a basis for developing of a laboratory 
computer system for Russian language ability modelling, 
"Nedorosl" [27,28].

II.BASIC CATEGORIES OF THE MEANING UNITS GRAMMAR

THE MEANING UNITS AND THE SRA-ELEMENTS
   The following properties of the NL are at the heart of the 
MUG.
1.The natural language sentence (NLS) represents a composition of 
semantic primitives connected between themselves and named the 
meaning units. (Note 2).
2.The meaning unit is a part of the NLS having independent meaning 
value. The simple MU is a minimal part of the NLS, having
structural (not atomic) equivalent in the meaning model (MAMM-S).
3.It is possible to recognise three major properties of the MU:
   - indivisibility of the simple MUs, i.e. further 
division of the simple MUs into the components produces their 
semantic independence and therefore uncertainty of the sentence 
as a semantic whole;
   - matchibility, i.e. the ability of MUs to connect between
themselves in definite manner for creation of the NLS.
Since the main and the only content of the NLS meaning is in 
description of various objects of the agent's surrounding 
reality in their interrelation then the third basic property 
of the MU is
   - ability to describe interrelations between the objects.
4.The structural equivalent of the MU in MAMM-S is an object-role 
predicate like RELATION(SUBJECT,ATTRIBUTE) or the SRA-element.
Its properties are discussed in [26]. 
Notation of an object-role predicate
          p(R1,O1,R2,O2),                                (1)     
used in MUG and MAMM-S, is in agreement with the standard
notation of predicate calculus[23]:
E(x)E(y) (p(x,y) & R1(x) & R2(y) & (x = O1) & (y = O2)), (2) 
where    
p is the name of the predicate expressing the RELATION between 
the SUBJECT x and the ATTRIBUTE y;
R1 is the role of a SUBJECT x; R2 is the role of an ATTRIBUTE y ; 
O1 is the referential value of a SUBJECT x; O2 is the referential 
value of an ATTRIBUTE y ; 
E is a quantor of existence.(Note 3,4).
    From (2) it follows that  nature of category ROLE
in MAMM-S and MUG is described by the properties of a set, 
and the relation ROLE-OBJECT is equivalent to mathematical 
"set-element" relation.
5.The SRA-structure, corresponding to the MU, consists of a
constant part (MU type), and the variables at substituting of 
which into a model structure a specific meaning value of the 
MU is achieved.

STRUCTURE AND PROPERTIES OF THE MU
Table 1 gives the examples (item 1-3) of NL-phrases, written
in a MUG notation.

-----------------------------------------------------------
Table 1. Examples of MU interpretation in the NL.
-----------------------------------------------------------
Example 1. "Parisian"
MU type: root(PLACE)-suffix(i)-suffix(an)
         ----------  - - - - - ==========
SRA-equivalent:
PLACE_OF_THE_RESIDENSE(PLACE,"Paris",OBJECT,o1)
Notes: root(PLACE) - root with semantic meaning of the place 
or the geographical name. Compare with: Asian, European etc.
-----------------------------------------------------------
Example 2."sister's room"
MU type: noun(possessive case)-null-noun()
         ==================== - - ------
SRA-equivalent: TO_HAVE_OWNER(OBJECT,"room",OWNER,"sister")
-----------------------------------------------------------
Example 3."The train has arrived at the station."
MU types:1. noun(nominative case)...null...+verb("arrive")
            --------------------    - -     ==============
2. verb("arrive")...pronoun("at")...noun(objective case)
   --------------   - - - - - - -   ===================
3. verb(present perfect)-null-null
   --------------------  - - ===
SRA-equivalent: 
   BEFORE_PP_UM(POINT_SITUATION,S1,POINT_SITUATION,TUP) & 
   S1=ARRIVE_WHERE(OBJECT,"the train",PLACE,"the station")
Notes: TUP-Time utterance point.
-----------------------------------------------------------
Notes for table 1: 1.Underlined: by a line - supporting MUP,
by dotted line-connecting MUP, by double lines-depending MUP.
       2.nul - MUP is absent.
       3.The feature of MUP are shown in parenthesis.
       4.Dots "..." means the feasibility of inclusions of other 
MU between MUP parts .
       5.Literals "+","*",":" signify the sequence of MUP in
number(+),case(*) and gender(:), respectively.


The examples illustrate the following theses of MUG.
1.It is possible to recognise the components of MU: a 
supporting part(SP);a connecting part(CP); a depending 
part(DP).
2.In the NLS there are simple (Table1,item 2) and complex MUs
(item 1). 
3.The MU consists of the MU parts (MUP). The MUPs can be also 
divided into the simple and complex MUPs in accordance with 
their structure(item 1).
   The following level of representation of the MUP and 
MU in the NLS  depending on the language means used can be 
identified:
   morphemic(item 1), where morphemes are the MUPs;
   lexical(item 2,3), where forms of a word are the MUPs;
   structural(fig.1.3), where structural parts of the NLS are 
the MUPs.

--------------------------------------------------------------
                             |-----------   ----------------
                             ||T1       |---|T2            |
                             ||schedules|and|      this(T1)|
                             || | \     |---|         |    | 
                             || |  New  |   |     increased|   
       I                     || |       |   |       /    | |  
       |                     ||were     |   |considerably| |  
      read                   ||intoduced|   |            | |  
     /    \ (S2)             |-----------   |a productivity|
yesterday  paper             |              |        |     |
 (S1)       \ (S3)           |              |    of labour |
             your            |              ---------------- 
                             |         T1                   
                             |         |                    
                             |         T2                   
                             | 
Fig 1.1  MU tree for the NLS:|Fig 1.3  A tree of MU trees for 
"I read your paper           |the NLS:"New schedules were
yesterday."                  |introduced and this increased 
                             |a productivity of labour 
                             |considerably."
----------------------------------------------------------- 
 S3                          | T1    T2       TUP             
-----------------------      | +     +         +
  S2            S1           |============================>           
  +             +            |
===================+=====>   |
                  TUP        |                              
simult(S3,TUP)       TUP     |before(T1,TUP)     TUP           
simult(S3,S2)       / | \    |before(T2,TUP)    /   \          
simult(S3,S1)      S1-S2-S3  |before(T1,T2)   T1-----T2      
before(S2,S1)                |        
before(S1,TUP)               |                              
before(S2,TUP)               |                              
                             |                              
Fig 1.2 MU net and the       |Fig 1.4  A net of MU net and    
diagram for NLS:             |the diagram for NLS:           
"I read your paper           |"New schedules were introduced 
yesterday."                  |and this increased a productivity 
                             |of labour considerably ."
-----------------------------------------------------------

Each grammatical level in MUP representation has its own 
set of MUP identification features. On morphemic level the MUP 
is identified by morphemes from a derivation model having 
specific semantics (item 1). On lexical level(item 2) MUP are 
defined by the next groups of features:
   primary (morphological) features, allowing to identify the
lexems and the "semantic links-properties" of appropriate 
concepts;
   secondary (syntactic) signs, allowing to identify 
"semantic links-relations" among the lexems,i.e. the MUP type.
In some parts of the speech the features are greatly dependent.
Thus, a verb from grammatical point of view, is a unique 
phenomenon, that unites  simultaneously several levels of 
MU representation: 
   the paradigm of morphemic,morphological and lexical MUP 
with great variety of situation-chronological relations 
[13,17,1,16](Tab.2); 
   the case-preposition noun government model[2,18] with 
role-attribute semantics of relations (Tab.3); 
   conjugation system with role-subject semantics of 
referential relations.

Table 2.Case-preposition function for the verb "to buy".
                      (a fragment) 
----------------------------------------------------------- 
Case  :Question:Preposition: SRA-predicate / Example 
----------------------------------------------------------- 
Nomi- :Who ?   :     -     :RELATION?(BUYER,ATTRIBUTE_ROLE?) 1)
native:        :           :"Peter bought(a book)."
      :----------------------------------------------------
      :What?   :    -      : 2)                      - 
----------------------------------------------------------- 
Objec-:Who ?   :    -      : 3)                      - 
tive  :        :-------------------------------------------
      :        :  without  :TO_BUY_WHITHOUT_WHO(BUYER, 
      :        :           :    ABSENT_ANIMATE_OBJECT)
      :        :           :"Peter bought a book without 
      :        :           :his  brother."
      :        :-------------------------------------------
      :        :  for      :TO_BUY_FOR_WHO(BUYER,
      :        :           : ANIMATE_DESTINATION_OBJECT)
      :        :           :"Peter bought a book for his brother." 
      :        :--------------------------------------------- 
      :        :  before   :TO_BUY_BEFORE_WHO(BUYER,
      :        :           :EVENT_CONNECTED_WITH_ANIMATE_OBJECT) 4) 
      :        :           :"Peter bought a book before the arrival 
      :        :           : of his brother." 
      :        :--------------------------------------------- 
      :        :because of :TO_BUY_BECAUSE_OF_WHO(BUYER,INITIATOR) 
      :        :           :"Peter bought a book because of 
      :        :           :his brother." 
      :        :--------------------------------------------- 
      :        :   from    :TO_BUY_FROM_WHO(BUYER,SELLER) 
      :        :           :"Peter bought a book from a
      :        :           :street merchant."
----------------------------------------------------------- 
Objec-:What ?  :     -     : 2)                      - 
tive  :        :-------------------------------------------
      :        :without   :TO_BUY_WITHOUT_WHAT(BUYER,
      :        :           : ABSENT_INANIMATE_OBJECT)
      :        :           :"Peter bought a book without 
      :        :           : enthusiasm" 5)
      :        :------------------------------------------- 
      :        :  for      :TO_BUY_FOR_WHAT(BUYER,GOAL)
      :        :           :"Peter bought a book to study."
      :        :------------------------------------------- 
      :        :  before   :TO_BUY_BEFORE_WHAT(BUYER,EVENT)
      :        :           :"Peter bought a book before holiday."
      :        :------------------------------------------- 
      :        :because of :TO_BUY_BECAUSE_OF_WHAT(BUYER,CAUSE)
      :        :           :"Peter bought a book because of 
necessity."
      :        :------------------------------------------- 
      :        :    at     :TO_BUY_AT_WHAT(BUYER,PLACE)
      :        :           :"Peter bought a book at the shop."
----------------------------------------------------------- 
Notes for table 2:
1) The SUBJECT do not form its own predicate, as it is equally
used in all predicate form .
2) It is not used.
3) It is not used for inanimate SUBJECTS. If it is a matter
of organisation or institution in Russian they are set equal to 
animate subjects,i.e. will answer the question " who ?".
4) The predicate has a temporal nuance.
5) Compare with:"Peter bought a book without a binding."

Table 3. Example of temporal paradigm of the verb "to read".
-----------------------------------------------------------
N  : Predicate/ NL-example 1) :Temporal diagram 2) 
-----------------------------------------------------------
1. :HOLDS_IP_UM(S,TUP)        :    <-- S ---->
   : "I am reading."          :=========+==============>      
   :                          :        TUP
----------------------------------------------------------- 
2. :BEFORE_IP_UM(S,TUP)       :<--- S ---->
   : "I was reading."         :=====+=========+========>
   :                          :    Tx        TUP       
----------------------------------------------------------- 
3. : AFTER_IP_UM(S,TUP)       :             <--- S ---->
   :"I shall be reading."     :=========+========+=====>
   :                          :        TUP      Tx       
----------------------------------------------------------- 
4. : SIMULT_IP_UM(Ss,TUP)     :             +--- S ---->
   :                          :=============+==========>
   :"I am begining to read."  :            TUP=Tx
----------------------------------------------------------- 
5. :BEFORE_SP_UM(Ss,TUP)      :    +---- S --->
   : "I have begun to read."  :====+============+======>
   :                          :   Tx           TUP
----------------------------------------------------------- 
6. : AFTER_SP_UM(Ss,TUP)      :             +--- S ---->
   : "I shall begin to read." :======+======+==========>
   :                          :     TUP    Tx
----------------------------------------------------------- 
7. : see 1                    :    <-- S ---->
   : "I have been reading."   :=========+==============>             
   :                          :        TUP
----------------------------------------------------------- 
8. : see 2                    :<--- S ---->
   : "I had been reading."    :=====+=========+========>       
   :                          :    Tx        TUP       

----------------------------------------------------------- 
9. :BEFORE_EP_UM(S1e,TUP) &   :<--S1--+  +--S2-->        
   :BEFORE_SP_UM(S2s,TUP)     :========+===========+===>       
   :"I continued to read."    :       Tx          TUP
----------------------------------------------------------- 
10.: AFTER_SP_UM(S2s,TUP)     :       <--S1--+  +--S2->
   :"I shall continue to      :=====+==========+=======>
   : read."                   :    TUP        Tx
----------------------------------------------------------- 
11.:BEFORE_SP_UM(S1s,TUP) &   :  +-S1->  +-S2-> 
   :BEFORE_SP_UM(S2s,TUP)     :==+=======+========+======>
   :"I began to read."        : T1      T2       TUP
----------------------------------------------------------- 
12.: AFTER_SP_UM(S1s,TUP)     :            +-S1->  +-S2->
   : AFTER_SP_UM(S2s,TUP)     :=====+======+=======+=====>
   :"I shall begin to         :    TUP     T1      T2
   : read."                   :
----------------------------------------------------------- 

Table 3. Example of temporal paradigm of verb "read"(cont.)
-----------------------------------------------------------
N  : Predicate/ NL-example    :Temporal diagram  
-----------------------------------------------------------
13.:BEFORE_EP_UM(Se,TUP)      :  <--S-+
   :"I have finished to read.":=======+=========+========>
   :                          :                TUP
----------------------------------------------------------- 
14.:AFTER_EP_UM(Se,TUP)       :          <--S---+
   :"I shall finish to read." :=======+=========+========>
   :                          :      TUP         
-----------------------------------------------------------
15.:SIMULT_EP_UM(Se,TUP)      :       <--- S ---+
   :"I am finishing to read." :=================+========>
   :                          :                TUP
----------------------------------------------------------- 
16.:BEFORE_EP_UM(S1e,TUP) &   :<-S1-+  <-S2-+
   :BEFORE_EP_UM(S2E,TUP)     :=====+=======+=======+====>
   :"I finished to read."     :    T1      T2      TUP
----------------------------------------------------------- 
17.:AFTER_EP_UM(S1e,TUP) &    :       <--S1--+  <-S2-+
   :AFTER_EP_UM(S2e,TUP) &    :======+=======+=======+===>
   :"I shall finish to read." :     TUP     T1      T2
----------------------------------------------------------- 
18.:BEFORE_PP_UM(S,TUP)       :    S + 
   : "I have read."           :======+==========+========>
   :                          :     Tx         TUP
----------------------------------------------------------- 
19.:AFTER_PP_UM(S,TUP)        :                S +
   :"I shall have read."      :=========+========+=======>
   :                          :        TUP      Tx
-----------------------------------------------------------
20.:PERIODIC_SIMULT_PI_M      :    S1+     S2+    S3+
   :(S,TUI,t,w)         3,4)  :==+===================+===>
   :"I read."                 :  <-------- TUI ------>
----------------------------------------------------------- 
21.:PERIODIC_AFTER_PP_M      :             S1+     S2+  
   :(S,TUP,t,w)     7)        :==+===========+=======+===>
   :"T shall read."           : TUP         T1      T2
----------------------------------------------------------- 
22.:PERIODIC_BEFORE_PP_M      :  S1+      S2+
   :(S,TUP,t,w)     8)        :====+========+========+===>
   :"I read."                 :   T1       T2       TUP
----------------------------------------------------------- 
Notes for table 3: 
1)As the role set in temporal (t-) relations is rather
limited,it include only five values: 
       POINT_SITUATION(P), 
       INTERVAL_SITUATION(I), 
       START_OF_THE_INTERVAL(S), 
       END_OF_THE_INTERVAL(E), 
       CHAIN(C); 
notation for t-RELATIONS includes the corresponding suffixes:
P, I, S, E, C. 
Besides, the notation for x is relations includes the suffixes
of corresponding scale:
      UM -unmetric, 
      M - metric, 
      F - fuzzy. 
See details in [13].
                          
2)Conventional signs on temporal diagram:
TUP - the time utterance point. 
TUI - the time utterance interval. 
Tx - time of an interval or point event. 
T1, T2 ... - time of periodic situations. 
+ - symbol of point situation. 
Symbols of interval situations: 
+--+ - interval situation of a closed type; 
+--> - the start of an interval situation of an opened type; 
<--+ - the end of an interval situation of an opened type; 
<--> - interval situation of an opened type without 
indication to the start or the end .
3) The point situation S repeats in time utterance interval 
with a period t in units w. 
4) Example of unbinary relation.

At the level of structural representation the role of a 
features depends on one or another lexems, punctuation marks 
or specific syntactical forms in structural MUP. The MU type 
is described by the combination of MUP features. The problem 
of uncertainty of some omostuctural MU types in the MU selection
can be solved by using a role semantic filter.
4.In order to short the MU notational record, its MUP,
assigned to various part of the speech, but serving a similar
functions in MU, can be combined into MUP classes . 
For example, nouns and personal pronouns which serve the 
similar functions.(item.3 in compare with "He arrived at 
the station.")

GRAPH REPRESENTATIONS AND THE MEANING ASPECTS
   To illustrate the links between the MUP sentence can 
be shown as set of graph representations(Fig.1). Each graph 
representation corresponds to the own type of the MAMM-S 
aspects of the meaning in MUG. 
   The simplest type of the representation is a MU tree,
consisting of lexical and morphemic MUPs and describing 
relations between the "external" concepts,i.e. the concepts 
existing without reference to the NLS.(Fig.1.1)
   The other types of graph representation are a MU net,
a tree of MU trees and a net of MU nets. They are typical 
for the links between the "internal" concepts,i.e. the 
concepts formed by the NLS itself. They are used for 
describing temporal links, the links between clauses 
of complex sentences, for describing modal, communicational 
and similar relations [11] (Fig.1.2-1.4). (Note 5).  

III.OPERATION WITH MEANING UNITS
   When fulfilling a cognitive function "NLS analysis" the 
MUG contemplates six basic operations with the meaning units
(Fig.2):
   - MU type identification;
   - substitution of concept variable values into the MU;
   - MU selection;
   - MU composition into the graph representation;
   - MU linking in a graph representation;
   - MU interpretation into the SRA-element.

-----------------------------------------------------------
            |  S - NLS.
            V
+---------------------------+            +----------------+
|   Complex traditional     |<===========|  Data base of  |
|   grammatical analysis    |            |       MUP      |    
+---------------------------+            +----------------+ 
            | MUP - parts of the meaning units.
            V
+---------------------------+            +----------------+
|   Operations with MU:     |<===========|  Data base of  |    
|                           |            |     the MU     | 
|     -identification;      |            +----------------+
|     -substitution;        |            +----------------+ 
|     -selection;           |<===========|  Data base of  |
|     -composition.         |       |====|    concepts    |
+---------------------------+       |    +----------------+
             | MU -the meaning units| 
             V     ,graphs.         |
+---------------------------+       |
|     Linking.              |<======|
|     Interpretation.       | 
+---------------------------+
             | L - logical form in MAMM-S.
             V

Fig.2.NL sentences analysis in MU grammar.
      (simplified scheme).
----------------------------------------------------------------

IV.MODEL RESTRICTIONS
   As any other model, MUG is an only approximate description 
of the natural language and has a number of restrictions. 
The author believes, that these restrictions do not belong 
to the MUG nature as well as MAMM-S, and are determined by a 
lot of forms and phenomena which compose the natural language, 
and which can be eliminated in  further development of the 
models.

V.DISCUSSION

THE "COLOURLESS GREEN IDEAS" AND THE OBJECT-ROLE FILTER

   In his classical work[6] Chomsky gave the phrase
 
            "colourless green ideas sleep furiously" 

as example of syntactically irreproachable, but absolutely 
meaningless 
sentence, that transformational grammar must consider as correct.
   On the contrary there is a possibility to use object - role
semantic filter in the Meaning Units Grammar that allows to discover
meaningless combinations of word.
   Nothing from relations (marked by number 1...4 in the MU tree
of figure 1) for this example is not satisfying just by the case
of impossibility for noted nonmaterial objects ("ideas") to play
corresponding roles ("to be colourless", "to have colour", "to sleep"
etc.).

            i d e a s
           1/  2|  3\
    colourless  green sleep
                       4\
                      furiously

Fig.3 MU tree for example of Chomsky.

MUG AND OTHER MODELS
  Meaning Unit Grammar is algorithmic model for NL-sentences 
processing, that is not substitute well-known linguistic models
[18,8,11, etc.] that describe phenomenology of language.  
Rather MUG is founded on them in attempts to take in account 
the basic properties of NL-form. In table 4 the set of the 
properties is shown, that was reflected in notation, structure 
and properties of meaning units. 

Table 4. Basic properties of NL-form, reflected in notation
structure and properties of MU.
--------------------------------------------------------------
PROPERTIE OF NL-FORM        PROPERTIE OF MU    
--------------------------------------------------------------
1.Semantic of NL-form       1.MU has an semantic equivalent 
                              in MAMM.
2.It has parameters.        2.MU has an constant component(=MU type)
                              and variable components(lexems)
3.It has a structure.       3.Mu consists of parts.

4.The order of word can     4.Conventional signs < > in notation.
be inverted.
5.Interrupts in the word    5.Conventional signs ... in notation.
semantic chains.
6.Agreement.                6.Conventional signs +,*,: in notation.

7.Links.                    7.a)Connecting part of MU.
                              b)Two mechanisms of linking 
                                in MU trees. 
8.Assymetry                 8.MU has supporting part and 
                              connecting part.
9.It has clusters.          9.Complex MU.

10.Tree of dependencies.    10.MU tree.

11.Morphology, lexicology, 11.They are represented in 
   syntax                     the means of MU describing.
12.Recognition.            12.Provided with features of MU.  
--------------------------------------------------------------

   To describe the correlation of MUG and other models more 
strictly it is necessary to view the basic notions of MUG in 
comparing with similar notions in other models.

1.Immediate components (IC) [6] vs. part of meaning units (MUP)

  At the first sight MUP may be identified with IC, but there 
are a number of differences.
  SIMILARITY:
  - MUP and IC may be described at lexical level;
  - MUP and IC have in its description lexical and syntactical 
    components;
  - classes of MUP, complex MUPs can have corresponding IC.
  DIFFERENCE:
  - MUP is a result of recognition, but IC is a result of 
    transformation (or generation), i.e. they have different 
    procedure nature;
  - notion of MUP is generalised to morphemic and structural 
    levels (as,for example, clause in the complex sentences).

2.Meaning units (MU) vs. surface-syntactical relations (SSR)[18]

  SSR is the component of MU, but MU corresponds to subject-
relation-attribute equivalent in MAMM, whereas SSR is purely 
syntactical structure.

3.MU vs. propositional form (in Explanatory Combinatorial 
  Dictionary in Meaning<=>Text Model of Mel'chuk)

  It can say, that MU is a propositional form, but described
in parametric form (not only lexically, but syntactically 
and structurally).

4.Tree of dependencies vs. Tree of MU 

  Tree of MU is a variety of tree of dependencies in which 
SSR are connected in MU.

5.Tree of MU vs. derivation tree of IC

  The basic difference between tree of MU and tree of IC is 
in its procedure nature. Traditionally tree of IC is viewed 
as result of action of the process that is homogenous in its 
mathematical nature, whereas tree of MU is a result of 
complex co-ordination of several processes (operations in 
MUG), that have different mathematical nature.

  Therefore, Meaning Unit Grammar is in relationship with 
Meaning<=>Text Model (MTM) in the part of surface syntax, 
but differs from surface syntax of MTM in that MUG is 
semantically oriented.


NOTES(to the paper):
1.The term "aspect" in the title "The Manyaspect Model of 
Meaning for Sentence " is used as "projection" or "plan"
of sentence meaning, but not as "grammatical aspect of verb" 
adopted in English grammar.
2.The term "semantic ptimitive" is used in MUG and MAMM-S
as common concept to name semantic constructions of language
model and meaning model. They have a minimal predicate value.
But they are not "atom of meaning" according to Wierzbicka.
The term "meaning unit" serve to name specific semantic 
primitive in language model (=MUG). Shalyapina proposed 
another term for this aim: "elementary predicate 
construction".
3.Typology of referential values is described in detail 
in dissertation of Paducheva.
4.There is an example of representation of propositional
part of meaning(=MU-tree) in MAMM-S for sentence "Peter 
bought a book.":

ExEy to_buy_what(x,y) & /* predicate */
     buyer(x) &      /* semantic role of subject x */
     purchase(y) &   /* semantic role of attribute y */
     x = Peter_1 &   /* referential value of subject x */
     book(y)         /* referential value of attribute y */

This example illustrates that form (2) does not limit all 
types of SRA-elements.

4.Strictly speaking MU tree and tree of MU trees are  
structures as of surface level(=level of language model) 
so of representation level(=level of meaning model).
On the contrary net of temporal MUs is a structure 
belonging to representation level only.


REFERENCES
1.Allen J.F. Towards a General Theory of Action and 
Time.//Artificial Intelligence.-1984,-N 2,-p.123-134
2.Apresyan Yu.D.,Mel'chuk I.A. and Zolkovsky A.K.
Semantics and Lexicography: Towards a New Type of Unilingual
Dictionary.//In Studies in Syntax and Semantics,
D.Reidel Publishing ,Dordrecht,Holland,1969,p.1-33.
3.Chang Hee Hwang and Lehnart K.Schubert.Meeting the 
interlocking needs of LF-computation,deindexing,and 
inference:An   organic   approach   to   general   NLU. // 
Proc.IJCAI-93,p.1297-1302 
4.Charniak E. The  case-slot  identity  theory. 
Cognitive Science, 1981, N 5, p.285-292. 
5.Charniak E. Passing  Markers:  a  theory  of  contextual 
influence in language comprehension.  Cognitive  Science.,  
N 7,p.171-190, 1983. 
6.Chomsky  N.  Three  Models  for  Description  of 
Language. //  IRE   Trans.Informat.Theory,   1956,   v.IT-2, 
p.113-124. 
7.Chomsky N. Knowledge of Language: Its Nature, Origin, and 
Use//Praeger Publishers,NY,1986.
8.Chomsky N. Lectures on Goverment and Binding//Foris,1981. 
9.Dahl V. What the Study of Language Can Contribute 
to AI.//European Journal on Artificial  Intelligence.,1993, 
N 6, p.92-106. 
10.Fillmore C.T.,The case for  case.//  Universal 
in linguistic theory. New York, 1968. 
11.Halliday M.A.K. An Introduction to functional Grammar.//
Edaward Arnold(Publishers),1985.
12.Herzog O.,Rollinger C.R. (Eds)  Text  Understanding 
in LILOG.,Springer-Verlag, Berlin, 1991. 
13.Kandrashina Eu.Yu.,Litvinceva L.V.,Pospelov D.A.
Knowledge Representation about Time and Space in Intelligent
Systems.,-Moscow,Nauka,1983,-325.(In Russian) 
14.Lehmann F.W. Semantic Networks//Computers & Mathematics 
with Applications,V.23,N.2-5,1992.
15.Lehnert  W.G.  Knowledge-based  Natural  Language 
Understanding. // Exploiring Artificial Intelligence: Survey 
Talks   from   the   National   Conference   on  Artificial 
Intelligence., Morgan Kaufmann Publ., CA, 1988. 
16.Lluis Vila. A Survey on Temporal Reasoning in Artificial
Intelligence.//AI Communications,Vol 1,N.1,March,1994,p.4-28
17.McDermott D.V. A Temporal logic for reasoning about 
processes and plans.//Cognitive Science.,-1982,-N 6,
p.125-217.
18.Melchuk,I.A. The Russian Language in the Meaning-Text 
Perspective. Wiener Slawistischer Almanach, Sonderband 39,
Moskau-Wien, 1995.
19.Narin'yani A.S.The Problem of NL Query to Datbase is 
Solved//"Dialog-95",Proceedings of International Workshop on
Computational Linguistics and its Applications,Kazan,
may 31 - june 4, 1995,p.206-215
(In Russian.See English translation in First Workshop 
on Applicatiopns of NATURAL LANGUAGE to DATA BASES(NLDB'95),
Versalles,France,June 28-29,1995).
20.Sabah G. Knowledge Representation and Natural Language
Understanding.//AI Communications,V.6,Nrs 3/4 Sept./Dec.,
1993, p.155-185.
21.Schank  R.  Conceptual  Information   Processing. 
New York, American Elsevier., 1975. 
22.Sowa J. Conceptual Structures. Addison-Wesley, 
Reading, MA, 1984. 
23.Thayse A. et all. Approche logique de l'intelligence 
artificielle. 1.De la logique classique a la programmation 
logique., Bordas, Paris, 1988. 
24.Wilensky R. Knowledge  Representation  -  A  Critique,  
A proposal. In: Experience,Memory, and  Reasoning.  
Kolodner,J. and Riesbeck, C.ed.Lawrence Erlbaum Assoc.,1986. 
25.Zadeh, L.A. "Fuzzy Sets," Information and Control, 8, 
338-353,1965.
26.Polyakov V. "Meaning Representation of Message in the 
Manyaspect Model". // Dialog'96: Computational Linguistics 
and its Applications, Proc.of Int.Workshop, Pushchino, 
Russia, May 4-9,1996.(In Russian)
27.Polyakov V. "Algorithmic models for identification of 
semantic relations in the systems of natural language 
processing". PhD thesis.  Moscow State Steel and Alloys 
Institute (Technological University). Moscow, 1997, 20 p. 
(In Russian)
28.Polyakov V. "Language Ability Modelling Using Computer 
System "Nedorosl". In: Paper Collection "Text Processing
and Cognitive Technologies". N 1. (Edited by A.G. Dyachko).
Moscow-Pushchino, ONTI PNTS RAN, 1997, 47-60.(In Russian)