Language and Mind:
Current Thoughts on Ancient Problems
The study of language is one of the oldest branches of systematic inquiry, tracing back to classical India and Greece, with a rich and fruitful history of achievement. From a different point of view, it is quite young. The major research enterprises of today took shape only about 40 years ago, when some of the leading ideas of the tradition were revived and reconstructed, opening the way to what has proven to be very productive inquiry.
That language should have exercised such fascination over the years is not surprising. The human faculty of language seems to be a true "species property", varying little among humans and without significant analogue elsewhere. Probably the closest analogues are found in insects, at an evolutionary distance of a billion years. The communication system of bees, for example, shares with human language the property of "displaced reference", our ability to talk about something that is remote from us in space or time; bees use an intricate "dance" to communicate the direction, distance, and desirability of a remote source of honey. Nothing similar is known elsewhere in nature. Even in this case, the analogy is very weak. Vocal learning has evolved in birds, but in three unrelated groups, independently it is assumed; here the analogies to human language are even more superficial.
Human language appears to be biologically isolated in its essential properties, and a rather recent development from an evolutionary perspective. There is no serious reason today to challenge the Cartesian view that the ability to use linguistic signs to express freely-formed thoughts marks "the true distinction between man and animal" or machine, whether by "machine" we mean the automata that captured the imagination of the 17th and 18th century, or those that are providing a stimulus to thought and imagination today.
Furthermore, the faculty of language enters crucially into every aspect of human life, thought, and interaction. It is largely responsible for the fact that alone in the biological world, humans have a history, cultural evolution and diversity of any complexity and richness, even biological success in the technical sense that their numbers are huge. A Marsian scientist observing the strange doings on Earth could hardly fail to be struck by the emergence and significance of this apparently unique form of intellectual organization. It is even more natural that the topic, with its many mysteries, should have stimulated the curiosity of those who seek to understand their own nature and their place within the wider world.
Human language is based on an elementary property that also seems to be biologically isolated: the property of discrete infinity, which is exhibited in its purest form by the natural numbers 1, 2, 3,... Children do not learn this property of the number system. Unless the mind already possesses the basic principles, no amount of evidence could provide them; and they are completely beyond the intellectual range of other organisms. Similarly, no child has to learn that there are three word sentences and four word sentences, but no three-and-a half word sentences, and that it is always possible to construct a more complex one, with a definite form and meaning. Such knowledge must come to us from "the original hand of nature", in David Hume's phrase, as part of our biological endowment.
This property intrigued Galileo, who regarded the discovery of a means to communicate our "most secret thoughts to any other person with 24 little characters" as the greatest of all human inventions. The invention succeeds because it reflects the discrete infinity of the language that these characters are used to represent. Shortly after, the authors of the Port Royal Grammar were struck by the "marvellous invention" of a means to construct from a few dozen sounds an infinity of expressions that enable us to reveal to others what we think and imagine and feel - from a contemporary standpoint, not an "invention" but no less "marvellous" as a product of biological evolution, about which virtually nothing is known, in this case.
The faculty of language can reasonably be regarded as a "language organ" in the sense in which scientists speak of the visual system, or immune system, or circulatory system, as organs of the body. Understood in this way, an organ is not something that can be removed from the body, leaving the rest intact. It is a subsystem of a more complex structure. We hope to understand the full complexity by investigating parts that have distinctive characteristics, and their interactions. Study of the faculty of language proceeds in the same way.
We assume further that the language organ is like others in that its basic character is an expression of the genes. How that happens remains a distant prospect for inquiry, but we can investigate the genetically-determined "initial state" of the language faculty in other ways. Evidently, each language is the result of the interplay of two factors; the initial state and the course of experience. We can think of the initial state as a "language acquisition device" that takes experience as "input" and gives the language as an "output" - an "output" that is internally represented in the mind/brain. The input and the output are both open to examination: we can study the course of experience and the properties of the languages that are acquired.
What is learned in this way can tell us quite a lot about the initial state that mediates between them. Furthermore, there is strong reason to believe that the initial state is common to the species: if my children had grown up in Tokyo, they would speak Japanese. That means that evidence about Japanese bears directly on the assumptions concerning the initial state for English. The shared initial state must be rich enough to yield each language, given appropriate experience; but not so rich as to exclude any language that humans can attain. We can establish strong empirical conditions that the theory of the initial state must satisfy, and pose several problems for the biology of language: How do the genes determine the initial state, and what are the brain mechanisms involved in the states that the language organ assumes? These are hard problems, even for much simpler systems where direct experiment is possible, but some may be at the horizons of inquiry.
To proceed, we should be more clear about what we mean by "a language." There has been much impassioned controversy about the right answer to this question, and more generally, to the question of how languages should be studied. The controversy is pointless, because there is no right answer. If we are interested in how bees communicate, we will try to learn something about their internal nature, social arrangements, and physical environment. These approaches are not in conflict; they are mutually supportive. The same is true of the study of human language: it can be investigated from the biological point of view, and from numerous others. Each approach defines the object of its inquiry in the light of its special concerns; and each should try to learn what it can from other approaches. Why such matters arouse great emotion in the study of humans is perhaps an interesting question, but I will put it aside for now.
The purely internalist approach I have been outlining is concerned with the faculty of language: its initial state, and the states it assumes. Suppose that Peter's language organ is in state L. We can think of L as Peter's language; when I speak of a language here, that is what I mean. So understood, a language is something like "the way we speak and understand", one traditional conception of language. The theory of Peter's language is often called the "grammar" of his language, and the theory of the initial state of the faculty of language is called "universal grammar", adapting traditional terms to a different framework. Peter's language determines an infinite array of expressions, each with its sound and meaning. In technical terms, his language "generates" the expressions of his language. The theory of his language is therefore called a generative grammar. Each expression is a complex of properties, which provide "instructions" for Peter's performance systems: his articulatory apparatus, his modes of organizing his thoughts, and so on. With his language and the associated performance systems in place, Peter has a vast amount of knowledge about the sound and meaning of expressions, and a corresponding capacity to interpret what he hears, to express his thoughts, and to use his language in a variety of other ways.
Generative grammar arose in the context of what is often called "the cognitive revolution" of the 1950s, and was an important factor in its development. Whether the term "revolution" is appropriate or not can be questioned, but there was an important change of perspective: from the study of behavior and its products (such as texts), to the inner mechanisms that enter into human thought and action. The cognitive perspective regards behavior and its products not as the object of inquiry, but as data that may provide evidence about the inner mechanisms of mind and the ways these mechanisms operate in executing actions and interpreting experience. The properties and patterns that were the focus of attention in structural linguistics find their place, but as phenomena to be explained along with innumerable others, in terms of the inner mechanisms that generate expressions.
The "cognitive revolution" renewed and reshaped many of the insights, achievements, and quandaries of what we might call "the first cognitive revolution" of the 17th and 18th century, which was part of the scientific revolution that so radically modified our understanding of the world. It was recognized at the time that language involves "the infinite use of finite means", in von Humboldt's phrase; but the insight could be developed only in limited ways, because the basic ideas remained vague and obscure. By mid-20th century, advances in the formal sciences had provided appropriate concepts in a very sharp and clear form, making it possible to give a precise account of the computational principles that generate the expressions of a language. Other advances also opened the way to investigation of traditional questions with greater hope of success. The study of language change had registered major achievements. Anthropological linguistics provided a far richer understanding of the nature and variety of languages, also undermining many stereotypes. And certain topics, notably the study of sound systems, had been much advanced by the structural linguistics of the 20th century.
The last prominent inheritor of the tradition, before it was swept aside by structuralist and behaviorist currents, was the Danish linguist Otto Jespersen. He argued 75 years ago that the fundamental goal of linguistics is to discover the "notion of structure" that is in the mind of the speaker, enabling him to produce and understand "free expressions" that are new to speaker and hearer or even the history of the language, a regular occurrence of everyday life. Jespersen's "notion of structure" is similar in spirit to what I have called "a language." The goal of a theory of the language is to unearth some of the factors that enter into the ability to produce and understand "free expressions." Only SOME of the factors, however, just as the study of computational mechanisms falls considerably short of capturing the idea of "infinite use of finite means", or addressing the issues that were fundamental to the first cognitive revolution, a matter to which I will return.
The earliest attempts to carry out the program of generative grammar, about 40 years ago, quickly revealed that even in the best studied languages, elementary properties had passed unrecognized, and that the most comprehensive traditional grammars and dictionaries only skim the surface. The basic properties of particular languages and of the general faculty of language are unconsciously presupposed throughout, unrecognized and unexpressed. That is quite appropriate if the goal is to help people to learn a second language, to find the conventional meaning and pronunciation of words, or to have some general idea of how languages differ. But if our goal is to understand the language faculty and the states it can assume, we cannot tacitly presuppose "the intelligence of the reader." Rather, this is the object of inquiry.
The study of language acquisition leads to the same conclusion. A careful look at the interpretation of expressions reveals very quickly that from the earliest stages, the child knows vastly more than experience has provided. That is true even of simple words. Young children acquire words at a rate of about one every waking hour, with extremely limited exposure under highly ambiguous conditions. The words are understood in delicate and intricate ways that are far beyond the reach of any dictionary, and are only beginning to be investigated. When we move beyond single words, the conclusion becomes even more dramatic. Language acquisition seems much like the growth of organs generally; it is something that happens to a child, not that the child does. And while the environment plainly matters, the general course of development and the basic features of what emerges are predetermined by the initial state. But the initial state is a common human possession. It must be, then, that in their essential properties, languages are cast to the same mold. The Martian scientist might reasonably conclude that there is a single human language, with differences only at the margins.
For our lives, the slight differences are what matter, not the overwhelming similarities, which we unconsciously take for granted. No doubt frogs look at other frogs the same way. But if we want to understand what kind of creature we are, we have to adopt a very different point of view, basically that of the Martian studying humans.
That is, in fact, the point of view we adopt when we study other organisms, or even humans apart from their mental aspects - humans "below the neck", metaphorically speaking. There is every reason to study what is above the neck in the same manner.
As languages were more carefully investigated from the point of view of generative grammar, it became clear that their diversity had been underestimated as radically as their complexity. At the same time, we know that the diversity and complexity can be no more than superficial appearance.
The conclusions are paradoxical, but undeniable. They pose in a stark form what has become the central problem of the modern study of language: How can we show that all languages are variations on a single theme, while at the same time recording faithfully their intricate properties of sound and meaning, superficially diverse? A genuine theory of human language has to satisfy two conditions: "descriptive adequacy" and "explanatory adequacy." The condition of descriptive adequacy holds for a grammar of a particular language. The grammar satisfies the condition insofar as it gives a full and accurate account of the properties of the language, of what the speaker of the language knows. The condition of explanatory adequacy holds for the general theory of language, universal grammar. To satisfy the condition, universal grammar must show that each particular language is a specific instantiation of the uniform initial state, derived from it under the "boundary conditions" set by experience. We would then have an explanation of the properties of languages at a deeper level. To the extent that universal grammar satisfies the condition of explanatory adequacy, it offers a solution to what is sometimes called "the logical problem of language acquisition." It shows how that problem can be solved in principle, and thus provides a framework for the study of how the process actually takes place.
There is a serious tension between these two research tasks. The search for descriptive adequacy seems to lead to ever greater complexity and variety of rule systems, while the search for explanatory adequacy requires that language structure must be largely invariant. It is this tension that has largely set the guidelines for research. The natural way to resolve the tension is to challenge the traditional assumption, carried over to early generative grammar, that a language is a complex system of rules, each specific to particular languages and particular grammatical constructions: rules for forming relative clauses in Hindi, verb phrases in Bantu, passives in Japanese, and so on. Considerations of explanatory adequacy indicate that this cannot be correct.
The problem was faced by attempts to find general properties of rule systems that can be attributed to the faculty of language itself, in the hope that the residue will prove to be more simple and uniform.
About 15 years ago, these efforts crystallized in an approach to language that was a much more radical departure from the tradition than earlier generative grammar had been. This "Principles and Parameters" approach, as it has been called, rejected the concept of rule and grammatical construction entirely; there are no rules for forming relative clauses in Hindi, verb phrases in Bantu, passives in Japanese, and so on. The familiar grammatical constructions are taken to be taxonomic artifacts, useful for informal description perhaps but with no theoretical standing. They have something like the status of "terrestial mammal" or "household pet." And the rules are decomposed into general principles of the faculty of language, which interact to yield the properties of expressions. We can think of the initial state of the faculty of language as a fixed network connected to a switch box; the network is constituted of the principles of language, while the switches are the options to be determined by experience. When the switches are set one way, we have Bantu; when they are set another way, we have Japanese. Each possible human language is identified as a particular setting of the switches - a setting of parameters, in technical terminology. If the research program succeeds, we should be able literally to deduce Bantu from one choice of settings, Japanese from another, and so on through the languages that humans can acquire. The empirical conditions of language acquisition require that the switches can be set on the basis of the very limited information that is available to the child. Notice that small changes in switch settings can lead to great apparent variety in output, as the effects proliferate through the system. These are the general properties of language that any genuine theory must capture somehow.
This is, of course, a program, far from a finished product. The conclusions tentatively reached are unlikely to stand in their present form; and, needless to say, one can have no certainty that the whole approach is on the right track. As a research program, however, it has been highly successful, leading to a real explosion of empirical inquiry into languages of a very broad typological range, to new questions that could never even have been formulated before, and to many intriguing answers. Questions of acquisition, processing, pathology, and others also took new forms, which have proven very productive as well. Furthermore, whatever its fate, the program suggests how the theory of language might satisfy the conflicting conditions of descriptive and explanatory adequacy. It gives at least an outline of a genuine theory of language, really for the first time.
Within this research program, the main task is to discover the principles and parameters. While a great deal remains obscure, there has been enough progress to consider some new and more far-reaching questions about the design of language. In particular, we can ask how good is the design. How close does language come to what some super-engineer would construct, given the conditions that the language faculty must satisfy? How "perfect" is language, to put it picturesquely?
This question carries us right to the borders of current inquiry, which has given some reason to believe that the answer is: "surprisingly perfect" - surprising, for several reasons to which I'll return. At this point it is hard to proceed without more technical apparatus. I will put that off until tomorrow, and turn now to some other topics of a more general nature, having to do with the ways the internalist study of language relates to the external world.
These questions fall into two categories: First, relations of mind and brain; second, questions of language use. Let's begin with the first.
The internalist study of language tries to discover the properties of the initial state of the faculty of language, and the states it assumes under the influence of experience. The initial and attained state are states of the brain primarily, but described abstractly, not in terms of cells but in terms of properties that the brain mechanisms must somehow satisfy.
It is commonly held that this picture is misguided in principle. The basic criticism has been presented most clearly by philosopher John Searle: The faculty of language is indeed "innate in human brains", he writes, but the evidence that has been used to attribute properties and principles to this innate faculty "is much more simply accounted for by the... hypothesis" that there is "a hardware level of explanation in terms of the structure of the device."
Exactly what is at stake?
The existence of the hardware level is not in question, if by that we mean that cells are involved in "the structure of the device" that is "innate in human brains." But it remains to discover the structure of the device, its properties and principles. The only question has to do with the status of the theory that expresses these properties. Searle says there would be "no further predictive or explanatory power by saying that there is a level of deep unconscious" principles of the faculty of language. That is quite true. Similarly chemistry is uninteresting if it says only that there are deep structural properties of matter. But chemistry is not uninteresting at all if puts forth theories about these properties, and the same is true of the study of language. And in both cases, one takes the entities and principles postulated to be real, because we have no other concept of reality. There is no issue, simply a serious confusion that is pervasive in discussion of mental aspects of the world.
An analogy to chemistry is instructive. Throughout its modern history, chemistry has tried to discover properties of complex objects in the world, offering an account in terms of chemical elements of the kind postulated by Lavoisier, atoms and molecules, valence, structural formulas for organic compounds, laws governing the combination of these objects, and so on. The entities and principles postulated were abstract, in the sense that there was no way to account for them in terms of known physical mechanisms. There was much debate over the centuries about the status of these hypothetical constructs; Are they real? Are they just calculating devices? Can they be reduced to physics? The debate continued until early in this century. It is now understood to have been completely senseless. It turned out that in fact, chemistry was not reducible to physics, because the assumptions of basic physics were wrong. With the quantum revolution, it was possible to proceed to unification of chemistry and physics, about 60 years ago. Now chemistry is considered to be part of physics, though it was not reduced to physics.
It would have been irrational to have claimed for centuries that chemistry is mistaken because its principles are "much more simply accounted for by a hardware level of explanation in terms of the entities and principles postulated by physicists"; and as we now know, the claim was not only irrational but false. For the same reason, it would be irrational to hold that a theory of language can be dispensed with in favor of an account in terms of atoms or neurons, even if there were much to say at this level. In fact, there is not, which should come as no surprise.
For the brain sciences, the abstract study of states of the brain provides guidelines for inquiry: they seek to discover what kinds of mechanisms might have these properties. The mechanisms might turn out to be quite different from anything contemplated today, as has been the case throughout the history of science. We do not advance the brain sciences by a proposal to stop trying to find the properties of states of the brain, or by assuming, dogmatically, that the little bit that is now known about the brain must provide the answers, or by saying that we can look for the properties, but we should not go on to attribute them to the brain and its states - "deep unconscious rules", if that is what the best theory concludes.
In the background lies what seems to be a deeper problem: the problem of dualism, of mind and body. The abstract study of language seems to fall on the mental side of the divide, hence to be highly problematic. It calls into question the "basic materialist premise" that "All reality is physical", to quote a recent study of "mental reality" by Galen Strawson, the most sophisticated and valuable account I know of the problem of materialism, which is widely held to be fundamental to contemporary thought.
Strawson points out that the problem "came to seem acute" in the 16th-17th centuries with the rise of "a scientific conception of the physical as nothing more than particles in motion." That is true, but the way this conception was formed raises some questions about the materialist premise and the quest for a "clear line between the mental and the nonmental" that Strawson and others consider critical for the philosophy of mind.
The "scientific conception" took shape as "the mechanical philosophy", based on the principle that matter is inert and interactions are through contact, with no "occult qualities" of the kind postulated by Scholastic doctrine. These were dismissed as "so great an Absurdity that I believe no Man who has in philosophical matters a competent Faculty of thinking, can ever fall into it." The words are Newton's, but they refer not to the occult qualities of Scholasticism that were in such disrepute, but to his own startling conclusion that gravity, though no less mystical, "does really exist." Historians of science point out that "Newton had no physical explanation of gravity at all", a deep problem for him and eminent contemporaries who correctly "accused him of reintroducing occult qualities", with no "physical, material substrate" that "human beings can understand." To the end of his life, Newton sought to escape the absurdity, as did Euler, D'Alembert, and many since, but in vain. Nothing has weakened the force of David Hume's judgment that by refuting the self-evident mechanical philosophy, Newton "restored (Nature's) ultimate secrets to that obscurity in which they ever did and ever will remain."
It is true that the "scientific conception of the physical" has incorporated "particles in motion, " but without "human understanding" in the sense of the earlier enterprise; rather, with resort to Newtonian "absurdities" and worse, leaving us "ignorant of the nature of the physical in some fundamental way." I am quoting Strawson's reference to the core problems of mind, but they are not alone in this regard. The properties of particles in motion also surpass human understanding, although we "accustomed ourselves to the abstract notion of forces, or rather to a notion hovering in a mystic obscurity between abstraction and concrete comprehension", Friedrich Lange points out in his classic scholarly study of materialism, discussing this "turning point" in its history, which deprives the doctrine of much significance. The sciences came to accept the conclusion that "a purely materialistic or mechanistic physics" is "impossible" (Alexander Koyre'). From hard science to soft, inquiry can do no more than to seek the best theoretical account, hoping for unification if possible, though how, no one can tell in advance.
In terms of the mechanical philosophy, Descartes had been able to pose a fairly intelligible version of the mind-body problem, the problem of "the ghost in the machine", as it is sometimes called. But Newton showed that the machine does not exist, though he left the ghost intact. With Newton's demonstration that there are no bodies in anything like the sense assumed, the existing version of the mind-body problem collapses; or any other, until some new notion of body is proposed. But the sciences offer none: there is a world, with whatever strange properties it has, including its optical, chemical, organic, mental, and other aspects, which we try to discover. All are part of nature.
That seems to have been Newton's view. To his last days, he sought some "subtle spirit" that would account for a broad range of phenomena that appeared to be beyond explanation in terms truly comprehensible to humans, including interaction of bodies, electrical attraction and repulsion, light, sensation, and the way "members of animal bodies move at the command of the will." Chemist Joseph Black recommended that "chemical affinity be received as a first principle, which we cannot explain any more than Newton could explain gravitation, and let us defer accounting for the laws of affinity, till we have established such a body of doctrine as Newton has established concerning the laws of gravitation." Chemistry proceeded to establish a rich body of doctrine, achieving its "triumphs... in isolation from the newly emerging science of physics", a leading historian of chemistry points out. As I mentioned, unification was finally achieved, quite recently, though not by reduction.
Apart from its theological framework, there has been, since Newton, no reasonable alternative to John Locke's suggestion that God might have chosen to "superadd to matter a faculty of thinking" just as he "annexed effects to motion, which we can in no way conceive motion able to produce." As the 18th chemist Joseph Priestley later elaborated, we must regard the properties "termed mental" as the result of "such an organical structure as that of the brain", superadded to others, none of which need be comprehensible in the sense sought by earlier science. That includes the study of language, which tries to develop bodies of doctrine with constructs and principles that can properly be "termed mental", and assumed to be "the result of organical structure" - how, it remains to discover. The approach is "mentalistic", but in what should be an uncontroversial sense. It undertakes to study a real object in the natural world - the brain, its states and functions - and thus to move the study of the mind towards eventual integration with the biological sciences.
It might be mentioned that such problems are mostly unsolved even for much simpler systems where direct experiment is possible. One of the best studied cases is the neroatode, little worms with a three-day maturation period, with a wiring diagram that is completely analyzed.
It is only very recently that some understanding has been gained of the neural basis of their behavior, and that remains limited and controversial.
Another question of the same category has to do with the way the genes express the properties of the initial state. That too is a very hard problem, barely understood even in far simpler cases. The "epigenetic laws" that transform genes to developed organisms are mostly unknown, a large gap in evolutionary theory as scientists have often pointed out, because the theory requires an understanding of genotype-phenotype correspondence, of the range of organisms that can develop from some complex of genes. I mention these facts only as a word of caution about strange conclusions that have been expressed, often with great passion again, about observations on the biological isolation of language and the richness of the initial state. There is much more to say about this topic, a very lively one today, but I will put it aside and turn to the second category of questions about how language engages the world: questions of language use.
For simplicity, let's keep to simple words. Suppose that "book" is a word in Peter's lexicon. The word is a complex of properties: in technical usage, phonetic and semantic features. The sensorimotor systems use the phonetic properties for articulation and perception, relating them to external events: motions of molecules, for example. Other systems of mind use the semantic properties of the word when Peter talks about the world and interprets what others say about it.
There is no far-reaching controversy about how to proceed on the sound side, but on the meaning side there are profound disagreements. Empirically-oriented studies seem to me to approach problems of meaning rather in the way they study sound. They try to find the phonetic properties of the word "book" that are used by articulatory and perceptual systems. And similarly, they try to find the semantic properties of the word "book" that are used by other systems of the mind/brain: that it is nominal not verbal, used to refer to an artifact not a substance like water or an abstraction like health, and so on. One might ask whether these properties are part of the meaning of the word "book" or of the concept associated with the wordy it is not clear how to distinguish these proposals, but perhaps an empirical issue can be unearthed. Either way, some features of the lexical item "book" that are internal to it determine modes of interpretation of the kind just mentioned.
Investigating language use, we find that words are interpreted in terms of such factors as material constitution, design, intended and characteristic use, institutional role, and so on. The notions can be traced to Aristotelian origin, philosopher Julius Moravcsik has pointed out in very interesting work. Things are identified and assigned to categories in terms of such properties, which I am taking to be semantic features, on a par with phonetic features that determine its sound. The use of language can attend in various ways to these semantic features.
Suppose the library has two copies of Tolstoy's WAR AND PEACE, Peter takes out one, and John the other. Did Peter and John take out the same book, or different books? If we attend to the material factor of the lexical item, they took out different books; if we focus on its abstract component, they took out the same book. We can attend to both material and abstract factors simultaneously, as when we say that his book is in every store in the country, or that the book he is planning will weigh at least five pounds if he ever writes it. Similarly, we can paint the door white and walk through it, using the pronoun "it" to refer ambiguously to figure and ground. We can report that the bank was blown up after it raised the interest rate, or that it raised the rate to keep from being blown up. Here the pronoun "it", and the "empty category" that is the subject of "being blown up", simultaneously adopt both the material and institutional factors.
The same is true if my house is destroyed and I re-build it, perhaps somewhere else; it is not the same house, even if I use the same materials, though I re-built it. The referential terms "re" and "it" cross the boundary. Cities are still different. London could be destroyed by fire and IT could be rebuilt somewhere else, from completely different materials and looking quite different, but still London. Carthage could be rebuilt today, and still be Carthage.
Consider the city that is regarded as holy by the faiths that trace to the Old Testament. The Islamic world calls it "AI-Quds", Israel uses a different name, as does the Christian world: in English, it is pronounced "Jerusalem." There is a good deal of conflict over this city. The NEW YORK TIMES has just offered what it calls a "promising solution." Israel should keep all of Jerusalem, but "AI-Quds" should be rebuilt outside the current boundaries of Jerusalem. The proposal is perfectly intelligible - which is why it arouses considerable outrage outside circles in which the doctrine of the powerful reigns unchallenged. And the plan could be implemented. What is the city to which we will then refer when we say that IT was left where it was while moved somewhere else?
The meanings of words have other curious properties. Thus if I tell you that I painted my house brown, I mean you to understand that I placed the paint on the exterior surface, not the interior surface. If I want you to know that it was the interior surface, I have to say that I painted my house brown on the inside. In technical terminology, there is a marked and unmarked usage; without specific indications, we give the words their unmarked interpretation. These are properties of houses, not just of the word "paint." Thus if I see the house, I see its exterior surface, though if I am sitting inside I can see the interior walls. Although the unmarked interpretations selects the exterior surface, I surely do not regard the house as just a surface. If you and I are outside the house, you can be nearer to it than I am; but if we are both in the house, that cannot be the case, even if you are closer to the surface. Neither of us is near the house. So we regard the house as an exterior surface, but with an interior as well. If I decide to use my house to store my car, living somewhere else, it is no longer a house at all, rather a garage, though the material constitution hasn't changed. Such properties hold quite generally, even for invented objects, even impossible ones. If I paint my spherical cube brown, I painted the exterior surface brown.
Such properties are not limited to artifacts. We call England an island, but if the sea-level dropped enough, it would be a mountain, by virtue of the faculties of the mind. The prototypical simple substance is water. But even here, immaterial factors enter into individuation. Suppose a cup is filled with pure H20 and I dip a tea bag into it. It is then tea, not water. Suppose a second cup is filled from a river.
It could be chemically identical with the contents of the first cup - perhaps a ship dumped tea bags in the river. But it is water, not tea, and that is what I would call it even if I knew all the facts. What people call "water" is correlated with H20 content, but only weakly, experimental studies have shown. Doubtless in this extreme case, constitution is a major factor in deciding whether something is water, but even here, not the only one. As I mentioned, the observations extend to the simplest referential and referentially dependent elements; and to proper names, which have rich semantic-conceptual properties. Something is named as a person, a river, a city, with the complexity of understanding that goes along with these categories. Language has no logically proper names, stripped of such properties, as Oxford philosopher Peter Strawson pointed out many years ago.
The facts about such matters are often clear, but not trivial. Such properties can be investigated in many ways: language acquisition, generality among languages, invented forms, etc. What we discover is surprisingly intricate; and not surprisingly, largely known in advance of any evidence, hence shared among languages. There is no a priori reason to expect that human language will have such properties; Martian could be different. The symbolic systems of science and mathematics surely are.
It is sometimes suggested that these are just things we know from experience with books, cities, houses, people, and so on. That is in part correct, but begs the question. We know all of this about parts of our experience that we construe as books, or cities, and so on, by virtue of the design of our languages and mental organization. To borrow the terminology of the cognitive revolution of the 17th century, what the senses convey gives the mind "an occasion to exercise its own activity" to construct "intelligible ideas and conceptions of things from within itself" as "rules", "patterns", exemplars" and "anticipations" that yield Gestalt properties and others, and "one comprehensive idea of the whole." There is good reason to adopt Hume's principle that the "identity which we ascribe" to things is "only a fictitious one", established by the human understanding, a picture developed further by Kant, Schopenhauer, and others. People think and talk about the world in terms of the perspectives made available by the resources of the mind, including the meanings of the terms in which their thoughts are expressed. The comparison to phonetic interpretation is not unreasonable.
Much of contemporary philosophy of language and mind follows a different course. It asks to what a word refers, giving various answers. But the question has no clear meaning. It makes little sense to ask to what thing the expression "Tolstoy's WAR AND PEACE" refers. The answer depends on how the semantic features are used when we think and talk, one way or another. In general, a word, even of the simplest kind, does not pick out an entity of the world, or of our "belief space" - which is not to deny, of course, that there are books and banks, or that we are talking about something if we discuss the fate of the earth and conclude that IT is grim. But we should follow the good advice of the 18th century philosopher Thomas Reid and his modern successors Wittgenstein and others, and not draw unwarranted conclusions from common usage.
We can, if we like, say that the word "book" refers to books, "sky" to the sky, "health" to health, and so on. Such conventions basically express lack of interest in the semantic properties of words and how they are used to talk about things. We could avoid the issues of acoustic and articulatory phonetics the same way. To say this is not to criticize the decision; any inquiry focuses on certain questions and ignores others. There has been a great deal of exciting work on the aspects of language that relate to phonetic interpretation and to semantic interpretation, but it should properly be called syntax, in my opinion, a study of the operations of the faculty of language, part of the mind. The ways language is used to engage the world lie beyond.
In this connection, let us return to my comment that generative grammar has sought to address concerns that animated the tradition, in particular, the Cartesian idea that "the true distinction" between humans and other creatures or machines is the ability to act in the manner they took to be roost clearly illustrated in the ordinary use of language: without finite limits, influenced but not determined by internal state, appropriate to situations but not caused by them, coherent and evoking thoughts that the hearer might have expressed, and so on. That is only partly correct. The goal of the work I have been discussing is to unearth some of the factors that enter into such normal practice. Only SOME of these, however.
Generative grammar seeks to discover the mechanisms that are used, thus contributing to the study of HOW they are used in the creative fashion of normal life. How they are used is the problem that intrigued the Cartesians, and it remains as mysterious to us as it was to them, even though far more is understood today about the mechanisms that are involved.
In this respect, the study of language is again much like that of other organs. Study of the visual and motor systems has uncovered mechanisms by which the brain interprets scattered stimuli as a cube and by which the arm reaches for a book on the table. But these branches of science do not raise the question of how people decide to do such things, and speculations about the use of the visual or motor systems, or others, amount to very little. It is these capacities, manifested most strikingly in language use, that are at the heart of traditional concerns: for Descartes, they are "the noblest thing we can have" and all that "truly belongs" to us. Half a century before Descartes, the Spanish philosopher-physician Juan Huarte observed that this "generative faculty" of ordinary human understanding and action, though foreign to "beasts and plants", is only a lower form of understanding. It falls short of true exercise of the creative imagination. Even the lower form lies beyond our theoretical reach, apart from the study of mechanisms that enter into it.
In a number of areas, language included, a lot has been learned in recent years about these mechanisms. The problems that can now be faced are hard and challenging, but many mysteries still lie beyond the reach of the form of human inquiry we call "science", a conclusion that we should not find surprising if we consider humans to be part of the organic world, and perhaps one we should not find distressing either.
Editorial note. Words are CAPITALIZED when I mean them to be underlined, to indicate italics. Also, I use ' for (French) acute accent, so the name "Koyre'" should have "e" with acute accent.