Nonverbal World

I am not yet so lost in lexicography as to forget that words are the daughters ofthe earth. . . . --Samuel Johnson (Dictionary)

Some natural tears they dropped, but wiped them soon; The world was all before them, where to choose Their place of rest, and Providence their guide. They, hand in hand, with wandering steps and slow,

Through Eden took their solitary way. --John Milton (Paradise Lost, Book XII; 1667)

Concept. 1. A domain of ancient social, emotional, and cognitive signs, established millions of years before the advent of speech. 2. A usually hidden, sensory dimension apart from that which is defined by words. 3. An often unconscious medium, between reflex and reason, governed by the oldest parts of our vertebrate brain (see NONVERBAL BRAIN).

Good place. Nonverbal World is a landscape without language, billboards, or signposts, a realm without writing or symbols of any kind. It is a place where information consists of colors, shapes, aromas, and natural sounds--untouched by narration. This is the unspoken world we seek on mountaintops and island retreats, i.e., the good place apart from words.

Usage: We reside in a world of words, but still make many of our most important decisions about life and living as if we had never left Nonverbal World: 1. We do not need words, e.g., to define a kiss, decode an Armani suit, or decipher new car smell; these depend on ancient signals from the wordless past. 2. Even technical knowledge is transmitted through nonverbal apprenticeships, in which we watch and do rather than read a manual. 3. We choose our vehicles, homes, and mates, e.g., on nonverbal grounds, and select wardrobes based on clothing's look and feel. 4. Many scientists (the most notable being Albert Einstein) think in visual, spatial, and physical images rather than in mathematical terms and words. (N.B.: That the theoretical physicist, Stephen Hawking, used an arboreal term to picture the cosmos [i.e., affirming that the universe "could have different branches,"] is a tribute to his [very visual] primate brain.)

Literature. "He went from the fields into a thick woods, as if resolved to bury himself. He wished to get out of hearing of the crackling shots which were to him like voices." --Stephen Crane (The Red Badge of Courage)

Origin. Nonverbal World originated ca. 3.5 billion years ago with the earliest known life forms, blue-green algae (i.e., cyanobacteria), living in shallow-water communities known as stromatolites. Voiceless, eyeless, unable to touch or hear, the first residents of Nonverbal World communicated chemically, through the medium of the molecule (see AROMA CUE).

Present day. Nonverbal World is the hidden place off the written transcript, where meaning lies not in vocabulary but in unspoken signals and cues. As anthropologists explore alien cultures and archaeologists unearth the past, we may seek our roots in a paleontology of gesture. Through spinal cord paleocircuits and cranial nerves, gestures recite an ancient wisdom which languages and literature fumble to explain today.

Observations. 1. To see Nonverbal World on TV, mute the sound (gestures and body movements become clearer). 2. To hear emotion on the phone, listen with your left ear (the right brain responds to feelings and moods). 3. To feel the smoothness of silk, flannel, and flesh, touch with your left hand (the right sensory strip is more emotional than the left [in right-handed people; the reverse is (partly) true in lefties]).

Evolution L. For ca. one-half-billion years, our vertebrate ancestors defined reality without uttering a phrase. The early residents of Nonverbal World dealt with each other and with great issues of the day apart from linguistic concepts or names. Though speechless, Nonverbal World was filled with whispering winds and flowing waters, rhetorical thunder, and the calls of wild things. It bustled with movement, percolated with aromas, and bristled with feathers and fur. Constant comment was heard eons before words arrived.

Evolution LL. Late in Nonverbal World's prehistory, the first words were spoken, marking the birth of a new conceptual order based on language. Spoken language emerged ca. 200,000 years ago as the dominant verbal medium of our species, Homo sapiens. But a price was paid for speaking, as words and the knowledge for which they stand estranged human beings from Nonverbal World. As ever larger areas of our brain specialized for speaking and listening (see HUMAN BRAIN), attention shifted away from the sensory reality our ancestors knew to a separate reality based on speech.

Evolution III. In our mind's eye, words have more meaning than what they name. Indeed, it may not be an exaggeration to say that language has taken over our conscious brain. For not only does talk stimulate the brain's largest speech areas--Broca's and Wernicke's--it excites other regions of the neocortex (e.g., "wide areas in the frontal and parietal lobes" [Eccles 1989:89]), as well, and the brain stem (with its incredible tangle of cranial nerves). Thus, hearing, saying, or seeing a word dominates attention by neurologically engulfing our mind.

Primatology. "With regard to the vocalizations of these animals [wild baboons], it is notable that many hours of the day are spent in almost complete silence" (Hall and DeVore 1972:158).

Space. Nearing completion of their five-month mission in orbit (from March to August 2001), international-space-station residents Yuri Usachev and Jim Voss "are yearning for the smells and sounds of nature" (Anonymous 2001J).

Neuro-notes. Nonverbal World gradually came to be known as nerves evolved to grasp its features. The oldest chemical and tactile senses enabled early creatures to know the landscape--and to smell, feel, and "taste" one another's presence in Nonverbal World. (N.B.: A great deal of our nonverbal communication—from the colognes we buy to our footwear--is still about presence today.)

Copyright © 1998 - 2001 (David B. Givens/Center for Nonverbal Studies)



Out ofthe abundance ofthe heart the mouth speaketh. --Matthew, XII, 34

Talk on, my son; say anything that comes to your mind or to the tip of your tongue . . . --Miguel de Cervantes (Don Quixote, 1605:695)

Nixon: "But they were told to uh" Haldeman: "uh and refused uh"

Nixon: [Expletive deleted.] --Excerpt from the Nixon Tape Transcripts (Lardner 1997)

Spoken language. 1. A verbal and vocal means of communicating emotions, perceptions, and thoughts by the articulation of words. 2. The organization of systems of sound into language, which has enabled Homo sapiens a. to transcend the limits of individual memory, and b. to store vast amounts of information.

Usage I: Speech (and manual sign language, e.g., ASL) has become the indispensable means for sharing ideas, observations, and feelings, and for conversing about the past and future. Speech so engages the brain in self-conscious deliberation, however, that we often overlook our place in Nonverbal World (see below, Neuro-notes V).

Usage II: "Earth's inhabitants speak some 6,000 different languages" (Raloff 1995).

Anatomy. To speak we produce complex sequences of body movements and articulations, not unlike the motions of gesture. Evolutionary recent speech-production areas of the neocortex, basal ganglia, and cerebellum enable us to talk, while evolutionary recent areas of the neocortex give heightened sensitivity a. to voice sounds (see AUDITORY CUE), and b. to positions of the fingers and hands.

Babble. 1. "Manual babbling has now been reported to occur in deaf children exposed to signed languages from birth" (Petitto and Marentette 1991:1493). 2. "Instead of babbling with their voices, deaf babies babble with their hands, repeating the same motions over and over again" (Fishman 1992:66). 3. Babies babble out of the right side of their mouths, according to a study presented at the 2001 Society for Neuroscience meeting in San Diego by University of Montreal researchers Siobhan Holowka and Laura Ann Petitto; non-speech cooing and laughter vocalizations are, on the other hand, symmetrical or emitted from the left (Travis 2001). "Past studies of adults speaking have established that people generally open the right side of the mouth more than the left side when talking, whereas nonlinguistic tasks requiring mouth opening are symmetric or left-centered" (Travis 2001:347).

Evolution I. Spoken language is considered to be between 200-thousand (Lieberman 1991) and two-million (Gibson 1993) years old. The likely precursor of speech is sign language (see HANDS, MIME CUE). Our ability a. to converse using manual signs and b. to manufacture artifacts (e.g., the Oldowan stone tools manufactured 2.4-to-1.5 m.y.a.) evolved in tandem on eastern Africa's savannah plains. Signing may not have evolved without artifacts, nor artifacts without signs. (N.B.: Anthropologists agree that some form of communication was needed to pass the knowledge of tool design on from one generation to the next.)

Evolution II. Handling, seeing, making, and carrying stone implements stimulated the creation of conceptual categories, available for word labels, which came in handy, e.g., for teaching the young. Through an intimate relationship with tools and artifacts, human beings became information-sharing primates of the highest order.

Evolution III. Preadaptations for vocal speech involved the human tongue. Before saying words, the tongue had been a humble manager of "food tossing." Through acrobatic maneuvers, chewed morsels were distributed to premolars and molars for finer grinding and pulping. (The trick was not getting bitten in the process.) As upright posture evolved, the throat grew in length, and the voice box was retrofit lower in the windpipe. As a result the larynx, originally for mammalian calling, increased its vocal range as the dexterous tongue waited to speak.

Evolution IV. ". . . the earliest linguistic systems emerged out of vocalizations like those of the great apes. The earliest innovation was probably an increase in the number of distinctive calls" (Foley 1997:70; see TONE OF VOICE, Evolution).

Gestural origin. "[David B.] Givens has called our attention to matters too often ignored: the biological imperative to communicate, present along the whole evolutionary track; the persistence, out of awareness, of very ancient bodily signals and their penetration of all our social interaction; and the powerful neoteny--human gestures and sign language signs make use of some of the same actions to signal semantically related messages. These same powerful influences, it seems from the study of sign languages, are beneath and behind language as we know it today. Thus it should be easier to construct a theory of gesture turning into language, complete with duality of patterning and syntactic structures, and thence into spoken language, than to find spoken language springing full grown from a species but one step removed from the higher apes" (Stokoe 1986:180-81).

Gestures. 1. Speaking gestures aid memory and thought, research from the University of Chicago suggests. In a study of 40 children and 36 adults (published in the November, 2001 issue of Psychological Science), subjects performed 20 percent better on a memory test when permitted to gesture with their hands while explaining how they had solved a math problem. Those asked to keep their hands still as they explained did perform as well. Gesture and speech are integrally linked, according to Susan Goldin-Meadow, an author of the study. Goldin-Meadow noted that gestures make thinking easier because they enlist spatial and other nonverbal areas of the brain. 2. A growing body of evidence suggests that teaching babies ASL may improve their ability to speak. Again, this indicates a link between manual signing and vocal speech. Babies express cognitive abilities through certain hand gestures (e.g., by pointing with the index finger) earlier than they do through articulated words (the latter require more refined oral motor skills, which very young babies do not yet possess).

Law. According to the Federal Rules of Evidence (Article VIII. Hearsay), "A 'statement' is (1) an oral or written assertion or (2) nonverbal conduct of a person, if it is intended by the person as an assertion" (Rule 801. Definitions).

Media. 1. According to the CBS Evening News show (October 17, 1995), the earliest known recording of a human voice was made on a wax cylinder in 1888 by Thomas Edison. The voice says, "I'll take you around the world." 2. The world's second most-recorded human voice is that of singer Frank Sinatra; the most recorded is that of crooner Bing Crosby (Schwartz 1995).

Sex differences I. "During phonological tasks [i.e., the processing of afferent (incoming), rhyming, vocal sounds], brain activation in males is lateralized to the left inferior frontal gyrus regions; in females the pattern of activation is very different, engaging more diffuse neural systems that involve both the left and right inferior frontal gyrus (Shaywitz et al. 1995:607).

Sex differences LL: Recent finding. "Study: Women Listen More than Men [Associated Press, Copyright 2000]." Nov. 28, 2000 — Score one for exasperated women: New research suggests men really do listen with just half their brains. "In a study of 20 men and 20 women, brain scans showed that men when listening mostly used the left sides of their brains, the region long associated with understanding language. Women in the study, however, used both sides. Other studies have suggested that women "can handle listening to two conversations at once," said Dr. Joseph T. Lurito, an assistant radiology professor at Indiana University School of Medicine. "One of the reasons may be that they have more brain devoted to it." Lurito's findings, presented Tuesday at the Radiological Society of North America's annual meeting, don't necessarily mean women are better listeners. It could be that "it's harder for them," Lurito suggested, since they apparently need to use more of their brains than men to do the same task. "I don't want a battle of the sexes," he said. "I just want people to realize that men and women" may process language differently. In the study, functional magnetic resonance imaging — or fMRI — was used to measure brain activity by producing multidimensional images of blood flow to various parts of the brain. Inside an MRI scanner, study participants wore headphones and listened to taped excerpts from John Grisham's novel "The Partner," while researchers watched blood-flow images of their brains, displayed on a nearby video screen. Listening resulted in increased blood flow in the left temporal lobes of the men's brains. In women, both temporal lobes showed activity" (Source: News, December 12, 2000).

Vocal recognition. In his EMOVOX project ("Voice variability related to speaker-emotional state in Automatic Speaker Verification"), Prof. Klaus Scherer (Department of Psychology, University of Geneva) and his colleagues are researching the effects of emotion on speech to improve the effectiveness of automatic speaker verification (as used, e.g., in security systems).

RESEARCH REPORTS: 1. "The general model encompassing both spoken and signed languages to be presented here assumes that the key lies in describing both with a single vocabulary, the vocabulary of neuromuscular activity--i.e. gesture" (Armstrong, Stokoe, and Wilcox 1995:6). 2. "With all due respect to my esteemed colleague [Iain Davidson], our disagreement doesn't really rest so much on whether or not I see a Broca's area on [fossil cranium] 1470, whichever Homo it turns out to be . . . . Our disagreement really stems from whether or not the manufacture of stone tools gives us any insights to previous cognitive behavioral patterns, and as I wrote back in 1969, 'Culture: A Human Domain,' in CA [Current Anthropology], I think there are more similarities than not between language behavior and stone tool making, and I haven't retreated from this position, because I haven't seen effective rebuttal, just denial" (Ralph L. Holloway, posting on Anthro-L, June 21, 1996, 4:04 PM). 3. "We tend to perceive speech sounds in terms of 'articulatory gestures,' whose boundaries and distinctions correspond to articulatory (i.e., somato-motor) features, not just sound features . . ." (Deacon 1997:359-60).

Neuro-notes I. Speaking is our most complex activity, requiring ca. 140,000 neuromuscular events per second to succeed. No animal on earth can match a human's extraordinary coordination of lips, jaws, tongue, larynx, pharynx, speech centers, basal ganglia, cerebellum, emotions, and memory, all required to utter a phrase.

Neuro-notes II. During the 1990-2000 Decade of the Brain, neuroscientists established that flaking a stone tool and uttering a word (e.g., handaxe) make use of the same--and closely related--brain areas. So nearly alike, in fact, are the neural pathways for manual dexterity and speech that a handaxe itself may be deciphered as though it were a paleolithic word or petrified phrase. Because a. the word "handaxe," and b. the perception of the worked stone (for which it stands) both exist as mental concepts (the neural templates for each are linked in the brain).

Neuro-notes III. Speech rests on an incredibly simple ability to pair stored mental concepts with incoming data from the senses. Ivan Pavlov (1849-1936; the Russian physiologist who discovered the conditioned response), e.g., observed dogs in his laboratory as they paired the sound of human footsteps (incoming data) with memories of meat (stored mental concepts). Not only did the meat itself cause Pavlov's dogs to salivate, but the mental concept of meat--i.e., memories of mealtimes past--was also called up by the sound of human feet. (N.B.: Pairing one sensation with memories of another [a process known as sensitization or associative learning] is an ability given to sea slugs, as well.)

Neuro-notes IV. Tool use itself probably increased concept formation. MRI studies, reveal that children who make early, skilled use of the digits of the right hand (e.g., in playing the violin) develop larger areas in the left sensory cortex devoted to fingering. Thus, Pleistocene youngsters who were precociously introduced to tool-making may have developed enhanced neural circuitry for the task.

Neuro-notes V. In an unpublished Carnegie Mellon University study, 18 volunteers were asked to do a language task and a visual task at the same time. Magnetic resonance imaging (MRI) measured the amount of brain tissue used by each task in voxels. Performed separately, the language and visual tasks each activated 37 voxels. Performed at the same time, however, the brain activated only 42 voxels rather than the expected 74. "'The brain can only be activated a limited amount and you have to decide where to use that activation,' says Marcel A. Just, PhD, from the Center for Cognitive Imaging at Carnegie Mellon. He plans a study in which subjects will be tested doing multiple tasks while in a driving simulator. One of those tasks will involve using a cell phone" (Lawrence 2001).

Self Help Affirmations

Self Help Affirmations

The big book of affirmations from personal development authors. An affirmation is something you say to yourself. Everybody uses them on purpose or accidentally. You get up in the morning, leap out of bed and proclaim,

Get My Free Ebook

Post a comment