48. Осознаваемые и неосознаваемые процессы: нейрофизиологический и нейропсихологический анализ К. Прибран (Conscious and Unconscious Processes: A Neurophysiological and Neuropsychological Analysis Karl H. Pribram)
48. Conscious and Unconscious Processes: A Neurophysiological and Neuropsychological Analysis Karl H. Pribram
(This work was supported by NIMH Grant No. MH 12970-10 and Career Research Award No. MH 15214-14 to the author)
Stanford University, USA
Introduction
A patient has a tumor removed from the occipital lobe on one side of his brain. The surgery leaves him unable to report the sight of objects presented to him on the side opposite the removal, yet he can correctly point to the location of the objects and even correctly respond to differences in their shape (Weiskrantz and Warrington, 1974). Even when repeatedly told that he is responding well, he insists that he is not aware of seeing anything and only is guessing.
Another patient has the medial structures of the temporal lobes of his brain removed on both sides. He performs well on tests of immediate memory such as recalling a telephone number just read out to him, but a few minutes later is not only unable to recall the number but the fact that he had heard a number or even that he had been examined. Even after twenty years of regular exposure to an examiner, the patient fails to recognize her as familiar (Scoville and Milner, 1957). Yet, this same patient, when trained to respond skillfully to a complex task, or to discriminate between objects, etc, can be shown to maintain such performances over years despite the disclaimer on his part that he was ever exposed to such a task (Sidman, Stoddard and Mohr, 1968).
Still another patient with a similar but more restricted bilateral lesion of her temporal lobe has gained over a hundred pounds of weight since surgery. She is a voracious eater, but when asked whether she is hungry or has any special appetites, she denies this even when apprehended in the midst of grabbing food from other patients (Pribram, 1965).
This is not all. A patient may have the major tracts connecting his cerebral hemispheres severed with the result that his responses to stimuli presented to him on opposite sides are treated independently of one another. His right side is unaware of what his left side is doing and vice-versa. The splitting of the brain has produced a split in awareness.
More common in the clinic are patients who are paralyzed on one side due to a lesion of the brain's motor system. But the paralysis is manifest especially when the patient attempts to follow instructions given to him or which he himself initiates. When highly motivated to perform well ingrained responses, as when a fire breaks out, or as part of a more general action, the paralysis disappears. Only intentional, volitional control is influenced by the lesion.
Observations such as these have set the problems that brain scientists need to answer. Not only do they demonstrate the intimate association that exists between brain and the human mind; they also make it necessary to take into account the dissociation between conscious awareness, feelings and intentions on the one hand and unconscious, automatic behavioral performances on the other.
Parhaps, therefore, it is not too surprising that a division in approach to the mind-brain problem has recently occurred. While philosophers and behavioral scientists have for the most part eschewed a Cartesian dualism in an attempt at rigorous operational and scientific understanding, thoughtful brain scientists have inveterately maintained that a dualism exists and must be taken into account. A brief review of my own struggles with the problem may be helpful in posing some of the issues involved.
Plans
The struggle began modestly with a recounting in the late nineteen fifties and early nineteen sixties of case histories such as those used in the Introduction to this paper. These were presented as an antidote to the radical behaviorism that then pervaded experimental psychology. The formal properties of a more encompassing view were presented in terms of a computer analogy in Plans and the Structure of Behavior under the rubric of a "Subjective Behaviorism". The analogy has since become a fruitful model or set of models known as "Cognitive Psychology" which, in contrast to radical behaviorism, has taken verbal reports of subjective conscious experience seriously into account as problem areas to be investigated and data to be utilized.
The computer has proved an excellent guide to understanding and experimental analysis. Further, it has become clear that a host of control engineering devices can serve as models for the brain scientist. Of special interest here is the distinction that can be made among such models between feedback and feedforward operations, a distinction which is critical to our understanding of the difference between automatic and voluntary control of behavior. Feedback organizations operate like thermostats, Cannon's (1927) familiar homeostatic brain processes that control the physiology of the organism. More recently it has become established that sensory processes also involve such feedback organizations (see Miller, Galanter & Pribram 1960 and Pribram 1971, Chaps. 3, 4 and 11 for review). Thus, feedback control is one fundamental of brain organization.
But another somewhat less well understood fundamental has emerged in the analyses of brain function in the past few years. This fundamental goes by the name of feedforward or information processing (see e.g., McFarland 1971 tChap. i). I have elsewhere (Pribram 1971b) Chap. 5; Pribram and Gill 1976, Chap. 1; Pribram 1971a) detailed my own understanding of feed-forward mechanisms and their relation to feedback control. Briefly, I suggest that feedbacks are akin to the processes described in the 1st law of thermodynamics (the law of conservation of energy) in that they are error processing, reactive to magnitudes of change in the constraints that describe a system. They operate to restore the system to the state of equilibrium.
By contrast feedforward organizations pr ocess "information", novelties that increase the degrees of freedom of the system. The manner by which this is accomplished is often portrayed in terms of Maxwell's demon and Szilard's solution to the problem posed by the "demons": how is energy conserved across a boundary (a system of constraints) that recognizes certain energy configurations and lets them pass while denying others (see Brilluin 1962 for review). In such a system the energy consumed in the recognition process must be continually enhanced or the "demon" in fact tends to disintegrate from the impact of random energy. Feedforward operations are thus akin to processes described by the second law of thermodynamics which deals with the amount of organization of enegry, not its conservation. Information has often been called neg-entropy (see e.g., Brilluin 1962) entropy being the measure on the amount of disorganization or randomness in a system. In the section on Consciousness and Volition we will return to these concepts and apply them to the issues at hand.
Of especial interest is the fact that Freud (1895) anticipated this distinction between feedback and feedforward in his delineation of primary and secondary processes (Pribram and Gill 1976). Freud distinguished three types of neural mechanisms that constitute primary processes. One is muscular discharge; a second is discharge into the blood stream of chemical substances; and a third is discharge of a neuron onto its neighbors. All three of these neural mechanisms entail potential or actual feedback. Muscular discharge elicits a reaction from the environment and a sensory report of the discharge (kinesthetic) to the brain. The neurochemical discharge results, by way of stimulation of other body chemicals to which the brain is sensitive, in a positive feedback which Freud labels "the generation of unpleasure". (This is the origin of the unpleasure - later the pleasure - principle). Discharge of a neuron onto its neighbors is the basis of associative processes that lead to a reciprocal increase in neural excitation (cathexis) between neurons (a feedback) which is the basis for facilitation (a lowering of resistance) of their synapses (learning).
By contrast, secondary or cognitive processes are based on a host of complex neural mechanisms (e.g., defense, attention) that delay discharge through inhibition. These convert wishes (the sum of facilitations) to willed acts by allowing attention (a double feedback that matches-a comparison process - the wish to external input) to operate (reality testing).
For Freud and nineteenth century Viennese neurology in general, consciousness was a function of the cerebral cortex. Thus the greater portion of the brain regulated behavior - of which we are not aware - is behavior which is automatic and unconscious.
Images
As indicated by the case histories described in the introduction, today's neuroscientist shares with nineteenth century neurology the necessity to understand the special role of the brain cortex in the constructions that constitute consciousness. Freud tackles that problem by distinguishing "the qualitative imaging" properties of sensations from the more "quantitative" properties of association, memory and motivating. The distinction remains a valid one today: how then are "images" constructed by the brain cortex?
Images are produced by a brain mechanism characterized by a precisely arranged anatomical array which maintains a topographic isomorphism between receptor and cortex but which can be seriously damaged or destroyed (up to 90%) without impairing the capacity of the remainder to function in lieu of the whole. These characteristics led me to suggest in the mid-sixties (Pribram, 1966) that in addition to the digital computer, brain models need to take into account the type of processing performed by optical systems. Such optical information processing is called holography, and holograms display exactly the same sort of imaging properties observed for brain: i. е., а precisely aligned mechanism that distributes information. In the brain the anatomical array serves the function of paths of light in optical systems and horizontal networks of lateral inhibition perpendicular to the array serve the functions of lenses (Pribram, 1971; Pribram, Nuwer and Baron, 1974).
I have proposed a specific brain mechanism to be responsible for the organization of neural holograms (Pribram 1971, Chap. 1). This mechanism involves the slow graded potential changes that occur at junctions between neurons and in their dendrites. Inhibitory interactions (by hyperpolarizations) in horizontal networks of neurons that do not generate any nerve impulses are the critical elements. Such inhibitory networks are coming more and more into the focus of investigation in the neurosciences. For instance, in the retina they are responsible for the organization of visual processes - in fact, nerve impulses do not occur at all in the initial stages of retinal processing (for review see Pribram 1971, Chaps. 1 and 3). The proposal that image construction (a mental process) in man takes place by means of a neural holographic mechanism is thus spelled out in considerable detail and departs from classical neurophysiology only in its emphasis on the importance of computations achieved by the reciprocal influences among slow, graded local potentials which are well established neurophysiological entities. No new principles ofmin d-b rain interaction need be considered.
For the mind-brain issue, the holographic model is also of special interest because the image wrhich results from the holographic process is located separately from the hologram that produces it. We need therefore to be less puzzled by the fact that our own images are not referred to eye or brain but are projected into space beyond. Von Bekesy (1967) has performed an elegant series of experiments that detail the process (lateral inhibition - the analogue of lenses in optical systems - as noted above) by which such projection comes about. Essentially the process is similar to that which characterizes the placement of auditory images between two speakers in a stereophonic music system. From this fact, it can be seen how absurd it is to ask questions concerning the "locus" of consciousness. The mechanism is obviously in the brain - yet subjective experience is not of this brain mechanism per se but of the resultant of its function. One would no more find "consciousness" by dissecting the brain than one would find "gravity" by digging into the earth. Let us therefore look at the brain processes that make consciousness possible, the control programs that organize the distributed holographic process into one or another image. Ordinarily wre speak of such control operations asgovern-ing "attention".
The digital computer and optical hologram thus provide models of mechanism which when tested against the actual functions of the primate brain go a long way toward explaining how human voluntary and imaging capabilities can become differentiated from unconscious processes by man's brain.
Consciousness and Attention
Just as did Freud, William James (1901) emphasized that most of the issues involved in delineating "consciousness" from unconscious processes devolve on the mechanism of attention. James, however, took the problem one step further by pointing out that attention sets the limits in capacity of the organism to process information from the external or internal environments. Gilbert Ryle (1949) has reminded us that in fact the term "mind" is derived from "minding", i.e., attending. Viewed from this vantage consciousness is a state that results from attentive processes-consciousness ceases to be cause but rather is itself caused. Two separate issues can therefore be discerned in relating consciousness to brain: description of the attentional processes, the control operations that determine consciousness, and description of the brain state(s) coordinate wdth consciousness. These two issues are, of course, the same as those delineated in the previous sections: the brain mechanisms responsible for the programming of psychological processes and behavior, and those involved in image construction. Let us turn once more, therefore, to the programming, the control operations performed by the brain that allocate attention and thus differentiate conscious from unconscious processes.
Over a decade and a half my laboratory (as well as those of many others) has been investigating the neural mechanisms involved in the control of attention. A comprehensive review of these data (Pribram and McGuinness, 1975) discerned three such mechanisms: one deals with short phasic response to an input (arousal); a second relates to prolonged tonic readiness of the organism to respond selectively (activation); and a third acts to coordinate the phasic (arousal) and tonic (activation) mechanisms. Separate neural and neurochemical (Pribram in press b) systems are involved in the phasic (arousal) and tonic (activation) mechanisms: the phasic centers on the amygdala, the tonic on the basal ganglia of the forebrain. The coordinating system critically involves the hippocampus, a phylogenetically ancient part of the neural apparatus.
The evidence suggests that the coordination of phasic (arousal) and tonic (activation) attentional processes demands "effort". Thus the relation of attention to intention, i.e., to volition and will comes into focus. Again, Wiltiam James had already pointed out that a good deal of what we call voluntary effort is the maintaining of attention or the repeated returning of attention to a problem until it yields solution.
Consciousness and Volition
William James had apposed will to emotion and motivation (which he called instinct). Here, once again, brain scientists have had a great deal to say. Beginning with Walter Cannon's experimentally based critique of James (1927), followed by Lashley's critique of Cannon (1960), to the anatomically based suggestions of Papez (1937) and their more current versions by MacLean (1949), brain scientists have been deeply concerned with the mechanisms of emotional and motivational experience and expression. Two major discoveries have accelerated our ability to cope with the issues and placed the earlier more speculative accounts into better perspective. One of the discoveries has been the role of the reticular formation of the brain stem (Ma-goun, 1950) and its chemical systems of brain amines (see e.g., review by Barchas, 1972; Pribram, in press b) that regulate states of alertness and mood. Lindsley (1951) proposed an activation mechanism of emotion and motivation on the basis of the initial discovery and has more recently (Lindsley and Wilson, 1976) detailed the pathways by which such activation can exert control over other brain processes. The other discovery is the system of brain tracts which when electrically excited results in reinforcement (i. е., increase in the probability of occurrence of the behavior that has produced the electrical brain stimulation) or deterrence (i.e., decrease in probability that such behavior will recur) by Olds and Milner (1954).
In my attempts to organize these discoveries and other data that relate brain mechanisms to emotion, I found it necessary to distinguish clearly between those data that referred to emotional experience (feelings) and those that referred to expression, and, further to distinguish emotion from motivation (Pribram, 1971b). Thus feelings were found to encompass both emotional and motivational experience, emotional as affective and motivational as appetitive (Pribram, 1970b). The appetitive processes of motivation are centered on the readiness (activation) mechanisms already alluded to in the discussion of attention. Not surprisingly the affective processes of emotion were found to be based on the machinery of arousal, the ability to make phasic responses to input which "stop" the ongoing activity of the organism. Thus feelings were found to be based on neurochemical states of alertness and mood which become organized by appetitive (motivation, "go") and affective (emotional, "stop") processes.
The wealth of new data and these insights obtained from them made it fruitful to reexamine the Jamesian positions with regard to consciousness and unconscious processes and their relationship to emotion, motivation, and will (Pribram, 1976b and in press c). James was found in error in his emphasis on the viscera] determination of emotional experience and his failure to take into consideration the role of expectation (familiarity) in the organizatin of emotional experience and expression. On the other hand, James had rightly emphasized that emotional processes take place primarily within the organism while motivation and will reach beyond into the organism's environment. Fuirther, James was apparently misinterpreted as holding a peripheral theory of emotion and mind. Throughout his writings he emphasizes the effect that peripheral stimuli (including those of visceral origin) exert on brain processes. The confusion comes about because James'insistence that emotions concern bodily processes, that they stop short at the skin. Nowhere, however, does he identify emotions with these bodily processes. Emotion is always their resultant in brain. James is in fact explicit on this point when he discusses the nature of the input to the brain from the viscera. He comes to the conclusion, borne out by subsequent research (Pribram, 1961), that the visceral representation in the brain shares the representation of other body structures.
The distinction between the brain mechanisms of motivation and will are less clearly enunciated by James. He grapples with the problem and sets the questions that must be answered. As already noted, clarity did not come until the late 1960's when several theorists (e.g.,!MacKay, 1966; Mittelsteadt, 1968; Waddington, 1957; Ashby, personal communication; McFarland, 1971; Pribram, 1960, 1971b) began to point out the difference between feedback, homeostat-ic processes on the one hand and feedforward, homeorhetic processes on the other. Feedback mechanisms depend on error processing and are therefore sensitive to perturbations in their environment. Information processing systems, whether computer or optical, may incorporate feedbacks, but their overall organization is insensitive to external perturbations. Programs, unless completely stopped, run themselves off to completion irrespective of obstacles placed in their way.
Clinical neurology had classically distinguished the mechanisms involved in voluntary from those involved in involuntary behavior. The distinction rests on the obseravation that lesions of the cerebellar hemispheres impair intentional behavior, while basal ganglia lesions result in disturbances of involuntary movements. Damage to the cerebellar circuits produces intention tremors, inability to prevent overshoot, etc The classical experimental neurophysiology of the cerebellum was reviewed by Ruch (in 1951) with the conclusion that cerebellar circuits are involved in a feedforward rather than a feedback mechanism (although Ruch did not have the term feedforward available to him). I have extended this conclusion (Pribram, 1971b) on the basis of more recent microelectrode analyses by Eccles, Ito and Szentagothai (1967) to suggest that the cerebellar hemispheres perform calculations in fast-time, i.e., extrapolate where a particular movement would end were it to be continued, and send the results of such a calculation to the cerebral motor cortex where they can be compared with the aim to which the movement is directed. Experimental analysis of the functions of the motor cortex had shown that this aim is composed of an "Image of Achievement" constructed in part on the basis of past experience (Pribram, Kruger, Robinson and Berman, 1955-56; Pribram, 1971b, Chapters 13, 14 and 16).
Just as the cerebellar circuit has been shown to serve intentional behavior, the basal ganglia have been shown to be important to involuntary processes. We have already noted the involvement of these structures in the control of activation, the readiness of organisms to respond. Lesions in the basal ganglia produce tremors at rest and markedly restricted expressions of emotion. Neurological theory has long held (see e.g., Bucy, 1944) that these disturbances are due to interference by the lesion of the normal feedback relationships between basal ganglia and cerebral cortex. In fact, surgical removals of motor cortex have been performed on patients with basal ganglia lesions in order to redress the imbalance produced by the initial lesions. Such resections have proved to be remarkably successful in alleviating the often distressing continuing disturbances of involuntary movement that characterize these basal ganglia diseases.
Self-Consciousness and Intentionality
A final observation is in order regarding William James' analysis of this set of related problems. James clearly distinguishes consciousness from self-consciousness and suggests that self-consciousness occurs when attention is paid (i.e., willed, effort is made) to internal brain processes. Today we would perhaps call this meta-consciousness. James sees no special problem here, but his contemporary, Brentano, Freud's teacher, identifies the issue of self-consciousness or intentionality as central to what makes man human.
Brentano derives his analysis from the scholastics and uses intentional in existence (usually referred to as "intentionality") as the key concept to distinguish observed from observer, the subjective from the objective. I have elsewhere (Pribram 1976b) somewhat simplified the argument by tracing the steps from the distinction between intentions and their realization in action to perceptions and their realization as the objective world. Brentano is credited along with James as the source of current American realism of which my own version "constructional realism" (Pribram 1971a) can be considered a part.
How then is Brentano's dualism, the distinction between subject and object, related to that of Descartes? Cogito and intentionality are of course the same. Brain must always be a part of the objective world even if it is the organ critically responsible for the subjective - from which in turn the objective is constructed. Brentano is perfectly clear on this point, and suggests that only the study of intentional consciousness, i. е., self consciousness, is the province of the philosopher-psychologist. Unconscious processes fall to the physiologist, especially the brain physiologist, to unravel. Of historical interest is the fact that a pupil of Brentano's, Sigmund Freud, later to become an outstanding neurologist, also became the champion of the importance of unconscious processes in determining everyday and pathological behavior (Pribram & Gill 1976).
However, Brentano places one reservation on his caveat: the philosopher-psychologist may have something to contribute to the analysis of unintentional, thus unconscious, processes should it turn out that Leibnitz is correct that these may be rooted in monadic structures (Leibnitz, 1898). Leibnitz undoubtedly derived the monadology from his mathematical invention, the integral calculus, much as Gabor (1969) derived Holography from current developments in this branch of mathematics. I have thus taken Brentano's exception seriously by proposing a physiological mechanism for both conscious and unconscious processes and suggesting that the premise of this physiological mechanism, the neural hologram (a monadic organization) has consequence for psychology. This consequence is, as I discern it, the clear separation of intentional (i.e., self consciousness) from unintentional consciousness, i.e., ordinary perception. The case histories presented at the outset of this paper make this point more strongly than any philosophical argument: minding is of two sorts, instrumental and intentional.
Contemporary observations sparked by Roger Sperry (1974) on patients in whom the corpus callosum has been severed, thus partially isolating the cerebral hemispheres from one another, are contributing further to our ability to ask precise questions regarding this issue. The hemispheres have been shown to predominate in different types of processing. In right-handed persons the left hemisphere processes information much as does the digital computer, while the right hemisphere functions more according to the principles of optical, holographic information processing systems. Do patients, then, with callosotomies have two minds in one head (as Sperry, 1969, would haveit),or is there only one mind, that of the hemisphere endowed with the ability to process information linguistically (as Eccles, 1965, has suggested)? My view of the matter is that both Sperry and Eccles are partially correct, but that neither has provided a comprehensive answer. Sperry is right in his observations that both hemispheres display consciousness, i. e., the ability to attend, to "mind". Themore, whenever it is shown that minding by one hemisphere is isolated and different from that of the other, two minds may be considered to be present in one head. Eccles, on the other hand, is, I believe, primarily concerned with metaconsciousness, the ability of a brain mechanism to focus attention on its own processing. This capability may well come pari-passu with the capability for linguistic construction and is therefore the province of the linguistic hemisphere. Intentionality and linguistics appear to go hand in hand.
Mind, Brain and Consciousness
Let us return now to the initial impetus to all of this inquiry, the relationship between conscious and unconscious processes. One would think that all of the clinical observations and experimental data that have accrued over the past two centuries would have resolved the issue in favor of some simple monism. But as noted in the introduction, paradoxically the investigators closest to the observations have seen no such easy way out of the problem. The case histories presented at the outset of this paper clearly make the point: verbal reports of introspection and instrumental behavior often become dissociated when patients with brain lesions are examined. Thus subjective report of conscious experience and objective automatic instrumental behavior of which the patient becomes only indirectly aware continue to be two separable dimensions of experience each of which must be taken into account. Behaviorism with all of its technical advantages has not resolved the issue.
Nonetheless additional clarity can be attained (see Pribram, 1965). While dualism cannot be ignored, it can be transcended by a systems analysis of the issues involved. Operational definition of the origins of automatic instrumental behavior and verbal reports of conscious processes suggests the following. Instrumental techniques are usually employed in the experimental analysis of behavior, i.e., in relating behavior to its organismic and environmental components. The investigator thus stands at the apex of the hierarchy of the subsystems he is investigating. In common with most of natural science (physical and biological), this use of behavior provides the "view of reality from the top" with which scientists are most comfortable. The approach is reductive, problems become solved or resolved, although room is ordinarily made for emergent properties when the reduction fails to completely explain all of the observations obtained at the more complex level of the hierarchy. For example, the wetness of water and its propensity for floating when frozen are properties that even today would be difficult to predict from the separate atomic properties of hydrogen and oxygen.
By contrast to these major reductive thrusts of the physical and biological sciences, the social sciences and humanities tend for the most part to look upward in an attempt to penetrate more complex organizations. This is not to deny that physical'scientists also occasionally work in this mode.For instance, the genera] and special theories of relativity are prime examples of viewing the physical universe from a vantage lower than that to be comprehended. And, of course, physical science is then plagued with all the problems continuously faced by social science: relativity, defining frames of reference, contextual influences including those introduced by the observer come to play important roles. Biological scientists are just beginning to grapple on a large scale with the problems posed by this type of science, although evolutionary theory has for some time provided excellent tools for doing so.
In an earlier paper using the chemical analogy of optical isomers I called these "mirror image" views of knowledge, one descriptive, the other normative (Pribram", 1965). Perhaps even more apt but also more risky in these days of women's liberation, would be the suggestion that reductive-descriptive science is earthily female in its attempts to resolve and is thus particularly attractive in a male dominant culture. By contrast, normative penetrations of frames are a male-like process that addresses its appeal to the feminine.
These observations are pertinent to the mind-brain issue because, in contrast to instrumental behavior, verbal behavior reflects our socially and culturally determined universe of discourse. Verbal behavior is primarily employed in communication (necessitating two or more brains). It describes, therefore, a view of the "reality from below" of this more complex social, cultural universe that it describes. As such, verbal descriptions of subjective experience are heir to all the vicissitudes that sort of approach entails: relativity, dependence on frames of reference and context and of observer interference.
But advantages also accrue when it is realized to what extent the structure of subjective experience is determined by social-cultural, i. е., human enterprise as encoded in language. Aesthetic and ethical processes no longer need be eschewed by scientists as the exclusive domain of humanists. The aesthetic and ethical dimensions of experience can in fact be related to other aspects of what constitutes persons and the brains that organize them (Pribram, 1968, 1969, 1976).
Two classes of problems immediately arise when this systems view of the mind-brain issue is pursued. One is relatively easily disposed of, the other is not. The "easy" class of problems concerns the manner by which socio-cultural events interact with the organization of the brain. The answer is twofold: First, we have already become acquainted with the mechanism whereby brain representations become realized in the environment through action. Note that, just as in the case of image construction, (where environment becomes represented in brain and consciousness) only classical neurophysio-logical processes are involved in such realization. Second, the brain contains elements that show a remarkable amount of plasticity. This plasticity allows a great deal of learning through experience to take place. Thus, the organization of memory as a brain representation of the environment (which can therefore construct images of the environment - i. e., produce "conscious awareness") continues to develop throughout life. The particulars of determining which brain elements display plasticity, of determining differences between brain systems involved in different aspects of learning, and of determining the elementary organization of memory are currently active areas of investigation in the brain and behavioral sciences. These problems are far from being solved, but the domain of questions can be clearly specified and techniques are available to pursue solution.
In short, the interaction between brain and mind occurs by way of organizing influences. Brain structure is influenced by cultural events which in turn become structured by brains. The interaction can thus be measured by the amount of information that characterizes the interaction. There is no special mystery here. Information processes are akin to those described by the second law of thermodynamics which deals with the organization of energy rather than its transfer and conservation. During informaton processing very small increments in the amount of energy transferred can result in major changes in structuring although the support systems that make information processing possible may expend a good deal of energy. Again, control theory as for instance embodied in computer systems, provides sophisticated models that help us understand problems that classically had no recourse to scientific analysis.
The other and at this time more difficult class of problems revolves around the issue of what it is that makes man peculiarly human. In linguistics the problem is often stated in terms of the analysis of the deep structure of language. But the issue extends equally to other cultural achievements and the subjective experiences (and thus the brain mechanisms) that make them possible. The case histories developed at the beginning of this paper suggest that this class of problems has a solution. However, experimental analysis is for the most part precluded because the observations must perforce be made on man. Nonetheless, continued ingenious use of clinical cases as they occur should, now that the issue is joined, slowly provide the substance for some definitive answers.
Conclusion - What About Unconscious Processes?
When I began my investigations into this fascinating realm of the relationship between brain and mind, I was convinced that brain research would ultimately do away with mentalism, just as biochemistry had doomed vitalism. A most rewarding aspect of working on the frontiers of knowledge is that almost daily some surprise is in store for the investigator. These increments of surprise have, in this instance, completely reversed my initial expectation. Brain research and clinical observation have made it mandatory to retain a concept of mind, to carefully analyze its origins and organization and its relationship to those systems that function to organize it: My initial focus had been on subsystems such as endocrines and brain (which is in turn composed of anatomical subsystems such as the visual and auditory and neurochemical, e. g., the mood-determining catechol and indole amine pathways). But more recently this research has had to cope with the effect on brain organization of experience and this has led to an upward look at the hierarchy of systems, toward the structuring of mind, via brain, by sociocultural organizations such as play (Reynolds and Pribram, submitted for publication), games (Pribram, 1959), and language (Pribram, 1973; 1976c, in press a).
With regard to the problem of differentiating conscious from unconscious processes the following has become clear as a result of the brain research delineated in this essay. Two types of brain mechanisms can be distinguished, each leading to a form of conscious experience and behavior (Pribram in press c). One type is automatic, leads to imaging and instrumental acts. The other is effortful and leads to intentional awareness and voluntary action. Automatic processing might be labelled preconscious or simple consciousness because the organisms displaying such processes are not unconscious in general and can be made aware of their automaticities (e. g., by biofeedback techniques which engage the effort mechanism). By contrast the effortful or intentional processing might be labelled self-conscious because its characteristic is the distinction made by Brentano of the ability to differentiate the processor from what is being processed.
Given these two forms of consciousness what then becomes of unconscious processes and what is their relationship to mind? It would be patently wrong to eliminate such processes as long-term memory, arousal or activation from consideration involved in mental processes. Still, these terms describe primarily brain mechanisms, although psychological processes are also referred to. Perhaps the clearest resolution of the issue is the one used in computer science: there the distinction is made between hardware and software. Hardware refers to the brain of the computer, software to the programs that control the operations of the hardware. But of course, hardware and software are relative and interchangeable. One can hardware an often used program to facilitate its use; one can program a piece of special purpose hardware in order to apply a general purpose machine to the special purpose. Despite this interchangeability, computer scientists find it useful to distinguish between their machines and their programs - and we find it equally useful to distinguish between brain and mind.
In the frame of this distinction, then, unconscious mental operations are those whose structure leads neither to automatic preconscious nor to intentional self-conscious experience or behavior. This is a definition by exclusion and the question arises as to whether any "unconscious" remnants can be identified.
I will not attempt to answer this q uesticn - but hope that it will at least be addressed in the remainder of this volume. The issue appears to me to be this: are automaticities such as those demonstrated in psychomotor epilepsy, during hypnosis, in hypnogogic dream and other "altered" states of consciousness to be used as evidence for some further, deeper "unconscious" organization of brain and mind? There is little question but that these alternate states of consciousness have something to tell us about antecedent properties of brain organization. The question I am asking is whether these alternate states also indicate the existence of some more universal "software", dependent on the interactions among many similarly constituted brains - a collective unconscious along the lines proposed by Jung (1960) - much as culture can be conceived as the collective conscious software produced by man. And, if there is a collective unconscious, what is its structure?
References
Barchas, J. E., Ciaranello, R. D., Stolk, J. M. and Hamburg, D. A. Biogenic amines and behavior. In S. Levine (Ed.), Hormones and Behavior. New York: Academic Press, 1972, pp. 235-329.
Bekesy Von, G. Sensory Inhibition. Princeton, New Jersey: Princeton University Press, 1967.
Brillouin, Leon. Science and Information Theory, 2nd ed. New York, Academic Press, Inc., 1962, pp. 347.
Bucy, P. С The Precentral Motor Cortex. Chicago. Illinois: University of Illinois Press, 1944. CANNON, VV. B. The James-Lange theory of emotions: a critical examination and an alternative theory. Amer. J. Psychol., XXXIX.. pp. 106-124,1927.
Eccles, J. The brain and the unity of conscious experience. The 19th Arthur Stanley Edding-ton Memorial Lecture. Cambridge: Cambridge University Press. 1965.
Eccles, J., Ito, M. and Szentagothal, J. The Cerebellum as a Neuronal Machine. New York: Springer-Verlag. 1967.
Freud, S. Project for a scientific psychology in The Complete Psychological Works of Sigmund Freud, Vol. 1. p. 19.
Gabor, Dennis. Information processing with coherent light. Optica Acta, 1969, 16: 519-533.
James, W. The Principles of Psychology. London: MacMillan and Co., Ltd. (Vols. I and II), 1901.
Jung, С Collected Works (Second edition). Princeton, New Jersey: Princeton University Press, 1960.
Lashley, D. The thalamus and emotion. In F. A. Beach. D. О. Невв, С. Т. Morgan and H. W. Nissen (Eds.). The Neuropsychology of Lashley. New York: McGraw-Hill, 1960, pp. 345-360.
Leibnitz, G. W. The Monadology and Other Philosophical Writings. Translated with an Introduction and Notes by Robert Latta. Oxford: Oxford Clarendon Press. 1898.
Lindsley, D. B. Emotion. In S. S. STEVENS (Ed.). Handbook of Experimental Psychology. New York: John Wiley and Sons, (Chapter 14), 1951, pp. 473-516.
Lindsley, D. B. and Wilson, С L. Brainstem-hypothalamic systems influencing hippocampal activity and behavior. In R. L. Isaacson and К-Н. Pribram (Eds.). The Hippocampus (Part IV). New York: Plenum Publishing Co., in press.
Mackay, D. M. Cerebral organization and the conscious control of action. In J. С ECCLES (Ed.). Brain and Conscious Experience. New York: Springer-Verlag, 1966. pp. 422-445.
maclean, P. D. Psychosomatic disease and the "visceral brain"; recent developments bearing on the Papez theory of emotion. Psychosom. Med., 11: 338-353, 1949.
Magoun, H. W. Caudal and cephalic influences of the brain reticula: formation. Physiol. Rev.. 39: 459-474, 1953.
McFarland, D. J. Feedback Mechanisms in Animal Behavior. London: Academic Press, 1971.
Miller, G. A., Galanter, A. & Pribram., К. Н. Plans and the Structure of Behavior. New York: Henry Holt and Co.. 1960
Mittelstaedt, H. Discussion. In D. P. Kimble (Ed.). Experience and Capacity. New York: The New York Academy of Sciences, Interdisciplinary Communications Program, 1968. pp. 46-49.
Olds, J. & Milner, P. Positive reinforcement produced by electrical stimulation of septal area and other regions of rat brain. J. Соmp. Physiol. Psychol., 47: 419-427. 1954.
Papez, J. W. A proposed mechanism of emotion. Arch. Neurol. Psychiat., Chicago. 38: 725-743. 1937. PRIBRAM, K- H. On the neurology of thinking. Вehav. Sci.. 4: 265-287. 1959.
Pribram. К. Н. A review of theory in physiological psychology. In: Annual Review of Psychology. Vol. II. Palo Alto. California: Annual Reviews, Inc., 1960. pp. 1-40.
Pribram, K. H. Limbic system. In: D. E. Sheer (Ed.). Electrical Stimulation of the Brain. Austin, Texas: University of Texas Press, 1931, pp. 311-320.
Pribram, К. Н. Interrelations of psychology and the neurological disciplines, in: S. Koch (Ed.). Psychology: A Study of a Science. Volume 4: Biologically oriented fields: Their place in psychology and in biological sciences. New York: McGraw-Hill. 1962, pp. 119-157.
Pribram, К. Н. Proposal for a structural pragmatism: some neuro-psychological considerations of problems in philosophy. In: B. Wolman and E. Nagle (Eds.). Scientific Psychology: Principles and Approaches. New York: Basic Books. 1965, pp. 426-459.
Pribram. К. Н. Some dimensions of remembering: steps toward a neuropsychological mode! of memory. In J. Gaito (Ed.). Macromolecules and Behavior. New York: Academic Press 1966, pp. 165-187.
Pribram, K. H. Toward a neuropsychological theory of person. In: E. Norbeck, D. Price-Williams and W. M. McCord (Eds.). The Study of Personality: An Interdisciplinary Approach. New York: Holt. Rinehart and Winston, 1968, pp. 150-160.
Pribram, K. H. Neural servosystems and the structure of personality. J. Nerv. Ment. Dis. (Kubie Issue), 140, 30-39, 1969.
Pribram, К. Н. The biology of mind: neurobehavioral foundations. In: A. R. Gilgen (Ed.). Scientific Psychology: Some Perspectives. New York: Academic Press. 1970a, pp. 45-70.
Pribram, K. H. Feelings as monitors. In: M. B. Arnold (Ed.) Feelings and Emotions. New York: Academic Press, Inc., 1970b, pp. 41-53.
Pribram, К. Н. The realization of mind. Synthese, 22: 313-322, 1971a.
Pribram, К. Н. Languages of the Brain: Experimental Paradoxes and Principles in Neuropsychology. Englewood Cliffs, New Jersey: Prentice-Hall, 1971b.
Pribram, К. Н. The comparative psychology of communication: The issue of grammar and meaning. In: E. Tobach, H. E. Adler, and L. L. Adler (Eds.). Comparative Psychology at Issue. Annals of the New York Academy of Sciences, Vol. 223, 1973. pp. 135-143.
Pribram, K. H. Problems concerning the structure of consciousness. In: G. Globus, G. Maxwell and I. Savodnik (Eds.). Consciousness and Brain: A Scientific and Philosophical Inquiry. New York: Plenum Press, 1976a.
Pribram, K. H. Self-consciousness and intentionality. In: G. E. Schwartz and D. Shapiro (Eds.). Consciousness and Self-Regulation: Advances in Research. New York: Plenum Publishing Corp., 1976b.
Pribram, К. Н. Neurolinguistics: the study of brain organization in grammar and meaning. Totus Homo, 1976c.
Pribram, К. Н. Persistent problems in neurolinguistics: a review. Proceedings of the New York Academy of Sciences Conference on Origins and Evolution of Language and Speech. New York, September, 1975, 22-25, in press a.
Pribram, К. Н. Peptides and Protocritic Processes. In: The Neuropeptides. Sandman (Ed.), in press b.
Pribram, К. Н. Modes of central processing in human learning and remembering. In: Brain and Learning. T. J. Teyler (Ed.), in press с.
Pribram, К. H. & M. Gill. Freud's "Project" Reassessed. Hutchinson, London and Basic Books, New York, 1976.
Pribam, K. H.. Kruger, L., Robinson, F. and Berman, A. J. The effects of precentral lesions on the behavior of monkeys. Yale J. Biol. & Med., 28: 428-443, 1955-56.
Pribram, K. H., & McGutnness, D. Arousal, activation and effort in the control of attention. Psychological Review, 82 (2): 116-149, 1975.
Pribram, К. H., Nuwer, М., & Baron, R. The holographic hypothesis of memory structure in brain function and perception. In: R. C. Atkinson, D. H. Kjantz, R. С Luce and P. Suppes (Eds.). Contemporary Developments in Mathematical Psychology. San Francisco: W. H. Freeman and Co., 1974, pp. 416-467.
Reynolds, P. С & Pribram, К. Н. The structure of social encounters in young rhesus monkeys. Behavior (stubmitted for publication).
Ruch, Т. С Motor systems. In: S. S. Stevens (Ed.). Handbook of Experimental Psychology. New York: John Wiley and Sons, 1951 , pp. 154-208.
Ryle, G. The Concept of Mind. New York: Barnes and Noble, 1949.
Scoville, W. B. & Milner, B. Loss of recent memory after bilateral hippocampal lesions. J. Neurоl. Neurоsurg. Psychia t., 20: 11-21. 1957.
Sidman, M., Stoddard, L. T. & Mohr, J. P. Some additional quantitative observations of immediate memory in a patient with bilateral hippocampal lesions. Neurоpsусhоlоgia, 6: 245-254, 1968.
Sperry, R. W. A modified concept of consciousness. Psych. Rev., 76, 532-636, 1969.
Sperry, R. W. Lateral specialization in the surgically separated hemispheres. In: F. O. Schmitt and F. G. Worden (Eds.). The Neurosciences Third Study-Program. Cambridge, Mass.: The MIT Press, 1974, pp. 5-19.
Waddington, С. Н. The Strategy of Genes. London: George Allen and Unwin, Ltd., 1957.
Weiskrantz, L., Warrington, E. K., Sanders, M. D. & Marshall, J. Visual capacity in the hemianopic field following a restricted occipital ablation. Brain, 97 (4): 709-728, 1974.