tmpF45A TMP
tmpF45A TMP
tmpF45A TMP
it should be understood that `consciousness' means not a stuff nor an entity by itself,
but is short for conscious animal or agent, for something which is conscious.
(Dewey, 1906)
consciousness is neither a definite nor a usable concept. belief in the existence of consciousness goes
back to the ancient days of superstition and magic."
(Watson, 1924)
The concept of consciousness is a hybrid or better, a mongrel concept: the word 'consciousness' connotes a
number of different concepts and denotes a number of different phenomena
(Block, 1995)
consciousness,
objective
consciousness,
sensorimotor
consciousness,
Abstract: This paper argues that the many and various conceptions of consciousness
propounded by cognitive scientists can all be understood as constituted with reference to
four fundamental sorts of criterion: epistemic (concerned with kinds of consciousness),
semantic (dealing with orders of consciousness), physiological (reflecting states of
consciousness) and pragmatic (seeking to capture types of consciousness). The resulting
four-fold taxonomy, intended to be exhaustive, implies that all of the distinct varieties of
consciousness currently encountered in cognitive neuroscience, the philosophy of mind,
clinical psychology and other related fields ultimately refer to a single natural
phenomenon, analyzed under four general aspects. The proposed taxonomy will, it is
hoped, possess sufficient clarity to serve as a sound theoretical framework for further
scientific studies, and to count as a significant step in the direction of a properly
formulated unified concept of consciousness.
Introduction
Consciousness, once considered to be exclusively dependent for its explanatory grounding
on being firmly located inside the head, was, by the end of the last century, almost locked
therein. While the principle of the brains causal priority (Bickle, 2008, p. 272) continues
to be upheld in explanatory models, anybody insisting today that only the brain is relevant
to an understanding of consciousness is liable to be diagnosed with what might be called
to paraphrase a well-known state of impairment methodological locked-in-syndrome.
Having been thus radically embrained, the emphasis now falls, though, on thinking of
consciousness as embodied and situated, and this relative to a wider environment that
includes the social realm (see Thompson & Varela, 2001; Chemero, 2009; Prinz,
forthcoming). Extending the analogy just employed to characterize the previously prevalent
paradigm, we might gloss radical versions of the new view as embodying a commitment to
methodological out-of-body experience.
The most striking consequence of endorsing this externalistic turn in
consciousness studies is the strong conviction that almost everything we know now has
potential relevance to the attempt to explain consciousness. Cognitive science, which has
always exhibited a markedly interdisciplinary character, has, as a result, evolved into a
multileveled and trans-disciplinary affair, involving modes of explanation that function at
almost all known levels of scientific description of the world (from the microphysical to the
social), and in terms that frequently involve close cooperation across a wide range of
scientific disciplines.
It seems reasonable to think that in uncovering such a rich network of connections
between areas, processes, and seemingly remote domains, we are making some sort of
definite progress in the science of consciousness. Unfortunately, however, an ultra-synoptic
vision of this sort may also prove problematic (see Sellars, 1962), since the concept of
consciousness informing the scientific investigations in question then appears to take on an
increasingly ambiguous character. Within contemporary debates, as well as in research
Vimal lists and describes 40 distinct meanings attributed to consciousness, classified according to a dualaspect framework as functions or experiences, while Brooks list encompasses 50 different usages of the
term.
Basic meanings
In everyday language, the words conscious and consciousness are usually used in one or
other of two senses: to indicate a persons waking state or to assert that they are aware of
something. As regards the state of wakefulness, the question one typically asks is Is he or
she conscious?. Unconsciousness here implies that the subject has fainted or is in a
coma. Regarding a relationship to an object of awareness, on the other hand, the sort of
question one typically asks is What is he or she conscious or aware of?. Lack of
consciousness here implies that the subject is unaware of some thing. To be precise, used
in reference to the state of wakefulness the term conscious is a one-place predicate (i.e.
true for subjects who are awake), whilst used in reference to a subjects awareness it is a
two-place predicate (i.e. true when the relation between a given subject and some particular
information is fulfilled when the subject is aware of that information). These meanings
may seem trivial but, as will be shown in due course, they form a necessary basis for
subsequent distinctions. They will hereon be referred to as follows:
(M1) X is conscious, X has consciousness (state of wakefulness)
(M2) X is conscious of Y, X has consciousness of Y (relation of awareness)
In popular dictionaries, a few additional meanings are also attached to these terms.
For instance, when consciousness is understood as collective or individual mentality (in
sentences like The year 1989 penetrated deeply into the Polish consciousness), or when
conscious refers to intentional action (it must have been a conscious act of violence).
However, those meanings are not very important in the current debate.2
In 1906, the philosopher Dewey was already in a position to distinguish as many as
six different senses, two of these corresponding, respectively, to our M1 and M2.
Interestingly, he considered wakefulness a newly coined meaning, parasitic on the then
emerging discipline of psychology, while at the same time judging the distinctively
philosophical use of the term to be a peculiar combination of several other meanings
(Dewey, 1906, pp. 40-41). The philosophical concept of consciousness surely remains such
a peculiar combination, but in fact it was the very distinction between psychological and
philosophical meanings that turned out later to be crucial, leading as it did (at the end of the
last century) to a fundamental split into two kinds of consciousness (described in the next
section).
Kinds of consciousness
Few if any would dispute the claim that any conscious subject must experience
something (distinct sensations, feelings, perceptions, etc.), and in this sense consciousness
may be said to be intertwined with experience. It is also claimed that conscious experience
is essentially private or subjective: i.e. directly accessible only to its subject, meaning that
nobody else knows, in the way that the subject does, what it is like to have it (see Nagel,
1974; Searle, 2000). At the same time, there are many objective characteristics of the
phenomenon of consciousness observed by science. Externally observed manifestations of
consciousness are usually correlated with sufficiently complex, non-random, goal-oriented
behavior, or with modes of action involving adaptation to changing conditions, non2
See: Webster Encyclopedic Unabridged Dictionary Gramercy Books, New York, 1989, p. 311 (four
meanings for consciousness, nine for conscious) or the online dictionary at
http://dictionary.reference.com/browse/consciousness (seven meanings) and at
http://dictionary.reference.com/browse/conscious (nine meanings), [Online, 20.11.2010]. See also note 1.
standard problem solving, decision making, and so on. Today, it is also possible to detect
consciousness objectively by monitoring the very low-level internal activity of the
subject: i.e. metabolic and electrical activity in specific brain regions. This sophisticated
form of micro-behavioral observation can furnish strong scientific evidence for the
presence of consciousness in an observed subject.
In the sense described above, consciousness is cognizable from the inside or from
the outside: from the subjects (first-person) or the observers (third-person) perspective.
This epistemic fact may serve as an initial criterion that will enable us to distinguish two
kinds of consciousness, i.e. subjective and objective. The criterion itself, and the distinction
that rides on it, will appear in our taxonomy as follows:
C.1 Epistemic criterion: kinds of consciousness3
1.1 Subjective (SKCs, cognized from subjects perspective, experienced)
1.2 Objective (OKCs, cognized from observers perspective, observed)
Explaining the difference between these two kinds of consciousness has proved to
be one of the greatest challenges facing contemporary philosophy and science. Wittgenstein
(1953, 412, p.131) had already noticed it over sixty years ago: The feeling of an
unbridgeable gulf between consciousness and brain-process This idea of difference in
kind is accompanied by slight giddiness4 That unbridgeable gulf between subjective
consciousness and objective brain processes has later come to be referred to as the
The criterion is called epistemic, rather than epistemological, in order to emphasize that what we are
concerned with here is just first-order cognition, in that the meta-cognitive reflection typical of epistemology
is not regarded as relevant. For more on this, see Wolenski, 2005, pp.83-4. The label epistemic is also broad
enough to encompass most similar sorts of division: e.g. Blocks P-consciousness vs. A-Consciousness,
Chalmers phenomenal vs. psychological, and Vimals experiential vs. functional.
4
explanatory gap (see Levine 1983, 2001), and has played a major role in subsequent
discussions.
An epistemic criterion also underlies the famous distinction made by Block (1995),
between P-consciousness and A-consciousness (phenomenal and access, respectively).
Inasmuch as both kinds of consciousness are accessed in a sense one only from the inside
(i.e. subjectively), the other also from the outside (i.e. objectively) the use of the terms
phenomenal and access here can be questioned. A detailed discussion of whether or not
the distinction itself is theoretically sound (see Kriegel, 2006) or empirically valid (see
Mangan, 1997) lies beyond the scope of the present article. It is sufficient for our purposes,
however, to note that the basic idea underlying it would appear to be consistent with the
subjective-objective distinction.
Chalmers, who has strongly endorsed Blocks idea, makes use of the notions of
phenomenal and psychological consciousness in much the same sort of context
(Chalmers, 1996 p. 23). Nevertheless, for him the idea has stronger consequences,
inasmuch as he claims that these two notions effect a division of the entire mind-body
problem into two distinct domains one fairly easy, the other really hard, or even,
perhaps, intractable for the sciences. The really hard problem of consciousness is the
problem of experience. When we think and perceive, there is a whirl of informationprocessing, but there is also a subjective aspect. As Nagel has put it, there is something it
is like to be a conscious organism. This subjective aspect is experience. (1995, p. 200201). Although not particularly innovative, the hard problem has become extremely
influential, surfacing as it did at just the right time when raging reductionism was
becoming increasingly unpopular in philosophy and in just the right form effecting as it
did a synthesis of the many anti-scientistic arguments inherited from past discussions.
The notion of epistemic subjectivity, central to such topics as the hard problem,
the explanatory gap and phenomenal consciousness, has been around in the
philosophical debate for much longer. Not only was it embroiled in the complicated
problem of qualia (genetically not very far from atomistic associationism and the sensedata debate; see Crane, 2000), but it also formed a basis for such notions as secondary
qualities, acquaintance, appearance and raw feels. Over the years, subjectivity has become
the weapon of choice in the critique of the dominant paradigms in scientific (objective)
explanations of mind one not easily disarmed, in that understood as something almost
primitive and indefinable, its meaning tends to become rather vague and elusive. Today, for
example, subjectivity is sometimes identified with awareness of point of view or of the
self, or with a feeling of what it is like to be someone (see Kriegel, 2006, p. 3; Levine,
2001, pp. 6-7.). Other authors extend it to the point of insisting that all consciousness is
essentially subjective (see Searle, 1992, pp. 20-21; 2000, pp. 561-70), or even further,
claiming existence of unconscious subjectivity (see Neisser, 2006, pp.1-6). Not
surprisingly, eliminativism has cut little ice with such an elusive entity: it seems easier to
disarm the implications of subjectivity by clarifying the meaning and reference of the
concept (with relation to qualia and phenomenality; see Bayne, 2009) than by seeking to
eliminate it. One positive result of the debate over subjectivity is that the latter has acquired
significance as something to be explained within consciousness studies. Yet this probably
does not correspond to a substantial shift in the foundations of science: we still expect it to
be objective, even when targeting subjectivity itself.
Finally, it is worth noticing that when distinguishing two kinds of consciousness on
the basis of the epistemic criterion, both of the basic meanings (M1 and M2) are in play.
Being in a state of consciousness (M1), as well as being conscious of something (M2),
will invoke certain objective manifestations (at a micro- or a macro-behavioral level) and
may, at the same time, result in certain subjectively experienced qualities. Considered in
isolation, and especially when understood along Dennettian lines (Dennett, 1988) as
intrinsic, ineffable, immediately accessible, private feelings, qualia (essential for Pconsciousness) may seem more like states (M1). For example, certain altered states of
consciousness (e.g. hypnosis, trance, meditation) quite often lack any distinct referential
content actually known to the subject, and so tend to be characterized in an essentially
qualitative way, in terms of what it is like to undergo them.5 Nonetheless, one might also
regard them as relational states holding between a subject and a phenomenal property, or
even between proper (internal) parts of a subject of consciousness.6 In A-consciousness, on
the other hand, meaning M2 seems to be more evident, as one must have access to
something if one is to be conscious of it.
Orders of consciousness
To build a materially adequate definition of truth, avoiding paradoxes, Tarski
(1933) made an important semantic distinction between, as he called it, object-language
(the language under discussion) and meta-language (the higher-order language used to talk
about an object-language). His famous sentence, snow is white is true if and only if snow
is white, is in the meta-language, as it concerns the first-order truth conditions for the
object-language sentence about snow. Consequently, this commentary (about Tarskis
meta-language sentence) must itself be in a meta-meta-language, or third-order language,
Altered and other states of consciousness will be discussed separately in this paper.
which in turn brings this actual sentence to the level of fourth-order status, and so on.7 In
practice, though, it is rare that we have cause to go beyond third-order constructions.
When one construes consciousness in the sense of M2, similar gradations of
semantic order are clearly in evidence. A subject X may be just conscious of Y, may be
aware of being conscious of Y, or may even be conscious of the fact that she was aware
of being conscious of Y, etc. Obviously, an increase of semantic order will reflect, directly
or indirectly, a variety of physiological, developmental and social factors: for example, the
possession of certain neurological structures, the instantiation of appropriate developmental
conditions, involvement in essential environmental or social interactions, and so on. This is
why we find that the various hierarchical models of consciousness presented in the
literature tend to feature a mix of all these elements, with the proportions reflecting the
chosen area of specialization of the individual researcher. (For a brief overview of theories
proposing multiple levels of consciousness, see Morin, 2006.)8
It is worth noticing, here, that the semantic hierarchy of conscious information is
linear, whereas biological systems operate nonlinearly, with the information used by
organisms being processed on many levels at the same time (parallel), and in different
structures of their nervous system (distributed). Naturally, the whole process is dynamic:
i.e. subordinated to the actual state of both the organism (aims, emotions, physiology) and
the environment (quantity and quality of available information, available reaction time,
Generalizing the idea, for any n-order sentence it will be possible to create a higher-order sentence (n+1) in
a richer meta-language where first-order sentences function as the bottom-level limiting cases that directly
refer to the world.
8
The expression levels of consciousness is quite popular, but unfortunately also ambiguous, as it is used not
only to denote semantic orders of reference but also levels of (behavioral or metabolic) arousal and of
(individual or social) development. For the sake of clarity, I suggest using the expression levels of
consciousness just to denote neuronal or behavioral arousal, the phrase orders of consciousness just to
convey the idea of a semantic hierarchy, and degrees of consciousness to refer just to gradations of a
developmental kind.
10
The general aim of the process is to select efficient patterns of action out of those stored during development
(individual history or ontogenesis) while also belonging to a set of genetic possibilities (established during
phylogenesis).
11
been avoided or the avoidance behaviour itself, one may say that it exhibits first-order
consciousness of the environment (1stOC). As the outcome of basic sensory detection
processes, 1stOC enables online motor coordination, and thus may be called sensorimotor
consciousness.10 Consequently, information accessed at the level of second-order
consciousness (2ndOC) is about the first-order information, and so does not refer directly to
the environment itself. At this level, sensorimotor information comes to be integrated into
basic perceptual wholes or percepts, making perceptual consciousness an appropriate term
here. A creature whose perceptual information is thus integrated should be capable of
proto-categorization: apart from obstacle avoidance, they should be able to execute
choices, to respond differentially to what counts, for them, as being or not being safe,
desirable, useful, edible, and so on. In a sense, it is at this point that the subjects cognitive
system starts to answer the What-is-it-I-am-perceiving?-question in addition to the
Where-is-it?-question of the previous stage.
As a scientific justification for such a view, we may invoke the well-supported
hypothesis of the existence of two relatively independent pathways in the brain, known as
the dorsal and ventral streams, adapted respectively for motor actions and perception (see
Milner & Goodale, 1998). Recent findings suggest that these two paths are not as
independent as initially assumed (see Farivar, 2009): they are, in some way, coupled
together, functioning as consecutive or integrated steps in an overarching process of
cognition. This surely makes for a picture of what is going on in cognition that is
10
For efficient motor actions, the organism will also need basic (first-order) body-consciousness (based on
proprioception). For our purposes, body-consciousness will be located in the category types of consciousness.
12
significantly more consistent with the implications of the notion of semantic orders of
consciousness.11
Described like this, both 1stOC and 2ndOC are frequently seen as not amounting to
consciousness at all. These early stages of motor and perceptual responses are more often
labeled as preconscious, subliminal or even unconscious. But why assume that this process
of arriving at successive levels of semantic superstructure only acquires the special feature
we call consciousness at a certain higher-order level? The very fact of information use
by a given subject should, regardless of its order, be taken to constitute consciousness of
that information no more and no less so no special further ingredient is needed.12
Third-order consciousness (3rdOC), insofar as it transcends the basic perceptual
information present in 2ndOC to yield information about ones own perceptions themselves,
may be termed meta-perceptual consciousness.13 Being aware of the perceptual process, at
this stage, a subject will be able not only to categorize percepts, but also to make basic
choices and predictions within and upon perceptual modalities. For example, he or she will
now know that some particular acoustic and visual data come from the same object, and
may correct the relevant perceptions more efficiently (e.g. matching up the blurred image
of an animal with the distinctive sound it makes). At the next level of abstraction (4thOC),
which takes us beyond perceptual processes, the subject should begin to be aware of the
existence of their own agency and/or selfhood, thus acquiring self-consciousness.
11
The famous D.F. case, studied by Milner and Goodale, is sometimes interpreted as a case of seeing
without consciousness, but should rather be understood as an example of acting without perception. Indeed,
that was the position of the authors themselves, and remains compatible with the claim that D.F. possessed
first-order visual consciousness at least, but without higher orders of visual awareness.
12
Of course, the essentially pragmatic characterization of consciousness and the notion of information
implied here ought to be spelled out in more detail, but that task will have to be addressed elsewhere.
13
This level of consciousness is sometimes characterized as introspective or reflexive (see e.g. Van
Gulick, 2004, Kriegel, 2007). However, because the very notion of introspection is itself ambiguous, while
reflexivity may refer to either different orders or the same one, the label meta-perceptual seems more neutral
and appropriate.
13
Admittedly, certain informational elements pertaining to the self will have already had to
be present within consciousness at previous stages, but it is only here that the self as a
whole can become an object of consciousness.14 Although a proto-conception of selfhood
already emerges here, the formation of a coherent conception of self calls for yet another
stage. Fifth-order (5thOC) meta-self-consciousness requires a capacity to engage in
symbolic thought about ones own self the sort of capacity only made possible by
something as distinctive as human language.15 Whereas previous semantic levels, up to and
including self-consciousness, are shared by us with other species, meta-self-consciousness
seems to be unique to the human brain.16 Recent studies also point to the evolutionary
immaturity of this highest form of consciousness in respect of what human brains are
potentially capable of (see Fleming et al., 2010).17
To sum up, then, the orders of consciousness, distinguished on the basis of a
semantic criterion, will be as follows:
C.2 Semantic criterion: orders of consciousness
2.1 Sensorimotor consciousness (1stOC, about the environment)
2.2 Perceptual consciousness (2ndOC, about percepts)
2.3 Meta-perceptual consciousness (3rdOC, about perception)
14
Elements of proprioceptive-body consciousness and certain social consciousness will have had to be
present before the subject becomes self-aware. Empirical data suggest that animals capable of being selfaware learn this mainly from social relations, distinguishing their own bodies by observing others (thanks to
mirror neurons and empathy). However, this is not a new thought: it was already endorsed a century ago by
pragmatists in America and, in Europe, by Bergson, who was then followed by the phenomenologists
(especially Merleau-Ponty).
15
The crucial role of language in higher-order consciousness processes is emphasized, among others, by
Clowes (2007), Stamenov (2003) and Morin (2005).
16
Animals that efficiently recognize themselves in a mirror (passing the so-called mark test) are generally
thought to be self-conscious in virtue of this fact. Indeed, it has been proved that apart from human beings and
great apes, elephants and some marine mammals, such as bottlenose dolphins and orcas, also do this. See
Smith (2009), Plotnik et.al, (2006), Reiss & Marino (2001), Delfour & Marten (2001).
17
In an experiment, Fleming and others found that not everybody is able to introspect and evaluate their own
conscious decisions with equal accuracy: those who performed best were found to possess a substantially
greater volume of gray matter in the region of the anterior prefrontal cortex. The claim that their meta-selfconsciousness was fully developed seems legitimate.
14
18
19
See Locke (1996/1689, pp. 33-39), Kant (1997/1781, p.153) and Leibniz (1996/1704, preface). Leibniz is
known for the many innovative observations he makes, contributing in essential ways even then to what we
would now call consciousness studies for example in areas such as attention, memory, motivation and
unconsciousness, to name but a few.
15
21
As an anonymous reviewer of this paper has perceptively pointed out, it is by no means easy to
accommodate so-called self-representational theories (see Kriegel 2006, 2007) within the model of a semantic
ordering of conscious experience, especially given the Tarskian preamble invoked here. Hence such selfrepresentation or self-reference may, in all probability, need to be ruled out by stipulation. Although much
more detailed analysis would be required to properly resolve the issue, we may glimpse the beginnings of
where it might lead by taking due note of the following: while it is true that we distinguish semantic orders of
consciousness much as we do orders of language, by referential content, in the case of consciousness the
question of whether orders are numerically or logically independent from one another remains open.
Certainly, the most important semantic feature that consciousness shares with language is its referentiality, or
aboutness, but the main difference is that consciousness is not embedded in a language-like symbolic system.
Higher-order language has to be richer than a lower-order language system (to avoid self-reference), whereas
higher-order consciousness is embedded in the same cognitive system as lower-order consciousness only it
uses more complex cognitive structures contained within that system. From one perspective, then,
consciousness seems to be a complex but unified biological process, whilst from another it appears as a
multilevelled semantic structure. In that case, the discussions surrounding the nature of its semanticality seem
destined to continue well into the future.
16
any investigation of its functions redundant (e.g. Watson, 1924). Their attempts, while
apparently successful at first, subsequently provoked the rapid growth of consciousness
studies still visible today.
Contemporary psychologists also often see consciousness as a process exhibiting
gradational structure. For example, four out of six varieties of consciousness described by
Natsoulas (1983, 1997a, 1997b), may be understood as consecutive semantic orders, with
that authors consciousness six bearing a close resemblance to the idea of meta-selfconsciousness described earlier. The five kinds of self-knowledge that Neisser (1988)
distinguishes reveal an even more obvious gradational structure of a semantic nature,
moving from the idea of an ecological self to that of a symbolic-self-concept. As we have
already mentioned, Morin (2006) has juxtaposed and systematized many theories like this:
gradational orderings are discernible, for example, in both Zelazos (2004) developmental
approach and the neuroscientific approach of Stuss and Anderson (2004).
In neuroscience, well known contributions compatible with the semantic orders
view include Damasio (1999), involving a distinction between protoself, core
consciousness and extended consciousness, and Edelman (1992), analyzing primary
consciousness not just in terms of a contrast with higher-order consciousness but also in
terms
of
sublevels
(conceptual
categorization,
scene
formation
and
symbolic
States of consciousness
Whereas the term consciousness was deployed in the previous section in the sense of M2
(X is conscious of Y) when exploring semantics-based orderings, it will now be used in that
17
18
22
For instance Faw (2009, pp. 64-6) distinguishes a normal wakeful state of consciousness (NWS), dreamsleep consciousness and slow-wave-sleep consciousness and unconsciousness.
23
This initial selectivity for stimulus importance and familiarity at brainstem level is accomplished in
structures such as the dorsal raphe nucleus, pedunculopontine tegmantal nucleus (PPTN) and locus
coeruleus.
24
For discussions of NCCs, see for example, Metzinger (2000), No & Thompson (2004), and Hohwy
(2009).
19
correlated with consciousness (e.g. the prefrontal areas, anterior cingulate gyrus) do not
process information in the early stages, and are involved in complex, far-reaching
connections.25 According to another finding, if activity in those areas occurs for
approximately half a second, with firing-potentials synchronized in time and at a certain
frequency (mainly at gamma-wave level), there is a strong likelihood that the subject will
be conscious.26 We definitely do know something here, but are still awaiting a detailed
specification of how these connections within the T-C complex could give rise to the whole
range of events involved in consciousness. (Dynamical core theory is one of the most
popular accounts of this: see Tononi & Edelman, 2000.)
Things become still more problematic when seeking to differentiate between a
minimally conscious state and one entirely bereft of consciousness (see Dehaene et al,
2006). This is not only theoretically complicated, as manifestations of consciousness are
not well defined, but also challenging in practice, especially when certain disorders and
impairments affect the normal state. In clinical practice, fixing the borders of consciousness
is sometimes a life-or-death matter and, as such, should not ever be permitted to be
mistaken. Unfortunately, as Giacino (2005) found, up to forty-one per cent (!) of cases may
be misevaluated. Patients with severe consciousness disorders are assessed, in mainly
quantitative terms, against certain scales. The diagnostic procedures are based on the
recording of certain behavioral responses to induced stimuli, with the score obtained on a
given scale crucial to the prognosis for recovery and the planning of treatment. The first
and most popular diagnostic tool, the Glasgow Coma Scale (GCS), is still used in refined
and revised versions; others used today are, inter alia, CRS and CRS-R (Coma Recovery
25
However, scientific inquiries are mostly concerned with higher-order phenomenal consciousness of a
certain type: e.g., the higher-order experience of vision (visual consciousness).
26
A time delay in higher-order consciousness is revealed for example in the famous ERPs P300 and N400,
correlated, respectively, with new and semantically incoherent stimuli.
20
See Teasdale & Jennett (1974), as well as Schnakers et.al (2008). Giacino (2008), moreover, lists seventeen
different scales used in consciousness disorder assessments.
21
28
This term, originally coined by Ludwig (1966), was popularized by Tart (1969). Detailed analysis of the
many varieties of ASC may be found in Kokoszka (2007), who distinguishes between profoundly altered
(PASC) and superficially altered (SASC) states of consciousness.
22
and psychologically impaired states (ISCs), as well as from altered states ones (ASCs). It
would, perhaps, be logically correct to group these states of consciousness into the
physiologically normal (WSCs, SlSCs) and abnormal (ISCs, ASCs), but such a grouping,
apart from being ethically questionable, would not be of much use, given that we can only
define normal states in relative terms. Hence, we may pass over it as we proceed to
summarize the distinctions outlined above:
C.3 Physiological criterion: states of consciousness
3.1 Wakeful states (WSCs, occurring in physiological wakefulness)
3.2 Sleep-states (SlSCs, occurring in physiological sleep)
3.3 Impaired states (ISCs, occurring in neuropsychological disorders)
3.4 Altered states (ASCs, occurring in non-standard conditions)
Types of consciousness
Where the semantic criterion was concerned, our focus was on information accessed in
consciousness: its order-of-reference, to be precise. Where the physiological criterion was
concerned, it was on the subjects states of consciousness. In the current section, though,
both factors show up as important. Further distinctions this time between different types
of consciousness are made according to a criterion that can be considered pragmatic, as it
concentrates on the following three problems:
(1) What is the major source of the information the subject is conscious of?
(2) For what purposes, and in what circumstances, may the information given be made use
of? Put another way, what is the aim and context-of-use for the consciousness?
(3) Who or what is the subject of consciousness? That is, what type of animal or cognitive
system can, and does, possess consciousness?
23
As far as (1) is concerned, it is not possible to discern more than a few sourcedefined types of consciousness (SoTCs). In the case of humans and many animals, what
may be distinguished are just visual and auditory, olfactory and gustatory, tactile and
proprioceptive (or bodily) types. These are listed in pairs, insofar as they stand in close
relations to one another structurally and functionally: e.g., the visual cortex lies close to
auditory areas and both senses may serve as a basis for spatial orientation. It is worth
noticing, however, that some sensations, like pain (important in consciousness studies) and
balance, rely on intertwined inputs from multiple sensory systems. At the same time, not all
sensory systems have distinct sensory organs proprioception, for example, lacks a
particular organ or even a distinctive type of receptor: smell and taste both rely on
chemoreceptors, touch and hearing on mechanoreceptors. One feature that particularly
calls for further investigation here is the fact that not all types of sensory information count
for the higher orders of consciousness: in human beings, for example, visual consciousness
may certainly inform the symbolic order, but proprioceptive information seems only to
count for a significantly lower one.
In answer to (2), we may assert that there are, indeed, many use-defined types of
consciousness (UTCs), including social consciousness, emotional consciousness, body
consciousness, spatial consciousness, motor-skill consciousness, time consciousness etc..29
Each of these refers to a certain type of information and an ability to use it in specific
situations. Roughly speaking, information accessed at a given moment (i.e. entering
consciousness) is an outcome of comparisons between external information (i.e. of an
environmental sort) and internal information (i.e. the kind stored in memory systems).
29
Social consciousness is understood here as individual skillfulness in making use of social information
(gestures, signs, emotions, etc.), not as a collective mentality emerging within a closely interacting group
setting (Pareira & Ricke, 2009, p.40). The notion of social consciousness, in the latter sense especially, was
developed in the early 20th century by Royce (1895), Cooley (1907), Mead (1910) and others.
24
These cognitive resources differ between individuals, even within a single species. For
example, it is simply not the case that every man or woman has identical social or motor
skills: effectiveness and competence with respect to consciousness of any particular type
will always be a function of individual history, habitus (i.e. ecological niche and social
group) and way of life (such as the amount and type of activities entailed) not to mention
genetic determinations of what is feasible.
An important point to make with respect to UTCs is that cognition embedded in
natural systems is, as far as we know, strongly adapted to use in certain environmental
contexts, fulfilling biologically specified needs (aims): in short, such cognition is always
situated. The organism, finding itself in a given situation, is always committed to making
certain cognitive assumptions, at one and the same time adjusting its sensory-systems
(sensitizing itself to this or that stimulus type) and reducing the set of possible modes of
action (active heuristics). Such a procedure is not only economically justified, but also
practically efficient, in spite of its susceptibility to error. Consciousness in nature is, then,
definitely designed to be of use in specific conditions: usability or situatedness should be
thought as one of the most important factors when explaining the functioning and origins of
consciousness as a natural phenomenon. For this reason alone, it would be a mistake to
neglect UTCs in favour of other varieties of consciousness described in the article.
As regards (3), where the question concerns the type of animal or cognitive system
that may possess consciousness (system-defined type of consciousness, SyTC), a multitude
of theoretically and practically challenging problems have been raised. Scientists and
philosophers associate consciousness not only with naturally evolved systems, like
chimpanzees, bats, dolphins and fruit flies (see the studies of animal consciousness in
Griffin & Speck, 2004 and Edelman & Seth, 2009), but also with artificial systems (see the
25
investigations of machine consciousness in Holland, 2003 and Torrace et al., 2007), and
even with counterfactual or hypothetical systems such as Zombies, Martians, Mary the
neuroscientist, thermostats, the population of China, and so on. There are many
fundamental arguments about the form of consciousness possessed by those creatures is it
phenomenal or not, self-consciousness or merely perceptual consciousness, normal or
somehow altered? and about the very possibility of possessing it.
Relative to the pragmatic criterion described in this section, the term
consciousness may be said to function in both of its basic meanings (M1 and M2), since
both states and referents of consciousness count here as important. In sum, we have sought
to distinguish the following types of consciousness:
C.4 Pragmatic criterion: types of consciousness
4.1 Source-defined (SoTCs, according to type of sensor)
4.2 Use-defined (UTCs, according to type of situation)
4.3 System-defined (SyTCs, according to type of system)
26
Criteria
Varieties of
Consciousness
Description
Examples
Epistemic
1.1
Subjective consciousness
(SKC)
1.2
Objective consciousness
(OKC)
2.1
Sensorimotor
consciousness
(1stOC)
2.2
Perceptual consciousness
(2ndOC)
Phenomenal,
first-person
qualitative, forme-ness, whatits- like-ness
Access,
psychological,
third-person,
functional
Sensorimotor
awareness,
ecological self,
proto-self
2.3
Meta-perceptual
consciousness
(3rdOC)
2.4
Self-consciousness
(4thOC)
2.5
Meta-self-consciousness
(5thOC)
3.1
Wakeful states of
consciousness (WSCs)
3.2
Sleep-states of
consciousness (SlSCs)
3.3
Impaired states of
consciousness (ISCs)
3.4
Altered states of
consciousness (ASCs)
Kinds
of
Consciousn.
Semantic
Orders
of
Consciousn.
Physiological
States
of
Consciousn
Pragmatic
Types
of
Consciousn.
4.1
Source-def. types of
consciousness (SoTCs)
4.2
Use-def. types of
consciousness (UTCs)
4.3
System-def. types of
consciousness (SysTCs)
Refers to environment.
Applied in basic motor
actions
Transitive
consc., coreconsc.
perceptual
categorization
Inner sense,
state consc.,
introspective,
pre-reflexive
Refers to perceiving
subjects. Enables selfidentification
Self-consc.
extended consc.
Refers to self-conscious
subject.
Enables abstract concept of
self
Symbolic selfconcept,
recursive selfconsc.
Normal state of
waking consc.
(NWS)
Occur in physiologically
normal wakefulness
Occur in physiologically
normal sleep
Occur in neurological and
psychological disorders of
varied etiology
Occur in non-standard
conditions that cause
qualitative changes
Distinguished according to
originating receptor type or
sensory system
Distinguished according to
type of situation (aim and
context) in which it is used
Distinguished according to
type of cognitive system
(subject) in which it occurs
Basic
Meanings
involved
M1
X is
conscious
(awake)
M2 X is
conscious
of Y
(aware)
M2
REM-consc.
NREM-consc.
Minimal,
blurred,
epileptic stupor,
delirium, etc.
Hypnosis,
trance,
meditation,
drug
intoxication,
OOBE, NDE
Visual, auditory
olfactory,
gustatory,
tactile,
proprioceptive
Emotional,
social, face,
language,
motor-skill, etc.
Animal, human,
machine,
artificial,
Martian, etc.
M1
M1
M2
27
Conclusion
The four-fold taxonomy set out here aims to serve as a theoretical framework for
further investigations into consciousness: consciously applied, it should make the concept
clearer and considerably more unified. It is hoped that even if consciousness turns out not
to be a fully unified phenomenon, the taxonomy proposed here will still be useful as a tool
for clarification, possibly enabling a range of philosophers and neurologists to specify their
subject of enquiry more exactly: that is, it could also serve to map out the relationships
between different phenomena (in the absence of an argument for them being unified) in a
theoretically useful way.30 On the other hand, the taxonomy itself makes a substantial case
for the notion that the distinct varieties of consciousness introduced by scientists in fact
refer to a single underlying natural phenomenon analyzed under four major aspects. If that
is so, then the basic criteria applicable within the science of consciousness studies are just
the following: epistemic access to consciousness, the semantics of conscious information,
the physiological underpinnings of the entire process involved in a subjects exhibiting
consciousness and, finally, the pragmatic relations holding between a given subject and
information of which he or she may be said to be conscious.
Is the taxonomy complete? Does it exhaust the concept of consciousness as it
appears in the philosophy of mind, cognitive neuroscience, clinical psychology, and other
related areas? It does seem that almost every conceivable example of consciousness along
the lines of those mentioned in the introduction, for example could be fitted into the
taxonomy somewhere. However, in fact there are a few conceptions that cannot be. One of
these is embodied consciousness. Even so, embodiment, unlike other relevant notions,
points to certain meta-theoretical assumptions of an explanatory-methodological character,
30
28
rather than to any particular aspects of the phenomenon of consciousness itself, such as
might count as distinct from those included in the taxonomy here.31 That is why embodied
consciousness does not, in truth, form another variety.32 One need not doubt that a few
more examples of this sort are, indeed, to be found quantum consciousness, for instance,
could be another candidate. What is certain, though, is that issues like this, together with
other important consequences and questions raised by the taxonomy presented, call for a
separate discussion to be pursued on another occasion.33
31
The general aim of the embodiment movement, roughly speaking, is to pull explanations of consciousness
out of the head, towards the body and environmental interactions.
32
It seems, however, that a unified concept of consciousness, such as might eventually turn out to be based on
this four-fold taxonomy, would in fact be closely attuned to the general meta-theoretical assumptions
informing the notion of embodied consciousness.
33
The issues in question are discussed in the present authors The four-fold concept of consciousness
(forthcoming).
29
References:
Armstrong, D. (1981) The Nature of Mind and Other Essays, Ithaca, NY: Cornell University Press.
Alvarez-Silva, S. et al. (2006) Epileptic consciousness: Concept and meaning of aura, Epilepsy & Behavior,
8, pp. 527533.
Bayne, T. (2009) Consciousness, in J. Symons & P. Calvo (eds) Routledge Companion to the Philosophy of
Psychology, pp. 477-94.
Bickle, J. (2008) The molecules of social recognition memory: Implications for social cognition, extended
mind, and neuroethics, Consciousness and Cognition 17, p. 472
Block, N. (1995) On confusion about a function of consciousness, Behavioral and Brain Sciences, 18 (2), pp.
227-287.
Bogen, J. E. (1995). On the Neurophysiology of Consciousness, An Overview. Consciousness and Cognition,
4(1), pp. 52-62.
Bosinelli, M. (1995) Mind and consciousness during sleep, Behavioral and Brain Research, 69, pp. 195-201.
Brook, A. & Raymont, P. (2006) [Online], http://http-server.carleton.ca/~abrook/papers/2006-UnifiedConscPreface.pdf [20.11.2010], (preface to forthcoming book: Unified Theory of Consciousness).
Brook, A. (2008) Terminology in Consciousness Studies, [Online],
http://www.ym.edu.tw/assc12/tutorials.html#02 [20.11.2010], (ASSC conference, 2008).
Carruthers, P. (2005) Consciousness: Essays from a Higher-Order Perspective, Oxford: OUP.
Carruthers, P. (2009) Higher-Order Theories of Consciousness, The Stanford Encyclopedia of Philosophy
(Fall 2009 Edition), Edward N. Zalta , online: http://plato.stanford.edu/entries/consciousness-higher [08.
2011]
Chalmers, D. (1995) Facing Up to the Problem of Consciousness, Journal of Consciousness Studies, 2 (3), pp.
200-219.
Chalmers, D. (1996) The Conscious Mind: in Search of a Fundamental Theory, Oxford: OUP.
Chemero, A. (2009) Radical Embodied Cognitive Science, MIT Press.
Clowes, R. (2007) A Self-Regulation Model of Inner Speech and its Role in the Organisation of Human
Conscious Experience, Journal of Consciousness Studies, 14 (7), pp. 59-71.
Cooley, C. H. (1907) Social Consciousness, Proceedings of the American Sociological Society, 1, pp. 97-109.
[Online], http://www.brocku.ca/MeadProject/Cooley/Cooley_1907.html [20.11.2010].
Cologan, V.et al., (2010) Sleep in disorders of consciousness, Sleep Medicine Reviews, 14, pp. 97105.
Crane, T. (2000) The origins of qualia, in Tim Crane & Sarah Patterson (eds) The History of the Mind-Body
Problem, London: Routledge.
Damasio, A. (1999) The Feeling of What Happens: Body, Emotion and the Making of Consciousness.
London: Vintage.
Dehaene, S. et al., (2006) Conscious, preconscious, and subliminal processing: a testable taxonomy, Trends in
Cognitive Sciences, 10 (5), pp. 204-211.
30
Delfour F. & Marten K. (2001) Mirror image processing in three marine mammal species: killer whales
(Orcinus orca) false killer whales (Pseudorca crassidens) and California sea lions (Zalophus
californianus), Behavioural Processes, 53 (3), pp.181-190.
Dennett, D.C. (1988) Quining Qualia, in Marcel, A. & Bisiach, E. (eds) Consciousness in Modern Science,
Oxford: OUP.
Dewey, J. (1906) The Terms Conscious and Consciousness, Journal of Philosophy, Psychology and
Scientific Method, 3, pp. 39-41. [Online], http://www.brocku.ca/MeadProject/Dewey/Dewey_1906.html
[20.11.2010].
Edelman, G. (1992) Bright Air, Brilliant Fire: On the Matter of the Mind, NY: Basic Books.
Edelman, G. & Tononi, G. (2000) Reentry and the dynamic core: Neural correlates of conscious experience,
in Metzinger, T. (ed), Neural Correlates of Consciousness, Cambridge, MA: MIT Press, pp. 139-151.
Edelman, G (2003) Naturalizing consciousness: A theoretical framework, Proceedings of the National
Academy of Sciences, USA, 100 (9), pp. 55205524.
Edelman, D. & Seth, A. (2009) Animal consciousness: a synthetic approach, Trends in Neuroscience, 9, pp.
47684.
Farivar, R. (2009) Dorsal-ventral integration in object recognition, Brain Research Reviews, 61 (2), pp.144153.
Faw, B. (2009) Cutting consciousness at its joints, Journal of Consciousness Studies, 16 (5), pp 54-67.
Fleming, S. M., Weil, R. S., Nagy, Z., Dolan, R. J. & Rees, G. (2010) Relating Introspective Accuracy to
Individual Differences in Brain Structure, Science, 329 (5998), pp. 1541-1543.
Gennaro, R. (2005) The HOT theory of consciousness: between a rock and a hard place, Journal of
Consciousness Studies, 12, pp. 3-21.
Giacino, J.T. (2005) The minimally conscious state: defining the borders of consciousness, Progress in Brain
Research, 150, pp. 381-95.
Giacino, J.T. (2008) [Online],
http://www.internationalbrain.org/pdf_public/lisbon/Giacino%20Lisbon,%20Advances%20in%20Neurob
ehavioral%20Assessment.pdf [20.11.2010].
Griffin, D. R. & Speck, G. B. (2004) New evidence of animal consciousness, Animal Cognition 7, pp. 5-18.
Hamlyn, D. W., (1968) Aristotle's De Anima Books II and III, Oxford: Clarendon Press.
Hicks, R.D. (1907) Aristotles De Anima, Ambridge University Press.
Hohwy, J. (2009) The neural correlates of consciousness. New experimental approaches needed?,
Consciousness and Cognition, 18, pp. 428-38
James, W. (1890/1999) The Principles of Psychology, Bristol: Thoemmes Press.
Holland O. (2003) (ed.) Machine Consciousness, Journal of Consciousness Studies, 10 (4-5).
Johanson, M. et al. (2003) Level and contents of consciousness in connection with partial epileptic seizures,
Epilepsy & Behavior, 4, pp. 279285.
31
Kant, I. (1781/1997) Critique of Pure Reason, transl. By P. Guyer & A. Wood, Cambridge: CUP
Kokoszka A. (2007) States of Consciousness. Models for Psychology and Psychotherapy. Springer, NY.
Kriegel, U. (2006) Consciousness: Phenomenal Consciousness, Access Consciousness, and Scientific
Practice, In P. Thagard (ed.), Handbook of Philosophy of Psychology and Cognitive Science, Amsterdam:
North-Holland, pp. 195-217.
Kriegel, U. (2007) The same-order monitoring theory of consciousness, Synthesis Philosophica, 2, pp. 361384.
Kriegel, U. & Williford, K. (2006) Self-Representational Approaches to Consciousness. Cambridge, MA:
MIT Press.
Levine, J. (1983) Materialism and Qualia: the Explanatory Gap, Pacific Philosophical Quarterly, 64, pp.
354361.
Levine, J. (2001) Purple Haze: The Puzzle of Consciousness, Oxford and New York: Oxford UP.
Levine, J. (2007) Two kinds of access, Behavioral and Brain Sciences, 30 (5/6), pp. 514515.
Leibniz, G. W. (1704/1996) New Essays on Human Understanding, transl. P. Remnant & J. Bennett,
Cambridge: CUP
Ludwig, A. M. (1966) Altered states of consciousness, Archives of General Psychiatry, 15 (3), pp. 225 -234.
Lycan, W. G. (1996) Consciousness and Experience, Cambridge, MA: MIT Press.
Mangan, B. (1997) Empirical status of Block's phenomenal/access distinction, Behavioral and Brain
Sciences, 20 (1), pp. 153-154.
Mead, G. H. (1910) Social Consciousness and the Consciousness of Meaning, Psychological Bulletin, 7, pp.
397-405; [Online], http://www.brocku.ca/MeadProject/Mead/pubs/Mead_1910a.html [20.11.2010].
Mead, G.H. (1934) Mind, Self and Society from the Standpoint of a Social Behaviorist (Ed. Charles W.
Morris). Chicago: University of Chicago Press.
Metzinger, T. (2000) (ed.), Neural Correlates of Consciousness: Empirical and Conceptual Questions
(Cambridge, MA: The MIT Press/A Bradford Book).
Milner, D. & Goodale, M. (1998) Visual Brain In Action, Psyche, 4 (12)
Moller, H.J, Devins G.M, Shen, J., Shapiro C.M., (2006) Sleepiness is not the inverse of alertness: evidence
from four sleep disorder patient groups, Experimantal Brain Research, 173, pp. 258266.
Morin, A. (2005) Possible links between self-awareness and inner speech, [Online],
http://cogprints.org/3784/1/IS.pdf [20.11.2010].
Morin, A. (2006) Levels of consciousness and self-awareness, Consciousness and Cognition, 15, pp. 358-371.
Nagel, T, 1974, What is it like to be a bat? Philosophical Review 83, pp. 435-451.
Natsoulas, T. (1983) Concepts of consciousness, The Journal of Mind and Behavior 4 (1), pp. 13-59
Natsoulas, T. (1997a) Consciousness and self-awareness: Part I. Consciousness1, consciousness2, and
consciousness3, Journal of Mind and Behavior, 18 (1), pp. 5374.
32
Natsoulas, T. (1997b) Consciousness and self-awareness: Part II. Consciousness4, consciousness5, and
consciousness6, Journal of Mind and Behavior, 18 (1), pp. 5374.
Neisser, U. (1988) Five Kinds of Self-Knowledge, Philosophical Psychology, 1, pp. 35-59.
Neisser, U. (2006) Unconscious Subjectivity, Psyche, 12 (3); http://www.theassc.org/files/assc/2642.pdf
[20.11.2010]
No, A., & Thompson, E. (2004). Are There Neural Correlates of Consciousness? Journal of Consciousness
Studies, 11(1), 3-28; Commentaries by other authors: pp. 29-86. Response by No and Thompson, pp. 8798.
Pareira, A. & Ricke, H. (2009) What is consciousness? Towards a preliminary definition, Journal of
Consciouness Studies, 16 (5), pp. 28-45.
Patterson, J.R. & Grabois, M. (1986) Locked-in syndrome: a review of 139 cases, Stroke, 17, pp. 758-764.
Pierre, J. (2003) Intentionality, The Stanford Encyclopedia of Philosophy, ed. E. N. Zalta, p. 9
http://plato.stanford.edu/entries/intentionality/#9 [20.07.2011]
Plotnik, J.M., de Waal, F.B.M. & Reiss, D. (2006) Self-recognition in an Asian elephant, Proceedings of the
National Academy of Sciences, USA, 103/45, pp.1705317057; [Online],
http://www.pnas.org/content/103/45/17053.full.pdf [20.11.2010].
Plum, F. & Posner, J. B. (1982) The Diagnosis of Stupor and Coma, Oxford: OUP.
Prinz, J., (forthcoming) Is consciousness embodied [Online],
http://www.unc.edu/~prinz/IsConsciousnessEmbodiedPrinz.pdf [20.11.2010].
Reiss, D. & Marino, L. (2001) Self-recognition in the bottlenose dolphin: A case of cognitive convergence,
Proceedings of the National Academy of Sciences, USA, 98 (10), pp. 5937-5942; [Online],
http://www.pnas.org/content/98/10/5937.full [20.11.2010].
Rosenthal, D. (1986) Two concepts of consciousness, Philosophical Studies, 49, pp. 329-359.
Royce, J. (1895) Self-consciousness, Social Consciousness and Nature (II) Philosophical Review, 4, pp. 577602. [Online], http://www.brocku.ca/MeadProject/Royce/Royce_1895b.html [20.11.2010].
Smith, J. D. (2009) The study of animal metacognition, Trends in Cognitive Sciences, 13 (9), pp. 389-396.
Schnakers, C. et.al, (2008) A French validation study of the Coma Recovery Scale-Revised (CRS-R), Brain
Injury, 22 (10), pp. 786-792.
Searle, J. (1992) The Rediscovery of the Mind, MIT Press.
Searle, J. (2000) Consciousness, Annual Review of Neuroscience, 23, pp. 557-578.
Sellars W. (1962) Philosophy and the Scientific Image of Man, in: Robert Colodny, (ed), Frontiers of Science
and Philosophy, Pittsburgh: University of Pittsburgh Press, pp. 35-78.
Stamenov, M.I. (2003) Language and self-consciousness: Modes of self-presentation in language structure in:
The Self in Neuroscience and Psychiatry, Kircher T. & David, A. S. (eds) Cambridge: CUP.
Stuss, D. T. & Anderson, V. (2004). The frontal lobes and theory of mind: Developmental concepts from
adult focal lesion research. Brain and Cognition, 55(1), pp. 6983.
Tarski, A. (1933/1983) The Concept of Truth in Formalized Languages, in Logic, Semantics,
Metamathematics, (ed) J. Corcoran, transl. by J. H. Woodger, Indianapolis: Hackett, pp. 152278.
33
Teasdale, G. & Jennett, B. (1974) Assessment of coma and impaired consciousness. A practical scale, Lancet
II, 1974, pp. 81-86.
Tart, C. (1969) Altered States of Consciousness: a Book of Readings. New York: Wiley
Tart, C. (1972) States of consciousness and state-specific sciences, Science, 176, pp. 1203-1210.
Thompson, E. & Varela, F. (2001) Radical Embodiment: Neural Dynamics and Consciousness, Trends in
Cognitive Sciences, 5, pp. 418-425.
Torrance, S. Clowes, R. & Chrisley, R. (2007) (eds) Machine Consciousness Embodiment and Imagination,
Journal of Consciousness Studies, 14 (7).
Van Gulick, R. (2004) Higher-order global states HOGS: an alternative higher-order model of consciousness.
In Gennaro, R. (ed.) Higher-Order Theories of Consciousness. Amsterdam and Philadelphia: John
Benjamins.
Vimal, R. L. P. (2009) Meanings attributed to the term 'consciousness': an overview. Journal of
Consciousness Studies: Special Issue on Defining consciousness (Ed. Chris Nunn), 16(5), pp. 9-27.
Vimal, R. L. P. (2010). On the Quest of Defining Consciousness. Mind and Matter, 8(1), pp.93-121.
Wittgenstein, L. (1953) Philosophical Investigations, 4th edition, transl. by G. E. M. Anscombe, P. M. S.
Hacker & J. Schulte, Oxford: Blackwell.
Wolenski, J. (2005) Metateoretyczne problemy epistemologii Diametros, 6, pp. 83-84 [Metatheoretical
problems of epistemology, available in Polish only].
Watson, J. (1924/1970) Behaviorism, New York: W.W. Norton & Inc.
Zelazo, P. D. (2004). The development of conscious control in childhood. Trends in Cognitive Sciences, 8,
pp. 1217.
34