0% found this document useful (0 votes)
21 views10 pages

Neurosci 02 00010

Uploaded by

Jeremiah Samuel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views10 pages

Neurosci 02 00010

Uploaded by

Jeremiah Samuel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Review

Cognition as a Mechanical Process


Robert Friedman

Retired from Department of Biological Sciences, University of South Carolina, Columbia, SC 29208, USA;
[email protected]

Abstract: Cognition is often defined as a dual process of physical and non-physical mechanisms.
This duality originated from past theory on the constituent parts of the natural world. Even though
material causation is not an explanation for all natural processes, phenomena at the cellular level
of life are modeled by physical causes. These phenomena include explanations for the function of
organ systems, including the nervous system and information processing in the cerebrum. This
review restricts the definition of cognition to a mechanistic process and enlists studies that support
an abstract set of proximate mechanisms. Specifically, this process is approached from a large-scale
perspective, the flow of information in a neural system. Study at this scale further constrains the
possible explanations for cognition since the information flow is amenable to theory, unlike a lower-
level approach where the problem becomes intractable. These possible hypotheses include stochastic
processes for explaining the processes of cognition along with principles that support an abstract
format for the encoded information.

Keywords: animal cognition; cognitive processes; physical processes; stochastic processes

 1. Introduction

1.1. The Many Definitions of Cognition
Citation: Friedman, R. Cognition as
Common definitions of cognition often include the phrase mental process or acqui-
a Mechanical Process. NeuroSci 2021,
sition of knowledge. Reference to mental processing descends from an assignment of
2, 141–150. https://doi.org/10.3390/
non-material substances to the act of thinking. Philosophers, such as the Cartesians and
neurosci2020010
Platonists, have written on this topic, including the relationship between mind and matter.
This perspective further involves concepts such as consciousness and intentionality. How-
Academic Editor: Lucilla Parnetti
ever, these ideas are based on metaphysical explanations and not on a modern scientific
Received: 29 March 2021
interpretation [1].
Accepted: 21 April 2021
The metaphysical approach is exemplified by the philosopher Plato and his Theory of
Published: 22 April 2021 Forms, a hypothesis of how knowledge is acquired. The idea is that a person is aware of
an object, such as a kitchen table, by comparison with an internal representation of that
Publisher’s Note: MDPI stays neutral object’s true form. The modern equivalent of this hypothesis is that our recognition of an
with regard to jurisdictional claims in object is by the similarity of its measurable properties with its true form. According to this
published maps and institutional affil- theory, these true and perfect forms originate in the non-material world.
iations. However, face recognition in primates shows that an object’s measured attributes
are not compared against a true form, but instead that recognition is from a comparison
between stored memory and a set of linear metrics of the object [2]. These findings agree
with studies of artificial neural networks, an analog of cerebral brain structure, where
Copyright: © 2021 by the author.
objects are recognized as belonging to a category without prior knowledge of the true
Licensee MDPI, Basel, Switzerland. categories [3].
This article is an open access article The theory of true forms originates from a thinking of a perfectly designed world
distributed under the terms and with deterministic processes, while a theory absent of true forms may instead depend
conditions of the Creative Commons on probabilistic processes. The rise of probabilistic thinking in natural science has co-
Attribution (CC BY) license (https:// incided with modern statistical methods and explanations of natural phenomena at the
creativecommons.org/licenses/by/ atomic level [4].
4.0/).

NeuroSci 2021, 2, 141–150. https://doi.org/10.3390/neurosci2020010 https://www.mdpi.com/journal/neurosci


NeuroSci 2021, 2 142

A modern experimental biologist would approach a study of the mind from a material
perspective, such as by the study of the cells and tissue of brain matter. This approach is
dependent on reduction of the complexity of a problem. An example is from economics,
where an individual is generalized as a single type and consequently the broader theories
of population behavior are based on this assumption [5]. There is a similar approach in
Newtonian physics where an object’s spatial extent is simplified as a single point in space.
Since some natural phenomena are not tractable to mechanistic study, concepts exist
that are not solely based on material and physical causes. However, it is necessary to
base science theory of brain function on natural mechanisms while disallowing mental
causation. There are exceptions where the physical world is visually indescribable and
solely dependent on mathematical description, but these occurrences are typically not
applicable to the investigation of life at the cellular level.

1.2. Mechanical Perspective of Cognition


Even though a mechanical perspective of neural systems is not controversial, there
remains a non-mechanical and metaphysical perspective concerning our sensory perception
of the world. An example is the philosophical conjecture about the relationship between
the human mind and any simulation of it [6]. This conjecture is based on assumptions
about intentionality and the act of thinking. However, others have presented scientific
evidence where these assumptions do not hold true [7]. One example is the mechanism
for an intent to move a body limb, such as in the act of walking. Whereas the traditional
perspective expects a mental process of thinking that leads to the generation of these body
movements, instead the mechanistic perspective is that a neuronal cell is the generator of
the intent of a body movement [8].
While a metaphysical explanation for phenomena is applicable to some areas of
knowledge, such as in the study of ethics, these explanations are not informative of nature
where the physical processes are expected. In the case of neural systems, the neurons,
their connections, and the neural processes are measurable by their properties, so their
phenomena are assignable to material causes instead of mental causes. Further, there is a
hierarchy of cellular organization that describes the brain where each level of this hierarchy
is associated with a particular scientific approach [9]. An example is at the cellular level
where the neurons are studied by the methods of cellular anatomy. This area of study also
includes the mechanisms for neuron formation and communication between neurons.
Neural systems may be studied at a higher-level perspective, such as at the level of
brain tissue or how information is communicated throughout the neural system [10]. The
information processing of the brain is particularly relevant since it has a close analog with
the artificial neural network architectures of computer science [11,12]. However, the lower
levels of biological organization are not as comparable, such as where an artificial neural
system is firmly based on an abstract and simplified concept of a neuronal cell and its
synaptic functions.

1.3. Purpose of This Review


This review is a search for a modern scientific definition of cognition. This mechanistic
perspective is ideally approached at the higher scale of a neural system—the flow of infor-
mation. Since cognition involves knowledge, the informational level is the most relevant.
The purpose here is to provide a solid foundation for building a theory on cognition that is
free of the constraint of metaphysics. This includes rejection of traditional terminology that
is not informative in explaining the cognitive processes. The language from metaphysics
detracts from the scientific questioning process and inhibits the construction of a language
for explaining cognitive mechanisms.
Common biological processes, including evolutionary theory, are also introduced here
as a guide for helping define cognition. This guide restricts the possible explanations for
the traits of cognition since these traits are constrained in their capacity for change. There
is also an emphasis here on a putative process of how information is encoded in a neural
NeuroSci 2021, 2 143

system. Most of the examples are in the visual system since that is the better studied of the
sensory systems, and is supported by the theories of optics and information flow. Lastly,
there is a section on general cognition that approaches the problem from an evolutionary
perspective.

2. Mechanisms of Visual Cognition


2.1. Stochastic Processes in Biology
Vision is the better studied of the sensory systems in primates [13,14]. It is particularly
relevant since the visual processes occupy one-half of the cerebral cortex [15]. There is
theory from the cognitive sciences that both vision and language are the major drivers for
acquiring knowledge and perception of the world. It may seem daunting to imagine that
our vivid awareness of a scene is built upon levels of basic physical processes. However,
cellular life has generated a high degree of complexity by layering physical processes, such
as mutation and population exponentiality, over an evolutionary time scale.
This problem of causation of complex phenomena has occurred in explanations for the
origin of the camera eye. The formation of a camera eye that has transformed from a simpler
organ, such as an eye spot, requires a model with a very large number of advantageous
modifications over time [16,17]. A casual observer of the different forms of eyes, such
as for this case, would find it difficult to imagine a material process that could design
a functional camera eye from a simpler form. The experienced observer would instead
invoke biological processes, such as random morphological change [17] and selection for
those changes that favor an increase in the rate of offspring production. The result is the
potential for a complex adaptation.
Further evidence that the formation of a camera eye is within the reach of natural
processes is provided by the analogous camera eye in a lineage of invertebrate cephalopods.
This resulted from an adaptation that occurred independent of the origin of the vertebrate
camera eye. Yet, another case of Darwinian evolution is in the optimized refractive index
of the camera eye lens. This adaptation occurred by modifications that led to recruitment
of protein molecules from other uses to the lens of the eye [18].
There is another case of independent evolution as observed in the neural circuity of
animals. The circuit for motion detection in the visual field has converged on a similar
design in two different eye forms, both the invertebrate compound eye and the mammalian
camera eye [19]. These examples show evolutionary convergence on a similar physical
design and evolution’s potential for forming complex biological systems. In addition,
the process of evolutionary convergence is dependent on developmental constraint on
the kinds of modifications, otherwise the chance of convergence on a single design is
expectedly low.
These are all examples of natural engineering of life forms by stochastic processes.
They are not deterministic processes since they are not directed toward a final goal, but in-
stead the adaptations are continually undergoing change by genetic and
phenotypic causes.
The neural system of the brain is a direct analog of the above processes. The organ is
considered highly complex and our perceptions are not easily translated to cellular level
mechanisms. However, by the same probabilistic processes, the neurons and their inter-
connections have evolved into a cognitive system that is capable of complex computation
with large amounts of sensory data. These cognitive processes include the identification
of visual objects, encoding of sensory data to an efficient format, and pattern matching of
visual objects to memory.

2.2. Abstract Encoding of Sensory Input


The biologically plausible proximate mechanism of cognition originates from the
receipt of high dimensional information from the outside world. In the case of vision, the
sensory data consist of reflected light rays that are absorbed across a two-dimensional
NeuroSci 2021, 2, FOR PEER REVIEW 4
NeuroSci 2021, 2 144

sensory data consist of reflected light rays that are absorbed across a two-dimensional
surface, the retinal
surface, the retinalcells
cellsofofthe
theeye.
eye.These
Theselight
lightrays
raysrange
rangeacross
across the
the electromagnetic
electromagnetic spec-
spectra,
tra, but the retinal cells are specific to a small subset of all possible
but the retinal cells are specific to a small subset of all possible light rays. light rays.
From an abstract
abstract perspective,
perspective,thethesurface
surfacethatthatreceives
receivesthe visual
the visualinput
inputis aistwo-di-
a two-
mensional
dimensional sheet of cells
sheet where
of cells eacheach
where cell has
cell an
hasactivation value value
an activation at a point
at ainpoint
time in(Figure
time
1). Over1).
(Figure a length
Over aof time,of
length the distribution
time, of theseofactivations
the distribution is undergoing
these activations change,change,
is undergoing so the
neural system
so the neural is reporting
system fromfrom
is reporting a dynamic
a dynamic state of of
state activations.
activations.This
Thisview
viewatat the
the visual
surface is representative of both the spatial and and temporal
temporal components
components of the the proximate
proximate
cause of vision.

Figure 1. An abstract representation of data that are received by a sensory organ, such as light rays
that
that are
are absorbed
absorbed by
by cells
cells along
alongthe
thesurface
surfaceofofthe
theretina
retinaofofa acamera
cameraeye.
eye.The
Thedrawing
drawingshows thethe
shows
spatial pattern, but there is also a temporal dimension since this sensory input data are changing
spatial pattern, but there is also a temporal dimension since this sensory input data are changing
over time.
over time.

This
This representation
representation of ofsensory
sensorydata datais is
similar
similarto to
that received
that receivedby artificial neural
by artificial net-
neural
work
networksystems. These
systems. artificial
These systems
artificial are capable
systems of identifying
are capable objects
of identifying in a visual
objects scene
in a visual
and
scenelabeling them by
and labeling themtheir
bymembership
their membership to a category of related
to a category objects.objects.
of related This also shows
This also
analogous function
shows analogous between
function the artificial
between processprocess
the artificial and natural cognition
and natural [20]. [20].
cognition
The open problem has been been generalizing
generalizing this knowledge
knowledge (transfer
(transfer learning) that is is
acquired from processing sensory input data. This is the essential problem for for artificial
artificial
systems in emulating cognition in animals. However, there is recent work work that
that employs
employs
artificial models of transfer learning [21,22].
A related problem
problem is is in
in identifying
identifying an anobject
objectwhere
wherethetheviewpoint
viewpointisisvariable.
variable.It It
is
addressed
is addressed bybya model
a model [3][3]
that
thatis designed
is designed forfor
biological
biologicalrealism, along
realism, with
along witha robust
a robustar-
chitecture
architecture forfor
sampling
sampling thethe
parts of an
parts ofobject. This approach
an object. includes
This approach the sampling
includes of vis-
the sampling
of visual
ual data which
data which are thenareencoded
then encoded in an abstract
in an abstract format,format,
a vectora of
vector of number
number values. values.
Specif-
Specifically,
ically, this sampling
this sampling occursoccurs
acrossacross
blocksblocks of columns
of columns in a visual
in a visual scene.
scene. Further,
Further, eacheachcol-
column
umn consists
consists of aof
seta of
setvectors
of vectorswhere where
eacheach vector
vector is assigned
is assigned to a discrete
to a discrete categorycategory
by its
by itsof
level level of representation
representation of thedata
of the input input data (Figure
(Figure 2). These2).processed
These processed
data aredata
thenare then
utilized
utilized
for findingfor columns
finding columns
of similarityof similarity that correspond
that correspond to the partsto of
theanparts of aan
object, object, a
consensus-
consensus-based
based approach
approach toward toward establishing
establishing a robust identification
a robust identification of an object.of an object.
Sci 2021, 2, FOR PEER REVIEW 5

NeuroSci 2021, 22, FOR PEER REVIEW 1455

Figure 2. A model for processing of visual objects. The first panel shows a visual scene. The next
panel shows an open circle which represents a region with a potential object. The third panel is an
Figure
enlargement ofFigure 2. A model
this region. The finalfor processing of visual objects. The first panel shows a visual scene. The next
2. A model for panel contains
processing three objects.
of visual open diagonal shapes
The first panelthat areaabstract
shows visual scene. The next
panel
representationspanel shows
of theshows an open
information in circle
the which
image. represents
They are a region
ordered fromwith a
bottompotential
to top object.
by
an open circle which represents a region with a potential object. The low The thirdpanel
to third panelisisan
an
high level of abstraction.
enlargement of this region. The final panel contains three open diagonal shapes that
enlargement of this region. The final panel contains three open diagonal shapes that are abstract are abstract
representations of
representations of the
the information
informationin inthe
theimage.
image.They
Theyare
areordered
orderedfrom
frombottom
bottomtototop
topbybylow
lowtotohigh
Previoushighapproaches
level level to artificial
of abstraction.
of abstraction. systems have often overfit the network model to a
training data set. Overfitting hinders the generalizability of the final model [23]—in this
Previous
case, the model is aPrevious approaches
networkapproaches to artificial
artificial systems
of nodes interconnected
to systems have often
with weight
have often
values.overfit the
the network
network model
The overfitting
overfit model to to aa
training
problem leadstraining data
to loss data set. Overfitting
of transferability
set. Overfitting of thehinders
modelthe
hinders the generalizability
to generalizability
other applications. of the final
Nature
of the model
final solves [23]—in
model [23]—in this this
case,
this problem case,
by theof
a set
the model is a
processes.
model is a network
One is of
network thenodes
of visualinterconnected
nodes processing forwith
interconnected weight
spatial
with and values.
weight temporal
values. The
The overfitting
overfitting
problem
an objectleads
invariance of problem in a to
leads loss
scene
to of
of transferability
loss[24,25]. This leadsof
transferability oftothe model
a more
the to
to other
other applications.
modelgeneralized form of theNature
applications. Nature solves
solves
this problem
problem by
object than otherwise.
this by aa set
set of
of processes.
processes. One One is is the
the visual
visual processing
processing for for spatial
spatial and
and temporal
temporal
invariance
A secondinvariance of an
an object
and complementary
of objectmethod
in
in aa scene
is to[24,25].
scene neurallyThis
[24,25]. This
codeleads to aa more
the object
leads to generalized
by metrics
more that form
generalized form ofof the
the
object
are abstract and than otherwise.
generalizable.
object than otherwise. This reflects the example where a photograph of a cat is
A second
secondtoand
encoded so that it matches
A and
bothcomplementary
another photograph
complementary method
method and isato
is topencil
neurally code
sketch
neurally codeof the object
the cat. by
by metrics
This
object metrics that
that
are
generalizability abstract
areinabstract and
identifying generalizable.
and objects This
is now possible
generalizable. reflects
This reflectsin the the example
case
the where
of artificial
example where a photograph
systems [26].
a photograph of a
of a cat
cat is
is
encoded
Additionally,encoded so that
that itit matches
this generalizability
so matches
leadsto both
totoboth another
another photograph
corrections and
and aa pencil
for the variability
photograph in an sketch
pencil object’s
sketch of
of the
the cat.
cat. This
This
generalizability
form, such as generalizability
change in identifying
identifying
in its orientation,
in objects is
deobfuscation
objects isagainst
now possible
now possible in the
the case
the background,
in caseorofdetection
of artificial systems
artificial systems [26].
[26].
Additionally,
based on a partial this
view (Figure
Additionally, thisgeneralizability
generalizabilityleads
3). leadsto to
corrections
correctionsfor the
for variability in anin
the variability object’s form,
an object’s
such as change in its orientation, deobfuscation against the background, or detection based
form, such as change in its orientation, deobfuscation against the background, or detection
on a partial view (Figure 3).
based on a partial view (Figure 3).

(a) (b)
Figure 3. (a) Figure
The first3.panel
(a) The first panel
shows shows aofphotograph
a photograph of athat
a visual scene visual scene athat
contains contains
table a table
along with along
other with The second
objects.
other objects. The second panel in (a) (a)the same scene but transformed so that it appears
is (b)as a pen-
panel in (a) is the same scene but transformed so that it appears as a pencil sketch drawing; (b) The first panel is a visual
cil sketch drawing; (b)3.The first panel is a visual drawing of the of
digit nine (9), while the next panel
drawing of the digit nine (9),Figure
while the (a)next
Thepanel
first panel
is theshows a photograph
same digit a visual
but transformed scene
by rotationthat contains
of the image.a table along with
is the same digit but objects.
other transformed by rotation
The second panelofinthe
(a)image.
is the same scene but transformed so that it appears as a pen-
2.3.sketch
cil Perception
drawing; as a(b)
Mechanical Process
The first panel is a visual drawing of the digit nine (9), while the next panel
2.3. Perceptionisasthea Mechanical
same digit Process
but transformed by
There is an extensive amount of visual rotation of the image.
processing in the brain since it occupies one-
There is an
halfextensive amount
of the cerebral of visual
brain processing
tissue [15]. Further,inthe
thenumber
brain since it occupies
of neurons one- exponentially
increases
2.3.
fromPerception
half of the cerebral millions as
brain tissueinathe
Mechanical
[15]. Process
Further,
earlier thepathways
visual number oftoneurons
billions increases exponen-
in the higher layers of the cere-
tially from millions
brumTherein the
[15]. earlier
is an
This visualof
extensive
hierarchy pathways
amount
processes to billions
of visual
creates our in
processingthe in
visual higher layers
the brain
perception of the
since
of the it occupies
world, one-
but there
cerebrum [15].half
This
is noof hierarchy
the cerebral
evidence of processes
that abrain creates
tissue of
perception [15]. our visual
Further,
a scene perception
the number
is processed of the
by aofsingle world,
neurons but
increases
cognitive exponen-
path. Studies
there is no evidence
show from
tially thatan
that a perception
object in
millions ofearlier
is the a scene
identified is processed
independent
visual pathwaysofbythe
a single
to visualcognitive
billions scene
in theand path.
its attributes
higher layers of are
the
modified
Studies show cerebrum to
that an object disfavor
[15].isThis variability
identified ofin
independent
hierarchy its appearance,
processes of the so
ourthat
visual
creates sceneany
visualandtransformationof theofworld,
its attributes
perception the object
but
does not
there is nolead to misclassification
evidence error of
that a perception [27].
a scene is processed by a single cognitive path.
Studies show that an object is identified independent of the visual scene and its attributes
NeuroSci 2021, 2 146

Temporally, the advanced sensory processing occurs over a millisecond time scale [24],
so it not expected that perceptions occur in real time. Instead, cognitive processes cre-
ate an internal representation, a facsimile, of the sensory data and that construction
is the perception.
Studies have further divided perception and awareness into multiple types, but in
all cases these cognitive processes are a mechanical construction of the outside world [7].
These internal models that form our representation of the world are material processes,
including the perceived awareness of objects, a scene, and the occurrence of events. The
physical events that occur over time in a scene are also time delayed and the length of
that delay is subject to perception. Therefore, the representation of the time delay is not
calibrated with real time. Artificial neural networks show analogous processes with models
that are capable of predictive coding, such as completing a written sentence or the next
frame of a visual image [28].
Visual perception also includes other processes, such as the transformation of a
scene’s brightness and contrast levels [29]. This may help in identifying objects against a
background. Further, the cerebral processing in vision is more extensive than that of the
early steps along the visual pathway, so it is reasonable to assume that the perceptual image
is weakly correlated with the initial retinal input or the earlier-path internal representations
of the visual data.
Last, the limit on the number of evolutionary and developmental outcomes restricts
the possible hypotheses about cognition. For example, the evolution of the camera eye
expectedly occurred by modifications of small effect, along with the accompanying adapta-
tions in cognition. This predicts that the artificial systems can emulate the visual cognitive
processes by a finite number of steps as represented by an algorithm. This has held true
since deep learning methods are competitive with our cognitive ability to identify objects
and process natural language.

2.4. Cognition as a Pattern Matching Process


To find a class of similar visual objects, a comparison to memory is necessary. The
informative comparisons occur at particular dimensions in the visual input data. This
pattern matching and sampling process is the expected model of cognition.
Animal cognition expectedly handles these pattern matching problems at the lower di-
mensional levels of information. In contrast, artificial systems are often designed to encode
and process at a higher dimensional level, such as for the unmodified two-dimensional
pixel data in the case of vision. Another example is for a grid of values and rules of a
deterministic boardgame, such as chess. It is known that the high dimensional information
is transformed to a lower dimensional form in the layers of the neural network, but a naive
approach to these tasks has not been consistent with the goals of transfer learning.
A chess game among human players is mainly based on recognition of patterns of
chess pieces on the board, along with a limited capacity to predict future possibilities for
the state of the chess board [30]. Instead, the artificial systems are often designed by a
different approach. They typically compute a best move by heuristic searching through all
possible outcomes from all possible game moves, a method that is exhaustive in its search
of possible combinations of board states [31].
A human player searches through a small set of possible outcomes in complex
boardgames. The alternative approach based on a low level representation of the board
state leads to a computational problem with complexity that is likely beyond the capacity
and energetics of the brain. Since a human player is mainly restricted to observing pat-
terns of pieces on the board, it is expected that natural cognition is mainly operating on
the information at a state of lower dimensionality. There is empirical support for these
ideas, too [32].
Similarly, transfer learning is likely occurring at a lower dimensionality than is present
in the unprocessed input source data. Natural cognition receives high dimensional sensory
input, a robust sampling process, and that input data are reformulated for constructing
NeuroSci 2021, 2 147

a perceptual model. This perception is an internal representation that is a high level


of representation based on the source data. In the case of vision, the sampling occurs
across a scene, across an object, and then it is possible to also sample across the internal
representation of that object. These are statistical processes that are expected in modeling
the variability of sensory objects (Figure 3). Without the robust sampling process, then an
identification of an object is expectedly overfitted to a form not represented in memory,
and therefore impeding any process of transfer learning.
These cognitive processes may also be described as a reduction of complexity in the
sensory input data, along with extraction of relevant information for downstream cognitive
processing. Likewise, it is already known that visual scenes are highly compressible [33]
and consequently both natural and artificially designed systems are capable of extracting
visual objects from a scene. This processing leads to an internal representation of objects and
their properties. This process is complemented by preprocessing pathways for efficiency
in cognition, such as internal correction of overall brightness and contrast levels in a
visual scene.

3. General Cognition in Animals


3.1. Cognition and Essential Animal Behavior
A definition of general cognition includes the communication of abstract representa-
tions and functions related to pattern matching. This definition applies to both a natural
and artificial design. However, animal cognition has the component of general cognition
that is confounded with processes related to essential animal behavior. Insight into these
differences is available from knowledge of the evolution and development of animals.
For example, it is necessary for animal populations to consist of individuals with
a common set of behaviors. Examples of these include an adult form that survives to
reproductive age and that sufficient progeny are produced to maintain the population. If
a population does not maintain a sufficiently high birth rate to counteract the death rate,
then the population will become extinct over a number of generations. This concept is
a mathematical necessity. Since evolutionary time is very long, even slight changes in
animal behavior may lead to population extinction, a process that is highly frequent across
the history of life [34]. Therefore, cognition is not at all likely a standalone process, but
instead heavily influenced by behavior that ensures reproduction and survival at each
relevant life stage.
It is possible to imagine animal cognition without essential animal behavior. This is
the presumed state of an artificial cognitive system that is not specifically programmed with
a set of behavioral characteristics. There are popular conjectures that models of artificial
cognition may lead to a metaphysical property of animal cognition, such as intentionality
or consciousness. However, these are properties that are unsupported from a mechanistic
perspective of brain computation. Instead, any artificial design of cognition is essentially
the same as any tool that is undirected by design [35]. History is a better judge of how
undirected tools are utilized than conjectures that confound a cognitive process with
non-material causes.

3.2. Cognition and Large-Scale Neuroanatomical Changes


In the case of mammals, adaptations may occur that require enhancement or reduction
to one or more of the functions of cognition. This leads to the prediction that there is not
a hierarchy of general intelligence by brain size, but instead that the cognitive capacity,
whether visual, auditory, or somatosensory, is a complex phenotype that is subject to
evolution at the different levels of cellular organization and in specific cerebral regions.
In the case of the human lineage, it is arguable that general cognition has expanded
to meet the requirements of advanced speech, speech perception, and representation of
abstract concepts [36,37]. This is one hypothesis for the proximate cause of the evolution
of cognition in recent hominins. However, the hypothesis for an expansion in cognitive
function is not necessarily a one-to-one relationship with brain size, as exemplified in other
NeuroSci 2021, 2 148

mammals. It has been shown in the cerebral and cerebellar regions of whales that cognitive
capability is not simply described by a change in neuron count or density [38].
To reiterate, the morphological changes of the brain and its regions are not necessarily
a simple correlation with cognitive function. The addition and subtraction of cognitive
capabilities, such as observed by contrasting species of marine and terrestrial mammals,
are complex phenomena that are molded by evolution and development. Therefore, it is
problematic to oversimplify the relationship between molecular or anatomical characters
and a cognitive function.

3.3. Cognition as a Physiological Process


The brain is an organ with a physiology that is explainable at the different biological
scales. At the molecular level, the neurons are described by a vast number of cellular
processes, along with the electrochemical signaling that interconnects the system. There
is also a higher scale process that codes for the internal neural representations of sensory
data. Both these scales have an analog in other organs, such as the cellular composition of
the heart and its electrical system that controls its pumping action.
However, the brain is commonly separated from the other organs and assigned a
role that is both biological and metaphysical. One example is illustrated by the diverse
set of academic disciplines that study the brain, such as the cognitive sciences, sociology,
and cellular biology. These disciplines often approach the problem at a different scale and
perspective. Not all approaches are amenable to the study of proximate mechanisms, such
as for areas in clinical psychology or the philosophy of mind, and this is one reason for the
current retention of explanations based on mental processes [7].
Instead, it is more efficient to study cognition and brain computation as a product of
physical forces and communication of information across the system. This scale provides a
better foundation for an experiment by a theoretical or an empirical approach. Theoretical
study is possible by use of models from network science and information theory, both
mature areas of inquiry. The other is from computer science and the use of artificial
neural networks. An example is a recent demonstration that colors in a visual scene are
efficiently expressed by language. This was shown in an artificial neural network [39]. This
shows evidence for efficiency in the neural coding of information and that artificial models
emulate this efficiency.
In summary, at its essence, cognition is a study of the physical processes of the brain.
As with the organs of most animals, the brain has evolved and acquired derived characters,
as documented across the history of life. At the lower biological scales, the brain is an
organ no more complex than the heart. Moreover, the proximate mechanisms of cognition,
the input and coding of sensory data, are not unimaginable in complexity. Instead, it is an
organ with a physiology that is tractable for study at the different scales. As the heart is an
organ with a physiology that includes the pumping of blood, the brain is an organ with a
physiology that includes the communication of information.

4. Suggestions for the Natural and Computer Sciences


The cognitive sciences is broad in scope and methods. It is important to continue to
integrate their findings with both the other natural and computer sciences. Past findings
on perception and awareness have not permeated some of the other areas of knowledge,
but eventually any metaphysical basis will yield to a more material definition of cog-
nition [8]. The definition includes an expectation of probabilistic processes that are an
essential part of cognition, along with an encoding process that efficiently stores infor-
mation from the outside world. For acquisition of knowledge of the world and to form
generalizations, it is expected that sensory information is internally represented at a higher
level of representation, one that is built from the lower levels.
Computer scientists should remain skeptical of assertions about their artificial systems.
The benchmark for these systems, such as the advanced deep learning approaches, is not
some process of thinking. These are metaphysical ideals that are not material in origin. A
NeuroSci 2021, 2 149

neuron is the proximate cause of cognition and does not possess a special immeasurable
quality. If a deep learning approach is an effort to emulate a pathway on how we acquire
knowledge, then a valid and realistic model should be established beforehand. The problem
is not whether the artificial systems can emulate the metaphysics of human thinking, a
false proposition, but instead that these systems are emulating a specific and measurable
cognitive process.

Funding: This research received no external funding.


Conflicts of Interest: The authors declare no conflict of interest.

References
1. Vlastos, G. Parmenides’ theory of knowledge. In Transactions and Proceedings of the American Philological Association; The Johns
Hopkins University Press: Baltimore, MD, USA, 1946; pp. 66–77.
2. Chang, L.; Tsao, D.Y. The code for facial identity in the primate brain. Cell 2017, 169, 1013–1028. [CrossRef] [PubMed]
3. Hinton, G. How to represent part-whole hierarchies in a neural network. arXiv 2021, arXiv:2102.12627.
4. Jeans, J.H. Physics and Philosophy; Cambridge University Press: Cambridge, UK, 1942.
5. Smith, A. An Inquiry into the Nature and Causes of the Wealth of Nations, 1st ed.; A. Strahan and T. Cadell: London, UK, 1776.
6. Searle, J.R.; Willis, S. Intentionality: An Essay in the Philosophy of Mind; Cambridge University Press: Cambridge, UK, 1983.
7. Haggard, P. Sense of agency in the human brain. Nat. Rev. Neurosci. 2017, 18, 196–207. [CrossRef] [PubMed]
8. Huxley, T.H. Evidence as to Man’s Place in Nature; Williams and Norgate: London, UK, 1863.
9. Ramon, Y.; Cajal, S. Textura del Sistema Nervioso del Hombre y de los Vertebrados; Nicolas Moya: Madrid, Spain, 1904.
10. Kriegeskorte, N.; Kievit, R.A. Representational geometry: Integrating cognition, computation, and the brain. Trends Cognit. Sci.
2013, 17, 401–412. [CrossRef]
11. Hinton, G.E. Connectionist learning procedures. Artif. Intell. 1989, 40, 185–234. [CrossRef]
12. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [CrossRef]
13. Yang, Z.; Purves, D. The statistical structure of natural light patterns determines perceived light intensity. Proc. Natl. Acad. Sci. USA
2004, 101, 8745–8750. [CrossRef]
14. Cichy, R.M.; Pantazis, D.; Oliva, A. Resolving human object recognition in space and time. Nat. Neurosci. 2014, 17,
455–462. [CrossRef]
15. Prasad, S.; Galetta, S.L. Anatomy and physiology of the afferent visual system. In Handbook of Clinical Neurology; Kennard, C.,
Leigh, R.J., Eds.; Elsevier: Amsterdam, The Netherlands, 2011; pp. 3–19.
16. Paley, W. Natural Theology: Or, Evidences of the Existence and Attributes of the Deity, 1st ed.; R. Faulder: London, UK, 1802.
17. Darwin, C. On the Origin of Species; John Murray: London, UK, 1859.
18. Tardieu, A.; Delaye, M. Eye lens proteins and transparency: From light transmission theory to solution X-ray structural analysis.
Annu. Rev. Biophys. Biophys. Chem. 1988, 17, 47–70. [CrossRef] [PubMed]
19. Borst, A.; Helmstaedter, M. Common circuit design in fly and mammalian motion vision. Nat. Neurosci. 2015, 18, 1067–1076.
[CrossRef] [PubMed]
20. DiCarlo, J.J.; Zoccolan, D.; Rust, N.C. How does the brain solve visual object recognition? Neuron 2012, 73, 415–434. [CrossRef]
21. Goyal, A.; Didolkar, A.; Ke, N.R.; Blundell, C.; Beaudoin, P.; Heess, N.; Mozer, M.; Bengio, Y. Neural Production Systems. arXiv
2021, arXiv:2103.01937.
22. Scholkopf, B.; Locatello, F.; Bauer, S.; Ke, N.R.; Kalchbrenner, N.; Goyal, A.; Bengio, Y. Toward Causal Representation Learning.
Proc. IEEE 2021, 1–22. [CrossRef]
23. Hawkins, D.M. The problem of overfitting. J. Chem. Inf. Comput. Sci. 2004, 44, 1–12. [CrossRef] [PubMed]
24. Yates, A.J. Delayed auditory feedback. Psychol. Bull. 1963, 60, 213–232. [CrossRef]
25. Wallis, G.; Rolls, E.T. Invariant face and object recognition in the visual system. Prog. Neurobiol. 1997, 51, 167–194. [CrossRef]
26. Goh, G.; Cammarata, N.; Voss, C.; Carter, S.; Petrov, M.; Schubert, L.; Radford, A.; Olah, C. Multimodal Neurons in Artificial
Neural Networks. Distill 2021. [CrossRef]
27. Garrigan, P.; Kellman, P.J. Perceptual learning depends on perceptual constancy. Proc. Natl. Acad. Sci. USA 2008, 105,
2248–2253. [CrossRef]
28. Liu, A.; Tucker, R.; Jampani, V.; Makadia, A.; Snavely, N.; Kanazawa, A. Infinite Nature: Perpetual View Generation of Natural
Scenes from a Single Image. arXiv 2020, arXiv:2012.09855.
29. Adelson, E.H. Lightness Perception and Lightness Illusions. In The New Cognitive Neurosciences, 2nd ed.; Gazzaniga, M., Ed.; The
MIT Press: Cambridge, MA, USA, 2000; pp. 339–351.
30. Chase, W.G.; Simon, H.A. Perception in chess. Cogn. Psychol. 1973, 4, 55–81. [CrossRef]
31. Silver, D.; Hubert, T.; Schrittwieser, J.; Antonoglou, I.; Lai, M.; Guez, A.; Lanctot, M.; Sifre, L.; Kumaran, D.; Graepel, T.; et al.
A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play. Science 2018, 362, 1140–1144.
[CrossRef] [PubMed]
NeuroSci 2021, 2 150

32. Kiani, R.; Esteky, H.; Mirpour, K.; Tanaka, K. Object category structure in response patterns of neuronal population in monkey
inferior temporal cortex. J. Neurophysiol. 2007, 97, 4296–4309. [CrossRef]
33. Pang, R.; Lansdell, B.J.; Fairhall, A.L. Dimensionality reduction in neuroscience. Curr. Biol. 2016, 26, R656–R660. [CrossRef]
34. Grant, P.R.; Grant, B.R. Adaptive radiation of Darwin’s finches: Recent data help explain how this famous group of Galapagos
birds evolved, although gaps in our understanding remain. Am. Sci. 2002, 90, 130–139. [CrossRef]
35. Bostrom, N. The superintelligent will: Motivation and instrumental rationality in advanced artificial agents. Minds Mach. 2012,
22, 71–85. [CrossRef]
36. Fitch, W.T. The Biology and Evolution of Speech: A Comparative Analysis. Annu. Rev. Linguist. 2018, 4, 255–279. [CrossRef]
37. Wang, X.; Wu, W.; Ling, Z.; Xu, Y.; Fang, Y.; Wang, X.; Binder, J.R.; Men, W.; Gao, J.H.; Bi, Y. Organizational principles of abstract
words in the human brain. Cereb. Cortex 2018, 28, 4305–4318. [CrossRef] [PubMed]
38. Muller, A.S.; Montgomery, S.H. Co-evolution of cerebral and cerebellar expansion in cetaceans. J. Evol. Biol. 2019, 32, 1418–1431.
[CrossRef] [PubMed]
39. Chaabouni, R.; Kharitonov, E.; Dupoux, E.; Baroni, M. Communicating artificial neural networks develop efficient color-naming
systems. Proc. Natl. Acad. Sci. USA 2021, 118, e2016569118. [CrossRef] [PubMed]

You might also like