0% found this document useful (0 votes)
32 views22 pages

Bookpart0 ResearchGate

The document discusses different theories about the nature of information and how understanding information is fundamental to understanding computer technologies. It explores taking a scientific approach looking at causes and effects to gain a deeper insight. While technologies are often described as created, the document argues the principles behind them should be examined.

Uploaded by

nurraffi febrian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views22 pages

Bookpart0 ResearchGate

The document discusses different theories about the nature of information and how understanding information is fundamental to understanding computer technologies. It explores taking a scientific approach looking at causes and effects to gain a deeper insight. While technologies are often described as created, the document argues the principles behind them should be examined.

Uploaded by

nurraffi febrian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/275155318

Logic of analog and digital machines

Book · January 2013

CITATIONS READS
11 467

All content following this page was uploaded by Paolo Rocchi on 28 December 2020.

The user has requested enhancement of the downloaded file.


COMPUTER SCIENCE, TECHNOLOGY AND APPLICATIONS

LOGIC OF ANALOG AND


DIGITAL MACHINES

PAOLO ROCCHI

New York
CONTENTS
Foreword
Introduction
Part 1
Chapter 1 A Rock Amidst the Route
Chapter 2 Analog and Digital: Two Courses of Action
Chapter 3 The Extravagant Realm
Part 2
Chapter 4 Digital System Architecture
Chapter 5 Networking
Chapter 6 Storage
Chapter 7 Efficient Strategy
Chapter 8 Adapt for Survival
Chapter 9 Galaxy of Programs
Part 3
Chapter 10 People Like to Communicate
Acronyms and Symbols
Index
Chapter 1

A ROCK AMIDST THE ROUTE

It is generally thought that the question of scientific method resolves itself into two parts:
the problem of discovery and the problem of justification. Scientists tackle the first problem
whenever they find out interesting phenomena, and set up innovative practical solutions.
Theorists and thinkers address the second problem i.e. they clarify the logic of observed
events, they investigate the root-causes, they define mathematical models to explain the
phenomena and so forth. Usually the latter follows the former in time and appears somewhat
distinct one from another; however some people mistake the problem of justification for
philosophical studies. Science shares some elements of its necessity and universality with
philosophy, although what distinguishes science from pure philosophy is its mandate to
understand the world of empirical experience.
A large number of researchers are interested in gaining a deep insight into computer
technologies and hope that exposition of explanatory theories will be brought up to date.
Alvin Schrader (1986) has analyzed various conceptualizations that appeared in the literature
and has underlined the need for universal definitions and concepts in information science.
Edsger W. Dijkstra authored several papers on fundamental topics of programming from the
sixties up to the year of his death in 2002. Peter J. Denning is intensely concerned in
appropriate principles of Computing and is perhaps the most active advocate for the
conceptual development of this field. “Computing as a Discipline” – the preliminary report of
the ACM task force on the core of Computer Science (Denning et al 1989) – captured the
intensive interest of researchers and gave rise to ample debates within the scientific
community.
I actively share the feeling and the aims pursued by those scholars. My investigations
address topics close to the basic principles of Computer Science but the starting point of my
way is rather new in comparison with the current literature.
As a physicist I am inclined to see the principle of causality as a solid assumption which
offers significant support to those who tackle the problem of justification and provides
modern science with a logical basis (Hübner et al 1983). Any material event has a practical
origin and the correspondence between causes and effects regulates the logic of natural
phenomena, as well as the logic of machines. The principle of causality sustains engineering,
and in particular this principle makes clear that the product w, carried on by the process S, is
the logical cause of S in that the outcome w determines the components of S and the entire
logic of S. Manufacturers install the machine S and this in turn outputs w. First comes S, and
then comes w on the operational time table; but things go the opposite way in the intellectual
sphere due to the principle of causality. The examination of w precedes the scrutiny of S since
this product determines the features of the system S.
The principle of causality yields the natural method of study which may be found in
many sectors: as first one becomes aware of w and later of S. To facilitate this process, a
student of engineering takes lessons on Chemistry and Electrical Technology, and then can
comprehend the operations of a plant that produces caustic soda. He/she masters the
electrolytic chloralkali process only when he/she is familiar with hydroxides, acids, chlorides
and so forth.
The study of w draws the attention of researchers in a special manner when the process S
is spontaneous in Nature, and no human architect designed the process S. For example,
accurate inquiries into the components of petroleum were carried out for decades and finally
clarified the spontaneous activities which produced petroleum. Presently most experts accept
the idea that oil derives from ancient fossilized organic materials, such as zooplankton and
algae natural.
The study of w appears extremely interesting when the process S has been devised by
technicians who operated by trial and error. This method, examined by Ross Ashby (1960), is
often used by people who have little knowledge or are pioneers in the problem area. Most of
the computer devices has been built up and optimized by trial and error. Pascal devised the
earliest mechanical calculator and subsequent constructors improved this machine using the
‘generate and test’ method. Eventually the electronic manufactures produced the modern
computer systems on the basis of practical experience rather than on theoretical assumptions.
It is evident how our cognition of Computing should be enhanced with an approach using
insight and theory.
However, to the best of my knowledge, modern commentators describe analog and digital
solutions just created and are not inclined to discuss what makes those solutions happen, or to
explicate the great principles that guide – or should guide – computer experts. They overlook
the problem of justification and usually introduce the hardware components and the software
programs on an as-is basis; i.e. they disregard the scientific principles that govern this
advanced sector. The simplified illustration of technology seems effortless and effective.
Learning just what to do allures the novice in computing but in the long run this method looks
rather superficial to the expert since one does not understand the reason for things. Secondly,
engineers make a lot of effort to correct and improve technical solutions when they lack
accurate notions. Lastly, simplified studies give support to a self-referential sciolism as the
links amongst various technical areas appear obscure.
The reasons for this strange cultural tendency typical of the computer sector may be
easily assessed.
The computing machine S manipulates information, and one should examine information
first and computer technologies later. Formidable obstacles restrict or impede the efforts of
thinkers to clarify what is information. The course, which appears to be the most natural on
the paper, involves a lot of argument in practice. The analysis of technical solutions grounded
on the concept of information is an open challenge and I mean to proceed in this arduous task.
1. A CHAMELEON
Various scientists are unraveling the nature of information in numerous areas. Experts in
Neurosciences, Linguistics, Cognitive Sciences, Sociology, Education and Communication
besides Informatics search for the solid definition of what is information. Different scientific
theories have been put forward to explain what is information, but none has gained universal
consensus so far.
Ronald Aylmer Fisher, an English statistician, first presented a scientific definition of
information in (1922). Measurements are usually imperfect and Fisher meant to specify the
amount of information deriving from a measurement process affected by statistical
fluctuations. During the same years, electrical engineers began using the term ‘information’ to
describe data transmission. Observations on electrical nets and circuits lead the American
Ralph Hartley to search for a quantitative measure whereby the capacities of various conducts
to convey information could be compared. Hartley distinguished the physical transmission of
information from ‘psychological factors’ in (1928) and opened the way to Claude Shannon
who devised the most famous mathematical conceptualization of information in the
engineering field. His work stimulated investigations conducted from several perspectives but
the classification of those theories which mushroomed in the past decades is challenging too.
The ensuing partial list – time ordered – can give an idea about the variety of schools of
thought:

− The statistical theory of information by Fisher (1922);


− The transmission theory of information by Hartley (1928);
− The communication theory of information by Shannon (1949);
− The semantic theory of information by Carnap and Bar Hillel (1953);
− The utility theory of information by Karkevich (1960);
− The cybernetic theory of information by Wiener (1961);
− The algorithmic theory of information by Solomonoff, Kolmogorov (1965), and
Chaitin (1977);
− The descriptive information theory by MacKay (1969);
− The semiotic/cybernetic theory of information by Nauta jr. (1970);
− The economic theory of information by Marschak (1971);
− The pragmatic theory of information by von Weizsäcker (1974);
− The qualitative theory of information by Mazur (1974);
− The living system information theory by Miller (1978);
− The autopoietic theory on information by Maturana and Varela (1980);
− The hierarchical information theory by Brookes (1980);
− The common-sense information theory by Derr (1985);
− The dynamic theory of information by Chernavsky (1990);
− The systemic theory of information by Luhmann (1990);
− The general information theory by Klir (1991);
− The physical theory of information by Levitin (1992);
− The organizational information theory by Stonier (1994);
− The independent theory of information by Losee (1997);
− The social theory of information by Goguen (1997);
− The purpose-oriented theory of information by Janich (1998);
− The philosophy of information by Floridi (1999);
− The anthropological information theory by Bateson (2000);
− The biological information theory by Jablonka (2002);
− The sociological theory of information by Garfinkel (2008);
− The general theory of information by Burgin (2009);
− The unified theory of information by Hofkirchner (2010);
− The communicative information theory by Budd (2011).

In sum, it may be said that a circle of followers of Shannon – such as Marschak, Brookes,
and Miller – considers the master’s theory good but insufficient and refines it or enriches it
with new contributions. The rest of the cited writers propose a variety of more or less original
alternative definitions of information.
The contrast among the various approaches – semantic, algorithmic, autopoietic, etc. – is
evident. The above listed descriptive adjectives – defined by the same authors or by the
commentators in the field – can aid the reader’s intuition about the diverging intents and
purposes of the works. A group – e.g. Burgin, Hofkirchner and Klir – searches for a
comprehensive conceptualization of information and others focus on narrower specialist
issues. Carnap’s view revolves around Semantics; in contrast Shannon deliberately ignores
the aspects of Semantics. Kolmogorov reasons at the purely mathematical level, whereas
Bateson aims at unifying the view of the mind with the world out there. Engineers focus on
the model transmitter/channel/receiver which is nonsensical for Maturana and Varela who
deny the existence of information as external instruction. In a way one could call Maturana
and Varela ‘negationist authors’ in this domain. Most researchers investigate the relations
between information and technology instead Richard Derr analyzes the term ‘information’ in
ordinary discourse and conversational utterances. Norbert Wiener rejects the idea that
information is physical, and Tom Stonier sees information as much a part of the physical
universe as energy and matter. Whilst to Shannon information is inversely proportional to
probability, to Wiener it is directly proportional to probability: the one is simply the negative
of the other.
Theorists do not concur even on the nature of the problem; a circle sees information as a
quantity to measure – e.g. Shannon, Kolmogorov, Fisher and Klir – other thinkers – e.g.
Floridi – are convinced of the prismatic constitution of information which one can scrutinize
only from the philosophical standpoint. The former are inclined to attack the problems using
analytical methods, the latter rejects any analytical approach and claims that pure philosophy
can enlighten the argument.
In addition, the reader can find definitions of information which the authors have posited
outside any formal theory or have placed inside a rather small theoretical framework; I cite
randomly Ackoff (1989), Kullback and Leibler (1951), Loveland (1969), Gabor (1946). All
these researches yield a lot of papers and books. Schrader (1986) has conducted an accurate
and stirring survey and concludes:
“The proliferation of notions of what information means staggers the mind: 134 variant
notions have been documented in the present research. This proliferation demonstrates
the conceptual chaos issuing from the definitional literature of information science.”
There is no lack of scientific contributions on the table; however, those proposals have
not led to a generally recognized description of information. Ritchie (1986) observes:

“Confusion has also arisen from confounding the precise technical and statistical usage of
words such as ‘uncertainty’, ‘information’ and ‘communication’ with the more common,
everyday usage of these words”.

Several complaints about the misunderstandings and misuses of the very idea of
information are expressed in the literature (Sayre 1976). There are so many irreconcilable
issues that I am prone to conclude with René Thom (1975):

“Information is a semantic chameleon”

My attitude towards this situation is essentially positive and I am convinced that the
wealth of theories is a sign of vitality and liveliness of thought. The scientific community
begets ideas and projects, and advances even when the works seem to miss the intended
targets. In fact multiple views mature the minds of researchers who derive the definitive
solutions through debates and common conclusions.
Unfortunately this is not the case in the information domain. Authors have clashing
viewpoints and rarely show a collaborative attitude; they even indulge in squabbles and
personal arguments. They often refuse to acknowledge that the viewpoints of their opponents
have any validity at all and – worse – do not accept disagreement gracefully. For example
Shannon’s followers interpret this statement in a very radical mode:

“These semantic aspects of communication are irrelevant to the engineering problem.”


(Shannon 1949)

And in practice they refuse to engage in a dialogue with semioticians and linguists who
methodically focus on the meanings of signs. On the other side a circle of humanists plunges
into literary genres and does not bother about the information and communication technology
(ICT). Obviously, this uncommunicative attitude of thinkers does not facilitate progress in the
field which is presently in a rather sorry state.
Strong divisions, incomplete and generic outcomes do not seem to support my cultural
project, that is, to penetrate deep into the computer technologies starting from the notion of
information. The lack of solid theoretical references looks like an immense rock amidst the
route that obstructs my progress. I have just taken to the road with the ambitious purpose of
proceeding along a challenging route and I cannot proceed onward.

A. Odd Opponents and True Friends

Some decades ago I became distressed by this barrier. I felt unhappy of course and had a
strong desire to circumvent this obstacle. I investigated all over the literature in the hope of
coming up with a better idea. At last this exploration paid its dividends and bore evidence that
my initial bleak intuition was partially incorrect.
When a reader explores works on the various aspects of information, or crosses the
borders erected amongst the numerous sectors dealing with communication, he/she makes a
pleasant discovery: the vast majority of scientists do not dissent from one another completely.
Authors are in conflict with the abstract idea of information, but agree on what a piece of
information consists of. Pieces of information are signs, also named signals, messages, news
items, reports etc. Signs take the form of words, images, numbers, music, sounds, gestures
and objects and the interpretation of this great variety of communication items surprisingly
proves to be rather uniform. Scholars coming from different areas are prone to believe that a
sign has a material basis and stands for something. A large group of scholars tends to
converge upon two fundamental notions established in Semiotics.
I briefly remind the reader that Semiotics is defined as the science of signs (Chandler
2007), and amongst its most eminent founders are Charles Sanders Peirce, Ferdinand de
Saussure and Charles W. Morris.
Semioticians who adopt the terminology deriving from De Saussure (1983) call the body
of a sign signifier (E); and the represented entity signified (NE). The distinction between
signifier and signified has sometimes been equated to the familiar dualism of form and
content (Deely 1990). Small bodies of ink on paper, electrical impulses along wires, sonorous
waves in the air, and pixels on the display screen are familiar examples of signifiers in the
computer sector.
The signified is the thing, the event or the concept indicated by the signifier. It does not
need to be a real object but is some referent to which the signifier refers. By way of
illustration take:

New York

This film of ink on the paper is the signifier and the large American city considered the
centre of the world is the signified NE1. I can also use this item of ink to denote the idea of
New York that I have in my mind (= NE2) and thus I handle an abstract signified.
When we examine closely Shannon’s work, we find:

“By a discrete system we will mean one in which both the message and the signal are a
sequence of discrete symbols. A typical case is telegraphy where the message is a
sequence of letters and the signal a sequence of dots, dashes and spaces.” (Shannon 1949)

He calls the generic signifier ‘signal’, the elementary signifier ‘symbol’ and the signified
the ‘message’. He calculates signals conveyed in a channel which are physical quantities and
carry a message. Shannon adopts a special terminology but recognizes the existence of
signifiers and signified, and even distinguishes the overall signal from its components that are
‘symbols’. Søren Brier (2008) remarks:

The Shannon theory “presumes that signals are meaningful codes established in a system
of signs, such as the Morse code for the alphabet that makes sense for humans”.
Even if modern followers of Shannon dislike arguing over Semantics, their master
accepts that an electric impulse (= E) signifies something like a letter (= NE) or a figure.

The abstract interpretation of information lies beyond the horizon. Nobody can say when
the definitive solution will be established; in the meanwhile scientists subscribe to
conceptualization on what a sign consists in. The notions coming from Semiotics can be
summed up in this way:

A sign has a physical origin and stands for something. [1.1]

B. In the Literature

The material and semantic properties of signs summed up in [1.1] did not follow the same
way in literature; they had a very different provenance and fortune during the centuries.

The semantics of signs was recognized since a long time. Very early on, the naïve idea of
meaning saw the light of day in Western culture. Aristotle, the genius involved in all the
known disciplines of his era, also found the time to bother about human communication. In
"De Interpretatione (On Interpretation)", a short treatise dealing with language and logic, he
writes:

“Spoken words are the symbols of mental experience and written words are the symbols
of spoken words. Just as all men have not the same writing, so all men have not the same
speech sounds, but the mental experiences, which these directly symbolize, are the same
for all, as also are those things of which our experiences are the images.” (Aristotle
2004).

Whilst the semantic nature of signs was evident early on and was developed through a lot
of discussion, the physical side of information was muddied up till recent decades. In
particular, ideas, thoughts and other products of thinking proved to be the strongest obstacles
to the material view of information. A circle of philosophers inherited a certain spiritual
conception of the human mind from Platonism and refused the interpretation of ideas – and in
turn the interpretation of signs – as concrete objects. Ancient writers marked the basis of
linguistic signs using the Latin term ‘vox (voice)’ which vaguely alludes to a psychological
constitution. Ferdinand De Saussure (1983) presumed that the form of a sign is immaterial
and added that the language itself is “a form, not a substance”.
Fortunately, in recent times neurologists have provided substantial evidence on how
mental functions are related to neuronal activities and it is natural to assume that also mental
signifiers have chemical-electrical basis. Various experiments are inducing authors to
converge toward the material interpretation of E. Also Gottlob Frege (1892), Tadeusz
Kotarbinski (1968) and other followers of the logical schools, who maintain the most abstract
stance, partake of the concrete origin of information. Charles Morris (1964) and Doede Nauta
coin the clause ‘information carrier’. Some sociologists prefer the terms ‘representational
medium’ to name the physical substrate in which a representation is instantiated. We even
find the terms ‘sign vehicle’ highlighting the physicism of a symbol. The base E looks like a
cart being loaded with goods: a very practical perspective indeed!
Regardless of which interpretation one gives to the tenet of information, the reader can
find general consensus on [1.1] which I sum up in the following two distinct points:

(1) A signifier has a physical basis,


(2) A signifier stands for something.

C. The Semantic Triangle

Pleasant discoveries in literature did not come to an end for me.


Semioticians agree that a third element is necessary to complete the interpretation of
signs. The roles of E and NE do not have mechanical origins and authors come into accord
that an intellectual sense-maker or interpreter links the signifier to the signified.
A large number of thinkers made the three elements progressively intelligible over the
centuries. Sextus Empiricus, Severinus Boethius, Thomas Aquinas up to De Saussure
analyzed the components of a sign according to the fine reconstruction made by François
Rastier (1990). In the late nineteenth century the American philosopher and logician Charles
Sanders Peirce formulated the triadic model (Hartshorne et al, 1931-66). In 1923 C. K. Ogden
and I. A. Richards (1989) drew a geometrical diagram to clarify the semantic process and
fixed the thought as the sense-maker.
This notion has a reasonable degree of generality for the present book. The human
intellect recognizes/ interprets/ invents the meanings of signs and brings about the entire
semantic processes. Saussure wrote that E and E* are “intimately linked” “by an associative
link” in the mind and “each triggers the other”. This is enough for us to conclude that the
thought or the mind ME connects E with NE in harmony with the ideas of Ogden and
Richards.

Figure 1.1. The semantic triangle.

One cannot deny that the semantic triangle raises a lot of serious questions. Terms such
as: interpretant, sense, signification, semiosis, significate, thought and reference, connote the
vertices of the triangle, and this varied terminology mirrors the contrasting philosophical
perspectives and pinpoints the misaligned interpretations of the semantic processes. What
meanings exactly signify is rather mysterious. Bertrand Russell and Gottlob Frege believe
that meaning is a nominated subject (Wolfgang, 1994); Alfred Tarski assumes meaning is an
interrelation between a sign and a significate; Wittgenstein says that a meaning is a relation
between a sign and people’s behavior. Ogden and Richards mention 23 different definitions
of the term ‘meaning’ in their monograph (Ogden et al 1989).
The functions of the sense-maker raise debates and lie far from unified conclusions. For
example, the recourse to the public character of words begs the question of how the mind is
supposed to gain access to the extra-mental world. Some thinkers posit that there must be a
realm of concepts in the mind as bearers of meaning, but Wittgenstein and others bring
suspicion on treating thoughts as ‘mental representations’. Communication and media
theorists stress the value of the mental processes, while others highlight the receptive aspect
of the intellect. The psychological and sociological features of significance cast light upon
human arbitrariness; on the other hand, logicians focus on the truth/falsity of reasoning, and
postulate the precision of the semantic relationships.
Alan Turing, John von Neumann and other modern thinkers on Artificial Intelligence
(AI) are convinced of direct connections extant between the computer and the human mind.
The brain can be found to be performing something like computational processing. One can
read the intriguing history of the efforts of scientists to develop a machine able to think like a
human being in (Pratt 1987). The bonds between natural intelligence and artificial devices
make it even more difficult to interpret what is significance and enhance the attraction of the
semantic triad for commentators.
The semantic triangle raises important points of discussion intimately bound up with
philosophical topics and even with unresolved questions concerning the logic and essence of
human life. The intellectual nucleus of Semiotics epitomizes the formation of human
knowledge and raises debates upon the origin of the mind, upon its unpredictable and free
nature, upon learning, memory, will, consciousness and many other knotty issues. At present
endless discussions are taking place and any synopsis of current works is at risk of being
reductive and incomplete. Far be it from me to talk about these complex arguments.
Stamper (1996) defines the following six levels regarding communication and related
issues:

1. Physical: This level is involved in the concrete appearance of signs, the media and
their preparation through technologies.
2. Empirical: The empirical level focuses on the variety and equivocation encountered;
sometimes related to the entropy and uncertainty.
3. Syntactic: A representation uses a defined set of symbols according to a set of rules.
If the syntax is formally defined then symbolic forms may be transformed into other
symbolic forms.
4. Semantic: This layer concerns the meaning and validity of what is expressed.
Meaning is the mapping of a symbol to a real world object or state and is different for
different people.
5. Pragmatic: The pragmatic level of representation concerns the usage of symbols. It
takes into account contextual issues including the intentions, responsibilities and
consequences behind the expressed statements.
6. Social: The social layer regards the understanding of the meaning of symbols, and
takes into account cultural, ethnical, geographical and political issues involved.
This book addresses the computer technologies and the multiple layers of the semantic
triangle lie beyond my competence. I restrict the discussion to the physical level 1; and the
diagram in Figure 1.1 will be good enough for completing the fundamental semiotic
assumption [1.1] in the following manner:

A sign is equipped with three elements in all: the signifier E, the signified NE
and the mental function ME which relates the former to the latter. [1.2]

All those components which appear at the corners of the semantic triangle are necessary
for creating cognition, information, and communication, and are sufficient for the analysis of
computing machines.

2. PRACTICAL PERSPECTIVE
I found out that the semiotic ideas are very popular in the literature and in professional
practice, and I intend to provide the reader with evidence. In particular, a wealth of pragmatic
inquiries has been done into the material nature of signs. Researchers and practitioners
coming from different fields and disciplines exploit the properties of signifiers even though
the term ‘signifier’ is not formally cited. My interest in signifiers is primarily from the
perspective of a scientist and a technician, and the next pages exhibit a concise survey which
focuses on the concrete aspects of signs.

A. Experts in Communication Sciences

The fundamental qualities of human communication are derived from the material
substance of signs, and linguists categorize languages on the basis of the material essence of
languages. The following classification, which is a prerequisite to further studies in
communication and media, includes for instance:

– Textual languages i.e. signifiers are words printed on paper,


– Vocal languages i.e. signifiers are sonorous words,
– Gestural languages i.e. signifiers are motions of limbs,
– Pictorial languages i.e. signifiers are colored pigments,
– Musical languages i.e. signifiers are sounds.

Physical features determine the nature of each language and cannot be swapped. Nobody
can switch from one material medium to another medium without losses and modifications.
For the sake of reference, in 1725 Antonio Vivaldi published “The Four Seasons”, a violin
concerto that recreates scenes located in spring, summer, fall and winter respectively. Each of
the four parts is illustrated by a sonnet – presumably written by the composer – full of
allusions suitable for sonic depiction. Concertos and sonnets recount the same scenes; they
depict the same vistas. The parallelism is evident but the melodious music gives rise to
sensations that the sonnets cannot stir. Ink and music convey untranslatable contents. One
cannot exchange the feelings evoked by the poems with the sensations of the concertos
because of their different material nature.

Word Order Languages Number %

SOV 497 41
SVO 436 35
VSO 85 7
VOS 26 2
OVS 9 1<
OSV 4 1 <<
No dominant order 171 14
TOTAL 1228

Figure 1.2. Popularity of word order (Source: Dryer op.cit.).

Typology, an entire and ever-increasing sub-field in linguistics, spells out the order of the
words (Whaley 1996). Authors recognize six basic types of declarative sentences in all: SOV,
SVO, VSO, VOS, OVS and OSV, where S marks the subject, O the object and V the verb. By
way of illustration:

– English is a SVO language


e.g. “Tom (=subject) met (=verb) Sally (=object)”;
– Japanese is SOV
e.g. “Gakusei-ga hon-o yonda (= student book read)”;
– Welsh is VSO
e.g. “LLaddodd y ddraig y dyn (= killed the dragon the man)”.

The disposition of subject, verb and object is a physical property which leads to
important classification of languages. SOV (subject-object-verb) is preferred by the largest
number of modern languages. SVO (subject-verb-object) is the second group, but has the
greatest number of speakers because this group includes the most popular languages such as
English, Chinese, French, Spanish, Portuguese, Russian, the Germanic languages, many
languages of Africa and of Southeast Asia, including Khmer, Vietnamese, Thai, and Malay.
There are words in a language indicating the relation of the substantive to a verb, an
adjective, or another substantive. These words are members of closed classes called
prepositions and postpositions. A preposition is located before the intended substantive X, the
postposition is located behind X. For example ‘with’ comes before ‘me’ in English sentences
and is classified as preposition. Take: "Bob speaks with me"; the clause ‘with me’ is translated
in Turkish as ‘benim ile’ (literally ‘me with’), i.e ‘with’ is a preposition and ‘ile’ is a
postposition. It is evident how prepositions and postpositions regulate time-space properties
of languages.
Linguists go on with their materialistic approach and discuss whether the disposition
adjective-noun (AN) or otherwise noun-adjective (NA) emerges as the prevalent order in a
phrase. For instance the English expression “Good morning!” is AN; the Italian clause
“piogge sparse” is a NA form.
Experts even investigate the displacement of the atomic parts in a word. They inquire into
prefixes, suffixes, and infixes and how these parts changed positions down the centuries and
influenced the rules of grammar.
In conclusion, linguists dedicate a fair amount of attention to the order of words, and
make comprehensible the material characteristic of textual signifiers. In fact, the order
regulates the spatial place of written words and the temporal priority of spoken words. The
order is a temporal and spatial property, and it is not an exaggeration to use the designation
‘physical linguistics’ for typology.
Marshall McLuhan (1965) masterfully explains how tiny physical details which belong to
a communication system and which appear trivial at first glance, provoke broad and
astonishing social phenomena. The medium has a profound influence on human thought of
which the user may not always be conscious. McLuhan’s famous aphorism:

“The medium is the message”

Summarizes how the construction of a medium adds contents to the mere message
transmitted. The concrete constitution of a signifier and the way it is delivered has a profound
effect on the human soul; it alters human perception and affects human consciousness. The
make-up of a medium is associated to specific subject contents so that communication media
direct listeners’ attention, and influence the behavior of social groups.
The physical structure of a medium leads McLuhan to the distinction of hot and cool
media. A hot medium enhances only one single sense and accordingly is rich in detail – note
how the richness and poverty of details do not refer to the content but to the form in which
this content is necessarily communicated due to the composition of the medium – the focus
on one sense caused by a hot medium makes the recipient refer to his inwardness and thus
separates him from the outside world. A cold medium in contrast lacks detail whilst it
demands active attendance and a multi-sensory participation from the recipient.
Two examples illustrate this distinction: the radio is a hot medium and the television is
cold. In actual fact, the radio only enhances the acoustic sense and stimulates the receiver's
self-suggestion. By the way, McLuhan remarks upon how the radio reinforced the excitement
of the Nazi propaganda. Television stimulates visual, acoustic and also collective
participation and should be considered a cold medium. The multi-sensory structure of TV
provokes little involvement and, for example, conditions children into becoming passive
observers.
Nowadays the followers of McLuhan apply his method of investigation, and analyze
novel media such as the communicative structure of network systems which are
revolutionizing people’s lifestyle (De Kerckhove 1997).
Paleographers and librarians are aware that a vegetal ink runs the risk of bleaching as
time passes, and moreover, the paper on which a historical document is written may
deteriorate. Experts take care of and preserve old writings that become darker and unreadable
(Cunha et al 1967). The physical state of ancient writing attracts the paleographers’ attention
and one concludes that even those experts are extremely sensitive to the empirical form of
information.
B. Experts in Biological and Social Sciences

Our bodies are complex systems and when something is wrong, our bodies inform us
through special warning signs e.g. pain, fever, sweat, vomit, tremors, weight loss, shortness of
breath, headache etc. The physical essence of those special signifiers – called symptoms – is
evident to everybody and does not need further elucidation. Symptomatology, a branch of
medical science, teaches physicians how to recognize symptoms and how to diagnose a
disease correctly (Gray et al 2001).
Essential symptoms are those signs that are intrinsically related to the pathology and
necessarily follow from it. The essential symptoms of X provide the complete conception of
the disease X. Because of the complexity and the large number of the essential symptoms of
X, doctors focus on the pathognomonic symptoms which are so characteristic of X as they are
sufficient to make a diagnosis. For example the yellowness of the skin, of the sclerotica and
of the nails is the pathognonomic symptom of jaundice. On observing this, a doctor is sure
that the patient has jaundice without making any further inquiry. Essential and
pathognomonic symptoms are judged as the appropriate elements for a diagnosis but a doctor
takes into consideration any circumstance of the case to make the correct diagnosis, and also
analyses the accidental symptoms that are not at all specific to the disease. Frequently, the
observation of a feeble accidental sign determines the survival of a patient. In fact some
diseases are asymptomatic: these diseases do not create any signifier and the patient is
unconscious of his state. Cancer is a well known silent illness, i.e. cancer does not show any
symptom for a long while and when the signals come to light, successful intervention
becomes more difficult.
Gestures and facial expressions are important vectors of communication which make up
the so-called language of signs. Through looks, behavior, facial expressions, and head
movements, people convey information about not only a person’s emotional state but also on
discursive and syntactic elements. The body organs moving with varying degrees of motion,
gentleness or force convey different meanings and make evident the concept of signifier
(Fernandez et al 1997).
The language of signs may be found even in the animal kingdom. This territory has been
explored by Thomas A. Sebeok, father of Zoosemiotics, by ethologists such as D. Griffin
(1985), C. Heyes and others. Researchers decipher the system of signs used by animals to
communicate: postures, gestures, cries, excrements, movements, which clearly are material
signifiers. Experts subdivide the communication between senders and receivers into two
classes:

(i) Those belonging to the same species (intraspecies comm.);


(ii) Those belonging to different species (interspecies comm.).

For example the signals for mating belong to the first class, and the signals exchanged
between predators and preys pertain to interspecies communication.
Figure 1.3.

Efforts have been concentrated on finding out whether a common alphabet can be
constructed on the basis of animal behavior. E.g. a quadruped makes use of its tail to
communicate. It keeps the tail between its legs to denote fear. If it wags its tail, it signifies
happiness and devotion. The quadruped is antagonistic and aggressive when it raises its tail.
The waving tail may be seen as the basic element of a code shared by various species of
mammals: horses, wolves, dogs and so forth (Hailman 2008)
In the nineteenth century, the scientific exploration of the processes of the brain began. In
1861, Paul Broca first described the section of the brain – close to the frontal lobe in the left
hemisphere – which is involved in speech production. Later experts discovered how
perceptual information from the eyes, ears, and the rest of the body is sent to the right zone of
the brain. Other specialized areas were found in the course of later investigations (see also
Paragraph 4 in the present chapter) (MacInteyer 2004). Modern techniques keep up the
physical origin of mental ideas in a factual manner. I quote positron emission tomography
(PET), a nuclear medicine system which produces a three-dimensional image of the brain’s
activities; magnetic resonance imaging (MRI) is an effective tool that visualizes the inside of
the brain (Vlaardingerbroek et al 2003).
Chemical-electrical reactions embody intellectual activities and the cure of psychological
diseases by means of drugs reinforces the awareness about the material origin of information
within the brain. Neurological and Psychiatric studies corroborate the materialistic
perspective on mental signifiers.
A court of justice gives judgment against an accused or otherwise acquits him after the
accurate consideration of evidence. Scientists accept a theory if it is supported by
experimental evidence, or else reject it. A manager chooses to follow a strategy in accordance
with positive evidence. Crucial decisions are taken on the basic strength of evidence.
Evidence is a sign, or better still, evidence is a signifier which plays a fundamental role in
forming a conclusion or judgment.

Figure 1.4.
The current literature concentrates attention on the particulars of evidences, on how to
acquire evidences, the methods to adopt etc. because of the valuable role which a tiny detail
can play. E.g. the scientific method requires that evidence should be obtained through
rigorous, standard procedures (Wilson 1952). E.g. legal doctrines determine how objects may
be submitted to a jury as evidences. Also verbal testimony is regulated and is physically
bound to a real individual, who is officially admitted as a witness (Blazey et al 1996).

C. Experts in Physical Sciences

An intriguing vein of studies sought out the minimal signifiers in classical physics. In
1929 Leo Szilard, inspired by the thermodynamical system elaborated by Maxwell,
envisioned a theoretical engine that consists of a cylinder with a single-molecule working
fluid. An operator observes a molecule in the chamber on the right and pushes the piston on
the left toward the center partition. The molecule in motion impinges on the piston, causing it
to slide back to the left. This creates useful work and the cycle can begin anew. Szilard found
that the amount of useful work per cycle is

WC= kBT ln (2)

Where T is the absolute temperature of the system and kB the constant of Boltzmann. The
second law of thermodynamics holds that energy cannot be created from nothing; thus Szilard
supposed WC is the energy absorbed in order to acquire information from the single molecule
(Leff et al 1990). Eminent physicists, say Bennett, Penrose, Zurek, Brillouin, Feynman,
discussed this argument and eventually converged toward the illustration of Rolf Landauer
(1961) who explained how the Szilard engine cannot work unless an operator takes energy to
erase acquired information. In substance, a sole molecule is the tiniest signifier in classical
physics and absorbs energy WC to be restored to the initial conditions.
Information has material origin, and demands physical power to be collected, erased,
thrown away, and so forth, in the tiniest systems too. We can but agree with the witticism of
E.T. Jaynes:

"The old adage 'knowledge is power' is a very cogent truth, both in human relations and
in thermodynamics."

Electronics supports continual evolution in the mechanisms of information rendering. We


live amidst several information representations, and a variety of technologies have been
developed to record, to process and to exhibit various types of phenomena. Images, sound and
other forms of data can be stored by any number of methods: magnetically, optically,
electronically or even otherwise. Cathode ray tube (CRT), liquid crystal display (LCD), light-
emitting diode (LED) and gas plasma produce ever better images. This is just to say that the
technology of signifiers is ever progressing in ICT and is transforming the lifestyles of
people. I shall make further comments on this topic later.
Some philosophers define a signifier as the surface structure of a language; others mark a
signifier as a shape of information. The terms ‘surface’ and ‘shape’ make a signifier similar
to a suit whose appearance does not influence the personality of the man wearing it. This
terminology hints at the idea that signifiers play an ancillary role instead the contrary is true.
This short survey is an attempt to illustrate the material side of information and to highlight
its relevance in a variety of environments. Linguists, paleographers, physicians, ethologists,
doctors, magistrates, experts in mass media and in cognitive sciences, computer engineers and
many others share the idea that information is tied to some substance and is not airy. They
delve into the multifold world of signifiers and benefit from their physical properties.

3. SHARPNESS
The idea that a sign has a physical origin proves to be very popular in various fields.
However, experts rarely study Semiotics. Engineers show little interest in the science of signs
and normally use E and NE by intuition. Why does Semiotics, which establishes two
fundamental tenets, seem to be alien to scientists and to engineers in a special manner?

I am inclined to believe that this behavior originates from a precise cause; in particular,
from the cultural gap between the humanities and technologies. The first use verbal
descriptions, the second are grounded on measures. Semioticians argue about the nature of
signs, whereas engineers need equations to calculate quantities concerning machines and
products. The signifier and the signified are popular ideas, but ICT practitioners put them in
the background due to methodological divergence. A special divide keeps Semiotics away
from the exact sciences and frustrates our initial, cultural project. The contrasting definitions
of information seemed to prevent me from advancing; now the generic notions of signifier
and signified build a second barrier against my initial purposes.

A. Toward a Principle

Actually engineers describe their objects using the mathematical idiom: a method that
does not tolerate any exception. Things are to be standardized, quantified, and measured using
rigorous equations; thus E and NE wait to be redefined in accordance with the rules of the
scientific method. The semiotic concepts could be appropriately used by technicians and
could provide an epistemological support to computing as long as one begins to provide the
formal definitions of the signifier and the signified.
Semiotic issue (1) “A signifier has a physical basis” – cited in the previous pages –sounds
rather vague from the perspective of the exact sciences and one wonders:

When may a generic body be a sign?


What are the properties that determine a signifier?
How could those properties be formalized?

Four different groups of experts, which seem to move toward a precise direction,
suggested an answer to me.
i. There are experimentalists who scrutinize information phenomena in biological and
mechanical systems.
• Physiologists bring evidence that an individual perceives reality through his
senses, which detect differences in stimuli, and in turn influence individual
cognition. When a sensory message to the brain is constantly repeated, the
sensitivity weakens and is finally suppressed. In short there is no information
when there are not any differences. Lack of contrast does not make people
informed and the receptors are capable of reporting news to the brain only when
something changes. Nerve endings do not sense stimuli but differences of stimuli
(Somjen 1983).
• We learn from Neurology that a nerve impulse moves along the axon and
consists of a self-propagating series of polarizations and depolarizations. The
spike reaches the action-potential +40 mV (millivolt) whose value is far from the
resting-potential located at – 70 mV. The nervous signifiers – about 110 mV
apart – prove to be absolutely distinct (Partridge 2003).
• Electronic engineers calculate the distance that makes any two signals
distinguishable. They recognize this quality as essential for handling signals
(Smirnov 1999).
• Nature long ago learned to encode information about organisms in
Deoxyribonucleic Acid (DNA). When a cell receives a biological material with
different DNA, it rejects such material; this means that the genetic code X
contrasts with the genetic code Y and has the evident property of being
distinguishable (Loewenstein 1999).
ii. Ever since the classical age, philosophers recognize that clear-headedness is a
necessary requisite for humans; to understand what happens in the world, sense data
and non-sense data alike are to be neat. I mean to quote the reflections by Wilhelm
Gottfried Leibniz who argues about the indiscernibility of identicals and the identity
of indiscernibles. He attaches great importance to the special property of becoming
distinct which renders indiscernibles clear to the mind (O'Leary 1995). In substance,
the ideas developed by Man can be elaborated as long as those pieces of information
are neat. Fuzzy tenets impair correct reasoning and it is natural to conclude that even
clear ideas are to be definite signifiers.
iii. Linguists highlight the distinctiveness of forms and Saussure (1983) says in explicit
terms:

Sign’s “most precise characteristic is to be what the others are not”.

And John Arthur Passmore (1985) offers a charming aphorism:

“Languages differ by differentiating differently”.

A principle of communication and art refers to the arrangement of opposite


elements in a piece so as to create visual interest, excitement and drama. Authors
adopt a wide set of contrasting items to arouse strong feeling e.g. light vs. dark
colors; rough vs. smooth textures; large vs. small shapes. An artist can employ
contrast as a tool to direct the viewer's attention to a particular point of interest within
the piece.
iv. Several theorists of information argue about the concept of dissimilarity. MacKay
writes in (1969):

“Information is a distinction that makes a difference.”

Mark Burgin (2009) places the concept of change at the base of his general
theory of information. He assumes that information causes changes either in the
whole system that receives a message or in a part of it. A measure of information is
some measure of provoked diversities.
At the base of his theory Gregory Bateson (2000) places the following:
The “elementary unit of information” is “a difference which makes a difference”.
Bateson feels the need to specify the notion of ‘difference’ that is central to his
definition, and notes that any object is characterized by a large number of special
features that are the ‘differences’ typical of that object. It is precisely because of this
infinitude that an object cannot enter into a communication or mental process such
as. People normally select or filter out a limited number of differences of the
intended object which becomes information. Thus mental information is an abstract,
simplified entity. The object-hammer is material with several peculiar attributes; the
information-hammer has the reduced set of attributes accepted by the individual’s
mind.

Four groups of authors – experimentalists, philosophers, experts on communication and


information theorists – spell out the idea that the ability to be distinct is the essential
characteristic of information. The act of distinguishing by comparing differences turns out to
be of universal use. Sharpness influences the existence of any piece of information and one
can reasonably conclude that a signifier must be neat in order to work properly.

Figure 1.5. Four different deterioration processes (from left to right: blurring, noise, erosion,
granulation) cause progressive loss of information.

MORE PAGES ARE AVAILABLE AT:


http://www.edscuola.it/archivio/software/bit/course/book.html

View publication stats

You might also like