0% found this document useful (0 votes)
11 views12 pages

Protection - in - Operating - Systems Article

Uploaded by

musfirafarooq776
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views12 pages

Protection - in - Operating - Systems Article

Uploaded by

musfirafarooq776
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/222089845

Protection in Operating Systems.

Article in Communications of the ACM · August 1976


DOI: 10.1145/360303.360333 · Source: doi.acm.org

CITATIONS READS
1,147 6,701

3 authors:

Michael A. Harrison Walter L Ruzzo


University of California, Berkeley University of Washington
106 PUBLICATIONS 5,068 CITATIONS 197 PUBLICATIONS 14,272 CITATIONS

SEE PROFILE SEE PROFILE

Jeffrey Ullman
Stanford University
413 PUBLICATIONS 73,968 CITATIONS

SEE PROFILE

All content following this page was uploaded by Michael A. Harrison on 26 June 2014.

The user has requested enhancement of the downloaded file.


I. Introduction

One of the key aspects of modern computing sys-


tems is the ability to allow many users to share the
same facilities. These facilities may be memory, proces-
Operating R.S. Gaines
sors, databases, or software such as compilers or sub-
Systems Editor
routines. When diverse users share common items, one
Protection in is naturally concerned with protecting various objects
from damage or from misappropriation by unauth-
Operating Systems orized users. In recent years, a great deal of attention
has been focussed on the problem. Papers [4-6, 8-13,
15] are but a sample of the work that has been done.
Michael A. Harrison and Walter L. Ruzzo In particular, Saltzer [15] has formulated a hierarchy
University of California, Berkeley of protection levels, and current systems are only
Jeffrey D. Ullman halfway up the hierarchy.
Princeton University The schemes which have been proposed to achieve
these levels are quite diverse, involving a mixture of
hardware and software. When such diversity exists, it
is often fruitful to abstract the essential features of such
systems and to create a formal model of protection
systems.
A model of protection mechanisms in computing The first attempts at modeling protection systems,
systems is presented and its appropriateness is argued. such as [4, 6, 10] were really abstract formulations of
The "safety" problem for protection systems under the reference monitors and protected objects of par-
this model is to determine in a given situation whether a ticular protection systems. It was thus impossible to
subject can acquire a particular right to an object. ask questions along the lines of "which protection
In restricted cases, it can be shown that this problem is system best suits my needs?" A more complete model
decidable, i.e. there is an algorithm to determine of protection systems was created in [8], which could
whether a system in a particular configuration is safe. express a variety of policies and which contained the
In general, and under surprisingly weak assumptions, "models" of [4, 6, 10] as special cases. However, no
it cannot be decided if a situation is safe. Various attempt to prove global properties of protection
implications of this fact are discussed. systems was made in [8], and the model was not com-
Key Words and Phrases: protection, protection pletely formalized.
system, operating system, decidability, Turing machine On the other hand, there have been models in which
CR Categories: 4.30, 4.31, 5.24 attempts were made to prove results [2, 3, 13]. In [2],
which is similar to [8] but independent of it, theorems
are proven. However, the model is informal and it
uses programs whose semantics (particularly side ef-
fects, traps, etc.) are not specified formally.
In the present paper, we shall offer a model of
protection systems. The model will be sufficiently
formal that one can rigorously prove meaningful
theorems. Only the protection aspects of the system will
be considered, so it will not be necessary to deal with
the semantics of programs or with general models of
computation. Our model is similar to that of [6, 10],
where it was argued that the model is capable of de-
Copyright Q 1976, Association for Computing Machinery, Inc. scribing most protection systems currently in use.
General permission to republish, but not for profit, all or part Section 2 describes the motivation for looking at
of this material is granted provided that ACM's copyright notice decidability issues in protection systems. Section 3 pre-
is given and that reference is made to the publication, to its date
of issue, and to the fact that reprinting privileges were granted sents the formal model with examples. In Section 4 we
by permission of the Association for Computing Machinery. introduce the question of safety in protection systems.
Research was sponsored by NSF Grants GJ-474 and GJ-43332. Basically, safety means that an unreliable subject can-
Work was performed while the third author was on leave at the
University of California at Berkeley. not pass a right to someone who did not already have it.
Author's addresses: M.A. Harrison and W. L. Ruzzo, Depart- We then consider a restricted family of protection
ment of Computer Science, University of California, Berkeley, systems and show that safety can be decided for these
CA 94720; J.D. Ullman, Department of Electrical Engineering,
Computer Science Laboratory, Princeton University, Princeton, systems. In Section 5 we obtain a surprising result:
NJ 08540. that there is no algorithm which can decide the safety

461 Communications August 1976


of Volume 19
the ACM Number 8
question for arbitrary protection systems. The p r o o f be quite primitive. N o general purpose computation is
uses simple ideas, so it can be extended directly to included, as we are only concerned with protection--
more elaborate protection models. that is, who has what access to which objects.
Definition. A protection system consists of the fol-
lowing parts:
2. Significance of the R e s u l t s (1) a finite set of generic rights R,
(2) a finite set C of commands of the form:
To see what the significance for the operating
command , x ( X 1 , X2 , . . •, Xk)
system designer of our results might be, let us consider if rl in (X,~, Xo~) and
an analogy with the known fact that ambiguity of a r2 in (X~2,Xo2)and
context free grammar is undecidable (see [7], e.g.).
The implication of the latter undecidability result is r~in (Xs~ , Xo~)
then
that proving a particular grammar unambiguous might
o171
be difficult, although it is possible to write down a op2
particular grammar, for Algol, say, and prove that it . . .

is unambiguous. By analogy, one might desire to show op~


that in a particular protection system a particular end
situation is safe, in the sense that a certain right cannot or if m is zero, simply
be given to an unreliable subject. Otar result on general
undecidability does not rule out the possibility that command ~(X1 . . . . , Xk)
o/71
one could decide safety for a particular situation in a
particular protection system. Indeed, we have not ruled opn
out the possibility of giving algorithms to decide safety end
for all possible situations of a given protection system,
or even for whole classes of systems. In fact we provide Here, a is a name, and X 1 , . . . , X~ are formal parame-
an algorithm of this nature.
ters. Each op~ is one of the primitive operations
By analogy with context free grammars, once again, enter r into ( X . , X o )
if we grant that it is desirable to be able to tell whether delete r from (X,, Xo)
a grammar is ambiguous, then it makes sense to look create subject X,
create object Xo
for algorithms that decide the question for large and destroy subject X,
useful classes of grammars, even though we can never destroy object Xo
find one algorithm to work for all grammars. A good
example of such an algorithm is the LR(k) test (see Also, r, r l , . . •, rm are generic rights and s, s l , . . . , s~
[7], e.g.). There, one tests a grammar for LR(k)-ness, and o, o l , . . . , on are integers between 1 and k. We
and if it is found to possess the property, we know the may call the predicate following if the conditions of
grammar is unambiguous. If it is not LR(k) for a and the sequence of operations o p l , . . . , op~ the
fixed k, it still may be unambiguous, but we are not body of a.
sure. It is quite fortunate that most programming Before explaining the significance of the commands
languages have LR(k) grammars, so we can prove we need to define a configuration, or instantaneous
their grammars unambiguous. description of a protection system•
It would be nice if we could provide for protection Definition. A configuration of a protection system
systems an algorithm which decided safety for a wide is a triple (S, O, P), where S is the set of current subjects,
class of systems, especially if it included all or most of O is the set of current objects, S c O, and P is an access
the systems that people seriously contemplate. Un- matrix, with a row for every subject in S and a column
fortunately, our one result along these lines involves a for every object in O. P[s, o] is a subset of R, the ge-
class of systems called "mono-operational," which are neric rights. P[s, o] gives the rights to object o possessed
not terribly realistic. Our attempts to extend these by subject s. The access matrix can be pictured as
results have not succeeded, and the problem of giving a in Figure 1. Note that row s of the matrix in Figure
decision algorithm for a class of protection systems as 1 is like a "capability list" [4] for subject s, while
useful as the LR(k) class is to grammar theory appears column o is similar to an "access list" for object o.
very difficult. Now let us interpret the parts of a protection system.
In practice, typical subjects might be processes [4], and
typical objects (other than those objects which are
3. A Formal Model of Protection Systems subjects) might be files. A c o m m o n generic right is
read, i.e. a process has the right to read a certain file.
We are about to introduce a formal protection The commands mentioned in item (2) above are meant
system model. Because protection is but one small to be formal procedures.
part of a modern computing system, our model will Since we wish to model only the protection aspects

462 Communications August 1976


of Volume 19
the ACM Number 8
of an operating system, we wish to avoid embedding owner himself). We thus have three c o m m a n d s of the
into the model unrestricted computing power. The form
c o m m a n d s therefore, are required to have a very
command CONFER~(owner, friend, file)
simple structure. Each c o m m a n d m a y specify a test if own in (owner, file)
for the presence of certain rights in certain positions of then enter r into (friend, file)
the current access matrix. These conditions can be used end
to verify that the action to be performed by the com-
where r is read, write, or execute. Technically the r
m a n d is authorized. For example, "if r in (X~, Xo)
here is not a parameter (our model allows only objects
t h e n . . . " i n d i c a t e s that the subject x, needs right r
as parameters). Rather, this is an abbreviation for the
to object Xo, where x, and Xo are the actual parameters
three commands CONFERreaa, etc.
corresponding to formal parameters X, and Xo. I f
the conditions are not satisfied, the body of the com- (3) Similarly, we have three c o m m a n d s by which
m a n d is not executed. The c o m m a n d body is simply the owner of a file m a y revoke another subject's access
rights to the file.
straight line code, a sequence of primitive operations
containing no conditional or unconditional branches, command REMOVEr (owner, exfriend, file)
no loops, and no procedure calls. if own in (owner, file) and
Each primitive operation specifies some modification r in (exfriend, file)x
then delete r from (exfriend, file)
which is to be made to the access matrix. F o r example, end
enter r into (X~, Xo) will enter right r into the matrix
at position ( x , , xo), where x, and Xo are the actual where r is read, write, or execute.
parameters corresponding to the formals X, and Xo. This completes the specification of most of the
That is, subject x, is granted generic right r to object Xo. example protection system. We shall expand this ex-
The effect of a c o m m a n d will be defined more for- ample after learning how such systems " c o m p u t e . "
mally after an example. To formally describe the effects of commands, we
Example 1. Let us consider what is perhaps the must give rules for changing the state of the access
simplest discipline under which sharing is possible. matrix.
We assume that each subject is a process and that the Definition. The six primitive c o m m a n d s mean ex-
objects other than subjects are files. Each file is owned actly what their names imply. Formally, we state
by a process, and we shall model this notion by saying their effect on access matrices as follows. Let (S, O, P)
that the owner of a file has generic right own to that and (S', O', P') be configurations of a protection system,
file. The other generic rights are read, write, and execute, and let op be a primitive operation. We say that:
although the exact nature of the generic rights other (s, o, P) ~o~ (s', o', P')
than own is unimportant here. The actions affecting
the access matrix which processes may perform are as [read (S, O, P) yields (S', 0', P') under op] if either:
follows. (I) op = enter r i n t o (s, o) and S = S', O = 0', s E S,
(1) A process m a y create a new file. The process o E O, P'[sl, ol] = P[sl, ol] if (sl, ol) # (s, o)
creating the file has ownership of that file. We represent and P'[s, o] = P[s, o] U {r}, or
this action by (2) o p = delete r from (s,o) and S = S', O = O',
s C S , o C O , P ' [ s l , ol] = P [ s l , 0 1 ] i f ( s l , 0 2 )
command CREATE(process, file)
create 9bject file
(s, o) and P'[s, o] = P[s, o] -- {r}.
enter own into (process, file) (3) op = create subject s', where s' is a new symbol
end not in O, S' = S O {s'}, O' = 0 O {s'}, P'[s, o] =
P[s, o] for all (s, o) in S X O, P'[s', o] = ~ for all
(2) The owner of a file m a y confer any right to
o C O', and P[s, s'] = ~ for all s 6 S'.
that file, other than own, on any subject (including the
(4) op = create object o', where o' is a new symbol
not in O, S' = S, O' = O [3 {o'}, P'[s, o] = P[s, o]
Fig. 1. Access matrix. for all (s, o) in S X O and P'[s, o'] = ,~ for all
sCS.
ob~cts (5) op = destroy subject s', where s' C S, S' = S -- {s'},
subjects O' = O -- {s'}, and P'[s, o] = P[s, o] for all (s, o) C
S' X O'.
(6) op = destroy object o', where o' C 0 - S, S' = S,
O ' = O - {o'1, and P'[s,o] = P[s,o] for all
subjects (s, o) ~ S' X O'.
m m

.~. rights of subject s 1This condition need not be present, since delete r from (ex-
to object o friend, file) will have no effect if r is not there.
2 ~ denotes the empty set.
463 Communications August 1976
of Volume 19
the ACM Number 8
The quantification in the previous definition is command c~(X, Y, Z )
enter rl into (X, X )
quite important. For example, a primitive operation
destroy subject X
enter r~ into (Y, Z )
enter r into (s, o)
end
requires that s be the name of a subject which now
There can never be a pair of configurations Q and Q'
exists, and similarly for o. If these conditions are not
such that
satisfied, then the primitive operation is not executed.
The primitive operation Q ~--(..... ) a '

create subject s' since the third primitive operation enter r2 into (x, z)
will occur at a point where no subject named x exists.
requires that s' is not a current object name. Thus Example 2. Let us consider the protection system
there can never be duplicate names of objects. whose commands were outlined in Example 1. Suppose
Next we see how a protection system executes a initially there are two processes Sam and Joe, and no
command. files created. Suppose that neither process has any
Definition. Let Q = (S, O, P) be a configuration of rights to itself or to the other process (there is nothing
a protection system containing: in the model that prohibits a process from having
command a(X1 . . . . . Xk) rights to itself). The initial access matrix is:
ff rl in (X,1, Xol) and Sam Joe
r,~ in (X,= , Xo~)
then o p l , . . . , opn Joe (~ (~
end
Now, Sam creates two flies named Code and Data,
Then we say and gives Joe the right to execute Code and Read
Q ~(~ .... ,~k) Q' Data. The sequence of commands whereby this takes
place is:
where Q' is the configuration defined as follows:
CREATE(Sam, Code)
(1) If a's conditions are not satisfied, i.e. if there is some CREATE(Sam, Data)
1 _< i < m such that r~ is not in Pixie, xo~], then CONFERexeeute(Sam, Joe, Code)
CONFERread(Sam, Joe, Data)
0=0
(2) Otherwise, i.e. if for all i between 1 and m, To see the effect of these commands on configura-
r~ { P[x,~, Xo,], then let there exist configurations tions, note that the configuration (S, O,P) can be
Q0, Q,, • • . , O~ such that represented by drawing P, and labeling its rows by
elements of S and its columns by elements of O, as
Q = Q0 ~om* Q1 ~op= . . . . ~o~.. Q, we have done for the initial configuration. The first
where opt* denotes the primitive operation op~ with command, CREATE(Sam, Code), may certainly be
the actual parameters x ~ , . . . , xk replacing all executed in the initial configuration, since CREATE
occurrences of the formal parameters X ~ , . . . , Xk, has no conditions. Its body consists of two primitive
respectively. Then Q' is Q , . operations, create object Code and enter o w n into
(Sam, Code). Then, using the ~ notation, we may
We say that Q [-, Q' if there exist parameters
show the effect of the two primitive operations as:
xx, . . . , Xk such that Q [-,<~ ....... ~) Q'; we say Q ~- Q'
if there exists a command ~ such that Q ~-, Q'.
It is also convenient to write Q ~-* Q', where ~-* Sam Joe
is the reflexive and transitive closure of ~-. That is, ~-* Sam ] (~ (Z) ).
represents zero or more applications of ~-. Joe [ ~ ~ create object Code
There are a number of points involved in our use of
parameters which should be emphasized. Note that Sam Joe Code
every command (except the empty one) has parameters.
Each command is given in terms of formal parameters.
s m ololo
Joe ~ (Z) enter own into (Sam, Code)
At execution time, the formal parameters are replaced
by actual parameters which are object names. Although Sam Joe Code
the same symbols are often used in this exposition for Sam I ~
formal and actual parameters, this should not cause
confusion. The "type checking" involved in deter-
,oo o l o F'°:"'
mining that a command may be executed takes place Thus, using the ~-notation for complete commands we
with respect to actual parameters. For example, consider can say that:

464 Communications August 1976


of Volume 19
the ACM Number 8
Sam Joe Example 3. A mode of access called "indirect" is
SamlOlO " discussed in [6]. Subject sl may access object o indi-
Joe ~ CREATE (Sam, Code) rectly if there is some subiect s2 with that access right
to o, and sl has the "indirect" right to s~. Formally,
Sam Joe Code we could model an indirect read by postulating the
generic rights read and iread, and
sa
Joe
L °~ L°~ I'°,. t command I R E A D ( s z , s~ , o)
The effect on the initial configuration of the four ff
read in (s~, o) and
commands listed above is: iread in ( s t , s~)
then
Sam Joe enter read into (sl, o)
Sam ~ - - ~ ~ - ~ delete read from (st, o)
end
Joe
It should be noted that the command in Example 3
Sam Joe Code has both multiple conditions and a body consisting of
Sam ~ more than one primitive operation, the first example
Iolo '°:'t we have seen of such a situation. In fact, since the
REMOVE commands of Example 1 did not really
Sam Joe Code Data need two conditions, we have our first example where
Sam ~ ~ 110~ multiple conditions are needed at all.
Joe t~ We should also point out that the interpretation of
1READ in Example 3 should not be taken to be null,
Sam Joe C o d e Data even though the command actually has no net effect
Sam ~ ~ [own} {own}I on the access matrix. The reason for this will become
Joe ~ ~ {execute} 9 ] clearer when we discuss the safety issue in the next
section. Intuitively, we want to show that sl temporarily
Sam Joe C o d e Data has the read right to o, even though it must give up
Sam
Joe
~ [ ~~ I {own} {ownl the right.
execute read I Example 4. The UNIX operating system [14] uses a
simple protection mechanism, where each file has one
We may thus say: owner. The owner may specify his own privileges
(read, write, and execute) and the privileges of all
Sam Joe other users, as a group. ~ Thus the system makes no
distinction between subjects except for the owner-
Joe ~9 nonowner distinction.
This situation cannot be modeled in our formalism
Sam Joe C o d e Data as easily as could the situations of the previous ex-
amples. It is clear that a generic right own is needed,
Joe ~3 @ {execute}]lread!J and that the rights of a user u to a file f w h i c h u owns
could be placed in the (u,f) entry of the access matrix.
It should be clear that in protection systems, the However, when we create a file f, it is not possible in
order in which commands are executed is not prescribed our formalism to express a command such as "give all
in advance. The nondeterminacy is important in model- subjects the right to read f , " since there is no a priori
ing real systems in which accesses and changes in bound on the number of subjects.
privileges occur in an unpredictable order. The solution we propose actually reflects the soft-
It is our contention that the model we have pre- ware implementation of protection in UNIX quite well.
sented has sufficient generality to allow one to specify We associate the rights to a file f with the (f, f ) entry
almost all of the protection schemes that have been in the access matrix. This decision means that files
proposed: cf. [6] for many examples of this flexibility. must be treated as special kinds of subjects, but there
It is of interest to note that it is immaterial whether is no logical reason why we cannot do so. Then a
hardware or software is used to implement the primi- user u can read (or write or execute) a file f if either:
tive operations of our model. The important issue is (1) own is in (u,f), i.e., u owns f, and the entry "owner
what one can say about systems we are able to model. can read" is in ( f f), or
In the next two sections, we shall develop the theory of (2) the entry "anyone can read" is in ( f , f ) .
protection systems using our model. We close this
section with two additional examples of the power of 3We ignore the role of the "superuser" in the following discus-
our model to reflect common protection ideas. sion.

465 Communications August 1976


of Volume 19
the ACM Number 8
Now we see one more problem. The conditions Fig. 2. UNIX type protection mechanism.
under which a read may occur is not the logical con-
command CREATEFILE(u,f)
junction of rights, but rather the disjunction of two
create subject f
such conjuncts, namely enter own into (u, f)
(1) o w n E P[u,f] and oread E Plf, f] or end
(2) aread E P[f,f] command LETORE.4D(u, f)
where oread stands for "owner may read," and aread if own in (u,f)
then enter oread into (f,f)
for "anyone may read." For simplicity we did not end
allow disjunctions in conditions, However, we can command LETA READ (u, f)
simulate a condition consisting of several lists of rights, if own in (u,f)
where all rights in some one list must be satisfied in then enter aread into (f,f)
order for execution to be permissible. We simply use end
several commands whose interpretations are identical. command READ (u, f)
That is, for each list of rights there will be one com- if either
own in (u, f) and
mand with that list as its condition. Thus any set of oread in (f~f)
commands with the more general, disjunctive kind of or
condition is equivalent to one in which all conditions aread in (Z,f)
are as we defined them originally. We shall, in this then
example, use commands with two lists of rights as a enter read into (u, f)
delete read from (u, f)
condition. end
We can now model these aspects of uNIx protection
as follows. Since write a n d e x e c u t e are handled exactly
as read, we shall treat only read. The set of generic
rights is thus own, oread, aread, a n d read. The first the matrix where it did not exist before. Furthermore,
three of these have already been explained, read is in some restricted cases where safety is decidable, the
symbolic only, and it will be entered temporarily into decision procedures are probably too slow to be of
practical utility.
(u,f) by a R E A D command, representing the fact
that s can actually read f read will never appear in the This question, whether a generic right can be
access matrix between commands and in fact is not "leaked" is itself insufficiently general. For example,
reflected directly in the protection mechanism of suppose subject s plans to give subject s' generic right
uNIX. The list of commands is shown in Figure 2. r to object o. The natural question is whether the
current access matrix, with r entered into (s', o), is
such that generic right r could subsequently be entered
4. Safety somewhere new. To avoid a trivial "unsafe" answer
because s himself can confer generic right r, we should
We shall now consider one important family of in most circumstances delete s itself from the matrix.
questions that could be asked about a protection system, It might also make sense to delete from the matrix
those concerning safety. When we say a specific pro- any other "reliable" subjects who could grant r, but
tection system is "safe," we undoubtedly mean that whom s "trusts" will not do so. It is only by using
access to files without the concurrence of the owner is the hypothetical safety test in this manner, with "re-
impossible. However, protection mechanisms are often liable" subjects deleted, that the ability to test whether
used in such a way that the owner gives away certain a right can be leaked has a useful meaning in terms of
rights to his objects. Example 4 illustrates this phe- whether it is safe to grant a right to a subject.
nomenon. In that sense, no protection system is "safe," Another common notion of the term "safety" is
so we must consider a weaker condition that says, in that one be assured it is impossible to leak right r to a
effect, that a particular system enables one to keep particular object ol. We can use our more general
one's own objects "under control." definition of safety to simulate this one. To test
Since we cannot expect that a given system will be whether in some situation right r to object ol can be
safe in the strictest sense, we suggest that the minimum leaked, create two new generic rights, r' and r". Put
tolerable situation is that the user should be able to r' in (ol, Ol), but do nothing yet with r". Then add
tell whether what he is about to do (give away a right,
command DUMMY(s, o)
presumably) can lead to the further leakage of that if
right to truly unauthorized subjects. As we shall see, r in (s, o) and
there are protection systems under our model for which r' in (o, o)
even that property is too much to expect. That is, it is then
enter r" into (o, o)
in general undecidable whether, given an initial access
end
matrix, there is some sequence of commands in which
a particular generic right is entered at some place in Then, since there is only one instance of generic right r',

466 Communications August 1976


of Volume 19
the ACM Number 8
o must be ol in command D U M M Y . Thus, leaking r" Example 2. Suppose a were the only command in the
to anybody is equivalent to leaking generic right r system. If the initial configuration has exactly one
to object ol specifically. subject and no other objects, then it is safe for r2 but
We shall now give a formal definition of the safety not for r l .
question for protection systems. There is a special case for which we can show it is
Definition. Given a protection system, we say com- decidable whether a given right is potentially leaked in
mand a(X1, . . . , Xk) leaks generic right r from configu- any given initial configuration. Decidability in this
ration Q = (S, O, P) if a, when run on Q, can execute special case is not significant in itself, since it is much
a primitive operation which enters r into a cell of the too restricted to model interesting systems. However,
access matrix which did not previously contain r. it is suggestive of stronger results that might be p r o v e d - -
More formally, there is some assignment of actual results which would enable the designer of a protection
parameters x l , • • •, xk such that system to be sure that an algorithm to decide safety,
(I) a ( x l , . . . , xk) has its conditions satisfied in Q, i.e. in the sense we have used the term here, existed for
for each clause "r in (X~, X~.)" in a's conditions his system.
we have r ~ P[x~ , xy], and Definition. A protection system is mono-operational
(2) if a's body is opl, • . . , o p , , then there exists an m, if each command's interpretation is a single primitive
1 _< m < n, and configurations Q = Q0, Q ~ , . . . operation.
!
Q,,_x = (S', o , P'), and Qm = (S", O , P"),
t,'
Example 4, based on UNIX, is not mono-operational
such that because the interpretation of C R E A T E F I L E has
length two.
Q0 ~o,1. Q1 ~o~2 . . . . Q,,-1 ~o~,~. Q,, THEOREM 1. There is an algorithm which decides
where opt* denotes op~ after substitution of whether or not a given mono-operational protection
x ~ , . . . , Xk for X I , . . . , Xk and moreover, there system and initial configuration is unsafe for a given
exist some s and o such that generic right r.
PROOF. The proof hinges on two simple observa-
r ~i P'[s,o] but r C P"[s,o] tions. First, commands can test for the presence of
(Of course, opm must be enter r into (s, o)). rights, but not for the absence of rights or objects.
Notice that given Q, a and r, it is easy to check This allows delete and destroy commands 4 to be re-
whether a leaks r from Q. Also note that a leaks r moved from computations leading to a leak. Second,
from Q even if a deletes r after entering it. Commands a command can only identify objects by the rights in
IREAD in Example 3 and READ in Example 4 are their row and column of the access matrix. No mono-
typical of commands which enter a right and then operational command can both create an object and
immediately delete it. In a real system we would expect enter rights, so multiple creates can be removed from
a procedure called " R E A D " to contain code between computations, leaving the creation of only one subject.
the enter and delete operations which passes data from This allows the length of the shortest "leaky" computa-
the file read to some other file or process. Although tion to be bounded.
we do not model directly the presence of such code, the Suppose
temporary presence of the ".read" right in the access (*) Oo ~cl 01 ~-c~... ~-ca O,,
matrix pinpoints this data transfer, thus identifying
the potential leak. is a minimal length computation reaching some con-
We should emphasize that "leaks" are not neces- figuration Q,~ for which there is a command a leak-
sarily "bad." Any interesting system will have com- ing r. Let Q~ = ( S i , O~, P~). Now we claim that C~,
mands which can enter some rights (i.e. be able to 2 < i < m is an enter command, and C1 is either an
leak those rights). The term assumes its usual negative enter or create subject command. Suppose not, and let
significance only when applied to some configuration, Cn be the last non-enter command in the sequence (.).
most likely modified to eliminate "reliable" subjects as Then we could form a shorter computation
!
discussed in the beginning of this section, and to some Q0 ~cx QI [-...Q,-I ~c'.+1 Q,+I ~... ~c" Q,~'
right which we hope cannot be passed around.
Definition. Given a particular protection system and as follows.
generic right r, we say that the initial configuration Q0 (a) if C, is a delete or destroy command, let Ci' = C~
is unsafe for r (or leaks r) if there is a configuration Q and Q~' = Q~ plus the right, subject or object which
and a command a such that would have been deleted or destroyed by C , . By the
(1) Q0 ~ * Q , and first observation above, C~ cannot distinguish Q~_~
from Q~_I,'
so Q~-I
/
~-c,t Q~' holds. Likewise, a leaks
(2) a leaks r from Q.
r from Q,,' since it did so from Q,..
We say Q0 is safe for r if Q0 is not unsafe for r.
Example 5. Let us reconsider the simple example 4 Since the system is mono-operational, we can identify the
of a command a(X, Y, Z) which immediately precedes command by the type of primitive operation.

467 Communications August 1976


of Volume 19
the ACM Number 8
(b) Suppose C~ is a create subject command and 5 problem is NP-complete intuitively means that if the
1S,_11 > I or C, is a create object command. Note that problem could be shown to be solvable in polynomial
a leaks r from Q,~ by assumption, so ~ is an enter time, this would be a major result in that a large number
command. Further, we must have [ Sm [ >__ 1 and of other problems could be solved efficiently. The best
known such problem is probably the "traveling sales-
IS~l = I s~-11 = . . . = [s.I >_ 1
person problem." Thus the above problem is almost
(C . . . . . , C,+1 are enter commands by assump- certainly of exponential time complexity in the size of
tion). Thus I S,_~] >__ 1 even if Ca is a create the matrix. Cf. [1] for a more thorough discussion of
object command. Let s E S~_1. Let o be the name of these matters.
the object created by Ca. Now we can let C~' = C~ For those familiar with the technical definitions
with s replacing all occurrences of o, and Q~' = Q~ needed, the argument will be sketched. (All these defini-
with s and o merged. For example, if o E O, -- S~ we tions may be found in [1].) We can prove the result by
would have reducing the k-clique problem to the problem: given
Si t = S i , a mono-operational system, a right r and an initial
0:= 0~- {o}, access matrix, determine if that matrix is safe for r.
, :P~[x,y] if y ~ s Given a graph and an integer k, produce a protection
P~[x,y] = [ e d x , s]U P~[x,o] if y = s. system whose initial access matrix is the adjacency
matrix for the graph and having one command. This
Clearly, command's conditions test its k parameters to see if
ei[x, o] c_ P~'[x, s], they form a k-clique, and its body enters some right r
somewhere. The matrix will be unsafe for r in this
so for any condition in C~ satisfied by o, the correspond- system if and only if the graph has a k-clique. The
ing condition in C~' is satisfied by s. Likewise for the above is a polynomial reduction of the known NP-
conditions of a. complete clique problem to our problem, so our problem
(c) Otherwise, we have IS~-~I = 0, C~ is a create is at best NP-complete. It is easy to find a nondeter-
subject command, and n > 2. The construction in this ministic polynomial time algorithm to test safety, so
case is slightly different--the create subject command our problem is in fact NP-complete and no worse.
cannot be deleted (subsequent "enters" would have One obvious corollary of the above is that any
no place to enter into). However, the commands pre- family of protection systems which includes the mono-
ceding C~ can be skipped (provided that the names of operational systems must have a general decision
objects created by them are replaced), giving problem which is at least as difficult as the NP-com-
plete problems, although individual members of the
Oo [-c, Q~' ~-c:+lQ'~+I ~ - . . . ~-c'Q,~'
family could have easier decision problems.
where, if S~ = {s}, we have C : is C~ with s replacing Another unproven but probably true characteristic of
the names of all objects in O~_~, and Q~P is Q~ with s NP-eomplete problems has interesting implications
merged with all o C O~_~. concerning proofs of safety. We can give a "short,"
In each of these cases we have created a shorter i.e. polynomial length, proof that a given matrix for a
"leaky" computation, contradicting the supposed mono-operational system is not safe [just list the com-
minimality of (.). Now we note that no C~ enters a putation (,)], although such a proof may be difficult
right r into a cell of the access matrix already containing to find. However, it is probable that there is no proof
r, else we could get a shorter sequence by deleting C~. system in which we can guarantee the existence of, let
Thus we have an upper bound on m: alone find, short proofs that an initial matrix for an
m < g(ISo[ + 1)(lOo I q- 1) q- 1 arbitrary mono-operational system is safe.

where g is the number of generic rights.


An obvious decision procedure now presents itself-- 5. Undecidability of the Safety Problem
try all sequences of enter commands, optionally start-
ing with a create subject command, of length up to the
We are now going to prove that the general safety
bound given above. This algorithm is exponential in
problem is not decidable. We assume the reader is
the matrix size. However, by using the technique of
"dynamic programming" (see [1], e.g.), an algorithm familiar with the notion of a Turing machine (see
polynomial in the size of the initial matrix can easily [7], e.g.). Each Turing machine T oonsists of a finite
be devised for any given protection system. set of states K and a distinct finite set of tape symbols
It is worth noting that if we wish a decision pro- P. One of the tape symbols is the blank B, which initially
cedure for all mono-operational systems, where the appears on each cell of a tape which is infinite to the
commands are a parameter of the problem, then the right only (that is, the tape cells are numbered
decision problem is "NP-complete." To say that a 1, 2 , . . . , i, ...). There is a tape head which is always
6 [ A [ stands for the number of members in set A. scanning (located at) some cell of the tape.
468 Communications August 1976
of Volume 19
the ACM Number 8
Fig. 3. R e p r e s e n t i n g a tape. the tape head at the second cell and the machine in
Sl s2 s3 s4 state q, is shown in Figure 3.
The moves of the Turing machine are reflected in
{w} (o.,) commands as follows. First, if
(e.,) 6(q, X) = (p, Y, L),
s5 (y) (o..> then there is
command Cqx(s, s')
s4
if
own in (s, s') and
q in (s', s') and
X in (s', s')
then
The moves of T are specified by a function 6 from delete q from (s', s')
K X F to K X F X {L, R}. If~(q, X) = (p, Y, R) for delete X from (s', s')
enter p into (s, s)
states p and q and tape symbols X and Y, then should
enter Y into (s', s')
the Turing machine T find itself in state q, with its end
tape head scanning a cell holding symbol X, then T
enters state p, erases X and prints Y on the tape cell That is, s and s' must represent two consecutive cells
scanned and moves its tape head one cell to the right. of the tape, with the machine in state q, scanning lhe
If g(q, X) = (p, Y, L), the same thing happens, but the cell represented by s', and with the symbol X written
tape head moves one cell left (but never off the left in s'. The body of the command changes X to Y and
end of the tape at cell 1). moves the head left, changing the state from q to p.
Initially, T is in state qo, the initial state, with its For example, Figure 3 becomes Figure 4 when com-
head at cell 1. Each tape cell holds the blank. There is mand C~x is applied.
a particular state qf, known as the final state, and it is If
a fact that it is undecidable whether started as above, 6(q, X) = (p, Y, R),
an arbitrary Turing machine T will eventually enter
state qs • that is, the tape head moves right, then we have two
THEOREM 2. It is undecidable whether a given con- commands, depending whether or not the head passes
figuration of a given protection system is safe for a given the current end of the tape, that is, the end right.
generic right. There is
PROOF. We shall show that safety is undecidable command C~x(s, s')
by showing that a protection system, as we have defined if
the term, can simulate the behavior of an arbitrary own in (s, s') and
q in (s, s) and
Turing machine, with leakage of a right corresponding X in (s, s)
to the Turing machine entering a final state, a condition then
we know to be undecidable. The set of generic rights delete q from (s, s)
of our protection system will include the states and delete X from (s, s)
tape symbols of the Turing machine. At any time, the enter p into (s', s')
enter Y into (s, s)
Turing machine will have some finite initial prefix end
of its tape cells, say 1, 2 , . . . k, which it has ever scanned.
This situation will b e represented by a sequence of k To handle the case where the Turing machine moves
subjects, sx, s 2 , . . . , sk, such that s~ "owns" s~+x for into new territory, there is also
1 < i < k. Thus we use the ownership relation to order command Dqx(s, s')
subjects into a linear list representing the tape of the if
Turing machine. Subject s~ represents cell i, and the end in (s, s) and
q in (s, s) and
fact that cell i now holds tape symbol X is represented X in (s, s)
by giving s~ generic right X to itself. The fact that q then
is the current state and that the tape head is scanning delete q from (s, s)
the jth cell is represented by giving s~. generic right q delete X from (s, s)
to itself. Note that we have assumed the states distinct create subject s'
enter B into (s', s')
from the tape symbols, so no confusion can result. enter p into (s', s')
There is a special generic right e n d , which marks enter Y into (s, s)
the last subject, sk. That is, sk has generic right e n d delete end from (s, s)
to itself, indicating that we have not yet created the enter end into (s', s')
subject sk+l which s~ is to own. The generic right o w n enter own into (s, s')
end
completes the set of generic rights. An example showing
how a tape whose first four cells hold W X Y Z , with If we begin with the initial matrix having one sub-

469 Communications August 1976


of Volume 19
the ACM Number 8
ject sl, with rights qo, B (blank) and end to itself, then safety is decidable but arbitrarily difficult. Second,
the access matrix will always have exactly one generic although any real system must place a bound on the
right that is a state. This follows because each com- number of objects which can be created, this bound
mand deletes a state known by the conditions of that will not make the decision of the safety question 'easy.
command to exist. Each command also enters one While the finiteness of real resources does make safety
state into the matrix. Also, no entry in the access decidable, we can show the following.
matrix can have more than one generic right that is a THEOREM 3. The question of safety for protection
tape symbol by a similar argument. Likewise, end systems without create commands is complete in poly-
appears in only one entry of the matrix, the diagonal nomial space:
entry for the last created subject. PROOF. A construction similar to that of Theorem 2
Thus, in each configuration of the protection system proves that any polynomial space bounded Turing
reachable from the initial configuration, there is at machine can be reduced in polynomial time to an
most one command applicable. This follows from the initial access matrix whose size is polynomial in the
fact that the Turing machine has at most one applicable length of the Turing machine input.
move in any situation, and the fact that Cqx and D~x
can never be simultaneously applicable. The protection
system must therefore exactly simulate the Turing 6. Conclusions and Open Questions
machine using the representation we have described.
If the Turing machine enters state q:, then the protec- A very simple model for protection systems has
tion system can leak generic right q:, otherwise, it is been presented in which most protection issues can
safe for q:. Since it is undecidable whether the Turing. be represented. In this model, it has been shown that
machine enters q/, it must be undecidable whether the no algorithm can decide the safety of an arbitrary
protection system is safe for q/. configuration of an arbitrary protection system. To
We can prove a result similar to Theorem 2 which is avoid misunderstanding of this result, we shall list
in a sense a strengthening of it. Theorem 2 says that some implications of the result explicitly.
there is no single algorithm which can decide safety First, there is no hope of finding an algorithm which
for all protection systems. One might hope that for can certify the safety of an arbitrary configuration of an
each protection system, one could find a particular arbitrary protection system, or of all configurations
algorithm to decide safety. We can easily show that for a given system. This result should not dampen the
this is not possible. By simulating a universal Turing spirits of those working on operating systems verifi-
machine [7] on an arbitrary input, we can exhibit a cation. It only means they must consider restricted
particular protection system for which it is undecidable cases (or individual cases), and undoubtedly they have
whether a given initial configuration is sate for a given realized this already.
right. Thus, although we can give different algorithms In a similar vein, the positive result of Section 4
to decide safety for different classes of systems, we can should not be a cause for celebration. In particular, the
never hope even to cover all systems with a finite, or result is of no use unless it can be strengthened along
even infinite, collection of algorithms. the lines of the models in [8].
Two other facts are easily seen. First, since we know Our model offers a natural classification of certain
that there are arbitrarily complex computable functions, features of protection systems and provides an in-
there must be special cases of protection systems where teresting framework for investigating the following
questions: Which features cause a system to slip over
the line and have an undecidable safety problem?
Fig. 4. After one move. Are there natural restrictions to place on a protection
system which make it have a solvable safety question?
sl S2 S3 S4

Sl .. {o,,} Acknowledgment. The authors thank one o f the


{Y} OWn) referees for simplifying the proof of Theorem 2.
S2

S3 OWn} Received November 1974; revised December 1975

S4 (z,,,a} This probably implies that that decision problem requires ex-
ponential time: cf. [1].

470 Communications August 1976


of Volume 19
the ACM Number 8
Programming C. M a n a c h e r , S. L. G r a h a m
Techniques Editors
References
1. Aho, A.V., Hopcroft, J.E., and Ullman, J.D. The Design and
Analysis of Computer Algorithms. Addison-Wesley, Reading,
An Insertion
Mass., 1974.
2. Andrews, G.R. COPS--A protection mechanism for computer
Systems. Ph.D. Th. and Tech. Rep. 74-07-12, Computer Sci.
Technique for
Program, U. of Washington, Seattle, Wash., July, 1974.
3. Bell, D.E., and LaPadula, L.J. Secure Computer Systems, One-Sided
Vol. I: Mathematical Foundations and Vol. II: A Mathematical
Model. MITRE Corp. Tech. Rep. MTR-2547, 1973.
4. Dennis, J.B., and Van Horn, E.C. Programming semantics for
Height-Balanced Trees
multiprogrammed computations. Comm. ACM 9, 3 (March 1966),
143-155. D. S. Hirschberg
5. Graham, R.M. Protection in an information processing Princeton University
utility. Comm. ACM 11, 5 (May 1968), 365-369.
6. Graham, G.S., and Denning, P.J. Protection--principles and
practice. AFIPS Conf. Proc., 1972 SJCC, Vol. 40, AFIPS Press,
Montvale, N.J., 1972, pp. 417-429.
7. Hopcroft, J.E., and Ullman, J.D. Formal Languages and A restriction on height-balanced binary trees is
Their Relation to Automata. Addison-Wesley, Reading, Mass. presented. It is seen that this restriction reduces the
1969.
8. Jones, A.K. Protection in programmed systems. Ph.D. Th., extra memory requirements by half (from two extra
Dep. of Computer Sci., Carnegie-Mellon U., Pittsburgh, Pa., bits per node to one) and maintains fast search capa-
June 1973. bilities at a cost of increased time requirements for in-
9. Jones, A.K., and Wulf, W. Towards the design of secure
systems. In Protection in Operating Systems, Colloques IRIA, serting new nodes.
Rocquencourt, France, 1974, pp. 121-136. Key Words and Phrases: balanced, binary, search,
10. Lampson, B.W. Protection, Proc. Fifth Princeton Syrup. on trees
Information Sciences and Systems, Princeton University, March
1971, pp. 437-443. Reprinted in Operating Systems Rev. 8, 1 CR Categories: 3.73, 3.74, 4.34, 5.25, 5.31
(Jan. 1974), 18-24.
11. Lampson, B.W. A note on the confinement problem. Comm.
ACM 16, 10 (Oct. 1973), 613-615.
12. Needham, R.M. Protection systems and protection imple-
mentations. AFIPS Conf. Proc., 1972 FJCC, Vol. 41, AFIPS
Press, Montvale, N.J., 1972, pp. 571-578. Binary search trees are a data structure in c o m m o n
13. Popek, G.J. Correctness in access control. Proc. ACM Nat.
Computer Conf., 1974, pp. 236-241. use. T o keep search time relatively small, the m e t h o d o f
14. Ritchie, D.M., and Thompson, K. The UNIX time sharing balancing binary trees was introduced by Adel'son-
system. Comm. ACM 17, 7 (July 1974), 365-375. Vel'skii and Landis [1]. These height-balanced binary
15. Saltzer, J.H. Protection and the control of information sharing
in MULT1CS. Comm. ACM 17, 7 (July 1974), 388-402. trees (also called A V L trees) require two extra bits per
n o d e and require only O(log N) operations to search
a n d / o r insert an item, where N is the n u m b e r of nodes
in the tree. E a c h n o d e in the tree has a height which is
defined to be the length o f the longest path f r o m that
n o d e d o w n the tree. The heights of the two sons o f a
n o d e m a y differ by at m o s t one.
K n u t h [2] suggests considering the case where the
tree is further restricted so that the right son never has
smaller height than the left son. We call such a tree a
one-sided height-balanced (OSHB) tree. I n this case,
only one extra bit is required per node. The saving o f
one bit per n o d e is significant if that bit would have re-

Copyright O 1976, Association for Computing Machinery, Inc.


General permission to republish, but not for profit, all or part
of this material is granted, provided that ACM's copyright notice
is given and that reference is made to the publication, to its date
of issue, and to the fact that reprinting privileges were granted
by permission of the Association for Computing Machinery.
Research supported by an IBM fellowship. Author's present
address: Department of Electrical Engineering, Rice University,
Houston, TX. 77001.
471 Communications August 1976
of Volume 19
the ACM Number 8

View publication stats

You might also like