05 Dsmodel
05 Dsmodel
Ethan Blanton
Department of Computer Science and Engineering
University at Buffalo
Introduction Synchronicity Communication Concurrency CSP Summary References
Distributed Systems
Early on, we defined a distributed system as:
What are:
computer programs?
communication?
messages?
Computer Programs
What is a computer program? is a hard question.
Message Passing
There are many avenues for message passing:
Shared memory
Files
Sockets
Pipes
Go channels
Concurrency
Synchronous Systems
In a synchronous system, all actions take predictable time:
A message sent from P to Q always arrives within some
bounded time.
The relative rate of progress in P and Q is known.
Asynchronous Systems
Implications of Asynchrony
Asynchronous systems present challenges.
Suppose that:
P sends a message to Q and expects a response.
No message arrives for longer than expected.
What happened?
Did Q fail?
Is Q much slower than P expects, and still working?
Was the request message delayed in the network?
Was the request message lost?
Was the response delayed or lost?
© 2023 Ethan Blanton / CSE 486/586: Distributed Systems 8
Introduction Synchronicity Communication Concurrency CSP Summary References
late, or
never going to arrive.
Loss as Delay
In particular:
Loss at a lower layer may look like delay at a higher layer.
Consider:
Process P sends a message to Q and expects a reply
P never receives a reply
P can’t tell.
© 2023 Ethan Blanton / CSE 486/586: Distributed Systems 12
Introduction Synchronicity Communication Concurrency CSP Summary References
Example:
Suppose we have a variable x = 0 visible to both P and Q.
P :x = x + 1
Q:x = x - 1
Race Conditions
This is a race, or race condition:
Two or more events are dependent upon each other
Some of the events may happen in more than one order, or
even simultaneously
There exists some ordering of the events that is incorrect
For example:
Some state will be updated multiple times
Output will be produced based on the state
Example Race
P :x = x + 1
Q:x = x - 1
Why?
Atomicity
These statements are not atomic: they can be interrupted.
P reads x
Q reads x
P computes x + 1 Q computes x - 1
P stores x = x + 1
Q stores x = x - 1
Happens Before
Mutexes
Messages
Fixing x
With CSP, we can ask a single process to manipulate x:
Summary
Next Time …
References I
Required Readings
[3] Ajay D. Kshemkalyani and Mukesh Singhal. Distributed
Computing: Principles, Algorithms, and Systems. Chapter 1:
1.1–1.3, 1.5–1.8. Cambridge University Press, 2008. ISBN:
978-0-521-18984-2.
Optional Readings
[1] C. A. R. Hoare. “Communicating Sequential Processes”. In:
Communications of the ACM 21.8 (Aug. 1978), pp. 666–677.
URL: https://search.lib.buffalo.edu/permalink/01SUNY_BUF/
12pkqkt/cdi_crossref_primary_10_1145_359576_359585.
References II
[2] C. A. R. Hoare. Communicating Sequential Processes. Prentice
Hall International, 1985. URL: http://www.usingcsp.com/.
[4] Rob Pike. Concurrency is not Parallelism. Jan. 2012. URL:
https://go.dev/blog/waza-talk.
[5] The Go Memory Model. May 2014. URL:
https://go.dev/ref/mem.