100% found this document useful (6 votes)
63 views

Download Parallel Programming with Microsoft NET Design Patterns for Decomposition and Coordination on Multicore Architectures Patterns Practices 1st Edition Colin Campbell ebook All Chapters PDF

Patterns

Uploaded by

ottonotono
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (6 votes)
63 views

Download Parallel Programming with Microsoft NET Design Patterns for Decomposition and Coordination on Multicore Architectures Patterns Practices 1st Edition Colin Campbell ebook All Chapters PDF

Patterns

Uploaded by

ottonotono
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 81

Download the full version of the ebook at

https://ebookultra.com

Parallel Programming with Microsoft NET


Design Patterns for Decomposition and
Coordination on Multicore Architectures
Patterns Practices 1st Edition Colin Campbell
https://ebookultra.com/download/parallel-
programming-with-microsoft-net-design-patterns-
for-decomposition-and-coordination-on-multicore-
architectures-patterns-practices-1st-edition-
colin-campbell/

Explore and download more ebook at https://ebookultra.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Parallel Programming with Microsoft Visual C Design


Patterns for Decomposition and Coordination on Multicore
Architectures Patterns and Practices 1st Edition Colin
Campbell
https://ebookultra.com/download/parallel-programming-with-microsoft-
visual-c-design-patterns-for-decomposition-and-coordination-on-
multicore-architectures-patterns-and-practices-1st-edition-colin-
campbell/
ebookultra.com

Professional ASP NET Design Patterns Scott Millett

https://ebookultra.com/download/professional-asp-net-design-patterns-
scott-millett/

ebookultra.com

Software modeling and design UML use cases patterns and


software architectures 1st Edition Hassan Gomaa

https://ebookultra.com/download/software-modeling-and-design-uml-use-
cases-patterns-and-software-architectures-1st-edition-hassan-gomaa/

ebookultra.com

Foundations of Object Oriented Programming Using NET 2 0


Patterns 1st Edition Christian Gross (Auth.)

https://ebookultra.com/download/foundations-of-object-oriented-
programming-using-net-2-0-patterns-1st-edition-christian-gross-auth/

ebookultra.com
Go Design Patterns 1st Edition Contreras

https://ebookultra.com/download/go-design-patterns-1st-edition-
contreras/

ebookultra.com

Applied XML programming for Microsoft NET 1st Edition Dino


Esposito

https://ebookultra.com/download/applied-xml-programming-for-microsoft-
net-1st-edition-dino-esposito/

ebookultra.com

Programming Microsoft SQL Server 2000 with Microsoft


Visual Basic NET 1st edition Edition Rick Dobson

https://ebookultra.com/download/programming-microsoft-sql-
server-2000-with-microsoft-visual-basic-net-1st-edition-edition-rick-
dobson/
ebookultra.com

Design Patterns 1st Edition Christopher G. Lasater

https://ebookultra.com/download/design-patterns-1st-edition-
christopher-g-lasater/

ebookultra.com

Design patterns explained a new perspective on object


oriented design 2. ed Edition Shalloway

https://ebookultra.com/download/design-patterns-explained-a-new-
perspective-on-object-oriented-design-2-ed-edition-shalloway/

ebookultra.com
Parallel Programming with Microsoft NET Design
Patterns for Decomposition and Coordination on
Multicore Architectures Patterns Practices 1st Edition
Colin Campbell Digital Instant Download
Author(s): Colin Campbell, Ralph Johnson, Ade Miller, Stephen Toub
ISBN(s): 9780735651593, 0735651590
Edition: 1
File Details: PDF, 3.74 MB
Year: 2010
Language: english
001

PARALLEL
PROGRAM M ING
WITH

M I C R O S O F T .N E T
®

Design Patterns for


Decomposition and Coordination
on Multicore Architectures

Colin Campbell
Ralph Johnson
Ade Miller
Stephen Toub

Foreword by
Tony Hey

• • • • • •
• • • • • • • •
• • • • • • •
• • • • •
a guide to parallel programming
Parallel Programming
with Microsoft .NET ®

Design Patterns for Decomposition and


Coordination on Multicore Architectures

Colin Campbell
Ralph Johnson
Ade Miller
Stephen Toub
ISBN 9780735640603

This document is provided “as-is.” Information and views expressed in this


document, including URL and other Internet website references, may change
without notice. You bear the risk of using it. Unless otherwise noted, the
companies, organizations, products, domain names, email addresses, logos,
people, places, and events depicted in examples herein are fictitious. No
association with any real company, organization, product, domain name,
email address, logo, person, place, or event is intended or should be inferred.
Complying with all applicable copyright laws is the responsibility of the user.
Without limiting the rights under copyright, no part of this document may be
reproduced, stored in or introduced into a retrieval system, or transmitted in
any form or by any means (electronic, mechanical, photocopying, recording,
or otherwise), or for any purpose, without the express written permission of
Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or


other intellectual property rights covering subject matter in this document.
Except as expressly provided in any written license agreement from Microsoft,
the furnishing of this document does not give you any license to these patents,
trademarks, copyrights, or other intellectual property.

© 2010 Microsoft Corporation. All rights reserved.

Microsoft, MSDN, Visual Basic, Visual C#, Visual Studio, Windows, Windows
Live, Windows Server, and Windows Vista are trademarks of the Microsoft
group of companies.

All other trademarks are property of their respective owners.


Contents

Foreword xi
Tony Hey

Preface xiii
Who This Book Is For xiii
Why This Book Is Pertinent Now xiv
What You Need to Use the Code xiv
How to Use This Book xv
Introduction xvi
Parallelism with Control Dependencies Only xvi
Parallelism with Control and Data Dependencies xvi
Dynamic Task Parallelism and Pipelines xvi
Supporting Material xvii
What Is Not Covered xviii
Goals xviii

Acknowledgments xix

1 Introduction 1
The Importance of Potential Parallelism 2
Decomposition, Coordination,
and Scalable Sharing 3
Understanding Tasks 3
Coordinating Tasks 4
Scalable Sharing of Data 5
Design Approaches 6
Selecting the Right Pattern 7
A Word About Terminology 7
The Limits of Parallelism 8
A Few Tips 10
Exercises 11
For More Information 11
vi

2 Parallel Loops 13
The Basics 14
Parallel for Loops 14
Parallel for Each 15
Parallel Linq (PLINQ) 16
What to Expect 16
An Example 18
Sequential Credit Review Example 19
Credit Review Example Using
Parallel.For Each 19
Credit Review Example with PLINQ 20
Performance Comparison 21
Variations 21
Breaking Out of Loops Early 21
Parallel Break 21
Parallel Stop 23
External Loop Cancellation 24
Exception Handling 26
Special Handling of Small Loop Bodies 26
Controlling the Degree of Parallelism 28
Using Task-Local State in a Loop Body 29
Using a Custom Task Scheduler
For a Parallel Loop 31
Anti-Patterns 32
Step Size Other than One 32
Hidden Loop Body Dependencies 32
Small Loop Bodies with Few Iterations 32
Processor Oversubscription
And Undersubscription 33
Mixing the Parallel Class and PLINQ 33
Duplicates in the Input Enumeration 34
Design Notes 34
Adaptive Partitioning 34
Adaptive Concurrency 34
Support for Nested Loops and Server Applications 35
Related Patterns 35
Exercises 35
Further Reading 37

3 Parallel Tasks 39
The Basics 40
An Example 41
vii

Variations 43
Canceling a Task 43
Handling Exceptions 44
Ways to Observe an Unhandled Task Exception 45
Aggregate Exceptions 45
The Handle Method 46
The Flatten Method 47
Waiting for the First Task to Complete 48
Speculative Execution 49
Creating Tasks with Custom Scheduling 50
Anti-Patterns 51
Variables Captured by Closures 51
Disposing a Resource Needed by a Task 52
Avoid Thread Abort 53
Design Notes 53
Tasks and Threads 53
Task Life Cycle 53
Writing a Custom Task Scheduler 54
Unobserved Task Exceptions 55
Relationship Between Data Parallelism
and Task Parallelism 56
The Default Task Scheduler 56
The Thread Pool 57
Decentralized Scheduling Techniques 58
Work Stealing 59
Top-Level Tasks in the Global Queue 60
Subtasks in a Local Queue 60
Inlined Execution of Subtasks 60
Thread Injection 61
Bypassing the Thread Pool 63
Exercises 64
Further Reading 65

4 Parallel Aggregation 67
The Basics 68
An Example 69
Variations 73
Using Parallel Loops for Aggregation 73
Using A Range Partitioner for Aggregation 76
Using Plinq Aggregation with Range Selection 77
Design Notes 80
Related Patterns 82
Exercises 82
Further Reading 83
viii

5 Futures 85
The Basics 86
Futures 86
Continuation Tasks 88
Example: The Adatum Financial Dashboard 89
The Business Objects 91
The Analysis Engine 92
Loading External Data 95
Merging 95
Normalizing 96
Analysis and Model Creation 96
Processing Historical Data 96
Comparing Models 96
View And View Model 97
Variations 97
Canceling Futures and Continuation Tasks 97
Continue When “At Least One” Antecedent Completes 97
Using .Net Asynchronous Calls with Futures 97
Removing Bottlenecks 98
Modifying the Graph at Run Time 98
Design Notes 99
Decomposition into Futures
And Continuation Tasks 99
Functional Style 99
Related Patterns 100
Pipeline Pattern 100
Master/Worker Pattern 100
Dynamic Task Parallelism Pattern 100
Discrete Event Pattern 100
Exercises 101
Further Reading 101

6 Dynamic Task Parallelism 103


The Basics 103
An Example 105
Variations 107
Parallel While-Not-Empty 107
Task Chaining with Parent/Child Tasks 108
Design Notes 109
Exercises 110
Further Reading 110
ix

7 Pipelines 113
The Basics 113
An Example 117
Sequential Image Processing 117
The Image Pipeline 119
Performance Characteristics 120
Variations 122
Canceling a Pipeline 122
Handling Pipeline Exceptions 124
Load Balancing Using Multiple Producers 126
Pipelines and Streams 129
Asynchronous Pipelines 129
Anti-Patterns 129
Thread Starvation 129
Infinite Blocking Collection Waits 130
Forgetting GetConsumingEnumerable() 130
Using Other Producer/Consumer
Collections 130
Design Notes 131
Related Patterns 131
Exercises 132
Further Reading 132
Appendices
a Adapting Object-Oriented Patterns 133
Structural Patterns 133
Façade 134
Example 134
Guidelines 134
Decorators 134
Example 135
Guidelines 136
Adapters 136
Example 137
Guidelines 138
Repositories And Parallel Data Access 138
Example 139
Guidelines 139
Singletons and Service Locators 139
Implementing a Singleton with the Lazy<T> Class 140
Notes 141
Guidelines 141
x

Model-View-ViewModel 142
Example 143
The Dashboard’s User Interface 144
Guidelines 147
Immutable Types 148
Example 149
Immutable Types as Value Types 150
Compound Values 152
Guidelines 152
Shared Data Classes 153
Guidelines 153
Iterators 154
Example 154
Lists and Enumerables 155
Further Reading 156
Structural Patterns 156
Singleton 156
Model-View-ViewModel 157
Immutable Types 158

b Debugging and Profiling


Parallel Applications 159
The Parallel Tasks and Parallel Stacks Windows 159
The Concurrency Visualizer 162
Visual Patterns 167
Oversubscription 167
Lock Contention and Serialization 168
Load Imbalance 169
Further Reading 172

c Technology Overview 173


Further Reading 175

Glossary 177

References 187
Other Online Sources 189

Index 191
Foreword

At its inception some 40 or so years ago, parallel computing was the


province of experts who applied it to exotic fields, such as high en-
ergy physics, and to engineering applications, such as computational
fluid dynamics. We’ve come a long way since those early days.
This change is being driven by hardware trends. The days of per-
petually increasing processor clock speeds are now at an end. Instead,
the increased chip densities that Moore’s Law predicts are being used
to create multicore processors, or single chips with multiple processor
cores. Quad-core processors are now common, and this trend will
continue, with 10’s of cores available on the hardware in the not-too-
distant future.
In the last five years, Microsoft has taken advantage of this tech-
nological shift to create a variety of parallel implementations. These
include the Windows High Performance Cluster (HPC) technology
for message-passing interface (MPI) programs, Dryad, which offers a
Map-Reduce style of parallel data processing, the Windows Azure
platform, which can supply compute cores on demand, the Parallel
Patterns Library (PPL) for native code, and the parallel extensions of
the .NET Framework 4.
Multicore computation affects the whole spectrum of applica-
tions, from complex scientific and design problems to consumer
applications and new human/computer interfaces. We used to joke
that “parallel computing is the future, and always will be,” but the
pessimists have been proven wrong. Parallel computing has at last
moved from being a niche technology to being center stage for both
application developers and the IT industry.
But, there is a catch. To obtain any speed-up of an application,
programmers now have to divide the computational work to make
efficient use of the power of multicore processors, a skill that still
belongs to experts. Parallel programming presents a massive challenge
for the majority of developers, many of whom are encountering it for
the first time. There is an urgent need to educate them in practical

xi
xii for ewor d

ways so that they can incorporate parallelism into their applications.


Two possible approaches are popular with some of my computer
science colleagues: either design a new parallel programming language
or develop a “heroic” parallelizing compiler. While both are certainly
interesting academically, neither has had much success in popularizing
and simplifying the task of parallel programming for non-experts. In
contrast, a more pragmatic approach is to provide programmers with
a library that hides much of parallel programming’s complexity and to
teach programmers how to use it.
To that end, the Microsoft .NET Framework parallel extensions
present a higher-level programming model than earlier APIs. Program-
mers can, for example, think in terms of tasks rather than threads and
can avoid the complexities of managing threads. Parallel Programming
with Microsoft .NET teaches programmers how to use these libraries
by putting them in the context of design patterns. As a result, applica-
tion developers can quickly learn to write parallel programs and gain
immediate performance benefits.
I believe that this book, with its emphasis on parallel design pat-
terns and an up-to-date programming model, represents an important
first step in moving parallel programming into the mainstream.

Tony Hey
Corporate Vice President, Microsoft Research
Preface

This book describes patterns for parallel programming, with code


examples, that use the new parallel programming support in the
Microsoft® .NET Framework 4. This support is commonly referred to
as the Parallel Extensions. You can use the patterns described in this
book to improve your application’s performance on multicore com-
puters. Adopting the patterns in your code makes your application run
faster today and also helps prepare for future hardware environments,
which are expected to have an increasingly parallel computing
architecture.

Who This Book Is For


The book is intended for programmers who write managed code for
the .NET Framework on the Microsoft Windows® operating system.
This includes programmers who write in Microsoft Visual C#®
development tool, Microsoft Visual Basic® development system, and
Microsoft Visual F#. No prior knowledge of parallel programming
techniques is assumed. However, readers need to be familiar with
features of C# such as delegates, lambda expressions, generic types,
and Language Integrated Query (LINQ) expressions. Readers should
also have at least a basic familiarity with the concepts of processes
and threads of execution.
Note: The examples in this book are written in C# and use the
features of the .NET Framework 4, including the Task Parallel
Library (TPL) and Parallel LINQ (PLINQ). However, you can use
the concepts presented here with other frameworks and libraries
and with other languages.
Complete code solutions are posted on CodePlex. See
http://parallelpatterns.codeplex.com/. There is a C# version
for every example. In addition to the C# example code, there
are also versions of the examples in Visual Basic and F#.

xiii
xiv pr eface

Why This Book Is Pertinent Now


The advanced parallel programming features that are delivered with
Visual Studio® 2010 development system make it easier than ever to
get started with parallel programming.
The Task Parallel Library (TPL) is for .NET programmers who
want to write parallel programs. It simplifies the process of adding
parallelism and concurrency to applications. The TPL dynamically
scales the degree of parallelism to most efficiently use all the proces-
sors that are available. In addition, the TPL assists in the partitioning
of work and the scheduling of tasks in the .NET thread pool. The
library provides cancellation support, state management, and other
services.
Parallel LINQ (PLINQ) is a parallel implementation of LINQ to
Objects. PLINQ implements the full set of LINQ standard query
operators as extension methods for the System.Linq namespace and
has additional operators for parallel operations. PLINQ is a declara-
tive, high-level interface with query capabilities for operations such as
filtering, projection, and aggregation.
Visual Studio 2010 includes tools for debugging parallel applica-
tions. The Parallel Stacks window shows call stack information for
all the threads in your application. It lets you navigate between
threads and stack frames on those threads. The Parallel Tasks window
resembles the Threads window, except that it shows information
about each task instead of each thread. The Concurrency Visualizer
views in the Visual Studio profiler enable you to see how your applica-
tion interacts with the hardware, the operating system, and other
processes on the computer. You can use the Concurrency Visualizer
to locate performance bottlenecks, processor underutilization, thread
contention, cross-core thread migration, synchronization delays, areas
of overlapped I/O, and other information.
For a complete overview of the parallel technologies available
from Microsoft, see Appendix C, “Technology Overview.”

What You Need to Use the Code


The code that is used as examples in this book is at http://parallel
patterns.codeplex.com/. These are the system requirements:
• Microsoft Windows Vista® SP1, Windows 7, Microsoft
Windows Server® 2008, or Windows XP SP3 (32-bit or 64-bit)
operating system
• Microsoft Visual Studio 2010 (Ultimate or Premium edition
is required for the Concurrency Visualizer, which allows
you to analyze the performance of your application); this
includes the .NET Framework 4, which is required to run
the samples
xv

How to Use This Book


This book presents parallel programming techniques in terms of
particular patterns. Figure 1 shows the different patterns and their
relationships to each other. The numbers refer to the chapters in this
book where the patterns are described.

1 Introduction

Data Parallelism Task Parallelism

Coordinated by
control flow only

2 Parallel Loops 3 Parallel Tasks

Coordinated by control
flow and data flow

4 Parallel Aggregation 5 Futures 7 Pipelines

6 Dynamic Task Parallelism

figure 1
Parallel programming patterns

After the introduction, the book has one branch that discusses data
parallelism and another that discusses task parallelism.
Both parallel loops and parallel tasks use only the program’s
control flow as the means to coordinate and order tasks. The other
patterns use both control flow and data flow for coordination.
Control flow refers to the steps of an algorithm. Data flow refers to
the availability of inputs and outputs.
xvi pr eface

introduction
Chapter 1 introduces the common problems faced by developers
who want to use parallelism to make their applications run faster. It
explains basic concepts and prepares you for the remaining chapters.
There is a table in the “Design Approaches” section of Chapter 1 that
can help you select the right patterns for your application.

parallelism with control dependencies only


Chapters 2 and 3 deal with cases where asynchronous operations are
ordered only by control flow constraints:
• Chapter 2, “Parallel Loops.” Use parallel loops when you want
to perform the same calculation on each member of a collection
or for a range of indices, and where there are no dependencies
between the members of the collection. For loops with depen-
dencies, see Chapter 4, “Parallel Aggregation.”
• Chapter 3, “Parallel Tasks.” Use parallel tasks when you have
several distinct asynchronous operations to perform. This chap-
ter explains why tasks and threads serve two distinct purposes.

parallelism with control and


data dependencies
Chapters 4 and 5 show patterns for concurrent operations that are
constrained by both control flow and data flow:
• Chapter 4, “Parallel Aggregation.” Patterns for parallel aggre-
gation are appropriate when the body of a parallel loop includes
data dependencies, such as when calculating a sum or searching
a collection for a maximum value.
• Chapter 5, “Futures.” The Futures pattern occurs when opera-
tions produce some outputs that are needed as inputs to other
operations. The order of operations is constrained by a directed
graph of data dependencies. Some operations are performed in
parallel and some serially, depending on when inputs become
available.

dynamic task parallelism and pipelines


Chapters 6 and 7 discuss some more advanced scenarios:
• Chapter 6, “Dynamic Task Parallelism.” In some cases,
operations are dynamically added to the backlog of work
as the computation proceeds. This pattern applies to several
domains, including graph algorithms and sorting.
• Chapter 7, “Pipelines.” Use pipelines to feed successive
outputs of one component to the input queue of another
component, in the style of an assembly line. Parallelism
results when the pipeline fills, and when more than one
component is simultaneously active.
xvii

supporting material
In addition to the patterns, there are several appendices:
• Appendix A, “Adapting Object-Oriented Patterns.”
This appendix gives tips for adapting some of the common
object-oriented patterns, such as facades, decorators, and
repositories, to multicore architectures.
• Appendix B, “Debugging and Profiling Parallel Applications.”
This appendix gives you an overview of how to debug and
profile parallel applications in Visual Studio 2010.
• Appendix C, “Technology Roadmap.” This appendix describes
the various Microsoft technologies and frameworks for parallel
programming.
• Glossary. The glossary contains definitions of the terms used
in this book.
• References. The references cite the works mentioned in this
book.
Everyone should read Chapters 1, 2, and 3 for an introduction and
overview of the basic principles. Although the succeeding material is
presented in a logical order, each chapter, from Chapter 4 on, can be
read independently. Don’t apply the patterns
Callouts in a distinctive style, such as the one shown in the margin, in this book blindly to your
alert you to things you should watch out for. applications.
It’s very tempting to take a new tool or technology and try and
use it to solve whatever problem is confronting you, regardless of the
tool’s applicability. As the saying goes, “when all you have is a hammer,
everything looks like a nail.” The “everything’s a nail” mentality can
lead to very unfortunate results, which one hopes the bunny in Figure
2 will be able to avoid.
You also want to avoid unfortunate results in your parallel pro-
grams. Adding parallelism to your application costs time and adds
complexity. For good results, you should only parallelize the parts of
your application where the benefits outweigh the costs.

figure 2
“When all you have is a hammer, everything looks like a nail.”
xviii pr eface

What Is Not Covered


This book focuses more on processor-bound workloads than on
I/O-bound workloads. The goal is to make computationally intensive
applications run faster by making better use of the computer’s avail-
able cores. As a result, the book does not focus as much on the issue
of I/O latency. Nonetheless, there is some discussion of balanced
workloads that are both processor intensive and have large amounts
of I/O (see Chapter 7, “Pipelines”). There is also an important example
for user interfaces in Chapter 5, “Futures,” that illustrates concurrency
for tasks with I/O.
The book describes parallelism within a single multicore node
with shared memory instead of the cluster, High Performance
Computing (HPC) Server approach that uses networked nodes with
distributed memory. However, cluster programmers who want to take
advantage of parallelism within a node may find the examples in
this book helpful, because each node of a cluster can have multiple
processing units.

Goals
After reading this book, you should be able to:
• Answer the questions at the end of each chapter.
• Figure out if your application fits one of the book’s patterns
and, if it does, know if there’s a good chance of implementing
a straightforward parallel implementation.
• Understand when your application doesn’t fit one of these
patterns. At that point, you either have to do more reading
and research, or enlist the help of an expert.
• Have an idea of the likely causes, such as conflicting
dependencies or erroneously sharing data between tasks,
if your implementation of a pattern doesn’t work.
• Use the “Further Reading” sections to find more material.
Acknowledgments

Writing a technical book is a communal effort. The patterns & prac-


tices group always involves both experts and the broader community
in its projects. Although this makes the writing process lengthier and
more complex, the end result is always more relevant. The authors
drove this book’s direction and developed its content, but they want
to acknowledge the other people who contributed in various ways.
The following subject matter experts were key contributors:
Nicholas Chen, Daniel Dig, Munawar Hafiz, Fredrik Berg Kjolstad and
Samira Tasharofi, (University of Illinois at Urbana Champaign), Reed
Copsey, Jr. (C Tech Development Corporation), and Daan Leijen
(Microsoft Research). Judith Bishop (Microsoft Research) reviewed
the text and also gave us her valuable perspective as an author. Our
schedule was aggressive, but the reviewers worked extra hard to help
us meet it. Thank you.
Jon Jacky (Modeled Computation LLC) created many of the
programming samples and contributed to the text. Rick Carr (DCB
Software Testing, Inc) tested the samples and content.
Many other people reviewed sections of the book or gave us
feedback on early outlines and drafts. They include Chris Tavares,
Niklas Gustafson, Dana Groff, Wenming Ye, and David Callahan
(Microsoft), Justin Bozonier (MG-ALFA / Milliman, Inc.), Tim Mattson
(Intel), Kurt Keutzer (UC Berkeley), Joe Hummel, Ian Griffiths and
Mike Woodring (Pluralsight, LLC).
There were a great many people who spoke to us about the book
and provided feedback. They include the attendees at the ParaPLoP
2010 workshop and TechEd 2010 conference, as well as contributors
to discussions on the book’s CodePlex site. The work at UC Berkeley
and University of Illinois at Urbana Champaign was supported in part
by the Universal Parallel Computing Research Center initiative.
Tiberiu Covaci (Many-core.se) also deserves special mention for
generating interest in the book during his numerous speaking engage-
ments on “Patterns for Parallel Programming” in the U.S. and Europe.

xix
xx acknowledgments

A team of technical writers and editors worked to make the prose


readable and interesting. They include Roberta Leibovitz (Modeled
Computation LLC), Tina Burden (TinaTech Inc.), and RoAnn Corbisier
(Microsoft).
The innovative visual design concept used for this guide was
developed by Roberta Leibovitz and Colin Campbell (Modeled
Computation LLC) who worked with a group of talented designers
and illustrators. The book design was created by John Hubbard (Eson).
The cartoons that face the chapters were drawn by the award-winning
Seattle-based cartoonist Ellen Forney. The technical illustrations were
done by Katie Niemer (TinaTech Inc.).
1 Introduction

The CPU meter shows the problem. One core is running at 100 per-
cent, but all the other cores are idle. Your application is CPU-bound,
but you are using only a fraction of the computing power of your
multicore system. What next?
The answer, in a nutshell, is parallel programming. Where you once Parallel programming
would have written the kind of sequential code that is familiar to all uses multiple cores at
programmers, you now find that this no longer meets your perfor- the same time to improve
mance goals. To use your system’s CPU resources efficiently, you need your application’s speed.
to split your application into pieces that can run at the same time.
This is easier said than done. Parallel programming has a
reputation for being the domain of experts and a minefield of subtle,
hard-to-reproduce software defects. Everyone seems to have a favor-
ite story about a parallel program that did not behave as expected
because of a mysterious bug.
These stories should inspire a healthy respect for the difficulty Writing parallel programs
of the problems you face in writing your own parallel programs. has the reputation of being
Fortunately, help has arrived. The Microsoft® .NET Framework 4 in- hard, but help has arrived.
troduces a new programming model for parallelism that significantly
simplifies the job. Behind the scenes are supporting libraries with
sophisticated algorithms that dynamically distribute computations on
multicore architectures. In addition, Microsoft Visual Studio® 2010
development system includes debugging and analysis tools to support
the new parallel programming model.
Proven design patterns are another source of help. This guide
introduces you to the most important and frequently used patterns
of parallel programming and gives executable code samples for them,
using the Task Parallel Library (TPL) and Parallel LINQ (PLINQ). When
thinking about where to begin, a good place to start is to review the
patterns in this book. See if your problem has any attributes that
match the six patterns presented in the following chapters. If it does,
delve more deeply into the relevant pattern or patterns and study the
sample code.

1
2 ch a pter one

Most parallel programs conform to these patterns, and it’s


very likely you’ll be successful in finding a match to your particular
problem. If you can’t use these patterns, you’ve probably encountered
one of the more difficult cases, and you’ll need to hire an expert or
consult the academic literature.
The code examples for this guide are online at http://parallel
patterns.codeplex.com.

The Importance of Potential Parallelism


Declaring the potential The patterns in this book are ways to express potential parallelism. This
parallelism of your program means that your program is written so that it runs faster when parallel
allows the execution environ- hardware is available and roughly the same as an equivalent sequential
ment to run it on all available program when it’s not. If you correctly structure your code, the
cores, whether one or many. run-time environment can automatically adapt to the workload on a
particular computer. This is why the patterns in this book only express
potential parallelism. They do not guarantee parallel execution in
every situation. Expressing potential parallelism is a central organizing
principle behind the programming model of .NET. It deserves some
explanation.
Some parallel applications can be written for specific hardware.
For example, creators of programs for a console gaming platform have
detailed knowledge about the hardware resources that will be
available at run time. They know the number of cores and the details
of the memory architecture in advance. The game can be written to
exploit the exact level of parallelism provided by the platform. Com-
plete knowledge of the hardware environment is also a characteristic
of some embedded applications, such as industrial control. The life
cycle of such programs matches the life cycle of the specific hardware
they were designed to use.
In contrast, when you write programs that run on general-purpose
Don’t hard code the degree of computing platforms, such as desktop workstations and servers, there
parallelism in an application.
You can’t always predict how
is less predictability about the hardware features. You may not always
many cores will be available know how many cores will be available. You also may be unable to
at run time. predict what other software could be running at the same time as
your application.
Even if you initially know your application’s environment, it can
change over time. In the past, programmers assumed that their
applications would automatically run faster on later generations of
hardware. You could rely on this assumption because processor clock
speeds kept increasing. With multicore processors, clock speeds are
not increasing with newer hardware as much as in the past. Instead,
the trend in processor design is toward more cores. If you want your
application to benefit from hardware advances in the multicore world,
you need to adapt your programming model. You should expect that
introduction 3

the programs you write today will run on computers with many more Hardware trends predict
cores within a few years. Focusing on potential parallelism helps to more cores instead of
“future proof” your program. faster clock speeds.
Finally, you must plan for these contingencies in a way that does
not penalize users who might not have access to the latest hardware.
You want your parallel application to run as fast on a single-core com-
puter as an application that was written using only sequential code. In
other words, you want scalable performance from one to many cores.
Allowing your application to adapt to varying hardware capabilities, A well-written parallel
both now and in the future, is the motivation for potential parallelism. program runs at approxi-
An example of potential parallelism is the parallel loop pattern mately the same speed
described in Chapter 2, “Parallel Loops.” If you have a for loop that as a sequential program
performs a million independent iterations, it makes sense to divide when there is only one
those iterations among the available cores and do the work in parallel. core available.
It’s easy to see that how you divide the work should depend on the
number of cores. For many common scenarios, the speed of the loop
will be approximately proportional to the number of cores.

Decomposition, Coordination,
and Scalable Sharing
The patterns in this book contain some common themes. You’ll see
that the process of designing and implementing a parallel application
involves three aspects: methods for decomposing the work into dis-
crete units known as tasks, ways of coordinating these tasks as they
run in parallel, and scalable techniques for sharing the data needed to
perform the tasks.
The patterns described in this guide are design patterns. You can
apply them when you design and implement your algorithms and
when you think about the overall structure of your application.
Although the example applications are small, the principles they dem-
onstrate apply equally well to the architectures of large applications.

understanding tasks
Tasks are sequential operations that work together to perform a
larger operation. When you think about how to structure a parallel
program, it’s important to identify tasks at a level of granularity that
results in efficient use of hardware resources. If the chosen granular-
ity is too fine, the overhead of managing tasks will dominate. If it’s too
coarse, opportunities for parallelism may be lost because cores that
could otherwise be used remain idle. In general, tasks should be Tasks are sequential units of
as large as possible, but they should remain independent of each work. Tasks should be large,
other, and there should be enough tasks to keep the cores busy. You independent, and numerous
may also need to consider the heuristics that will be used for task enough to keep all cores busy.
4 ch a pter one

scheduling. Meeting all these goals sometimes involves design


tradeoffs. Decomposing a problem into tasks requires a good under-
standing of the algorithmic and structural aspects of your application.
An example of these guidelines is a parallel ray tracing application.
A ray tracer constructs a synthetic image by simulating the path of
each ray of light in a scene. The individual ray simulations are a good
level of granularity for parallelism. Breaking the tasks into smaller
units, for example, by trying to decompose the ray simulation itself
into independent tasks, only adds overhead, because the number of
ray simulations is already large enough to keep all cores occupied. If
your tasks vary greatly in size, you generally want more of them in
Keep in mind that tasks order to fill in the gaps.
are not threads. Tasks and
threads take very different
Another advantage to grouping work into larger and fewer tasks
approaches to scheduling. is that such tasks are often more independent of each other than
Tasks are much more compat- smaller but more numerous tasks. Larger tasks are less likely than
ible with the concept of smaller tasks to share local variables or fields. Unfortunately, in
potential parallelism than applications that rely on large mutable object graphs, such as applica-
threads are. While a new
thread immediately introduces
tions that expose a large object model with many public classes,
additional concurrency to your methods, and properties, the opposite may be true. In these cases, the
application, a new task larger the task, the more chance there is for unexpected sharing of
introduces only the potential data or other side effects.
for additional concurrency. A The overall goal is to decompose the problem into independent
task’s potential for additional
concurrency will be realized
tasks that do not share data, while providing sufficient tasks to
only when there are enough occupy the number of cores available. When considering the number
available cores. of cores, you should take into account that future generations of
hardware will have more cores.

coordinating tasks
It’s often possible that more than one task can run at the same time.
Tasks that are independent of one another can run in parallel, while
some tasks can begin only after other tasks complete. The order of
execution and the degree of parallelism are constrained by the appli-
cation’s underlying algorithms. Constraints can arise from control
flow (the steps of the algorithm) or data flow (the availability of inputs
and outputs).
Various mechanisms for coordinating tasks are possible. The way
tasks are coordinated depends on which parallel pattern you use. For
example, the pipeline pattern described in Chapter 7, “Pipelines,” is
distinguished by its use of concurrent queues to coordinate tasks.
Regardless of the mechanism you choose for coordinating tasks, in
order to have a successful design, you must understand the dependen-
cies between tasks.
introduction 5

scalable sharing of data


Tasks often need to share data. The problem is that when a program
is running in parallel, different parts of the program may be racing
against each other to perform updates on the same location of
memory. The result of such unintended data races can be catastroph-
ic. The solution to the problem of data races includes techniques for
synchronizing threads.
You may already be familiar with techniques that synchronize
concurrent threads by blocking their execution in certain circum-
stances. Examples include locks, atomic compare-and-swap opera-
tions, and semaphores. All of these techniques have the effect of
serializing access to shared resources. Although your first impulse for
data sharing might be to add locks or other kinds of synchronization,
adding synchronization reduces the parallelism of your application.
Every form of synchronization is a form of serialization. Your tasks
can end up contending over the locks instead of doing the work you
want them to do. Programming with locks is also error-prone.
Fortunately, there are a number of techniques that allow data to
For more about the impor-
be shared that don’t degrade performance or make your program tance of immutable types in
prone to error. These techniques include the use of immutable, read- parallel programs, see the
only data, limiting your program’s reliance on shared variables, and section, “Immutable Types,”
introducing new steps in your algorithm that merge local versions of in Appendix A.
mutable state at appropriate checkpoints. Techniques for scalable
sharing may involve changes to an existing algorithm.
Conventional object-oriented designs can have complex and Scalable sharing may involve
highly interconnected in-memory graphs of object references. As a changes to your algorithm.
result, traditional object-oriented programming styles can be very
difficult to adapt to scalable parallel execution. Your first impulse
might be to consider all fields of a large, interconnected object graph
as mutable shared state, and to wrap access to these fields in serial-
izing locks whenever there is the possibility that they may be shared
by multiple tasks. Unfortunately, this is not a scalable approach to
sharing. Locks can often negatively affect the performance of all
cores. Locks force cores to pause and communicate, which takes time,
and they introduce serial regions in the code, which reduces the Adding synchronization
potential for parallelism. As the number of cores gets larger, the cost (locks) can reduce the
of lock contention can increase. As more and more tasks are added scalability of your
that share the same data, the overhead associated with locks can application.
dominate the computation.
In addition to performance problems, programs that rely on com-
plex synchronization are prone to a variety of problems, including
deadlock. This occurs when two or more tasks are waiting for each
other to release a lock. Most of the horror stories about parallel
programming are actually about the incorrect use of shared mutable
state or locking protocols.
6 ch a pter one

Nonetheless, synchronizing elements in an object graph plays a


legitimate, if limited, role in scalable parallel programs. This book uses
synchronization sparingly. You should, too. Locks can be thought of
as the goto statements of parallel programming: they are error prone
but necessary in certain situations, and they are best left, when
possible, to compilers and libraries.
No one is advocating the removal, in the name of performance, of
synchronization that’s necessary for correctness. First and foremost,
the code still needs to be correct. However, it’s important to incorpo-
rate design principles into the design process that limit the need for
synchronization. Don’t add synchronization to your application as an
afterthought.

design approaches
It’s common for developers to identify one problem area, parallelize
the code to improve performance, and then repeat the process for the
next bottleneck. This is a particularly tempting approach when you
parallelize an existing sequential application. Although this may give
you some initial improvements in performance, it has many pitfalls,
and it may not produce the best results. A far better approach is to
understand your problem or application and look for potential
parallelism across the entire application as a whole. What you dis-
Think in terms of data cover may lead you to adopt a different architecture or algorithm that
structures and algorithms; better exposes the areas of potential parallelism in your application.
don’t just identify bottlenecks.
Don’t simply identify bottlenecks and parallelize them. Instead, pre-
pare your program for parallel execution by making structural changes.
Techniques for decomposition, coordination, and scalable sharing
are interrelated. There’s a circular dependency. You need to consider
all of these aspects together when choosing your approach for a
particular application.
After reading the preceding description, you might complain that
it all seems vague. How specifically do you divide your problem into
tasks? Exactly what kinds of coordination techniques should you use?
Questions like these are best answered by the patterns described
Use patterns. in this book. Patterns are a true shortcut to understanding. As you
begin to see the design motivations behind the patterns, you will also
develop your intuition about how the patterns and their variations can
be applied to your own applications. The following section gives more
details about how to select the right pattern.
introduction 7

Selecting the Right Pattern


To select the relevant pattern, use the following table.

Application characteristic Relevant pattern


Do you have sequential loops where there’s no The Parallel Loop pattern (Chapter 2).
communication among the steps of each iteration? Parallel loops apply an independent operation to multiple
inputs simultaneously.
Do you have distinct operations with well-defined The Parallel Task pattern (Chapter 3)
control dependencies? Are these operations largely free Parallel tasks allow you to establish parallel control flow
of serializing dependencies? in the style of fork and join.
Do you need to summarize data by applying some kind The Parallel Aggregation pattern (Chapter 4)
of combination operator? Do you have loops with steps Parallel aggregation introduces special steps in the
that are not fully independent? algorithm for merging partial results. This pattern
expresses a reduction operation and includes map/reduce
as one of its variations.
Does the ordering of steps in your algorithm depend The Futures pattern (Chapter 5)
on data flow constraints? Futures make the data flow dependencies between tasks
explicit. This pattern is also referred to as the Task Graph
pattern.
Does your algorithm divide the problem domain The Dynamic Task Parallelism pattern (Chapter 6)
dynamically during the run? Do you operate on recursive This pattern takes a divide-and-conquer approach and
data structures such as graphs? spawns new tasks on demand.
Does your application perform a sequence of operations The Pipelines pattern (Chapter 7)
repetitively? Does the input data have streaming Pipelines consist of components that are connected by
characteristics? Does the order of processing matter? queues, in the style of producers and consumers. All
the components run in parallel even though the order
of inputs is respected.

One way to become familiar with the possibilities of the six patterns
is to read the first page or two of each chapter. This gives you an
overview of approaches that have been proven to work in a wide va-
riety of applications. Then go back and more deeply explore patterns
that may apply in your situation.

A Word About Terminology


You’ll often hear the words parallelism and concurrency used as syn-
onyms. This book makes a distinction between the two terms.
Concurrency is a concept related to multitasking and asynchro-
nous input-output (I/O). It usually refers to the existence of multiple
threads of execution that may each get a slice of time to execute
before being preempted by another thread, which also gets a slice of
time. Concurrency is necessary in order for a program to react to
external stimuli such as user input, devices, and sensors. Operating
systems and games, by their very nature, are concurrent, even on
one core.
8 ch a pter one

With parallelism, concurrent threads execute at the same time on


multiple cores. Parallel programming focuses on improving the perfor-
mance of applications that use a lot of processor power and are not
constantly interrupted when multiple cores are available.
The goals of concurrency and parallelism are distinct. The main
goal of concurrency is to reduce latency by never allowing long peri-
ods of time to go by without at least some computation being
performed by each unblocked thread. In other words, the goal of
concurrency is to prevent thread starvation.
Concurrency is required operationally. For example, an operating
system with a graphical user interface must support concurrency if
more than one window at a time can update its display area on a sin-
gle-core computer. Parallelism, on the other hand, is only about
throughput. It’s an optimization, not a functional requirement. Its goal
is to maximize processor usage across all available cores; to do this, it
uses scheduling algorithms that are not preemptive, such as algorithms
that process queues or stacks of work to be done.

The Limits of Parallelism


A theoretical result known as Amdahl’s law says that the amount of
performance improvement that parallelism provides is limited by the
amount of sequential processing in your application. This may, at first,
seem counterintuitive.
Amdahl’s law says that no matter how many cores you have, the
maximum speedup you can ever achieve is (1 / percent of time spent
in sequential processing). Figure 1 illustrates this.
4 figure 1
Amdahl’s law for an
3.5 application with 25
percent sequential
3
processing
Execution Speed

2.5
2
1.5

1
0.5

0
0 6 11 16

Number of processors
introduction 9

For example, with 11 processors, the application runs slightly more


than three times faster than it would if it were entirely sequential.
Even with fewer cores, you can see that the expected speedup is
not linear. Figure 2 illustrates this.
3 figure 2
Per-core performance
2.5 improvement for a 25
percent sequential
2 application
Speedup

1.5

0.5

0
1 2 3 4 5
Number of cores

KEY
% Parallel

% Sequential

Figure 2 shows that as the number of cores (and overall application


speed) increases the percentage of time spent in the sequential part
of the application increases. (The elapsed time spent in sequential
processing is constant.) The illustration also shows why you might be
satisfied with a 2x speedup on a four-core computer for actual ap-
plications, as opposed to sample programs. The important question is
always how scalable the application is. Scalability depends on the
amount of time spent doing work that is inherently sequential in na-
ture.
Another implication of Amdahl’s law is that for some problems,
you may want to create additional features in the parts of an applica-
tion that are amenable to parallel execution. For example, a developer
of a computer game might find that it’s possible to make increasingly
sophisticated graphics for newer multicore computers by using the
parallel hardware, even if it’s not as feasible to make the game logic
(the artificial intelligence engine) run in parallel. Performance can in-
fluence the mix of application features.
The speedup you can achieve in practice is usually somewhat
worse than Amdahl’s law would predict. As the number of cores
10 ch a pter one

increases, the overhead incurred by accessing shared memory also


increases. Also, parallel algorithms may include overhead for coordina-
tion that would not be necessary for the sequential case. Profiling
tools, such as the Visual Studio Concurrency Visualizer, can help you
understand how effective your use of parallelism is.
In summary, because an application consists of parts that must
run sequentially as well as parts that can run in parallel, the application
overall will rarely see a linear increase in performance with a linear
increase in the number of cores, even if certain parts of the applica-
tion see a near linear speedup. Understanding the structure of your
application, and its algorithms—that is, which parts of your applica-
tion are suitable for parallel execution—is a step that can’t be skipped
when analyzing performance.

A Few Tips
Always try for the simplest approach. Here are some basic precepts:
• Whenever possible, stay at the highest possible level of abstrac-
tion and use constructs or a library that does the parallel work
for you.
• Use your application server’s inherent parallelism; for example,
use the parallelism that is incorporated into a web server or
database.
• Use an API to encapsulate parallelism, such as Microsoft Parallel
Extensions for .NET (TPL and PLINQ). These libraries were
written by experts and have been thoroughly tested; they help
you to avoid many of the common problems that arise in parallel
programming.
• Consider the overall architecture of your application when
thinking about how to parallelize it. It’s tempting to simply look
for the performance hotspots and focus on improving them.
While this may improve things, it does not necessarily give you
the best results.
• Use patterns, such as the ones described in this book.
• Often, restructuring your algorithm (for example, to eliminate
the need for shared data) is better than making low-level
improvements to code that was originally designed to run
serially.
• Don’t share data among concurrent tasks unless absolutely
necessary. If you do share data, use one of the containers
provided by the API you are using, such as a shared queue.
• Use low-level primitives, such as threads and locks, only as
a last resort. Raise the level of abstraction from threads to
tasks in your applications.
introduction 11

Exercises
1. What are some of the tradeoffs between decomposing
a problem into many small tasks versus decomposing it
into larger tasks?
2. What is the maximum potential speedup of a program
that spends 10 percent of its time in sequential processing
when you move it from one to four cores?
3. What is the difference between parallelism and
concurrency?

For More Information


If you are interested in better understanding the terminology used in
the text, refer to the glossary at the end of this book.
The design patterns presented in this book are consistent with
classifications of parallel patterns developed by groups in both indus-
try and academia. In the terminology of these groups, the patterns in
this book would be considered to be algorithm or implementation
patterns. Classification approaches for parallel patterns can be found
in the book by Mattson, et al. and at the Our Pattern Language (OPL)
web site. This book attempts to be consistent with the terminology
of these sources. In cases where this is not possible, an explanation
appears in the text.
For a detailed discussion of parallelism on the Windows platform,
see the book by Duffy. An overview of threading and synchronization
in .NET can be found in Albahari.
J. Albahari and B. Albahari. C# 4 in a Nutshell. O’Reilly, fourth
edition, 2010.
J. Duffy. Concurrent Programming on Windows. Addison-Wesley,
2008.
T. G. Mattson, B. A. Sanders, and B. L. Massingill. Patterns for
Parallel Programming. Addison-Wesley, 2004.
“Our Pattern Language for Parallel Programming Ver 2.0.”
http://parlab.eecs.berkeley.edu/wiki/patterns
2 Parallel Loops

Use the Parallel Loop pattern when you need to perform the same
independent operation for each element of a collection or for a fixed
number of iterations. The steps of a loop are independent if they
don’t write to memory locations or files that are read by other steps.
The syntax of a parallel loop is very similar to the for and foreach
loops you already know, but the parallel loop runs faster on a com-
puter that has available cores. Another difference is that, unlike a se-
quential loop, the order of execution isn’t defined for a parallel loop.
Steps often take place at the same time, in parallel. Sometimes, two
steps take place in the opposite order than they would if the loop
were sequential. The only guarantee is that all of the loop’s iterations
will have run by the time the loop finishes.
It’s easy to change a sequential loop into a parallel loop. However,
it’s also easy to use a parallel loop when you shouldn’t. This is because
it can be hard to tell if the steps are actually independent of each
other. It takes practice to learn how to recognize when one step is
dependent on another step. Sometimes, using this pattern on a loop The Parallel Loop pattern
with dependent steps causes the program to behave in a completely independently applies an
unexpected way, and perhaps to stop responding. Other times, it in- operation to multiple data
troduces a subtle bug that only appears once in a million runs. In elements. It’s an example
other words, the word “independent” is a key part of the definition of of data parallelism.
this pattern, and one that this chapter explains in detail.
For parallel loops, the degree of parallelism doesn’t need to be
specified by your code. Instead, the run-time environment executes
the steps of the loop at the same time on as many cores as it can. The
loop works correctly no matter how many cores are available. If there
is only one core, the performance is close to (perhaps within a few
percentage points of) the sequential equivalent. If there are multiple
cores, performance improves; in many cases, performance improves
proportionately with the number of cores.

13
14 ch a pter t wo

The Basics
The .NET Framework includes both parallel For and parallel ForEach
loops and is also implemented in the Parallel LINQ (PLINQ) query
To make for and foreach language. Use the Parallel.For method to iterate over a range of inte-
loops with independent ger indices and the Parallel.ForEach method to iterate over user-
iterations run faster on provided values. Use PLINQ if you prefer a high-level, declarative style
multicore computers, use for describing loops or if you want to take advantage of PLINQ’s
their parallel counterparts. convenience and flexibility.

parallel for loops


Here’s an example of a sequential for loop in C#.
int n = ...
for (int i = 0; i < n; i++)
{
// ...
Don’t forget that the steps }
of the loop body must be
independent of one another To take advantage of multiple cores, replace the for keyword with a
if you want to use a parallel call to the Parallel.For method and convert the body of the loop into
loop. The steps must not
communicate by writing
a lambda expression.
to shared variables. int n = ...
Parallel.For(0, n, i =>
{
// ...
});

Parallel.For uses multiple Parallel.For is a static method with overloaded versions. Here’s the
cores to operate over an index signature of the version of Parallel.For that’s used in the example.
range.
Parallel.For(int fromInclusive,
int toExclusive,
Action<int> body);

In the example, the first two arguments specify the iteration limits.
The Parallel.For method does The first argument is the lowest index of the loop. The second argu-
not guarantee any particular ment is the exclusive upper bound, or the largest index plus one. The
order of execution. Unlike a third argument is an action that’s invoked once per iteration. The ac-
sequential loop, some tion takes the iteration’s index as its argument and executes the loop
higher-valued indices may be body once for each index.
processed before some
lower-valued indices. The Parallel.For method has additional overloaded versions.
These are covered in the section, “Variations,” later in this chapter and
in Chapter 4, “Parallel Aggregation.”
The example includes a lambda expression in the form args =>
body as the third argument to the Parallel.For invocation. Lambda
expressions are unnamed methods that can capture variables from
pa r a llel loops 15

their enclosing scope. Of course, the body parameter could also be an


instance of a delegate type, an anonymous method (using the delegate If you’re unfamiliar with the
syntax for lambda expressions,
keyword) or an ordinary named method. In other words, you don’t see “Further Reading” at the
have to use lambda expressions if you don’t want to. Examples in this end of this chapter. After you
book use lambda expressions because they keep the code within the use lambda expressions, you’ll
body of the loop, and they are easier to read when the number of lines wonder how you ever lived
of code is small. without them.

parallel for each


Here’s an example of a sequential foreach loop in C#.
IEnumerable<MyObject> myEnumerable = ...

foreach (var obj in myEnumerable)


{
// ...
}

To take advantage of multiple cores, replace the foreach keyword


with a call to the Parallel.ForEach method.
IEnumerable<MyObject> myEnumerable = ...

Parallel.ForEach(myEnumerable, obj =>


{
// ...
});

Parallel.ForEach is a static method with overloaded versions. Here’s Parallel.ForEach runs the
the signature of the version of Parallel.ForEach that was used in the loop body for each element in
example. a collection.
ForEach<TSource>(IEnumerable<TSource> source,
Action<TSource> body);

In the example, the first argument is an object that implements the


IEnumerable<MyObject> interface. The second argument is a method
that’s invoked for each element of the input collection.
The Parallel.ForEach method does not guarantee the order of Don’t forget that iterations
need to be independent. The
execution. Unlike a sequential ForEach loop, the incoming values loop body must only make
aren’t always processed in order. updates to fields of the
The Parallel.ForEach method has additional overloaded versions. particular instance that’s
These are covered in the section, “Variations,” later in this chapter and passed to it.
in Chapter 4, “Parallel Aggregation.”
16 ch a pter t wo

parallel linq (plinq)


The Language Integrated Query (LINQ) feature of the .NET Frame-
work includes a parallel version named PLINQ (Parallel LINQ). There
are many options and variations for expressing PLINQ queries but al-
You can convert LINQ most all LINQ-to-Objects expressions can easily be converted to their
expressions to parallel code parallel counterpart by adding a call to the AsParallel extension
with the AsParallel method. Here’s an example that shows both the LINQ and PLINQ
extension method. versions.
IEnumerable<MyObject> source = ...

// LINQ
var query1 = from i in source select Normalize(i);

// PLINQ
var query2 = from i in source.AsParallel()
select Normalize(i);

This code example creates two queries that transform values of the
enumerable object source. The PLINQ version uses multiple cores if
they’re available.
You can also use PLINQ’s ForAll extension method in cases
It’s important to use PLINQ’s
ForAll extension method where you want to iterate over the input values but you don’t want
instead of giving a PLINQ to select output values to return. This is shown in the following code.
query as an argument to the
Parallel.ForEach method. For IEnumerable<MyObject> myEnumerable = ...
more information, see the
section, “Mixing the Parallel myEnumerable.AsParallel().ForAll(obj => DoWork(obj));
Class and PLINQ,” later in
this chapter. The ForAll extension method is the PLINQ equivalent of Parallel.
ForEach.

what to expect
By default, the degree of parallelism (that is, how many iterations run
Adding cores makes your loop
at the same time in hardware) depends on the number of available
run faster; however, there’s
cores. In typical scenarios, the more cores you have, the faster your
always an upper limit.
loop executes, until you reach the point of diminishing returns that
Amdahl’s Law predicts. How much faster depends on the kind of
work your loop does.
The .NET implementation of the Parallel Loop pattern ensures
You must choose the correct that exceptions that are thrown during the execution of a loop body
granularity. Too many small
parallel loops can reach a point
are not lost. For both the Parallel.For and Parallel.ForEach methods
of over-decomposition where as well as for PLINQ, exceptions are collected into an AggregateEx-
the multicore speedup is more ception object and rethrown in the context of the calling thread. All
than offset by the parallel exceptions are propagated back to you. To learn more about excep-
loop’s overhead. tion handling for parallel loops, see the section, “Variations,” later in
this chapter.
pa r a llel loops 17

Parallel loops have many variations. There are 12 overloaded Robust exception handling
methods for Parallel.For and 20 overloaded methods for Parallel. is an important aspect of
ForEach. PLINQ has close to 200 extension methods. Although there parallel loop processing.
are many overloaded versions of For and ForEach, you can think of
the overloads as providing optional configuration options. Two ex-
amples are a maximum degree of parallelism and hooks for external
cancellation. These options allow the loop body to monitor the prog-
ress of other steps (for example, to see if exceptions are pending) and
to manage task-local state. They are sometimes needed in advanced
scenarios. To learn about the most important cases, see the section,
“Variations,” later in this chapter.
Check carefully for dependen-
If you convert a sequential loop to a parallel loop and then find
cies between loop iterations!
that your program does not behave as expected, the mostly likely Not noticing dependencies
problem is that the loop’s steps are not independent. Here are some between steps is by far the
common examples of dependent loop bodies: most common mistake you’ll
• Writing to shared variables. If the body of a loop writes to make with parallel loops.
a shared variable, there is a loop body dependency. This is a
common case that occurs when you are aggregating values.
Here is an example, where total is shared across iterations.
for(int i = 1; i < n; i++)
total += data[i];

If you encounter this situation, see Chapter 4, “Parallel Aggregation.”


Shared variables come in many flavors. Any variable that is
declared outside of the scope of the loop body is a shared
variable. Shared references to types such as classes or arrays
will implicitly allow all fields or array elements to be shared.
Parameters that are declared using the keyword ref result in
shared variables. Even reading and writing files can have the
same effect as shared variables.
• Using properties of an object model. If the object being
processed by a loop body exposes properties, you need to
know whether those properties refer to shared state or state
that’s local to the object itself. For example, a property named
Parent is likely to refer to global state. Here’s an example.
for(int i = 0; i < n; i++)
SomeObject[i].Parent.Update();

In this example, it’s likely that the loop iterations are not independent.
For all values of i, SomeObject[i].Parent is a reference to a single
shared object.
18 ch a pter t wo

• Referencing data types that are not thread safe. If the body of
You must be extremely the parallel loop uses a data type that is not thread safe, the
cautious when getting data
from properties and methods. loop body is not independent (there is an implicit dependency
Large object models are known on the thread context). An example of this case, along with a
for sharing mutable state in solution, is shown in “Using Task-Local State in a Loop Body” in
unbelievably devious ways. the section, “Variations,” later in this chapter.
• Loop-carried dependence. If the body of a parallel for loop
performs arithmetic on the loop index, there is likely to be a
dependency that is known as loop-carried dependence. This is
shown in the following code example. The loop body references
data[i] and data[i – 1]. If Parallel.For is used here, there’s no
guarantee that the loop body that updates data[i – 1] has
executed before the loop for data[i].
for(int i = 1; i < N; i++)
data[i] = data[i] + data[i - 1];

Sometimes, it’s possible to use a parallel algorithm in cases of


loop-carried dependence, but this is outside the scope of this
book. Your best bet is to look elsewhere in your program for
Arithmetic on loop index opportunities for parallelism or to analyze your algorithm and
variables, especially addition or see if it matches some of the advanced parallel patterns that
subtraction, usually indicates occur in scientific computing. Parallel scan and parallel dynamic
loop-carried dependence.
programming are examples of these patterns.
When you look for opportunities for parallelism, profiling your ap-
plication is a way to deepen your understanding of where your
Don’t expect miracles from application spends its time; however, profiling is not a substitute for
profiling—it can’t analyze your understanding your application’s structure and algorithms. For exam-
algorithms for you. Only you ple, profiling doesn’t tell you whether loop bodies are independent.
can do that.

An Example
Here’s an example of when to use a parallel loop. Fabrikam Shipping
extends credit to its commercial accounts. It uses customer credit
trends to identify accounts that might pose a credit risk. Each cus-
tomer account includes a history of past balance-due amounts. Fabri-
kam has noticed that customers who don’t pay their bills often have
histories of steadily increasing balances over a period of several
months before they default.
To identify at-risk accounts, Fabrikam uses statistical trend analy-
sis to calculate a projected credit balance for each account. If the
analysis predicts that a customer account will exceed its credit limit
within three months, the account is flagged for manual review by one
of Fabrikam’s credit analysts.
pa r a llel loops 19

In the application, a top-level loop iterates over customers in the


account repository. The body of the loop fits a trend line to the bal-
ance history, extrapolates the projected balance, compares it to the
credit limit, and assigns the warning flag if necessary.
An important aspect of this application is that each customer’s
credit status can be independently calculated. The credit status of one
customer doesn’t depend on the credit status of any other customer.
Because the operations are independent, making the credit analysis
application run faster is simply a matter of replacing a sequential
foreach loop with a parallel loop.
The complete source code for this example is online at http://
parallelpatterns.codeplex.com in the Chapter2\CreditReview project.

sequential credit review example


Here’s the sequential version of the credit analysis operation.
static void UpdatePredictionsSequential(
AccountRepository accounts)
{
foreach (Account account in accounts.AllAccounts)
{
Trend trend = SampleUtilities.Fit(account.Balance);
double prediction = trend.Predict(
account.Balance.Length + NumberOfMonths);
account.SeqPrediction = prediction;
account.SeqWarning = prediction < account.Overdraft;
}
}

The UpdatePredictionsSequential method processes each account


from the application’s account repository. The Fit method is a utility
function that uses the statistical least squares method to create a
trend line from an array of numbers. The Fit method is a pure func-
tion. This means that it doesn’t modify any state.
The prediction is a three-month projection based on the trend. If
a prediction is more negative than the overdraft limit (credit balances
are negative numbers in the accounting system), the account is flagged
for review.

credit review example using


parallel.for each
The parallel version of the credit scoring analysis is very similar to the
sequential version.
20 ch a pter t wo

static void UpdatePredictionsParallel(AccountRepository accounts)


{
Parallel.ForEach(accounts.AllAccounts, account =>
{
Trend trend = SampleUtilities.Fit(account.Balance);
double prediction = trend.Predict(
account.Balance.Length + NumberOfMonths);
account.ParPrediction = prediction;
account.ParWarning = prediction < account.Overdraft;
});
}

The UpdatePredictionsParallel method is identical to the Up-


datePredictionsSequential method, except that the Parallel.ForEach
method replaces the foreach operator.

credit review example with plinq


You can also use PLINQ to express a parallel loop. Here’s an example.
static void UpdatePredictionsPlinq(AccountRepository accounts)
{
accounts.AllAccounts
.AsParallel()
.ForAll(account =>
{
Trend trend = SampleUtilities.Fit(account.Balance);
double prediction = trend.Predict(
account.Balance.Length + NumberOfMonths);
account.PlinqPrediction = prediction;
account.PlinqWarning = prediction < account.Overdraft;
});
}

Using PLINQ is almost exactly like using LINQ-to-Objects. PLINQ


provides a ParallelEnumerable class that defines extension methods
for various types in a manner very similar to LINQ’s Enumerable class.
One of the methods of ParallelEnumerable is the AsParallel exten-
sion method.
The AsParallel extension method allows you to convert a se-
quential collection of type IEnumerable<T> into a ParallelQuery<T>
object. Applying AsParallel to the accounts.AllAccounts collection
returns an object of type ParallelQuery<AccountRecord>.
PLINQ’s ParallelEnumerable class has close to 200 extension
methods that provide parallel queries for ParallelQuery<T> objects.
In addition to parallel implementations of LINQ methods, such as
pa r a llel loops 21

Select and Where, PLINQ provides a ForAll extension method that


invokes a delegate method in parallel for every element.
In the PLINQ prediction example, the argument to ForAll is a
lambda expression that performs the credit analysis for a specified
account. The body is the same as in the sequential version.

performance comparison
Running the credit review example on a quad-core computer shows
that the Parallel.ForEach and PLINQ versions run slightly less than
four times as fast as the sequential version. Timing numbers vary; you
may want to run the online samples on your own computer.

Variations
The credit analysis example shows a typical way to use parallel loops,
but there can be variations. This section introduces some of the most
important ones. You won’t always need to use these variations, but
you should be aware that they are available.

breaking out of loops early


Breaking out of loops is a familiar part of sequential iteration. It’s less
common in parallel loops, but you’ll sometimes need to do it. Here’s
an example of the sequential case.
int n = ...
for (int i = 0; i < n; i++)
{
// ...
if (/* stopping condition is true */)
break;
}

The situation is more complicated with parallel loops because more


than one step may be active at the same time, and steps of a parallel
loop are not necessarily executed in any predetermined order. Conse-
quently, parallel loops have two ways to break or stop a loop instead
of just one. Parallel break allows all steps with indices lower than the
break index to run before terminating the loop. Parallel stop termi-
nates the loop without allowing any new steps to begin.

Parallel Break
The Parallel.For method has an overload that provides a Parallel Use Break to exit a loop
LoopState object as a second argument to the loop body. You can ask early while ensuring that
the loop to break by calling the Break method of the ParallelLoop lower-indexed steps complete.
State object. Here’s an example.
22 ch a pter t wo

int n = ...
Parallel.For(0, n, (i, loopState) =>
{
// ...
if (/* stopping condition is true */)
{
loopState.Break();
return;
}
});

This example uses an overloaded version of Parallel.For that passes a


“loop state” object to each step. Here’s the signature of the version of
the Parallel.For method that was used in the example.
Parallel.For(int fromInclusive,
int toExclusive,
Action<int, ParallelLoopState> body);

The object that’s passed to the loopState argument is an instance of


the ParallelLoopState class that was created by the parallel loop for
use within the loop body.
Calling Break doesn’t stop Calling the Break method of the ParallelLoopState object begins
other steps that might have an orderly shutdown of the loop processing. Any steps that are run-
already started running. ning as of the call to Break will run to completion.
You may want to check for a break condition in long-running loop
bodies and exit that step immediately if a break was requested. If you
don’t do this, the step will continue to run until it finishes. To see if
another step running in parallel has requested a break, retrieve the
value of the parallel loop state’s LowestBreakIteration property. If
this returns a nullable long integer whose HasValue property is true,
you know that a break has been requested. You can also read the
ShouldExitCurrentIteration property of the loop state object, which
checks for breaks as well as other stopping conditions.
Don’t forget that all steps with During the processing of a call to the Break method, iterations
an index value less than the
step that invoked the Break with an index value less than the current index will be allowed to start
method will be allowed to run (if they have not already started), but iterations with an index value
normally, even after you call greater than the current index will not be started. This ensures that all
Break. iterations below the break point will complete.
Because of parallel execution, it’s possible that more than one
step may call Break. In that case, the lowest index will be used
to determine which steps will be allowed to start after the break
occurred.
The Parallel.For and Parallel.ForEach methods return an object
of type ParallelLoopResult. You can find out if a loop terminated
with a break by examining the values of two of the loop result proper-
pa r a llel loops 23

ties. If the IsCompleted property is false and the LowestBreak


Iteration property returns an object whose HasValue property is
true, you know that the loop terminated by a call to the Break
method. You can query for the specific index with the loop result’s
LowestBreakIteration property. Here’s an example.
int n = ...
var result = new double[n];

var loopResult = Parallel.For(0, n, (i, loopState) =>


{
if (/* break condition is true */)
{
loopState.Break();
return;
}
result[i] = DoWork(i);
});

if (!loopResult.IsCompleted &&
loopResult.LowestBreakIteration.HasValue)
{
Console.WriteLine(“Loop encountered a break at {0}”,
loopResult.LowestBreakIteration.Value); Be aware that some steps with
} index values higher than the
step that called the Break
The Break method ensures that data up to a particular iteration index method might be run. There’s
value will be processed. Depending on how the iterations are sched- no way of predicting when or
uled, it may be possible that some steps with a higher index value than if this might happen.
the one that called the Break method may have been started before
the call to Break occurs.
The Parallel.ForEach method also supports the loop state Break
method. The parallel loop assigns items a sequence number, starting The Parallel.ForEach
from zero, as it pulls them from the enumerable input. This sequence method also supports the
number is used as the iteration index for the LowestBreakIteration loop state Break method.
property.

Parallel Stop
There are also situations, such as unordered searches, where you want
the loop to stop as quickly as possible after the stopping condition is Use Stop to exit a loop early
met. The difference between “break” and “stop” is that, with stop, no when you don’t need all
attempt is made to execute loop iterations less than the stopping in- lower-indexed iterations
dex if they have not already run. To stop a loop in this way, call the to run before terminating
ParallelLoopState class’s Stop method instead of the Break method. the loop.
Here is an example of parallel stop.
24 ch a pter t wo

var n = ...
var loopResult = Parallel.For(0, n, (i, loopState) =>
{
if (/* stopping condition is true */)
{
loopState.Stop();
return;
}
result[i] = DoWork(i);
});

if (!loopResult.IsCompleted &&
!loopResult.LowestBreakIteration.HasValue)
{
Console.WriteLine(“Loop was stopped”);
}

When the Stop method is called, the index value of the iteration
that caused the stop isn’t available.
You cannot call both Break and Stop during the same parallel
loop. You have to choose which of the two loop exit behaviors you
want to use. If you call both Break and Stop in the same parallel loop,
an exception will be thrown.
Parallel programs use Stop more often than Break. Processing all
iterations with indices less than the stopping iteration is usually not
You’ll probably use Stop necessary when the loop bodies are independent of each other. It’s
more often than Break. also true that Stop shuts down a loop more quickly than Break.
There’s no Stop method for a PLINQ query, but you can use the
WithCancellation extension method and then use cancellation as a
way to stop PLINQ execution. For more information, see the next
section, “External Loop Cancellation.”

external loop cancellation


In some scenarios, you may want to cancel a parallel loop because of
an external request. For example, you may need to respond to a re-
quest from a user interface to stop what you’re doing.
In .NET, you use the CancellationTokenSource class to signal
cancellation and the CancellationToken structure to detect and re-
spond to a cancellation request. The structure allows you to find out
if there is a pending cancellation request. The class lets you signal that
cancellation should occur.
The Parallel.For and Parallel.ForEach methods include over-
loaded versions that accept parallel loop options as one of the argu-
ments. You can specify a cancellation token as one of these options.
Exploring the Variety of Random
Documents with Different Content
And smote the wingèd coursers till they flew
Unchecked thro’ opening vistas of the heaven.
His father, mounted on a blazing star,
Rode after, warning him: “Drive thither, boy!”
“Wheel yonder!”
The messenger seems to have continued with a picture of
Phaethon’s fall. The body, still giving off the smoke of destruction, is
next brought in, and we possess part of Clymene’s frantic speech.
Her grief is mingled with terror: the strange manner of her son’s
death may provoke her husband Merops to inquiry and reflexion and
so her long-past union with the Sun-god may come to light. She bids
them hide the body in the treasure-chamber, of which she alone
holds the keys. Soon the king enters amid lyric strains celebrating
the marriage-day of Phaethon. He is giving orders for merry-making
when a servant hurries out to inform him that the treasure-chamber
is giving forth clouds of smoke. Merops hastens within, and the
chorus bewail the disclosure which is imminent. In a moment the
stricken father is heard returning with lamentation. The course of
the last scene is not certain, but probably a god reconciled the king
and his wife, giving directions for the disposal of Phaethon’s body; a
beautiful but obscure fragment,[802] redolent with the charm of
breezes and murmuring boughs after all this blaze and splendour,
seems to point to the story of Phaethon’s sisters, who mourned him
beside the western waters and were transformed into poplars. This
god was probably Oceanus,[803] the father of Clymene. He alone
(deity of the world-encircling water) could give unity to these two
pictures, the radiant eastern land of Phaethon’s youthful enterprise,
and the distant western river where his sorrows and his end are
bathed in dim beauty.
This sketch allows us to realize how much we have lost in the
Phaethon. The romantic events and setting recall the Andromeda.
Clymene’s sorrow and shame mingle strangely with the gallant
enterprise and bright charm of the whole, somewhat as Creusa’s
story is contrasted with the fresh cheerfulness of Ion. Above all, the
noble simplicity and high-hearted adventurousness of Phaethon,
inspired by his new-found kinship with a god and chafing at the
placid programme of domestic honour and luxury which his
supposed father sets before him—this is a concept of boundless
promise.
The Hypsipyle,[804] which was produced late[805] in Euripides’ life,
is specially interesting through the discovery in 1906 of extensive
fragments at Oxyrhynchus in Egypt. Previously it was known by
scanty quotations of no great interest, though apparently much
prized in ancient times.[806] The plot is now in the main clear.
Hypsipyle, grand-daughter of the god Dionysus and daughter of
Thoas, King of Lemnos, was exiled because she refused to join in
the massacre of the Lemnian men by their women. Previously she
had borne twin sons to Jason. These she lost when expelled from
her home. She is now slave to Eurydice, Queen of Nemea in the
north of the Peloponnese, and nurse to her infant son Opheltes. Her
own sons come in quest of her, and without recognizing their mother
are entertained in the palace. Hypsipyle is quieting the child with a
song and a rattle when the chorus of Nemean women enter. Next
certain soldiers arrive from the host which the seven chieftains are
leading against Thebes. Their commander, the prince Amphiaraus,
explains that the army is in need of water, and Hypsipyle consents to
show them a spring. Later she returns in anguish: during her
absence the child has been killed by a great serpent. Eurydice is
about to slay her, when she appeals to Amphiaraus, who pleads her
cause and promises Eurydice that the Greeks shall found a festival in
honour of the child. (This festival is that of the famous Nemean
Games.) He sees that this fatal accident is a bad omen for the
enterprise of the Seven, and names the child Archemorus[807]
instead of Opheltes. Eurydice is appeased. Later we find Hypsipyle
and her sons made known to one another, and the god Dionysus
appears, apparently to arrange future events.
Though there is one difficulty as to the plot, namely, that we do
not know what function was assigned to Hypsipyle’s sons—they
cannot have been introduced merely for the recognition-scene—the
whole conception strikes one as simple and masterly. It has been
well remarked[808] that while a modern dramatist would have
omitted the Theban expedition, “nothing seemed to the Greeks
worthy of contemplation in the theatre by a great people, unless it
had some connexion with the exploits and the history of nations....
On the same canvas the death of one little child and the doom of the
seven chieftains with their crowding battalions are depicted in a
perspective which sets the former fatality in the foreground.”
The captive princess, even through the ruins of the text, shines
forth with great charm. Her whole life centres round her lost children
and the brief magical time of her union with Jason. The chorus
reproach her with her indifference to the exciting presence of
Adrastus’ great army—she will think of nothing save Argo and the
Fleece. When at point to die her spirit flashes back to those old days
in a few words of amazing poignancy:—
ὦ πρῷρα καὶ λευκαῖνον ἐξ ἅλμης ὕδωρ
Ἀργοῦς, ἰὼ παῖδ’....

“Ah, prow of Argo and the brine that flashed into whiteness! ah,
my two sons!” Her talk with them towards the end is a pathetic and
lovely passage equal to anything Euripides ever wrote in this kind.
Melanippe the Wise[809] appears to have been a drama of unusual
personal interest. Æolus espoused Hippe, whose daughter Melanippe
became by Poseidon mother of twin sons. The god bade her hide
them from Æolus, and they were discovered by grooms in the care
of a bull and a cow. They, supposing the children miraculous
offspring of these animals, reported their discovery to Æolus, who
decided to expiate the portent by burning the infants alive.
Melanippe was instructed to shroud them for death. In order to save
her children without revealing her own secret she denied the
possibility of such portentous births, but seems to have found herself
forced at length to confess in order to prove the natural origin of the
infants. Æolus condemned her to be blinded and imprisoned, her
offspring to be exposed. Her mother Hippe appeared as dea ex
machina[810] and saved her kin.
The great feature of this play was the heroine’s speech in which
she sought to convince her father that such a portent was
impossible. Lines from the opening of this argument are preserved:
“The story is not mine—from my mother have I learned how Heaven
and earth were once mingled in substance; when they separated
into twain they engendered and brought into the light of day all
creatures, the trees, birds, beasts, nurslings of the sea, and the race
of men”. The speech was an elaborate scientific sermon to disprove
the possibility of miracles. Similarly, according to a famous story, the
drama opened originally with the line: “Zeus, whoever Zeus may be,
for only by stories do I know of him ...”; but this open agnosticism
gave such offence that Euripides produced the play again with the
words: “Zeus, as Truth relates....” A different but closely-connected
source of interest is the fact that here Euripides veiled his own
personality less thinly than usual. That Melanippe was only his
mouthpiece appears to have been a recognized fact. Dionysius of
Halicarnassus[811] observes that it presents a double character, that
of the poet, and that of Melanippe; and Lucian[812] selects the
remark on Zeus in the prologue as a case where the poet is speaking
his own views. The “mother” from whom “Melanippe” learned her
philosophy has been identified with the great metaphysician and
scientist Anaxagoras, who was banished from Athens in 430 b.c.; and
it is natural to suppose that this Melanippe is not much later than
that year, perhaps much earlier[813] in view of the strongly didactic
manner.[814] Hartung refers to this play the splendid fragment:—

ὄλβιος ὅστις τῆς ἱστορίας


ἔσχε μάθησιν, μήτε πολιτῶν
ἐπὶ πημοσύνῃ μήτ’ εἰς ἀδίκους
πράξεις ὁρμῶν,
ἀλλ’ ἀθανάτου καθορῶν φύσεως
κόσμον ἀγήρω, πῇ τε συνέστη
καὶ ὅπῃ καὶ ὅπως.
τοῖς δὲ τοιούτοις οὐδέποτ’ αἰσχρῶν
ἔργων μελέτημα προσίζει.

“Happy is he who hath won deep learning. He setteth himself


neither to hurt his fellow-citizens nor towards works of iniquity, but
fixeth his gaze upon the ageless order of immortal Nature, the laws
and methods of its creation. Unto such a man never doth there cling
the plotting of base deeds.” If these lines point at Anaxagoras and
belong to our play, the two significant clauses which defend the
moral character of the philosopher in question indicate the year 430
itself.
The Cresphontes had immense success as a powerful melodrama.
Polyphontes, having slain his brother Cresphontes, King of Messenia,
seized his throne and married his widow Merope, who sent her
infant son Cresphontes away to safe keeping in Ætolia. When he
grew up he returned to avenge his father. At this point the action
begins. Cresphontes seems to have delivered the prologue; since
Polyphontes fearing his return has offered a reward to whoever shall
slay him, he has determined to win the usurper’s confidence by
claiming to have destroyed his enemy. Meanwhile, Merope, alarmed
by the proclamation of the king, has sent an aged slave to find
whether Cresphontes is well; he returns with tidings that the prince
has disappeared from Ætolia. Merope gives her son over for lost,
and observing the youthful stranger who is received with joy by the
king, she becomes convinced that he is the murderer of her son.
While he lies asleep she steals upon him with an axe, when the old
slave recognizes the stranger and stops her arm. Mother and son are
united, and at once plot to slay Polyphontes. Merope pretends to be
reconciled to the king, who in his joy goes to sacrifice, accompanied
by the youth, who takes advantage of a suitable moment to slay his
enemy.
Plutarch, nearly six centuries later, testifies[815] to the sensation
which the Recognition caused in the audience. Merope herself seems
to have been a figure ranking with Hecuba in the Troades. The
tidings of her son’s death draw from her words which in their quiet
dignity of grief have something of Wordsworth:—

Children have died ere now, not mine alone,


And wives been widow’d. Yea, this cup of life
Unnumber’d women have drain’d it, as do I....
... Insistent Fate,
Taking in fee the lives of all I lov’d,
Hath made me wise.

Probably it was Merope again who uttered the famous lines which
advise lament over the newly-born and a glad procession to
accompany the dead. The recognition-scene is singled out for
especial praise by Aristotle.[816]
The fragments of this tragedy include a perfect jewel of lyric
poetry, a prayer to Peace:—
Εἰρήνα βαθύπλουτε καὶ
καλλίστα μακάρων θεῶν,
ζῆλός μοι σέθεν, ὡς χρονίζεις.
δέδοικα δὲ μὴ πρὶν πόνοις
ὑπερβάλῃ με γῆρας,
πρὶν σὰν χαρίεσσαν ὥραν προσιδεῖν
καὶ καλλιχόρους ἀοιδὰς
φιλοστεφάνους τε κώμους.
ἴθι μοι, πότνα, πόλιν.
τὰν δ’ ἐχθρὰν στάσιν εἶργ’ ἀπ’ οἴ-
κων τὰν μαινομέναν τ’ ἔριν
θηκτῷ τερπομέναν σιδάρῳ.

A paraphrase might run thus:—

O Peace, thou givest plenty as from a deep spring: there is


no beauty like unto thine, no, not even among the blessed
gods.
My heart yearneth within me, for thou tarriest; I grow old
and thou returnest not.
Shall weariness overcome mine eyes before they see thy
bloom and thy comeliness? When the lovely songs of the
dancers are heard again, and the thronging feet of them that
wear garlands, shall grey hairs and sorrow have destroyed me
utterly?
Return, thou Holy One, to our city: abide not far from us,
thou that quenchest wrath.
Strife and bitterness shall depart, if thou art with us:
madness and the edge of the sword shall flee away from our
doors.

Matthew Arnold’s Merope has the same plot and includes a


recognition-scene which probably resembles the lost original closely.
His conception of Polyphontes is thoroughly Euripidean.
Of the other lost plays little can be said here. Still amid this faint
glow of star-dust many marvellous things are to be discerned—
words of tremulous tenderness from the Danae describing the charm
of infancy; a line from Ino which in its powerful grimness recalls
Æschylus, “like a lone beast, he lurks in caves unlit”;[817] out of the
Polyidus the celebrated query,

Who knows of life that it is aught but death,


And death aught else than life beyond the grave?[818]

From an unknown drama comes a line which owes its preservation


to St. Paul[819]:

φθείρουσιν ἤθη χρήσθ’ ὁμιλίαι κακαί,

“evil communications corrupt good manners”. Euripides’


cosmopolitan sympathy nowhere finds finer expression than in the
distich

Where’er spreads Heaven the eagle cleaves his path;


Where’er lies earth the righteous are at home.[820]

But the student must at his leisure explore the marvels of these
rock-pools left by the retiring ocean. One majestic passage[821] from
the Cretans shall suffice to close this survey. The lines are from a
march sung by the Curetes or priests of the Cretan Zeus, and show
that even in the Hellenic world the monastic spirit was not unknown:

Thou whom the Tyrian princess bare


To mighty Jove, thou Lord of Crete,
To whom her hundred cities bow,
Lo, I draw near thy judgment-seat,

Quitting my home, yon hallowed place


Where beams of cypress roof the shrine,
By far-brought axes lopped and hewn,
Close knit by oxen’s blood divine.

Pure is my life’s unbroken calm


Since Zeus to bliss these eyes unsealed;
The feast of quivering flesh I shared
While through the dark strange thunder pealed.

The Mountain-Mother heard my vows,


And saw my torch the darkness ride;
The Hunter named me for his priest,
A mail-clad Bacchant sanctified.

Now robed in white I keep me pure


From food that e’er has throbbed with breath;
I shun the new-born infant’s cry,
And gaze not on the couch of death.
It now remains for us to attempt a synthesis—to set before
ourselves as clearly as may be the whole personality of Euripides.
We are studying not the programme of a politician, but the spirit and
method of a great artist, the inspiration of a great teacher. An artist
has other things to heed than a superficial consistency of
presentation; and a teacher of permanent value shows his followers
not what to think, but how to think—not opinions, but the reasoned
basis of opinion. Euripides is a man not of dogmas, nor indeed of
negations; he is the apostle of a spirit which blows whither it lists,
setting up a healthful circulation of tingling life throughout regions
which have languished in the heavy air of convention. His work
forces us to think and feel for ourselves, not necessarily to think and
feel with him.
The briefest description of his special quality is that he is in the
same moment a great artist and a great rationalist—a man
profoundly conscious of the beauty and value of all life, all existence,
all energy, and yet an uncompromising critic of the vesture which
man throws around those parts of the Universe which are subjected
to him. No man has ever loved and expressed beauty with a mind
less swayed by illusion. These two instincts, the instinct to study life
in all its unforced manifestations, and the instinct to question all
conventions, lie at the root of his work. It is in virtue of these that
he has been called enigmatic. Like Renan he was ἀνὴρ δίψυχος, a
man of two souls[822]; but he is no more an enigma than others. His
peculiarity lies herein, that the duality of nature often found in
ordinary men was by him exhibited at the heights of genius. That is
why he so often seems labouring to destroy the effect he has
created; he is “inconsistent” because he is equally at home in the
two worlds of feeling and of thought. Precisely for this reason he
created a new type of drama. Horace Walpole wrote that “Life is a
comedy to those who think, a tragedy to those who feel”; thus,
when a genius of Euripides’ type addressed itself to the theatre, the
result was drama which could not but shock people who, bred in the
school of Æschylus, had no conception of “tragedy” which could be
witty, light, modern, destructive. Menander is the successor of
Euripides, not of Aristophanes.
Anyone who follows out these two strands of instinct will
understand much that might seem strange, much that gave offence,
in his work. It will be well therefore to bring together the faults
which have been found with him in ancient and in later times.
Leaving on one side, since it is by no means certainly a reproach,
the celebrated remark[823] of Sophocles, “I represent people as they
should be, Euripides as they are,” we find our chief material in
Aristophanes and Aristotle. The Frogs contains an elaborate attack
upon the tragedian which, whether fair or not, has a prima facie
reasonableness. Euripides is twitted with moral and literary offences.
In the first place, his predilection for depicting the power of love,
especially the adulterous or incestuous passions of women[824] and
the sophistical restlessness of mind which he inculcates,[825] mark
him as a corrupter of Athens. On the technical side, his music[826] is
affected and decadent, the libretto[827] of his choruses is both
elaborate and jejune, the style of his iambics[828] lacks weight and
dignity, his prologues[829] are tiresome and written in a mechanical
fashion. Aristotle in his turn objects to certain weaknesses of
characterization: Menelaus in the Orestes is particularly bad, the
speech of Melanippe—no doubt that celebrated oration on miracles—
is indecorous and out of character; in the Aulid Iphigenia the heroine
is inconsistent.[830] He gives two examples[831] of the irrational,
Ægeus in the Medea and Menelaus once more in the Orestes.
Euripides’ use of the deus ex machina is also often bad; he instances
Medea’s miraculous chariot. Lastly there is the famous mixture[832]
of praise and blame: “Euripides, faulty as he is in the general
management of his subject, is yet felt to be the most tragic of the
poets.” If we pass now to modern detractors, we find one fault
overshadowing all the rest—bad construction, what Aristotle calls
“episodic” plots, namely, plays the several scenes of which are more
or less accidentally combined and form no organic whole.
There is truth in some of this fault-finding; whether we are to
regard such features as actually blemishes is another matter. Two
certainly are defects of the gravest possible description—“episodic”
plots and the deus ex machina. If a man produces plays which have
no organic unity, or which at the close of the action are in such a
tangle that a being of superhuman information and power is
necessary to “cut the knot,” he is no “unskilful dramatist” but merely
a blockhead, for he can always fling his rubbish into the fire. So
hopelessly damaging are these two accusations that one really
cannot believe Euripides obnoxious to them. One might as well
allege that Alexander did not understand tactics, or that Pericles
believed Byzantium was in Sicily. The charge of faulty construction
has been considered earlier in connexion with the plays which are
supposed examples thereof. But the deus ex machina needs a few
words. “The god out of the machine” is a phrase of two applications.
It may mean a deity brought in to round off the play by giving
information about the future history of the personages. Or the god
may be introduced when the plot, owing to the human limitations of
the characters, has become knotted and progress is impossible; then
a being who miraculously knows all the facts appears and “cuts” the
knot. In the first case the epiphany is practically outside the drama;
in the second it is only too vital to it. Of the first case there are
five[833] instances in the extant plays: to these, of course, our grave
objection cannot apply. Of the second type there are seven[834]
examples if we regard the miraculous car of Medea as a “deus”.
Granted the story which is known to the audience, such
interventions are necessary. Medea cannot escape the vengeance of
Corinth, Orestes the verdict of the Argive State, without supernatural
aid; Theseus would, it might seem, never have been persuaded by
mortal witness that Hippolytus is innocent; in the Tauric Iphigenia
and the Helena[835] nothing but a miracle can save from death the
fugitives who as a matter of “history” reached home in safety: the
Supplices would end without the formal compact between rescuers
and rescued if the goddess did not intervene; as for the Ion,
Euripides’ contemporaries knew that Delphi still flourished, so that
the annihilating investigation of Ion must, it appeared, have been
somehow arrested. For these seven plays, then, we can choose
between two theories of the deus ex machina (in that second sense
of a pseudo-dramatic expedient). The first theory is that the poet
wishes to end with “historical” truth, but in the course of his action
has so blundered that he cannot naturally do so; therefore he puts
forward a god who asserts that the action shall continue as “history”
asserts that it did; so might a competitor in a match of archery
employ a confederate who, whenever his arrow missed the target,
should pick it up and plant it in the white. The other theory is that
Euripides intended to work out an interesting situation of legend as a
study in natural psychology and social development. The situation
according to story came to a certain end; according to Euripides that
was not the natural end. And he emphasizes this legendary
distortion by pointing out clearly that to square nature and the story
nothing less than a miracle is required. To assert that he needed the
supernatural intervention to save his play is absolutely to reverse the
facts. Can we doubt which of these theories is sound?
Two further questions at once arise. Why did he select situations
from misleading legends? And, is there then no pseudo-dramatic
deus ex machina at all? The first question is of vital importance. It is
incorrect to say that he was bound by convention to the traditional
stories; Phrynichus, Agathon, and Moschion all defied this
“convention”. Euripides was a student of human thought, of the
development of belief, as well as a dramatist. Convinced that his
contemporaries held false beliefs about the gods and that the myths
were largely responsible for this, hypnotizing thought by their beauty
and paralyzing logic by their authority, he sets himself to show, not
only that they are untrue, but also how, though untrue, they ever
won credence. As for the deus ex machina the truth is that he does
not exist (save, of course, in the rôle of a non-dramatic narrator). He
is, like the three unities, a figment based on uncritical and hasty
reading. Outside this poet the only possible case is that of the
Philoctetes, which has been shown no genuine instance.
We may now return to the objections raised by Aristophanes and
Aristotle. They are all due to the two instincts we have described—
his interest in every manifestation of life, and his stern rationalism.
Most of the technical flaws, for instance, alleged against him are
proofs that he was attracted by the possibilities of his own art; he is
constantly testing the limits to which development can go. The
iambics of the Orestes, for example, are extraordinarily full of
resolved feet; after that play he restrains himself more. In music too
he appears to have been an explorer; at any rate the fault found
with the words of his choruses points to a development like the
modern, in which libretto was becoming subservient to music. The
comic poet, again, fastens eagerly upon the prologues, and puts into
the mouth of Æschylus a famous jest:—[836]

Æsch.: And now, by Jove, I’ll not smash each phrase word
by word, but with heaven’s aid I’ll ruin your prologues with—a
little oil-flask.
Eur.: An oil-flask? You ... my prologues?
Æsch.: Just one little flask. You write so that anything will
fit into your iambics—a little fleece, a little flask, a little bag.
I’ll show you on the spot.
Eur.: Oh! you will?
Æsch.: Yes.
Dion.: Now you must recite something.
Eur.:

“Ægyptus, as the far-spread story tells,


With fifty sons in voyage o’er the deep
Landing at Argos....”

Æsch.: (interrupting) ... “lost his flask of oil”.

Several other absurd instances follow.


This celebrated jest means (i) that Euripides constructs the early
sentences of his prologue in such a way that a subordinate clause
(usually containing a participle) leads up to a short main clause at
the end of the sentence; (ii) that his prologues descend to trivial
details; (iii) that the cæsura occurs always in the third foot; (iv) that
he is viciously addicted to resolved feet. The tragedian can be
defended from these charges, such as they are, but the idea at the
back of Aristophanes’ mind is true, namely, that these prologues are
often dull performances. Probably the poet did not intend much
more. He wishes to put his hearers au fait with the precise legend
and the precise point with which he is concerned;[837] as is often
said, these passages take the place of a modern play-bill.
Later in the Frogs Dionysus produces a huge pair of scales; each
is to utter a line into his scale-pan, and the heavier line wins.
Euripides declaims into his pan the opening line of the Medea, εἴθ’
ὤφελ’ Ἀργοῦς μὴ διαπτάσθαι σκάφος, and his rival Σπερχειὲ ποταμὲ
βουνομοί τ’ ἐπιστροφαί. Dionysus absurdly explains that the latter
wins because he has put in water like a fraudulent woollen-
merchant, while Euripides has offered a “word with wings”.
Underlying this nonsense is the truth that the Æschylean line is
ponderous and slow, that of Euripides light and rapid; it is like
contrasting Marlowe and Fletcher. The difference is not between
good and bad, but between old and new. Æschylus’ iambic style is
fitted most admirably for his purpose. But Euripides has not the
same purpose—that is all. It is one of his most remarkable
innovations that he practically invented the prose-drama. A very
great deal of his “verse” is simply prose which can be scanned. To
compare such a passage[838] as:

ἥξει γὰρ αὐτὸς σὴν δάμαρτα καὶ τέκνα,


ἕλξων φονεύσων κἄμ’ ἐπισφάξων ἄναξ·
μένοντι δ’ αὐτοῦ πάντα σοὶ γενήσεται,
τῇ τ’ ἀσφαλείᾳ κερδανεῖς· πόλιν δὲ σὴν
μὴ πρὶν ταράξῃς πρὶν τόδ’ εὖ θέσθαι, τέκνον,

or a hundred others, with the beacon-speech in Agamemnon or


Athena’s charge to the Areopagite court, is to ignore the whole point
of a literary revolution. Who would set a page of Hedda Gabler’s
conversation against an extract from Macbeth, and affirm that Ibsen
could not write dialogue?
Ibsen, indeed, it is particularly instructive to bear in mind here.
According to him “the golden rule is that there is no golden rule”.
[839] Dr. Stockman’s nobility consists in telling the truth at all costs.
Gregers Werle insists on that course, and is seen to be a
meddlesome prig who ruins his friend’s home. Here the Greek and
the Norwegian agree heartily; for the “sophistry” with which many at
Athens were disgusted is only Euripides’ way of putting his
conviction that there is no fixed rule of conduct, still less any fixed
rule for our self-satisfied attempts to praise or blame the abnormal.
An impulse of pity ruins Creon in the Medea; Lycus in the Heracles
turns his back on mercy, and is destroyed also. The pride of glorious
birth nerves Macaria to heroism; of Achilles it makes merely a
pathetic sham. Consciousness of sin wrecks and tortures Phædra,
while to Helen in Orestes it means little more than a picturesque
melancholy. Hermione in Andromache and Creusa both go to all
lengths in their passionate yearning for domestic happiness; one
destroys her husband and her own future, the other reaps deeper
bliss than she dared to hope. Iphigenia and Hippolytus serve the
same goddess, but amid what different atmospheres and diverse
destinies! This consciousness that effort brings about results
different from its aims, that chance, whatever chance may be, is too
potent to allow any faith in orthodox deities, only in moods of
despair wrings from the poet such outcry as Hecuba’s, that Fate is “a
capering idiot”.[840] But it has planted surely in his mind the
conviction that there is no golden rule of conduct. And hence that
“love of forensic rhetoric” of which we hear so much—each case
must be considered on its own merits.
To this agnosticism we owe not only that treatment of religious
legend which we have already studied but the poet’s greatest
achievement. Socrates, because, as he said, he could not
understand metaphysics or astronomy, gave his attention to man.
His friend because he despaired of a satisfying theology threw his
genius into psychological drama. The centre of his interest is the
human heart. Only one fact about destiny can be stated as
consistently held by him, namely, that the spring of action and the
chief factor in happiness or misery is, not the will of Heaven or
dogmatic belief, but the nature (φύσις) of the individual.[841]
Because he studies sin, not to condemn but to understand, he has
earned that reproach of Aristophanes who rages at his predilection
for Phædras and Sthenebœas. What attracted him was not a desire
to gloat or even to pardon; it was the fact that the sinners he
depicts are so intensely alive. A being dead in virtue engaged his
interest less than one who, however evilly, existed with vigour. To
this passionate interest in human life can be referred as basis all the
other themes on which he spent study. Religion, as we have found,
only attracts him because it guides or misleads conduct. His political
studies have little concern with ethnology or economics; they are
only an expansion to a wider field of this same interest in sheer
humanity. Philosophy and natural science are of value for him, as for
Lucretius, in that they provide an escape from paralyzing
superstition. If they are presented as a refuge from the facts of life,
he will have none of them. When Electra[842] seeks in her
knowledge of astronomy a far-fetched consolation for self-fostered
misery, she strikes us not as heroic but as own kin to the febrile
“intellectuals” of Tchekov’s Cherry Orchard or the novels of
Dostoevsky.
His dislike of convention in morals is answered by his originality in
portraiture as well as in dramatic situations. Nothing is more thrilling
than to observe how in the hands of a great realist whole masses of
human beings come to life. What was the background of one
novelist suddenly begins in the pages of another to stir, to articulate
itself, to move forward and discover a language. “The men”
commanded by Captain Osborne in Vanity Fair become Private
Ortheris or Corporal Mulvaney in the pages of Kipling. So in Euripides
the dim and familiar background of “barbarians” who existed merely
to give colour and outline to Achilles and Odysseus, the women who
bore the necessary children and ground the needed flour, the
henchmen without whom horses would not be groomed or trees
felled, suddenly awake and reveal passions of love and hatred,
pathetic histories, opinions about marriage and the grave. In every
age the man who points to the disregarded, the dormant, hitherto
supposed securely neutral and plastic, who cries “it is alive, watching
you and reflecting, waiting its time”—such a man is met in his
degree with the reception given to Euripides by the elder generation
of Athenians. The clamour of “crank!” “faddist!” “this is the thin end
of the wedge,” and kindred watchwords, may be found translated
into brilliant Attic by Aristophanes. But in virtue of these same
interests Euripides became the Bible of later Greek civilization. He
would have passed into a fetish had it not been that the
destructively critical side of his genius prevented the most narrow-
minded from reducing him to a system. To the last he remains
inconclusive, provocative, refreshing.
On the other side his sensitiveness to all aspects of life—his
“feeling for Beauty” to use the familiar phrase—held him back from
mere cynicism. The Hippolytus remains as perhaps the most glorious
support in literature for unflinching facing of facts—it shows
triumphantly how a man may feel all the sorrow and waste which
wreck happiness, yet declare the endless value and loveliness of life.
We may detect two aspects in which this joy in life shows itself most
markedly—his romance and his wit.
Romance is not improperly contrasted with “classicism,” but as few
Greek or Roman writers are classical in the rigid sense it is not
surprising to find romantic features outcropping at every period of
their literature. Euripides himself is the most romantic author
between Homer and Appuleius, whatever our definition of romance
may be. R. L. Stevenson’s remark that “romance is consciousness of
background,” Hegel’s doctrine that “romantic art is the straining of
art to go beyond itself,”[843] and a more recent dictum that
“romance is only the passion which is in the face of all realism,”[844]
each of them definitely recalls some feature of Euripides’ work
already discussed. A modern writer with whom he can be fruitfully
compared, at this point especially, is Mr Bernard Shaw. In many
characteristics these two dramatists are notably alike: their ruthless
insistence upon questioning all established reputations, whether of
individuals, nations, or institutions; their conviction that there is no
absolute standard of conduct; their blazing zeal for justice; their
mastery of brilliant lithe idiom. But in their feeling about romance
they diverge violently. Perhaps the largest ingredient in Mr. Shaw’s
strength is his hatred and distrust of emotion and of that spirit,
called romance, which organizes emotion and sees in it a basic part
of life. But Euripides appreciates it all the more highly that he is not
enslaved by it. Even in such ruthless dramas as the Medea and the
Iphigenia in Tauris one remarks how the thrill and beauty of life
gleams out, if only as a bitter memory or a present pain of contrast
—the magic fire-breathing bulls and the heapy coils of the glaring
dragon in the remote land where Jason won his quest, the strange
seas, deserted beaches, and grim savages among whom Iphigenia
cherishes her thoughts of childhood in Argos. The same sense of
glamour which inspires early in his life such a marvellous flash as the
description of Rhesus’ steeds:

στίλβουσι δ’ ὥστε ποταμίου κύκνου πτερόν,[845]

and indeed the whole dashing buoyant drama—this passion survives


the shames and disillusionment wrought by twenty-five years of
tyranny and war; it persists even in those black but glorious hours
when he wrote the Troades and at the close of his life culminates in
the splendours of the Bacchæ. No attentive student of his work can
ignore this effect, but if we possessed all his plays we should be in
no danger of accepting the idea that Euripides is beyond all other
things a bitter realist. The Andromeda and the Phaethon would have
redressed the balance.
The wit of Euripides cannot easily be discussed; it often depends
upon idiomatic subtlety, and must almost disappear in translation.
But frequently, again, it consists in the method of handling a
situation. Just as the playwright often makes of his drama, among
other things, an elaborate reductio ad absurdum of myth, so is he
capable of writing a whole scene with a twinkle in his eye. The
clearest example is the Helena; Menelaus’ stupefaction at learning
that Egypt contains an Helen, daughter of Zeus, is indeed definite
comedy:

Διὸς δ’ ἔλεξε παῖδά νιν πεφυκέναι.


ἀλλ’ ἦ τις ἔστι Ζηνὸς ὄνομ’ ἔχων ἀνήρ
Νείλου παρ’ ὄχθας; εἷς γὰρ ὅ γε κατ’ οὐρανόν.[846]

“And she told me that the lady was a daughter of Zeus! What! is
there some person called Zeus living beside the Nile? There’s one in
Heaven, to be sure, but that’s another story.” Such a translation
gives perhaps the intention of the words and colloquial rhythm of
the last sentence. Here is comedy, but that of Congreve, not of
Aristophanes. The distinction is important. Euripides is less comic
than witty. As we turn his pages we rarely laugh, but a thousand
times we break into the slight smile of intellectual enjoyment; one
delight in reading an Euripidean play—tragedy though it be—is the
same as that aroused by the work of Meredith. Euripides’ sense of
the ludicrous is a part of his restlessness in conception. Again and
again he startles us by placing at some tragic moment a little
episode which passes the pathetic and becomes absurd. When
Clytæmnestra and Achilles bring each other into awkward perplexity
over the espousal of Iphigenia the effect is amusing, and the
intervention of the old slave who puts his head out of the tent-door
must provoke a smile, even though we realize that he has misery
and death on his lips.[847] After Creusa has given her instructions for
the assassination of Ion, it is, though natural, yet quaint for the
prospective murderer to reply: “Now do you retire to your hotel”.
[848] In the Medea the whole episode of Ægeus, to which Aristotle
objected as “irrational,” is tinged with the grotesque. That the
horrible story of Medea’s revenge must hang upon a slow-witted
amiable person like Ægeus is natural to the topsy-turviness of life as
the dramatist saw it. In fact, just as Euripides on the linguistic side
practically invents the prose-drama, so in the strictly dramatic sphere
he invents tragicomedy. Nothing can induce him to keep tears and
laughter altogether apart. The world is not made like that, and he
studies facts, depicting the phases of great happenings not as they
“ought to be” but “as they are”. He would have read with amused
delight that quaint sentence of Dostoevsky: “All these choruses sing
about something very indefinite, for the most part about somebody’s
curse, but with a tinge of the higher humour”.[849] It is indeed
significant that sparkles of incidental mirth are (so far as a modern
student can tell) commonest in that most heartbreaking play
Orestes. One dialogue between Orestes and Menelaus, to take a
single passage, is a blaze of wit—it exemplifies every possible grade
of witticism, from the downright pun[850] to subtle varieties of
iambic rhythm. Perhaps the most light-hearted and entertaining
example[851] is provided by Helen who (of all casuists!) evolves a
theory of sin as a method of putting her tigerish niece into good
humour and so inducing her to perform for Helen an awkward task.
Even more skilful, but ghastly in its half-farcical horror, is the
dialogue between Orestes and the escaped Phrygian slave.
Later ages of Greek civilization looked upon Euripides as a mighty
leader of thought, a great voice expressing all the wisdom of their
fathers, all the pains and perplexities familiar to themselves. After
generations had passed it was easy to dwell upon one side only of
his genius, and for Plutarch or Stobæus to regard him as the poet of
sad wisdom:—

Amongst us one,
Who most has suffer’d, takes dejectedly
His seat upon the intellectual throne;
And all his store of sad experience he
Lays bare of wretched days![852]

But his own contemporaries, living in the days before Ægospotami


and knowing the many facets of his spirit, could not so well accept a
man of such contradictions, who was in strange earnest about things
they felt to be indifferent, and who smiled at such odd moments.
Euripides must often have felt himself very lonely in Athens. “My
soul,” he cries, “lay not hold upon words of subtlety. Why admit
these strange high thoughts, if thou hast no peers for audience to
thy serious musings?”[853] And again;—

Though far beyond my ken a wise man dwell,


Across the earth I greet him for a friend.[854]

It may be that Europeans of our own day are better fitted to


estimate him aright than enthusiasts under the Empire or his
companions who saw him too close at hand. During the last half-
century we have witnessed great changes which have their
counterpart in the Athens for which he wrote. Hopes have been
realized only to prove disappointments and the source of fresh
perplexities. In England the spread of knowledge has resulted not in
a cultivated, but in a mentally restless people. Universal ability to
read has for its most obvious fruit not wider knowledge of literature,
but more newspapers and a rank jungle of “popular” writing.
Similarly at Athens the sophists had produced mental avidity where
there was no quickening of spiritual vigour to correspond. Another
fact of vital import has been the rise of our working-class to
solidarity and political power: it probably resembles that “demos”
which Cleon led more closely than “the masses” with which Peel or
Russell had to deal. Again, experience of war has shown how small
is the effect which settled government, social reform, and education
have exercised upon the raw, primitive, human instincts, both base
and noble. In Greece, the empire of Athens, with its tyranny and
selfishness, and the Peloponnesian war which had produced a
frightful corruption of conduct and ideals,[855] tainted society with
that cynicism (ἀναίδεια) of which Euripides so often speaks. Just as
we are severed by a wide gulf from the crude but not ignoble
certainty, the superficial worship of progress which marked the
Victorian era, so was Euripides severed from the “men of Marathon”
for whom Æschylus wrote.
So it is that we can judge the poet of “the Greek
enlightenment”[856]—or rather of the Athenian disillusionment—
better than most generations of his readers. To aid us, there have
naturally arisen writers to voice, in a manner often like his, our own
disappointment and our renewed interest in parts of life and the
world which we had ignored as unmeaning or barren. The
disinherited are coming into their own. Mr. Thomas Hardy has
written of the English peasant with a richness and profundity
unknown since Shakespeare. He offers indeed another interesting
analogy with Euripides: while the critics are concerned with his
“pessimism” he remains for an unsophisticated reader a splendid
witness to the majesty and charm of the immense slow curves of
life, the deep preciousness which glows from the gradual processes
of nature and that dignity of mere existence which survives all sin
and effort. Tess of the D’Urbervilles is the best modern parallel to
Hippolytus. Meanwhile M. Anatole France has given us many an
example of that ironical wit of which the Greek poet is so
consummate a master. Another Frenchman, Flaubert, has set as the
climax to his dazzling phantasy, La Tentation de St. Antoine, an
expression in un-attic vehemence and elaboration of that passionate
sympathy with all existence which blazes in the lyrics of the Bacchæ
—a yearning which Arnold in the Scholar-Gipsy has uttered in milder
and still more haunting language.
There is no final synthesis of Euripides. Throughout his life he held
true to those two principles, the worship of beauty, and loyalty to
the dry light of intelligence. Glamour never blinded him to sin and
folly; misery and coarse tyranny never taught his lips to forswear the
glory of existence. One of his own noblest songs sets this
triumphantly before us[857]:—

οὐ παύσομαι τὰς Χάριτας


Μούσαις συγκαταμειγνύς,
ἁδίσταν συζυγίαν.
μὴ ζῴην μετ’ ἀμουσίας,
αἰεὶ δ’ ἐν στεφάνοισιν εἴην.

“I will not cease to mingle the Graces with the Muses—the sweetest
of fellowships. When the Muses desert me, let me die; may the
flower-garlands never fail me.” The Graces and the Muses—such is
his better way of invoking Beauty and Truth, the two fixed stars of
his life-long allegiance.
CHAPTER VI
METRE AND RHYTHM IN GREEK TRAGEDY

§ I. Introduction
Poetry is illuminating utterance consisting of words the successive
sounds of which are arranged according to a recurrent pattern. The
soul of poetry is this illumination, its body this recurrent pattern of
sounds; and it is with the body that we are now to deal. At the
outset we must distinguish carefully between rhythm and metre.
Rhythm is the recurrence just mentioned—the structure; metre is
the gathering together of sounds into masses upon which rhythm
shall do its work. Metre, so to put it, makes the bricks, while rhythm
makes the arch.
Greek metre is based, not upon stress-accent,[858] but upon
quantity—the length of time needed for the pronunciation of a
syllable. In English the line

My bosom’s lord sits lightly in his throne

is “scanned” (that is to say, marked off into “feet”—the metrical


units) as a series of five iambi; the iambus being a foot which
consists of an unaccented, followed by an accented, syllable. The
word “bosoms” can stand where it does because the stress of the
voice naturally falls upon the first syllable of “bosom”; to begin a line
with “my seréne bosom” would clearly be wrong. The length of the
syllables has no effect on the scansion. That “sits” needs as long a
time for its utterance as the first syllable of “lightly” does not alter
the fact that “sits light-” is an accentual iambus.
Greek words, on the other hand, as metrical material, are
considered only from the quantitative point of view, not the
accentual. The voice-stress in the word λόγους rests upon the first
syllable, but the word is an iambus, a “short” followed by a “long”
(marked respectively thus ⏑–). Whereas an English blank verse
consists of five accentual iambi, e.g.
To ént|ertaín | divíne | Zenócr|até,
the corresponding verse of all the Greek dramatists is composed of
six feet each of which is theoretically a quantitative iambus, and
most of which actually are such. Thus Andromache, v. 241 is to be
scanned

⏑ – ⏑ – ⏑ – ⏑ – ⏑ – ⏑ –
τι δ ου | γυναιξ|ι ταυτ|α πρωτ|α παντ|αχου.

When is a syllable long and when short? A few rules will settle all
but a minority. All syllables are long—
(i) Which contain a necessarily long vowel (η or ω), e.g. μη̄ν, τω̄ν.
(ii) Which contain a diphthong or iota subscript, e.g. ο̅ι̅νος,
α̅ι̅νο̅υ̅μεν, ρᾳ̅διως; save that the first syllable of ποιῶ and τοιοῦτος
(and their parts) is often short.
(iii) Which end with a double consonant (ζ, ξ, ψ), e.g. ο̄ζος, ε̄ξω,
ε̄ψαυσα.
(iv) Which have the circumflex accent, e.g. υμῖ̅ν, μῦ̅ς.
Most syllables are long the vowel of which is followed by two
consonants. But there is some difficulty about this very frequent
case. It can arise in three ways:—
(a) Both consonants may be in the same word as the vowel. Then
the syllable is long, save when the consonants are (i) a voiced stop
(β, γ, δ) followed by ρ; or (ii) a voiceless stop or spirant (κ, π, τ; θ,
φ, χ) followed by a liquid or nasal (λ, ρ, μ, ν)—in both of which
cases the syllable can be counted long or short at pleasure. Thus
ε̄σμεν, μο̄ρφη, ᾱνδρος; but the first syllables of ιδρις, τεκνον, ποτμος
are “doubtful”—they can be either long or short as suits the poet.
(b) One of the consonants may end its word and the other begin
the next. Such syllables are all long. Thus, τηκτο̄ς μολυβδος, ανδρε̄ς
σοφοι, although both these long syllables are “short by nature” (see
below).
(c) Both consonants may occur at the beginning of the second
word. If the vowel is naturally short, the syllable is almost always
short, though such scansions as σε̄ κτενω are occasionally found.
But if the second word begins with a double consonant or σ followed
by another consonant, the syllable is always long. Thus ο̄ ξενος, τῑ
ζητεις, ταυτᾱ σκοπουμεν.
A vowel, naturally short, when thus lengthened is said to be
“lengthened by position.”
The following types of syllable are always short:—
(i) Those containing a naturally short vowel (ε or ο) not
lengthened by position, e.g. ε̆κων, ο̆λος.
(ii) Final α of the third declension neuter singular (σωμᾰ), third
declension accusative singular (ελπιδᾰ, δρασαντᾰ), and all neuters
plural (τᾰ, σωματᾰ, τοιαυτᾰ).
(iii) Final ι (e.g. εστῐ, τῐ), save, of course, when it is part of a
diphthong.
(iv) The accusative -ας of the third declension (ανδρᾰς,
πονουντᾰς). But μουσᾱς (first declension). The quantity in both
cases is that of the corresponding nominative.
Hiatus is practically unknown. That is, a word ending in a vowel is
not to be followed by a word beginning with a vowel, unless one
vowel or the other disappears. Almost always it is the first vowel
which is thus cut off, the process being called “elision.” In verse one
would not write πάντα εἶπε, but πάντ’ εἶπε; not ἔτι εἶναι, but ἔτ’ εἶναι.
When the first vowel is long and the second short, the latter is cut
off by “prodelision,” a much rarer occurrence. Thus τούτῳ ἀνεῖπε
would become τούτῳ ’νεῖπε. Two long vowels, as in καλὴ ἡμέρα, are
not used together at all. But the rule as to hiatus does not normally
apply at the end of a verse; usually one can end a verse with an
unelided vowel and begin the next with a vowel. If in any metrical
scheme this liberty is not allowed, it is said that “synapheia[859]
prevails.”
We are now in a position to discuss the various metres to be found
in Greek Tragedy.

§ II. The Iambic Metre


Practically all the dialogue and speeches are written in this metre.
The student would do well to grow thoroughly accustomed to
reading these aloud with correct quantities before he attempts the
others.
The iambic line consists of six feet, any one of which may be an
iambus. But a “pure” iambic line, one in which every foot is an
iambus, as in Andromache, v. 241 (see above), is very rare. A
speech written solely in such feet would be highly monotonous and
far too rapid. Other feet are therefore allowed, under restrictions, to
take the place of the iambus.
By far the commonest of these is the spondee, which consists of
two long syllables (λο̄γχη̄, πᾱντω̄ν). This can occur in the first, third,
or fifth places—one, two, or all three. Thus:—

– – ⏑– ⏑ – ⏑ – – – ⏑ –
δησαι | βιᾳ | φαραγγ|ι προς | δυσχειμ|ερῳ (Prom.
Vinctus, 15).

– – ⏑ – – – ⏑ – ⏑– ⏑ –
ω τεκν|α Καδμ|ου του | παλαι | vεα | τροφη (Œd.
Tyr., I).
Next, the lightness and variety is often greatly increased by the
use of “resolved”[860] (or broken-up) feet. Each long syllable being
regarded as equal to two “shorts,” it follows that the iambus can be
“resolved” into ⏑⏑⏑, the spondee into –⏑⏑, ⏑⏑– (and ⏑⏑⏑⏑, but this
last is not employed in iambics).
Of these three the tribrach (⏑⏑⏑) is much the most frequent. As it
corresponds to the iambus, it can occur in any place, save the sixth;
it is exceedingly rare in the fifth place:—

– – ⏑ ⏑ ⏑ – – ⏑ – – – ⏑ –
φαιδρωπ|ον εδιδ|ου τοισ|ιν Αιγ|ισθου | φιλοις
(Orestes, 894).

⏑ – ⏑ – ⏑ – ⏑ ⏑ ⏑ – – ⏑–
περιξ | εγω | καλυψ|α βοτρυ|ωδει | χλοῃ (Bacchæ,
12).

The dactyl (–⏑⏑) is allowed in those places to which the spondee is


admitted, save the fifth (just as the tribrach is excluded from the
sixth). Thus:—

– – ⏑ – – ⏑ ⏑ ⏑ – – – ⏑ –
ου φασ|ι πρωτ|ον Δανα|ον Αιγ|υπτῳ | δικας
(Orestes, 872).

⏑ – ⏑ – – ⏑ ⏑ ⏑ – – – ⏑ –
λογους | ελισσ|ων οτι | καθιστ|αιη | νομους
(Ibid., 892).

It is rare in the first foot.


Least common of all is the anapæst (⏑⏑–), which appears only in
the first foot, unless it is contained entirely in a proper name, when
it can occur in any place save the sixth. This license is due to
necessity: such a name as Ἀντῐγόνη could not otherwise be
introduced into iambics at all. Examples:—
⏑ ⏑ – ⏑– ⏑ – ⏑ – – – ⏑ –
στεφανους | δρυος | τε μιλ|ακος τ|ανθεσφ|ορου
(Bacchæ, 703).

– – ⏑ – ⏑ – ⏑ – ⏑⏑ – ⏑ –
δεσποιν|α γαρ | κατ οικ|ον Ερμ|ιονην | λεγω
(Androm., 804).

Occasionally a line is to be found with two or even three resolved


feet:—

– – ⏑ ⏑ ⏑ – ⏑ ⏑ ⏑ – ⏑ – ⏑ –
λουτροισ|ιν αλοχ|ου περι|πεσων | πανυστ|ατοις
(Orestes, 367).

– ⏑ ⏑ ⏑ – – ⏑ ⏑ ⏑ – ⏑ – ⏑ –
μητερα | το σωφρ|ον τ ελαβ|εν αντ|ι συμφ|ορας
(Ibid., 502).

⏑ ⏑ – ⏑ ⏑ ⏑ – ⏑ ⏑ ⏑ – ⏑ – ⏑ –
αναδελφ|ος απατ|ωρ αφιλ|ος ει | δε σοι | δοκει
(Ibid., 310).

Two licenses should be noted. The last syllable of the line may be
short; no doubt the pause[861] at the end was felt to help it out.
Lines of this kind are innumerable, e.g.:—

⏑⏑
Κρατος Βια τε σφῳν μεν εντολη | Διος (Prom.
Vinctus, 12)

(which is followed by a vowel—ἔχει). It matters little whether such


syllables are marked as short, as long, or with the sign of doubtful
quantity ( ᷋). Next, synizesis (συνίζησις, “collapse”) occurs now and
then—two syllables coalesce and are scanned as one, e.g. μ̅η̅ ̅ ο̅υ̅,
πολε̅ω̅ς:—
– – ⏑ – – – ⏑ – – – ⏑ –
αλλ εα | με και | την εξ | εμου | δυσβουλ|ιαν
(Antigone, 95).

– ⏑ – – ⏑ ⏑ ⏑ – ⏑ – ⏑ –
ως μ̅η̅ ̅ε̅ι̅δ̅|οθ ητ|ις μ ετεκ|εν εξ | οτου τ|
εφυν (Ion, 313).

– – ⏑ – – – ⏑ – ⏑ – ⏑ ⏑
σφαζ αιμ|ατου | θεας βωμ|ον η | μετεισ|ι σε
(Andromache, 260).

(Synizesis is specially common in the various cases of θεός and θεά.)


Finally, two important rules of rhythm remain to be stated.
First, there must be a “cæsura”[862] in either the third or the
fourth foot. A cæsura is a gap between words in the middle of a
foot. Either the third foot, then, or the fourth must consist partly of
one word, partly of another. It is indicated in scansion by the sign ‖.
Many verses have this necessary cæsura in the third foot only, e.g.:

⏑ – ⏑ – – – ⏑ – – – ⏑ ⏑
απανθ | ο μακρ|ος ‖ καν|αριθμ|ητος | χρονος (Ajax,
646).

Many show it in the fourth only:—

– – ⏑ – ⏑ – ⏑ – – – ⏑ ⏑
προς τησδ|ε της | γυναικ|ος ‖ οικτ|ειρω | δε νιν
(Ibid., 652).

A still larger number have cæsura in both places:—

– – ⏑ – – – ⏑ – – – ⏑ ⏑
φρουρας | ετει|ας ‖ μηκ|ος ‖ ην | κοιμωμ|ενος

You might also like