100% нашли этот документ полезным (2 голоса)
52 просмотров47 страниц

Buy Ebook Bayesian Optimization: Theory and Practice Using Python 1st Edition Peng Liu Cheap Price

Liu

Загружено:

giottinele
Авторское право
© © All Rights Reserved
Мы серьезно относимся к защите прав на контент. Если вы подозреваете, что это ваш контент, заявите об этом здесь.
Доступные форматы
Скачать в формате PDF, TXT или читать онлайн в Scribd
100% нашли этот документ полезным (2 голоса)
52 просмотров47 страниц

Buy Ebook Bayesian Optimization: Theory and Practice Using Python 1st Edition Peng Liu Cheap Price

Liu

Загружено:

giottinele
Авторское право
© © All Rights Reserved
Мы серьезно относимся к защите прав на контент. Если вы подозреваете, что это ваш контент, заявите об этом здесь.
Доступные форматы
Скачать в формате PDF, TXT или читать онлайн в Scribd
Вы находитесь на странице: 1/ 47

Download the full version of the ebook at

https://ebookmass.com

Bayesian Optimization: Theory and Practice


Using Python 1st Edition Peng Liu

https://ebookmass.com/product/bayesian-
optimization-theory-and-practice-using-python-1st-
edition-peng-liu/

Explore and download more ebook at https://ebookmass.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Bayesian Optimization : Theory and Practice Using Python


Peng Liu

https://ebookmass.com/product/bayesian-optimization-theory-and-
practice-using-python-peng-liu/

testbankdeal.com

Machine learning: A Bayesian and optimization perspective


2nd Edition Theodoridis S

https://ebookmass.com/product/machine-learning-a-bayesian-and-
optimization-perspective-2nd-edition-theodoridis-s/

testbankdeal.com

Advanced Data Analytics Using Python : With Architectural


Patterns, Text and Image Classification, and Optimization
Techniques 2nd Edition Sayan Mukhopadhyay
https://ebookmass.com/product/advanced-data-analytics-using-python-
with-architectural-patterns-text-and-image-classification-and-
optimization-techniques-2nd-edition-sayan-mukhopadhyay/
testbankdeal.com

eTextbook 978-0134379760 The Practice of Computing Using


Python (3rd Edition)

https://ebookmass.com/product/etextbook-978-0134379760-the-practice-
of-computing-using-python-3rd-edition/

testbankdeal.com
Implementing Cryptography Using Python Shannon Bray

https://ebookmass.com/product/implementing-cryptography-using-python-
shannon-bray/

testbankdeal.com

Introduction To Computing And Problem Solving Using Python


1st Edition E. Balaguruswamy

https://ebookmass.com/product/introduction-to-computing-and-problem-
solving-using-python-1st-edition-e-balaguruswamy/

testbankdeal.com

Python Programming: Using Problem Solving Approach 1st


Edition Reema Thareja

https://ebookmass.com/product/python-programming-using-problem-
solving-approach-1st-edition-reema-thareja/

testbankdeal.com

Critical thinking in clinical research : applied theory


and practice using case studies Fregni

https://ebookmass.com/product/critical-thinking-in-clinical-research-
applied-theory-and-practice-using-case-studies-fregni/

testbankdeal.com

Machine Learning on Geographical Data Using Python 1st


Edition Joos Korstanje

https://ebookmass.com/product/machine-learning-on-geographical-data-
using-python-1st-edition-joos-korstanje/

testbankdeal.com
Bayesian Optimization
Theory and Practice Using Python

Peng Liu
Bayesian Optimization: Theory and Practice Using Python
Peng Liu
Singapore, Singapore

ISBN-13 (pbk): 978-1-4842-9062-0 ISBN-13 (electronic): 978-1-4842-9063-7


https://doi.org/10.1007/978-1-4842-9063-7

Copyright © 2023 by Peng Liu


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now
known or hereafter developed.
Trademarked names, logos, and images may appear in this book. Rather than use a trademark symbol with
every occurrence of a trademarked name, logo, or image we use the names, logos, and images only in an
editorial fashion and to the benefit of the trademark owner, with no intention of infringement of the
trademark.
The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not
identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to
proprietary rights.
While the advice and information in this book are believed to be true and accurate at the date of publication,
neither the authors nor the editors nor the publisher can accept any legal responsibility for any errors or
omissions that may be made. The publisher makes no warranty, express or implied, with respect to the
material contained herein.
Managing Director, Apress Media LLC: Welmoed Spahr
Acquisitions Editor: Celestin Suresh John
Development Editor: Laura Berendson
Coordinating Editor: Mark Powers
Cover designed by eStudioCalamar
Cover image by Luemen Rutkowski on Unsplash (www.unsplash.com)
Distributed to the book trade worldwide by Apress Media, LLC, 1 New York Plaza, New York, NY 10004,
U.S.A. Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail [email protected], or visit
www.springeronline.com. Apress Media, LLC is a California LLC and the sole member (owner) is Springer
Science + Business Media Finance Inc (SSBM Finance Inc). SSBM Finance Inc is a Delaware corporation.
For information on translations, please e-mail [email protected]; for reprint,
paperback, or audio rights, please e-mail [email protected].
Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook versions and
licenses are also available for most titles. For more information, reference our Print and eBook Bulk Sales
web page at http://www.apress.com/bulk-sales.
Any source code or other supplementary material referenced by the author in this book is available to
readers on GitHub (https://github.com/Apress). For more detailed information, please visit http://www.
apress.com/source-code.
Printed on acid-free paper
For my wife Zheng and children Jiaxin, Jiaran, and Jiayu.
Table of Contents
About the Author����������������������������������������������������������������������������������������������������� ix

About the Technical Reviewer��������������������������������������������������������������������������������� xi


Acknowledgments������������������������������������������������������������������������������������������������� xiii

Introduction�������������������������������������������������������������������������������������������������������������xv

Chapter 1: Bayesian Optimization Overview������������������������������������������������������������ 1


Global Optimization����������������������������������������������������������������������������������������������������������������������� 2
The Objective Function������������������������������������������������������������������������������������������������������������ 4
The Observation Model������������������������������������������������������������������������������������������������������������ 8
Bayesian Statistics���������������������������������������������������������������������������������������������������������������������� 11
Bayesian Inference���������������������������������������������������������������������������������������������������������������� 11
Frequentist vs. Bayesian Approach��������������������������������������������������������������������������������������� 14
Joint, Conditional, and Marginal Probabilities����������������������������������������������������������������������� 15
Independence������������������������������������������������������������������������������������������������������������������������ 18
Prior and Posterior Predictive Distributions�������������������������������������������������������������������������� 19
Bayesian Inference: An Example������������������������������������������������������������������������������������������� 23
Bayesian Optimization Workflow������������������������������������������������������������������������������������������������ 26
Gaussian Process������������������������������������������������������������������������������������������������������������������ 26
Acquisition Function�������������������������������������������������������������������������������������������������������������� 29
The Full Bayesian Optimization Loop������������������������������������������������������������������������������������� 30
Summary������������������������������������������������������������������������������������������������������������������������������������ 31

Chapter 2: Gaussian Processes������������������������������������������������������������������������������ 33


Reviewing the Gaussian Basics�������������������������������������������������������������������������������������������������� 36
Understanding the Covariance Matrix����������������������������������������������������������������������������������� 37
Marginal and Conditional Distribution of Multivariate Gaussian�������������������������������������������� 39
Sampling from a Gaussian Distribution��������������������������������������������������������������������������������� 40
v
Table of Contents

Gaussian Process Regression����������������������������������������������������������������������������������������������������� 43


The Kernel Function�������������������������������������������������������������������������������������������������������������� 43
Extending to Other Variables������������������������������������������������������������������������������������������������� 46
Learning from Noisy Observations���������������������������������������������������������������������������������������� 49
Gaussian Process in Practice������������������������������������������������������������������������������������������������������ 50
Drawing from GP Prior����������������������������������������������������������������������������������������������������������� 50
Obtaining GP Posterior with Noise-Free Observations���������������������������������������������������������� 55
Working with Noisy Observations������������������������������������������������������������������������������������������ 57
Experimenting with Different Kernel Parameters������������������������������������������������������������������ 59
Hyperparameter Tuning��������������������������������������������������������������������������������������������������������� 61
Summary������������������������������������������������������������������������������������������������������������������������������������ 66

Chapter 3: Bayesian Decision Theory and Expected Improvement������������������������ 69


Optimization via the Sequential Decision-Making���������������������������������������������������������������������� 70
Seeking the Optimal Policy���������������������������������������������������������������������������������������������������� 72
Utility-Driven Optimization����������������������������������������������������������������������������������������������������� 74
Multi-step Lookahead Policy������������������������������������������������������������������������������������������������� 76
Bellman’s Principle of Optimality������������������������������������������������������������������������������������������� 79
Expected Improvement��������������������������������������������������������������������������������������������������������������� 82
Deriving the Closed-Form Expression����������������������������������������������������������������������������������� 83
Implementing the Expected Improvement����������������������������������������������������������������������������� 86
Using Bayesian Optimization Libraries���������������������������������������������������������������������������������� 96
Summary������������������������������������������������������������������������������������������������������������������������������������ 98

Chapter 4: Gaussian Process Regression with GPyTorch������������������������������������� 101


Introducing GPyTorch���������������������������������������������������������������������������������������������������������������� 101
The Basics of PyTorch��������������������������������������������������������������������������������������������������������� 102
Revisiting GP Regression����������������������������������������������������������������������������������������������������� 104
Building a GP Regression Model������������������������������������������������������������������������������������������ 105
Fine-Tuning the Length Scale of the Kernel Function��������������������������������������������������������� 111
Fine-Tuning the Noise Variance������������������������������������������������������������������������������������������� 117

vi
Table of Contents

Delving into Kernel Functions��������������������������������������������������������������������������������������������������� 119


Combining Kernel Functions����������������������������������������������������������������������������������������������� 122
Predicting Airline Passenger Counts����������������������������������������������������������������������������������� 124
Summary���������������������������������������������������������������������������������������������������������������������������������� 129

Chapter 5: Monte Carlo Acquisition Function with Sobol Sequences


and Random Restart��������������������������������������������������������������������������������������������� 131
Analytic Expected Improvement Using BoTorch������������������������������������������������������������������������ 131
Introducing Hartmann Function������������������������������������������������������������������������������������������� 132
GP Surrogate with Optimized Hyperparameters������������������������������������������������������������������ 134
Introducing the Analytic EI��������������������������������������������������������������������������������������������������� 135
Optimization Using Analytic EI��������������������������������������������������������������������������������������������� 138
Grokking the Inner Optimization Routine����������������������������������������������������������������������������� 140
Using MC Acquisition Function������������������������������������������������������������������������������������������������� 148
Using Monte Carlo Expected Improvement������������������������������������������������������������������������� 150
Summary���������������������������������������������������������������������������������������������������������������������������������� 153

Chapter 6: Knowledge Gradient: Nested Optimization vs. One-Shot Learning����� 155


Introducing Knowledge Gradient����������������������������������������������������������������������������������������������� 156
Monte Carlo Estimation������������������������������������������������������������������������������������������������������� 158
Optimizing Using Knowledge Gradient�������������������������������������������������������������������������������� 161
One-Shot Knowledge Gradient�������������������������������������������������������������������������������������������������� 167
Sample Average Approximation������������������������������������������������������������������������������������������� 167
One-Shot Formulation of KG Using SAA������������������������������������������������������������������������������ 169
One-Shot KG in Practice������������������������������������������������������������������������������������������������������ 171
Optimizing the OKG Acquisition Function���������������������������������������������������������������������������� 178
Summary���������������������������������������������������������������������������������������������������������������������������������� 184

Chapter 7: Case Study: Tuning CNN Learning Rate with BoTorch������������������������� 185
Seeking Global Optimum of Hartmann�������������������������������������������������������������������������������������� 186
Generating Initial Conditions����������������������������������������������������������������������������������������������� 187
Updating GP Posterior��������������������������������������������������������������������������������������������������������� 188

vii
Table of Contents

Creating a Monte Carlo Acquisition Function���������������������������������������������������������������������� 190


The Full BO Loop����������������������������������������������������������������������������������������������������������������� 193
Hyperparameter Optimization for Convolutional Neural Network��������������������������������������������� 198
Using MNIST������������������������������������������������������������������������������������������������������������������������ 199
Defining CNN Architecture��������������������������������������������������������������������������������������������������� 203
Training CNN������������������������������������������������������������������������������������������������������������������������ 209
Optimizing the Learning Rate���������������������������������������������������������������������������������������������� 212
Entering the Full BO Loop���������������������������������������������������������������������������������������������������� 215
Summary���������������������������������������������������������������������������������������������������������������������������������� 222

Index��������������������������������������������������������������������������������������������������������������������� 225

viii
About the Author
Peng Liu is an assistant professor of quantitative finance
(practice) at Singapore Management University and an
adjunct researcher at the National University of Singapore.
He holds a Ph.D. in Statistics from the National University
of Singapore and has ten years of working experience as a
data scientist across the banking, technology, and hospitality
industries.

ix
Visit https://ebookmass.com
now to explore a rich
collection of eBooks and enjoy
exciting offers!
About the Technical Reviewer
Jason Whitehorn is an experienced entrepreneur and
software developer and has helped many companies
automate and enhance their business solutions through data
synchronization, SaaS architecture, and machine learning.
Jason obtained his Bachelor of Science in Computer Science
from Arkansas State University, but he traces his passion
for development back many years before then, having first
taught himself to program BASIC on his family’s computer
while in middle school. When he’s not mentoring and
helping his team at work, writing, or pursuing one of his
many side-projects, Jason enjoys spending time with his wife and four children and
living in the Tulsa, Oklahoma, region. More information about Jason can be found on his
website: ­https://jason.whitehorn.us.

xi
Acknowledgments
This book summarizes my learning journey in Bayesian optimization during my
(part-­time) Ph.D. study. It started as a personal interest in exploring this area and
gradually grew into a book combining theory and practice. For that, I thank my
supervisors, Teo Chung Piaw and Chen Ying, for their continued support in my
academic career.

xiii
Introduction
Bayesian optimization provides a unified framework that solves the problem of
sequential decision-making under uncertainty. It includes two key components: a
surrogate model approximating the unknown black-box function with uncertainty
estimates and an acquisition function that guides the sequential search. This book
reviews both components, covering both theoretical introduction and practical
implementation in Python, building on top of popular libraries such as GPyTorch and
BoTorch. Besides, the book also provides case studies on using Bayesian optimization
to seek a simulated function's global optimum or locate the best hyperparameters (e.g.,
learning rate) when training deep neural networks. The book assumes readers with a
minimal understanding of model development and machine learning and targets the
following audiences:

• Students in the field of data science, machine learning, or


optimization-related fields

• Practitioners such as data scientists, both early and middle in their


careers, who build machine learning models with good-performing
hyperparameters

• Hobbyists who are interested in Bayesian optimization as a global


optimization technique to seek the optimal solution as fast as
possible

All source code used in this book can be downloaded from ­github.com/apress/
Bayesian-optimization.

xv
CHAPTER 1

Bayesian Optimization
Overview
As the name suggests, Bayesian optimization is an area that studies optimization
problems using the Bayesian approach. Optimization aims at locating the optimal
objective value (i.e., a global maximum or minimum) of all possible values or the
corresponding location of the optimum in the environment (the search domain). The
search process starts at a specific initial location and follows a particular policy to
iteratively guide the following sampling locations, collect new observations, and refresh
the guiding policy.
As shown in Figure 1-1, the overall optimization process consists of repeated
interactions between the policy and the environment. The policy is a mapping function
that takes in a new input observation (plus historical ones) and outputs the following
sampling location in a principled way. Here, we are constantly learning and improving
the policy, since a good policy guides our search toward the global optimum more
efficiently and effectively. In contrast, a good policy would save the limited sampling
budget on promising candidate locations. On the other hand, the environment contains
the unknown objective function to be learned by the policy within a specific boundary.
When probing the functional value as requested by the policy, the actual observation
revealed by the environment to the policy is often corrupted by noise, making learning
even more challenging. Thus, Bayesian optimization, a specific approach for global
optimization, would like to learn a policy that can help us efficiently and effectively
navigate to the global optimum of an unknown, noise-corrupted environment as quickly
as possible.

1
© Peng Liu 2023
P. Liu, Bayesian Optimization, https://doi.org/10.1007/978-1-4842-9063-7_1
Chapter 1 Bayesian Optimization Overview

Figure 1-1. The overall Bayesian optimization process. The policy digests the
historical observations and proposes the new sampling location. The environment
governs how the (possibly noise-corrupted) observation at the newly proposed
location is revealed to the policy. Our goal is to learn an efficient and effective
policy that could navigate toward the global optimum as quickly as possible

Global Optimization
Optimization aims to locate the optimal set of parameters of interest across the whole
domain through carefully allocating limited resources. For example, when searching
for the car key at home before leaving for work in two minutes, we would naturally start
with the most promising place where we would usually put the key. If it is not there,
think for a little while about the possible locations and go to the next most promising
place. This process iterates until the key is found. In this example, the policy is digesting
the available information on previous searches and proposing the following promising
location. The environment is the house itself, revealing if the key is placed at the
proposed location upon each sampling.
This is considered an easy example since we are familiar with the environment
in terms of its structural design. However, imagine locating an item in a totally new

2
Chapter 1 Bayesian Optimization Overview

environment. The policy would need to account for the uncertainty due to unfamiliarity
with the environment while sequentially determining the next sampling location. When
the sampling budget is limited, as is often the case in real-life searches in terms of
time and resources, the policy needs to argue carefully on the utility of each candidate
sampling location.
Let us formalize the sequential global optimization using mathematical terms. We
are dealing with an unknown scalar-valued objective function f based on a specific
domain Α. In other words, the unknown subject of interest f is a function that maps a
certain sample in Α to a real number in ℝ, that is, f : Α → ℝ. We typically place no specific
assumption about the nature of the domain Α other than that it should be a bounded,
compact, and convex set.
Unless otherwise specified, we focus on the maximization setting instead of
minimization since maximizing the objective function is equivalent to minimizing the
negated objective, and vice versa. The optimization procedure thus aims at locating
the global maximum f ∗ or its corresponding location x∗ in a principled and systematic
manner. Mathematically, we wish to locate f ∗ where

f   max f  x   f  x  
x

Or equivalently, we are interested in its location x∗ where

x   argmax xA f  x 

Figure 1-2 provides an example one-dimensional objective function with its global
maximum f ∗ and its location x∗ highlighted. The goal of global optimization is thus to
systematically reason about a series of sampling decisions within the total search space
Α, so as to locate the global maximum as fast as possible, that is, sampling as few times
as possible.

3
Chapter 1 Bayesian Optimization Overview

Figure 1-2. An example objective function with the global maximum and its
location marked with star. The goal of global optimization is to systematically
reason about a series of sampling decisions so as to locate the global maximum as
fast as possible

Note that this is a nonconvex function, as is often the case in real-life functions we
are optimizing. A nonconvex function means we could not resort to first-order gradient-­
based methods to reliably search for the global optimum since it will likely converge to
a local optimum. This is also one of the advantages of Bayesian optimization compared
with other gradient-based optimization procedures.

The Objective Function


There are different types of objective functions. For example, some functions are wiggly
shaped, while others are smooth; some are convex, while others are nonconvex. An
objective function is an unknown object to us; the problem would be considered solved
if we could access its underlying mathematical form. Many complex functions are almost
impossible to be expressed using an explicit expression. For Bayesian optimization, the
specific type of objective function typically bears the following attributes:

• We do not have access to the explicit expression of the objective


function, making it a “black-box” function. This means that we can
only interact with the environment, that is, the objective function, to
perform a functional evaluation by sampling at a specific location.

4
Chapter 1 Bayesian Optimization Overview

• The returned value by probing at a specific location is often


corrupted by noise and does not represent the exact true value of the
objective function at that location. Due to the indirect evaluation of
its actual value, we need to account for such noise embedded in the
actual observations from the environment.

• Each functional evaluation is costly, thus ruling out the option for
an exhaustive probing. We need to have a sample-efficient method to
minimize the number of evaluations of the environment while trying
to locate its global optimum. In other words, the optimizer needs to
fully utilize the existing observations and systematically reason about
the next sampling decision so that the limited resource is well spent
on promising locations.

• We do not have access to its gradient. When the functional evaluation


is relatively cheap and the functional form is smooth, it would be very
convenient to compute the gradient and optimize using the first-­
order procedure such as gradient descent. Access to the gradient is
necessary for us to understand the adjacent curvature of a particular
evaluation point. With gradient evaluations, the follow-up direction
of travel is easier to determine.

The “black-box” function is challenging to optimize for the preceding reasons. To


further elaborate on the possible functional form of the objective, we list three examples
in Figure 1-3. On the left is a convex function with only one global minimum; this is
considered easy for global optimization. In the middle is a nonconvex function with
multiple local optima; it is difficult to ascertain if the current local optimum is also
globally optimal. It is also difficult to identify whether this is a flat region vs. a local
optimum for a function with a flat region full of saddle points. All three scenarios are in a
minimization setting.

5
Chapter 1 Bayesian Optimization Overview

Figure 1-3. Three possible functional forms. On the left is a convex function whose
optimization is easy. In the middle is a nonconvex function with multiple local
minima, and on the right is also a nonconvex function with a wide flat region full
of saddle points. Optimization for the latter two cases takes a lot more work than
for the first case

Let us look at one example of hyperparameter tuning when training machine


learning models. A machine learning model is a function that involves a set of
parameters to be optimized given the input data. These parameters are automatically
tuned via a specific optimization procedure, typically governed by a set of corresponding
meta parameters called hyperparameters, which are fixed before the model training
starts. For example, when training deep neural networks using the gradient descent
algorithm, a learning rate that determines the step size of each parameter update needs
to be manually selected in advance. If the learning rate is too large, the model may
diverge and eventually fails to learn. If the learning rate is too small, the model may
converge very slowly as the weights are updated by only a small margin in this iteration.
See Figure 1-4 for a visual illustration.

6
Visit https://ebookmass.com
now to explore a rich
collection of eBooks and enjoy
exciting offers!
Chapter 1 Bayesian Optimization Overview

Figure 1-4. Slow convergence due to a small learning rate on the left and
divergence due to a large learning rate on the right

Choosing a reasonable learning rate as a preset hyperparameter thus plays a critical


role in training a good machine learning model. Locating the best learning rate and
other hyperparameters is an optimization problem that fits Bayesian optimization. In
the case of hyperparameter tuning, evaluating each learning rate is a time-consuming
exercise. The objective function would generally be the model’s final test set loss (in
a minimization setting) upon model convergence. A model needs to be fully trained
to obtain a single evaluation, which typically involves hundreds of epochs to reach
stable convergence. Here, one epoch is a complete pass of the entire training dataset.
The book’s last chapter covers a case study on tuning the learning rate using Bayesian
optimization.
The functional form of the test set loss or accuracy may also be highly nonconvex
and multimodal for the hyperparameters. Upon convergence, it is not easy to know
whether we are in a local optimum, a saddle point, or a global optimum. Besides, some
hyperparameters may be discrete, such as the number of nodes and layers when training
a deep neural network. We could not calculate its gradient in such a case since it requires
continuous support in the domain.
The Bayesian optimization approach is designed to tackle all these challenges. It
has been shown to deliver good performance in locating the best hyperparameters
under a limited budget (i.e., the number of evaluations allowed). It is also widely and
successfully used in other fields, such as chemical engineering.

7
Chapter 1 Bayesian Optimization Overview

Next, we will delve into the various components of a typical Bayesian optimization
setup, including the observation model, the optimization policy, and the Bayesian
inference.

The Observation Model


Earlier, we mentioned that a functional evaluation would give an observation about
the true objective function, and the observation may likely be different from the true
objective value due to noise. The observations gathered for the policy learning would
thus be inexact and corrupted by an additional noise term, which is often assumed to be
additive. The observation model is an approach to formalize the relationship between
the true objective function, the actual observation, and the noise. It governs how the
observations would be revealed from the environment to the policy.
Figure 1-5 illustrates a list of observations of the underlying objective function. These
observations are dislocated from the objective function due to additive random noises.
These additive noises manifest as the vertical shifts between the actual observations and
the underlying objective function. Due to these noise-induced deviations inflicted on the
observations, we need to account for such uncertainty in the observation model. When
learning a policy based on the actual observations, the policy also needs to be robust
enough to focus on the objective function’s underlying pattern and not be distracted by
the noises. The model we use to approximate the objective function, while accounting
for uncertainty due to the additive noise, is typically a Gaussian process. We will cover it
briefly in this chapter and in more detail in the next chapter.

8
Random documents with unrelated
content Scribd suggests to you:
When Ada found her way out of the woods, and came again upon
the green lawn, the young deer was close by her side. As soon as
Lion saw the fawn, he gave a loud bark, and came dashing toward
the timid creature. But Ada put her arm around its neck, and said,—
“Don’t be afraid. Lion won’t hurt you. Lion is a good dog.”
And Lion seemed to understand the act of Ada, for he stopped short
before he reached them, wagged his tail, and looked curiously at the
new companion which Ada had found. First he walked round and
round, as if the whole matter was not clear to him. He had chased
deer in his time, and did not seem to understand why he was not to
sink his great teeth into the tender flank of the gentle creature that
had followed his young mistress from the woods. But he soon
appeared to get light on this difficult subject, for he came up to be
patted by Ada, and did not even growl at the fawn, nor show any
disposition to hurt it.
The fawn would not stay in the park after this. Ada’s father had it
taken back once or twice, but before the day was gone it managed to
escape, and came to see its newly-found friend. After this it was
permitted to remain; and every day little Ada fed it with her own
hand. When others of the family approached, the timid creature
would start away; but when Ada appeared, it came with confidence
to her side.
Ada had a brother two years older than she was. He was different
from his sister in not having her innocent mind and loving heart.
Sometimes he indulged in a cruel disposition, and often he was ill-
tempered. When William saw the fawn he was delighted, and tried to
make friends with the gentle animal. But the fawn was afraid of him,
and when he tried to come near would run away, or come up to Ada.
Then, if William put his hand on it to caress it, the fawn would shrink
closer to Ada, and tremble. William did not like it because the fawn
would not be friends with him, and wondered why it should be afraid
of him, and not of Ada. He did not think that it was because Ada was
so good, while he let evil tempers come into his heart.
“But how could the fawn know this?” ask my young readers. “The
fawn couldn’t see what was in William’s heart.”
No; for if it could have done so, it would have been wiser than a
human being. But all good affections, let it be remembered, as well
as all evil affections, represent themselves in the face, and picture
themselves in the eyes; and there is, besides, a sphere of what is
good or evil about every one, according to the heart’s affections—
just as the sphere of a rose is around the flower in its odour, showing
its quality. Animals, as well as human beings, can read, by a kind of
instinct, the good or evil of any one in his face, and perceive, by a
mysterious sense, the sphere of good or evil that surrounds him.
You do not clearly understand this, my young reader; nevertheless it
is so. If you are good, others will know it at a glance, and feel it when
you come near them. And the same will be the case if your hearts
are evil.
Ada’s pet fawn stayed with her many months, and nothing harmed it.
The horns began to push forth, like little knobs, from its head; and
afterwards it grew up to be a stately deer, and was sent back to the
park. Ada often went to see her favourite, which now had a pair of
beautiful branching antlers. It always knew her, and would come up
to her side and lick her hand when she held it forth.
Such power has love over even a brute animal.
How to Avoid a Quarrel.
“HERE! lend me your knife, Bill; I’ve left mine in the house,” said
Edgar Harris to his younger brother. He spoke in a rude voice, and
his manner was imperative.
“No, I won’t! Go and get your own knife,” replied William, in a tone
quite as ungracious as that in which the request, or rather command,
had been made.
“I don’t wish to go into the house. Give me your knife, I say. I only
want it for a minute.”
“I never lend my knife, nor give it, either,” returned William. “Get your
own.”
“You are the most disobliging fellow I ever saw,” retorted Edgar
angrily, rising up and going into the house to get his own knife. “Don’t
ever ask me for a favour, for I’ll never grant it.”
This very unbrotherly conversation took place just beneath the
window near which Mr. Harris, the father of the lads, was seated. He
overheard it all, and was grieved, as may be supposed, that his sons
should treat each other so unkindly. But he said nothing to them
then, nor did he let them know that he heard the language that had
passed between them.
In a little while Edgar returned, and as he sat down in the place
where he had been seated before, he said,—
“No thanks to you for your old knife! Keep it to yourself, and
welcome. I wouldn’t use it now if you were to give it to me.”
“I’m glad you are so independent,” retorted William. “I hope you will
always be so.”
And the boys fretted each other for some time.
On the next day, Edgar was building a house with sticks, and William
was rolling a hoop. By accident the hoop was turned from its right
course, and broke down a part of Edgar’s house. William was just
going to say how sorry he was for the accident, and to offer to repair
the damage that was done, when his brother, with his face red with
passion, cried out,—
“Just see what you have done! If you don’t get away with your hoop,
I’ll call father. You did it on purpose.”
“Do go and call him! I’ll go with you,” said William, in a sneering,
tantalizing tone. “Come, come along now.”
For a little while the boys stood and growled at each other like two ill-
natured dogs, and then Edgar commenced repairing his house, and
William went on rolling his hoop again. The latter was strongly
tempted to repeat, in earnest, what he had done at first by accident,
by way of retaliation upon his brother for his spiteful manner toward
him; but, being naturally of a good disposition, and forgiving in his
temper, he soon forgot his bad feelings, and enjoyed his play as
much as he had done before.
This little circumstance Mr. Harris had also observed.
A day or two afterwards, Edgar came to his father with a complaint
against his brother.
“I never saw such a boy,” he said. “He will not do the least thing to
oblige me. If I ask him to lend me his knife, or ball, or anything he
has, he snaps me up short with a refusal.”
“Perhaps you don’t ask him right,” suggested the father. “Perhaps
you don’t speak kindly to him. I hardly think that William is ill-
disposed and disobliging naturally. There must be some fault on your
part, I am sure.”
“I don’t know how I can be in fault, father,” said Edgar.
“William refused to let you have his knife, the other day, although he
was not using it himself, did he not?”
“Yes, sir.”
“Do you remember how you asked him for it?”
“No, sir, not now, particularly.”
“Well, as I happened to overhear you, I can repeat your words,
though I hardly think I can get your very tone and manner. Your
words were, ‘Here, lend me your knife, Bill!’ and your voice and
manner were exceedingly offensive. I did not at all wonder that
William refused your request. If you had spoken to him in a kind
manner, I am sure he would have handed you his knife instantly. But
no one likes to be ordered, in a domineering way, to do anything at
all. I know you would resent it in William, as quickly as he resents it
in you. Correct your own fault, my son, and in a little while you will
have no complaint to make of William.”
Edgar felt rebuked. What his father said he saw to be true.
“Whenever you want William to do anything for you,” continued the
father, “use kind words instead of harsh ones, and you will find him
as obliging as you could wish. I have observed you both a good deal,
and I notice that you rarely ever speak to William in a proper manner,
but you are rude and overbearing. Correct this evil in yourself, and
all will be right with him. Kind words are far more powerful than harsh
words, and their effect a hundred-fold greater.”
On the next day, as Edgar was at work in the garden, and William
standing at the gate looking on, Edgar wanted a rake that was in the
summer-house. He was just going to say, “Go and get me that rake,
Bill!” but he checked himself, and made his request in a different
form, and in a better tone than those words would have been uttered
in.
“Will you get me the small rake that lies in the summer-house,
William?” he said. The words and tone involved a request, not a
command, and William instantly replied,—
“Certainly;” and bounded away to get the rake for his brother.
“Thank you,” said Edgar, as he received the rake.
“Don’t you want the watering-pot?” asked William.
“Yes, I do; and you may bring it full of water, if you please,” was the
reply.
Off William went for the watering-pot, and soon returned with it full of
water. As he stood near one of Edgar’s flower-beds, he forgot
himself, and stepped back with his foot upon a bed of pansies.
“There! just look at you!” exclaimed Edgar, thrown off his guard.
William, who had felt drawn towards his brother on account of his
kind manner, was hurt at this sudden change in his words and tone.
He was tempted to retort harshly, and even to set his foot more
roughly upon the pansies. But he checked himself, and, turning
away, walked slowly from the garden.
Edgar, who had repented of his rude words and unkind manner the
moment he had time to think, was very sorry that he had been
thrown off his guard, and resolved to be more careful in the future.
And he was more careful. The next time he spoke to his brother, it
was in a kind and gentle manner, and he saw its effect. Since then,
he has been watchful over himself, and now he finds that William is
one of the most obliging boys anywhere to be found.
“So much for kind words, my son,” said his father, on noticing the
great change that had taken place. “Never forget, throughout your
whole life, that kind words are far more potent than harsh ones. I
have found them so, and you have already proved the truth of what I
say.”
And so will every one who tries them. Make the experiment, young
friends, and you will find it to succeed in every case.
The Broken Doll.
NEARLY all the unhappiness that exists in the world has its origin in
the want of a proper control over the desires and passions. This is as
true in childhood as in more advanced age. Children are unhappy
because they do not possess many things they see; and too often, in
endeavouring to obtain what they have no right to, they make
themselves still more unhappy. A spirit of covetousness is as bad a
spirit as can come into the heart; and whoever has this spirit for a
guest, cannot but be, most of his time, very miserable.
Albert Hawkins, I am sorry to say, had given place in his heart to this
evil spirit of covetousness. Almost everything he saw he desired to
possess. Had it not been for this, Albert would have been a very
good boy. He learned his lessons well, was obedient and attentive at
school and at home, and did not take delight in hurting or annoying
dumb animals and insects, as too many boys do. But his restless
desire to have whatever he saw marred all this, and produced much
unhappiness in his own mind, as well as in the minds of his parents.
One day, on coming home from school, he found his sister Ellen
playing with a large new doll that her father had bought for her.
“Oh, isn’t it beautiful!” he exclaimed. “Where did you get it? Let me
have it to look at.”
And Albert caught hold of the doll and almost forced it out of the
hands of Ellen, who resigned it with great reluctance. He then sat
down and held it in his lap, while Ellen stood by, half in tears. She
had only had it about an hour, and she could not bear to let it go from
her. Albert, in his selfish desire to hold in his hands the beautiful doll,
did not think of how much pleasure he was depriving his sister, who
patiently waited minute after minute to have it restored to her. At last,
seeing that her brother still kept possession of the doll, she said,
gently and kindly,—
“Won’t you give it to me now?” and she put out her hand to take it as
she spoke.
But Albert pushed her hand quickly away, and said,—
“No, no; I’ve not done with it yet.”
Ellen looked disappointed. But she waited still longer.
“Now, brother, give me my doll, won’t you?” she said.
“Don’t be so selfish about your doll,” answered Albert, rudely. “You
shall have it after a while, when I’ve done with it.”
Ellen now felt so vexed that she could not keep from crying. As soon
as Albert saw the tears falling over her face, and heard her sob, he
became angry, and throwing the doll upon the floor, exclaimed in a
harsh voice,—
“There! Take your ugly old doll, if you are so selfish about it!”
As the beautiful figure struck the floor, one of its delicate hands
broke off from the wrist. But even a sight of the injury he had done
did not soften the heart of Albert, who left the room feeling very
angry towards his sister. He was trying to amuse himself in the yard,
about half an hour afterwards, when his mother, who had been out,
called to him from the door. He went up to her, and she said,—
“Albert, how came the hand of Ellen’s new doll broken? Do you
know? I have asked her about it; but the only answer I can get from
her is in tears.”
Albert’s eyes fell immediately to the ground, while his face became
red.
“I hope you did not break it!” the mother said, pained to see this
confusion manifested by her boy.
Now Albert, although of a covetous disposition, never told a lie. He
was a truthful boy, and that was much in his favour. To lie is most
wicked and despicable. There is no meaner character than a liar.
“Yes, ma’am, I broke it,” he replied, without any equivocation.
“How did you do that, Albert?” asked his mother.
“Ellen would not let me hold it, and I got angry and threw it upon the
floor. I didn’t mean to break it.”
At this confession, Albert’s mother was very much grieved.
“But what right had you to Ellen’s doll?” she asked.
“I wanted to hold it.”
“But it was your sister’s, not yours; and if she did not wish you to
have it, that was no reason why you should get angry and break it.”
“But, indeed, mother, I didn’t mean to break it.”
“I don’t suppose you did. I should be very sorry to think you were so
wicked. Still, you have been guilty of a great wrong to your sister;
and to this you have no doubt been led by indulging in that covetous
spirit of which I have so often talked to you, and which, if not
overcome, may lead you into some great evil when you become a
man. But tell me just how it happened.”
And Albert truthfully related what had passed.
“I cannot tell you how much all this grieves me,” his mother said.
“Ellen never interferes with your pleasures, and never covets your
playthings nor books, but you give her no peace with anything she
has. If your father brings each of you home a book, yours is thrown
aside in a few moments, and you want to look at hers. It is this
covetous spirit—this desiring to have what belongs to another—that
leads to stealing; and unless you put it away from your heart, you will
be in great danger of more temptations than now assail you. Poor
Ellen! Her heart is almost broken about her doll.”
“I am very sorry, mother,” replied Albert in a penitent voice. “I wish I
hadn’t touched her doll. Don’t you think it can be mended? Can’t I
buy her a new hand for it? I will take the money out of my box.”
“We will see about that, my dear. If you can restore the hand, I think
it is your duty to do so. It will be nothing but simple justice, and we
should all be just one towards another in little as well as in great
things. But your first duty is to go to Ellen and try to comfort her in
her affliction, for it is a great grief for her to have her beautiful doll
broken. I found her just now crying bitterly.”
All Albert’s better feelings came back into his heart. He felt very sorry
for Ellen, and went in immediately to the room where she was. He
found her with her head leaning down upon a table, weeping.
“Sister Ellen!” he said, speaking earnestly, “I am so sorry I broke
your doll’s hand. Don’t cry, and I will take money out of my box, and
buy you a new hand for it.”
Albert’s voice was so kind, and so full of sympathy, that Ellen felt
better in a moment. She lifted her head from the table and looked
round into her brother’s face.
“You will forgive me, won’t you, sister?” he said. “I was angry and
wicked, but I am very sorry, and will try and never trouble you any
more. After dinner we will go out, and see if we can’t find another
hand, and I will buy it for you out of my own money.”
Ellen’s tears all dried up; and she said in a kind, gentle way, that she
forgave her brother. After dinner they went out together, and Albert
found a new hand, and bought it for his sister. The doll is now as
good as it was before; and what is better, Albert has learned to
restrain his covetous spirit, and to leave Ellen happy in the
enjoyment of what is her own.
Harsh Words and Kind Words.
WILLIAM BAKER, and his brother Thomas, and sister Ellen, were
playing on the green lawn in front of their mother’s door, when a lad
named Henry Green came along the road, and seeing the children
enjoying themselves, opened the gate and came in. He was rather
an ill-natured boy, and generally took more pleasure in teasing and
annoying others than in being happy with them. When William saw
him coming in through the gate, he called to him and said, in a harsh
way,—
“You may just keep out, Henry Green, and go about your business!
We don’t want you here.”
But Henry did not in the least regard what William said. He came
directly forward, and joined in the sport as freely as if he had been
invited instead of repulsed. In a little while he began to pull Ellen
about rudely, and to push Thomas so as nearly to throw them down
upon the grass.
“Go home, Henry Green! Nobody sent for you! Nobody wants you
here!” said William Baker, in an angry tone.
It was of no use, however. William might as well have spoken to the
wind. His words were unheeded by Henry, whose conduct became
ruder and more offensive.
Mrs. Baker, who sat at the window, saw and heard all that was
passing. As soon as she could catch the eye of her excited son, she
beckoned him to come to her, which he promptly did.
“Try kind words on him,” she said; “you will find them more powerful
than harsh words. You spoke very harshly to Henry when he came
in, and I was sorry to hear it.”
“It won’t do any good, mother. He’s a rude, bad boy, and I wish he
would stay at home. Won’t you make him go home?”
“First go and speak to him in a gentler way than you did just now. Try
to subdue him with kindness.”
William felt that he had been wrong in letting his angry feelings
express themselves in angry words. So he left his mother and went
down upon the lawn, where Henry was amusing himself by trying to
trip up the children with a long stick, as they ran about on the green.
“Henry,” he said, cheerfully and pleasantly, “if you were fishing in the
river, and I were to come and throw stones in where your line fell,
and scare away all the fish, would you like it?”
“No, I should not,” replied the lad.
“It wouldn’t be kind in me?”
“No, of course it wouldn’t.”
“Well, now, Henry”—William tried to smile and to speak very
pleasantly—“we are playing here and trying to enjoy ourselves. Is it
right for you to come and interrupt us by tripping up our feet, pulling
us about, and pushing us down? I am sure you will not think so if you
reflect a moment. So don’t do it any more, Henry.”
“No, I will not,” replied Henry promptly. “I am sorry that I disturbed
you. I didn’t think what I was doing. And now I remember, father told
me not to stay, and I must run home.”
So Henry Green went quickly away, and the children were left to
enjoy themselves.
“Didn’t I tell you that kind words were more powerful than harsh
words, William?” said his mother, after Henry had gone away. “When
we speak harshly to our fellows, we arouse their angry feelings, and
then evil spirits have power over them; but when we speak kindly, we
affect them with gentleness, and good spirits flow into this latter
state, and excite in them better thoughts and intentions. How quickly
Henry changed, when you changed your manner and the character
of your language. Do not forget this, my son. Do not forget that kind
words have double the power of harsh ones.”
A Noble Act.
“WHAT have you there, boys?” asked Captain Bland.
“A ship,” replied one of the lads who were passing the captain’s neat
cottage.
“A ship! Let me see;” and the captain took the little vessel, and
examined it with as much fondness as a child does a pretty toy.
“Very fair indeed; who made it?”
“I did,” replied one of the boys.
“You, indeed! Do you mean to be a sailor, Harry?”
“I don’t know. I want father to get me into the navy.”
“As a midshipman?”
“Yes, sir.”
Captain Bland shook his head.
“Better be a farmer, a physician, or a merchant.”
“Why so, captain?” asked Harry.
“All these are engaged in the doing of things directly useful to
society.”
“But I am sure, captain, that those who defend us against our
enemies, and protect all who are engaged in commerce from wicked
pirates, are doing what is useful to society.”
“Their use, my lad,” replied Captain Bland, “is certainly a most
important one; but we may call it rather negative than positive. The
civilian is engaged in building up and sustaining society in doing
good, through his active employment, to his fellow-men. But military
and naval officers do not produce anything; they only protect and
defend.”
“But if they did not protect and defend, captain, evil men would
destroy society. It would be of no use for the civilian to endeavour to
build up, if there were none to fight against the enemies of the state.”
“Very true, my lad. The brave defender of his country cannot be
dispensed with, and we give him all honour. Still, the use of defence
and protection is not so high as the use of building up and
sustaining. The thorn that wounds the hand stretched forth to pluck
the flower is not so much esteemed, nor of so much worth, as the
blossom it was meant to guard. Still, the thorn performs a great use.
Precisely a similar use does the soldier or naval officer perform to
society; and it will be for you, my lad, to decide as to which position
you would rather fill.”
“I never thought of that, captain,” said one of the lads. “But I can see
clearly how it is. And yet I think those men who risk their lives for us
in war, deserve great honour. They leave their homes, and remain
away, sometimes for years, deprived of all the comforts and
blessings that civilians enjoy, suffering frequently great hardships,
and risking their lives to defend their country from her enemies.”
“It is all as you say,” replied Captain Bland; “and they do, indeed,
deserve great honour. Their calling is one that exposes them to
imminent peril, and requires them to make many sacrifices; and they
encounter not this peril and sacrifice for their own good, but for the
good of others. Their lives do not pass so evenly as do the lives of
men who spend their days in the peaceful pursuits of business, art,
or literature; and we could hardly wonder if they lost some of the
gentler attributes of the human heart. In some cases this is so; but,
in very many cases, the reverse is true. We find the man who goes
fearlessly into battle, and there, in defence of his country, deals
death and destruction unsparingly upon her enemies, acting, when
occasion offers, from the most humane sentiments, and jeopardizing
his life to save the life of a single individual. Let me relate to you a
true story in illustration of what I say.

“When the unhappy war that was waged by the American troops in
Mexico broke out, a lieutenant in the navy, who had a quiet berth at
Washington, felt it to be his duty to go to the scene of strife, and
therefore asked to be ordered to the Gulf of Mexico. His request was
complied with, and he received orders to go on board the steamer
Mississippi, Commodore Perry, then about to sail from Norfolk to
Vera Cruz.
“Soon after the Mississippi arrived out, and before the city and castle
were taken, a terrible ‘norther’ sprung up, and destroyed much
shipping in the harbour. One vessel, on which were a number of
passengers, was thrown high upon a reef; and when morning broke,
the heavy sea was making a clear breach through her. She lay about
a mile from the Mississippi, and it soon became known on board the
steamer that a mother and her infant were in the wreck, and that,
unless succour came speedily, they would perish. The lieutenant of
whom I speak immediately ordered out a boat’s crew, and although
the sea was rolling tremendously, and the ‘norther’ still blowing a
hurricane, started to the rescue. Right in the teeth of the wind were
the men compelled to pull their boat, and so slowly did they proceed
that it took more than two hours to gain the wreck.
“At one time they actually gave up, and the oars lay inactive in their
hands. At this crisis, the brave but humane officer, pointing with one
hand to the fortress of San Juan de Ulloa, upon which a fire had
already commenced, and with the other to the wreck, exclaimed,
with noble enthusiasm,—
“‘Pull away, men! I would rather save the life of that woman and her
child, than have the honour of taking the castle!’
“Struck by the noble, unselfish, and truly humane feelings of their
officer, the crew bent with new vigour to their oars. In a little while the
wreck was gained, and the brave lieutenant had the pleasure of
receiving into his arms the almost inanimate form of the woman, who
had been lashed to the deck, and over whom the waves had been
beating, at intervals, all night.
“In writing home to his friends, after the excitement of the adventure
was over, the officer spoke of the moment when he rescued that
mother and child from the wreck as the proudest of his life.
“Afterwards he took part in the bombardment of Vera Cruz, and had
command, in turn, of the naval battery, where he faithfully and
energetically performed his duty as an officer in the service of his
country. He was among the first of those who entered the captured
city; but pain, not pleasure, filled his mind, as he looked around and
saw death and destruction on every hand. The arms of his country
had been successful; the officer had bravely contributed his part in
the work; but he frankly owns that he experienced far more delight in
saving the woman he had borne from the wreck, than he could have
felt had he been the commander of the army that reduced the city.

“Wherever duty calls, my lads,” concluded the captain, “you will find
that brave officer. He will never shrink from the post of danger, if his
country have need of him, nor will he ever be deaf to the appeal of
humanity; but so long as he is a true man, just so long will he delight
more in saving than in destroying.”
Emma Lee and her Sixpence.
EMMA’S aunt had given her a sixpence, and now the question was,
what should she buy with it?
“I’ll tell you what I will do, mother,” she said, changing her mind for
the tenth time.
“Well, dear, what have you determined upon now?”
“I’ll save my sixpence until I get a good many more, and then I’ll buy
me a handsome wax doll. Wouldn’t you do that, mother, if you were
me?”
“If I were you, I suppose I should do just as you will,” replied Emma’s
mother, smiling.
“But, mother, don’t you think that would be a nice way to do? I get a
good many pennies and sixpences, you know, and could soon save
enough to buy me a beautiful wax doll.”
“I think it would be better,” said Mrs. Lee, “for you to save up your
money and buy something worth having.”
“Isn’t a large wax doll worth having?”
“Oh yes; for a little girl like you.”
“Then I’ll save up my money, until I get enough to buy me a doll as
big as Sarah Johnson’s.”
In about an hour afterwards, Emma came to her mother, and said,—
“I’ve just thought what I will do with my sixpence. I saw such a
beautiful book at a shop yesterday! It was full of pictures, and the
price was just sixpence. I’ll buy that book.”
“But didn’t you say, a little while ago, that you were going to save
your money until you had enough to buy a doll?”
“I know I did, mother; but I didn’t think about the book then. And it will
take so long before I can save up money enough to get a new doll. I
think I will buy the book.”
“Very well, dear,” replied Mrs. Lee.
Not long after, Emma changed her mind again.
On the next day her mother said to her,—
“Your aunt Mary is very ill, and I am going to see her. Do you wish to
go with me?”
“Yes, mother, I should like to go. I am so sorry that aunt Mary is ill.
What ails her?”
“She is never very well, and the least cold makes her worse. The last
time she was here she took cold.”
As they were about leaving the house, Emma said,—
“I’ll take my sixpence with me, and spend it, mother.”
“What are you going to buy?” asked Mrs. Lee.
“I don’t know,” replied Emma. “Sometimes I think I will buy some
cakes; and then I think I will get a whole sixpence worth of cream
candy—I like it so.”
“Have you forgotten the book?”
“Oh no. Sometimes I think I will buy the book. Indeed, I don’t know
what to buy.”
In this undecided state of mind, Emma started with her mother to see
her aunt. They had not gone far before they met a poor woman with
some very pretty bunches of flowers for sale. She carried them on a
tray. She stopped before Mrs. Lee and her little girl, and asked if they
would not buy some flowers.
“How much are they a bunch?” asked Emma.
“Sixpence,” replied the woman.
“Mother, I’ll tell you what I will do with my sixpence,” said Emma, her
face brightening with the thought that came into her mind. “I will buy
a bunch of flowers for aunt Mary. You know how she loves flowers.
Can’t I do it, mother?”
“Oh yes, dear. Do it, by all means, if you think you can give up the
nice cream candy or the picture book for the sake of gratifying your
aunt.”
Emma did not hesitate a moment, but selected a very handsome
bunch of flowers, and paid her sixpence to the woman with a feeling
of real pleasure.
Aunt Mary was very much pleased with the bouquet Emma brought
her.
“The sight of these flowers, and their delightful perfume, really
makes me feel better,” she said, after she had held them in her hand
for a little while. “I am very much obliged to my niece for thinking of
me.”
That evening Emma looked up from a book which her mother had
bought her as they returned home from aunt Mary’s, and with which
she had been much entertained, and said,—
“I think the spending of my sixpence gave me a double pleasure.”
“How so, dear?” asked Mrs. Lee.
“I made aunt happy, and the flower-woman too. Didn’t you notice
how pleased the flower-woman looked? I shouldn’t wonder if she
had little children at home, and thought about the bread that
sixpence would buy them when I paid it to her. Don’t you think she
did?”
“I cannot tell that, Emma,” replied her mother; “but I shouldn’t at all
wonder if it were as you suppose. And so it gives you pleasure to
think you have made others happy?”
“Indeed it does.”

Вам также может понравиться