Bayesian Tensor Decomposition For Signal Processing and Machine Learning Modeling Tuningfree Algorithms and Applications Lei Cheng Instant Download
Bayesian Tensor Decomposition For Signal Processing and Machine Learning Modeling Tuningfree Algorithms and Applications Lei Cheng Instant Download
https://ebookbell.com/product/bayesian-thinking-in-biostatistics-
purushottam-w-laud-wesley-o-johnson-46469338
https://ebookbell.com/product/bayesian-approaches-to-shrinkage-and-
sparse-estimation-dimitris-korobilis-46884056
https://ebookbell.com/product/bayesian-reasoning-and-gaussian-
processes-for-machine-learning-applications-shubham-tayal-46969554
https://ebookbell.com/product/bayesian-optimization-in-action-
meap-v07-1st-chapters-1-to-8-of-13-quan-nguyen-47541510
Bayesian Statistical Modeling With Stan R And Python Kentaro Matsuura
https://ebookbell.com/product/bayesian-statistical-modeling-with-stan-
r-and-python-kentaro-matsuura-47573672
https://ebookbell.com/product/bayesian-optimization-1st-roman-
garnett-47581428
https://ebookbell.com/product/bayesian-analysis-of-stochastic-process-
models-david-insua-fabrizio-ruggeri-47691782
https://ebookbell.com/product/bayesian-analysis-with-excel-and-r-1st-
conrad-carlberg-47735706
https://ebookbell.com/product/bayesian-analysis-of-infectious-
diseases-covid19-and-beyond-1st-edition-lyle-d-broemeling-48123254
Lei Cheng
Zhongtao Chen
Yik-Chung Wu
Bayesian Tensor
Decomposition
for Signal
Processing and
Machine Learning
Modeling, Tuning-Free Algorithms, and
Applications
Bayesian Tensor Decomposition for Signal
Processing and Machine Learning
Lei Cheng · Zhongtao Chen · Yik-Chung Wu
Bayesian Tensor
Decomposition for Signal
Processing and Machine
Learning
Modeling, Tuning-Free Algorithms,
and Applications
Lei Cheng Zhongtao Chen
College of Information Science Department of Electrical and Electronic
and Electronic Engineering Engineering
Zhejiang University The University of Hong Kong
Hangzhou, China Hong Kong, China
Yik-Chung Wu
Department of Electrical and Electronic
Engineering
The University of Hong Kong
Hong Kong, China
© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature
Switzerland AG 2023
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse
of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and
transmission or information storage and retrieval, electronic adaptation, computer software, or by similar
or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.
This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface
Our world is full of data, and these data often appear in high-dimensional structures,
with each dimension describing a unique attribute. Examples include data in social
sciences, medicines, pharmacology, and environmental monitoring, just to name a
few. To make sense of the multi-dimensional data, advanced computational tools,
which directly work with tensor rather than first converting a tensor to a matrix, are
needed to unveil the hidden patterns of the data. This is where tensor decomposi-
tion models come into play. Due to the remarkable representation capability, tensor
decomposition models have led to state-of-the-art performances in many domains,
including social network mining, image processing, array signal processing, and
wireless communications.
Previous research on tensor decompositions mainly approached from an optimiza-
tion perspective, which unfortunately does not come with the capability of tensor
rank learning and requires heavy hyper-parameter tuning. While these two tasks are
important in complexity control and avoiding overfitting, they are often overlooked
or downplayed in current research, and assumed can be achieved by trivial opera-
tions, or somehow can be obtained from other methods. In reality, estimating the
tensor rank and a proper set of hyper-parameters usually involve exhaustive search.
This requires running the same algorithm many times, effectively increasing the
computational complexity in actual model deployment.
Another path for model learning is Bayesian methods. They provide a natural
recipe for the integration of tensor rank learning, automatic hyper-parameter deter-
mination, and tensor decomposition. Due to this unique capability, Bayesian models
and inference trigger a recent interest in tensor decompositions for signal processing
and machine learning. From these recent works, Bayesian models show comparable
or even better performance than optimization-based counterparts.
However, Bayesian methods are very different from optimization methods, with
the former learning distributions of the unknown parameters, and the latter learning
a point estimate. The process of building the models and inference algorithm deriva-
tions are fundamentally different as well. This leads to a barrier between the two
groups of researchers working on similar problems but starting from different
v
vi Preface
perspectives. This book aims to distill the essentials of Bayesian modeling and infer-
ence in tensor research, and present a unified view of various models. The book
addresses the needs of postgraduate students, researchers, and practicing engineers
whose interests lie in tensor signal processing and machine learning. It can be used
as a textbook for short courses on specific topics, e.g., tensor learning methods,
Bayesian learning, and multi-dimensional data analytics. Demo codes can be down-
loaded from https://github.com/leicheng-tensor/Reproducible-Bayesian-Tensor-Mat
rix-Machine-Learning-SOTA. It is our hope that by lowering the barrier to under-
standing and entering the Bayesian landscape, more ideas and novel algorithms can
be stimulated and facilitated in the research community.
This book starts by reviewing the basics and classical algorithms for tensor
decompositions, and then introduces their common challenge on rank determination
(Chap. 1). To overcome this challenge, this book develops models and algorithms
under the Bayesian sparsity-aware learning framework, with the philosophy and
key results elaborated in Chap. 2. In Chaps. 3 and 4, we use the most basic tensor
decomposition format, Canonical Polyadic Decomposition (CPD), as an example
to elucidate the fundamental Bayesian modeling and inference that can achieve
automatic rank determination and hyper-parameter learning. Both parametric and
non-parametric modeling and inference are introduced and analyzed. In Chap. 5,
we demonstrate how Bayesian CPD is connected with stochastic optimization in
order to fit large-scale data. In Chap. 6, we show how the basic model can incorpo-
rate additional nonnegative structures to achieve enhanced performances in various
signal processing and machine learning tasks. Chapter 7 discusses the extension
of Bayesian methods to complex-valued data, handling orthogonal constraints and
outliers. Chapter 8 uses the direction-of-arrival estimation, which has been one of
the focuses of array signal processing for decades, as a case study to introduce the
Bayesian tensor decomposition under missing data. Finally, Chap. 9 extends the
modeling idea presented in previous chapters to other tensor decomposition formats,
including tensor Tucker decomposition, tensor-train decomposition, PARAFAC2
decomposition, and tensor SVD.
The authors sincerely thank the group members, Le Xu, Xueke Tong, and Yangge
Chen, at The University of Hong Kong for working on this topic together over the
years. This project is supported in part by the NSFC under Grant 62001309, and in
part by the General Research Fund from the Hong Kong Research Grant Council
under Grant 17207018.
vii
viii Contents
Abstract In this chapter, we will first introduce the preliminaries on tensors, includ-
ing terminologies and the associated notations, related multi-linear algebra, and more
importantly, widely used tensor decomposition formats. Then, we link the tensor
decompositions to the recent representation learning for multi-dimensional data,
showing the paramount role of tensors in modern signal processing and machine
learning. Finally, we review the recent algorithms for tensor decompositions, and
further analyze their common challenge in rank determination.
Plain letters (e.g., x) are used to denote scalars. The boldface lowercase (e.g., x)
and uppercase letters (e.g., X) are used for vectors and matrices, respectively. For
tensors, they are denoted by boldface calligraphic letters X.
In multi-linear algebra, the term order measures the number of indices used to
assess each data element (in scalar form). Specifically, vector x ∈ R I is of order 1
since its element xi can be assessed via only one index. Matrix X ∈ R I ×J is of order 2,
because two indices are enough to traverse all of its elements Xi, j . As a generalization,
tensors are of order three or higher. An N th order tensor X ∈ R I1 ×···×I N utilizes N
indices to address its elements Xi1 ,...,i N . For illustration, we depict scalar, vector,
matrix, and tensor in Fig. 1.1.
For an N th order tensor X, addressing each element requires N indices, and
each index corresponds to a mode, which is used to generalize the concepts of rows
and columns in matrices. For example, for a third-order tensor X ∈ R I1 ×I2 ×I3 , given
indices i 2 and i 3 , the vectors X:,i2 ,i3 are termed as mode-1 fibers.
From (1.2), it is easy to see that the Khatri–Rao product performs the column-wise
Kronecker product between two matrices {A, B}. The Khatri–Rao product is one of
the most critical operators in tensor canonical polyadic decomposition, which will
be elucidated in later sections.
The Hadamard product, which performs element-wise product between two matri-
ces {A, B}, is defined as follows.
In
(X ×n M)i1 ,...,in−1 ,r,in+1 ,...,i N = Mr,in Xi1 ,...,i N . (1.4)
i n =1
If the tensor X ∈ R I1 ×I2 is a matrix (or a 2D tensor), its 1-mode product with M
reduces to a matrix product, i.e., X ×1 M = M × X, where M ∈ R R×I1 . Similarly,
X ×2 M = X × MT , where X ∈ R I1 ×I2 , M ∈ R R×I2 . Another generalization from
vector/matrix algebra is the generalized inner product.
I1
I2
IN
< X, Y >= ··· Xi1 ,...,in Yi1 ,...,in . (1.6)
i 1 =1 i 2 =1 i n =1
In data analytic tasks, the l p norm, which was defined for vectors and matrices,
frequently appears in the designs of cost functions and regularizations. For tensors,
we can generalize its definition as follows.
Definition 1.7 (l p tensor norm) For a tensor X ∈ R I1 ×···×I N , its l p norm is:
⎛ ⎞1/ p
||X|| p = ⎝ |Xi1 ,...,i N | p ⎠ . (1.7)
i 1 ,...,i N
For p = 0, the l0 norm ||X||0 gives the number of non-zero elements (strictly
speaking l0 does not satisfy the usual norm properties), and thus acts as a measure
of sparsity. As its tightest convex surrogate, the l1 norm ||X||1 computes the sum
of absolute values of tensor X, and also can be treated as a convenient measure of
sparsity. The most widely used one is the l2 norm ||X||2 , which is also called the
Frobenius norm and denoted by ||X|| F .
As illustrated in Fig. 1.5, the CPD, also known as PARAFAC [3], decomposes tensor
data X ∈ R I1 ×···×I N into a summation of R rank-1 tensors [3]:
R
X= ur(1) ◦ · · · ◦ ur(N ) , (1.8)
r =1
rank-1 tensor
where ◦ denotes the vector outer product. Equation (1.8) states that the tensor X
consists of R rank-1 component tensors. If we put the vectors u1(n) , . . ., u(n)
R into a
factor matrices U(n) ∈ R In ×R defined as:
U(n) = u1(n) , . . . , u(n)
R , (1.9)
(1)
Equation (1.8) can be expressed as another equivalent form X = rR=1 U:,r ◦ ··· ◦
(N ) (1) (N )
U:,r := U , . . . , U , where · · · is known as the Kruskal operator. Notice
that the minimum number R that makes (1.8) hold is termed as tensor rank, which
generalizes the notion of matrix rank to high-order tensors.
Tensor CPD has been found in various data analytic tasks due to its appealing
uniqueness property. Here, we present one of the most widely used sufficient condi-
tions for CPD uniqueness. For other conditions that take additional structures (e.g.,
nonnegativity, orthogonality) into account, interested readers can refer to [1, 3].
1.2 Representation Learning via Tensors 7
In Property 1.1, the k-rank of matrix A is defined as the maximum value k such
that any k columns are linearly independent [1]. Property 1.1 states that under mild
conditions, tensor CPD is unique up to trivial scaling and permutation ambiguities.
This is one of the major differences between tensor CPD and low-rank matrix decom-
position, which is, in general, not unique unless some constraints are imposed. This
nice property has made CPD an important tool in the blind source separation and
data clustering-related tasks, as will be demonstrated in the following chapters.
The CPD disregards interactions among the columns of factor matrices and requires
the factor matrices to have the same number of columns. To achieve a more flex-
ible tensor representation, tensor TuckerD was introduced to generalize CPD by
allowing different column numbers of factor matrices and introducing a core tensor
G ∈ R R1 ×···×R N . Particularly, tensor TuckerD is defined as [1, 4]:
where each factor matrix U(n) ∈ R In ×Rn , ∀n. The tuple (R1 , . . . , R N ) is known as
multi-linear rank. An illustration of TuckerD is provided in Fig. 1.6. Note that when
the core tensor G is super-diagonal and R1 = · · · = R N , TuckerD reduces to CPD.
Using the Kruskal operator, Tucker D can be compactly denoted by:
The TTD decomposes tensor data X ∈ R I1 ×···×I N into a set of core tensors {G(n) ∈
R Rn ×In ×Rn+1 } such that [5]
Given the tensor decomposition models introduced in the last section, the next task
is to estimate the model parameters and hyper-parameters from the observed ten-
sor data. One straightforward approach is to formulate the learning problem as an
optimization problem (see Fig. 1.8). Specifically, in different application contexts,
cost functions can be designed to encode our knowledge of the tensor data and the
tensor model. Constraints on model parameters can be further added to embed the
side information. The problem formulation generally appears in the form:
“I’m so glad you liked it,” replied February, much pleased. “Now
I’ll trouble you for my thumb-nail and left ear-tip.”
The can was brought, and Max carefully measured out what
was wanted. February kissed Thekla’s hand (the tip of his nose felt
very cold), made a clumsy bow to both, and went away.
The children hugged each other. “If they’re all like that,” cried
they, “how jolly it will be!”
Greedy.
CHAPTER III.
LITTLE TOT.
It was a dark night, and very cold. As they sat by the fire
waiting, they could hear the frost cracking and snapping the tree-
boughs. Now and then a crash like thunder came. It was a limb,
overloaded with ice, breaking off, and falling to the ground. And by
and by, among the other noises, a strange, wild voice began to
mingle, making them all more fearful. It was March, who, as he
came through the forest, was talking to himself.
“Not I,” said Max, who was plucking up courage, “not if I know
it!—Of course you are to tell a story,” he continued aloud: “you
promised, and you ought to be a Month of your word. Thekla, put
away that broom. Now we’re all ready, sir.”
“Don’t you think, that Tot, the biggest one, was putting a stick
of wood on the fire when I looked in? Stick as big as she was,
almost! How she did it was a mystery. Little apron blew into the
flame, but I flew up the chimney and blew it the other way. ’Tisn’t
often I do a good turn, but I couldn’t help it then.”
“That was right,” said Thekla.
“Then she put a shawl over the other tot. Patted the corners
down just like an old woman, and put one on herself. Hind side
before, but no matter for that. Then she got into bed, and sang,
‘Hush by, Budda,—hus’ by, Budda,’ till the baby went to sleep. Then
she went to sleep too. I thought I’d like to see what would happen
when they woke up, so I sent the snow-storm on and stayed behind
with my eye to the chink.
“There was only one stick of wood left, and that was a big one.
Tot couldn’t move it. Pussy got on the table, and lapped up all the
milk in the pan. Then Tot cried hard, and said, ‘Mamma, come! oh
do come!’ over and over. She put all the clothes there were on the
bed. When the baby cried, she patted him with her little hand, and
cried too. When morning came, they were both still. I could see
them through the window. Away off on the prairie I heard the slow
jingle of a bell.
“They lay in the bed; but no little voices answered. The mother
gave a loud scream. ‘Oh, they are dead!’ she shrieked, and flung
herself over them.
“The men ran in. There were four of them. They built a fire and
warmed blankets, and put hot milk into the mouths of the little ones.
“‘This little fellow isn’t dead,’ said one of them. He wasn’t. Pretty
soon he opened his eyes, and when he saw his mother he began to
cry. Tot had wrapped him up so warm that the cold didn’t kill him,—
only made him dull.
“It took longer to bring her round, but at last they did. And the
first thing she said was, ‘Mamie didn’t mean to spill the milk.’
“I declare,” said March with a frog in his throat, “I never did see
the beat of that child.”
“And is that the end?” asked Thekla, who had been quietly
crying for some time past over little Tot’s troubles.
“Of course it’s the end,” replied March. “What did you expect?
And a very nice story it is, though I say it as shouldn’t.
“And now I’m off,” shouted he, and made a rush for the door.
SUCH colds! Never was any thing like them. Day after day Max sat
by the fire with a splitting headache, cold chills running down his
back; while night after night Thekla awoke, coughing and choking
from a spot in her throat which burned like a live coal. I can tell you,
when March gives a present he does it in real earnest.
“One day in an old garret I found the doll,
who, as I said, was living in a closet.”
The night was still. The noisy winds had fallen asleep, so that
you could hear the least sounds far away in the forest. By and by
light footsteps became audible, drawing nearer; and Max had time
to run for a chair and place it in the cosiest corner, before a soft tap
fell upon the door.
“May I come in?” said a voice, very gently and politely. How
different from rude March!
This was April. She looked very young and small; and, as Thekla
went forward to greet her, she felt as if it were some little visitor of
her own age come to tea. It was with a sense of protection and
hospitality that she took from her hand a great bundle, which
seemed heavy. April sat down, and then she put her arm round
Thekla’s waist and pulled her nearer, bundle and all. She had an odd,
pretty face when you came to look at it. The lips laughed of
themselves; but the eyes, which were blue and misty, seemed to
have tears behind them all ready to fall. Or if, as sometimes
happened, the lips took a fancy to pout, then the eyes had their
turn, and brightened and twinkled so that you could not help
smiling. It would have puzzled anybody whether to call the
countenance most sad or most merry. April’s hair was all wavy and
blowsy, as if she had been out in a gale of wind. Two or three violets
were stuck in it; and the voice with which she spoke sounded like
the tinkle of rain-drops on the leaves.
April laughed. She parted the flowers, and there were two little
new-born chicks, as yellow as the yolk of an egg. They were soft
and downy; and their cunning black eyes and little beaks gave them
a knowing look, which was astonishing, when you recollected how
short a time they had been in the world. “Cheep! cheep!” they cried,
and one ran directly into Thekla’s outstretched hands. The warm
fingers felt to it like a nest; and the little creature cuddled down
contentedly, with a soft note which expressed comfort. The other,
April handed to Max.
“They are for you,” she said. “If you like them and take care of
them, you may have a whole poultry-yard some day. My broods are
not always lucky; but these will be.”
“Like them,” indeed! You should have seen the happy fuss which
went on over the new pets. Max ran for a basket; Thekla brought
flannel to line it, and meal and water; and the chicks were kissed,
fed, and tucked away as if they had been babies. By and by they fell
fast asleep under their warm coverlet; and then the children went
back to the fire, and, while Max made ringlets of the dandelion-
stalks and stuck them in Thekla’s hair, April began:—
“My story isn’t much,” she said. “I’ve told so many in the course
of my life that I’m quite exhausted, for I make it a rule never to tell
the same twice. Some are so sad that it makes me cry merely to
think of them,”—and as she said this April’s tears suddenly rained
down her face,—“and others so jolly that I should split my sides if I
tried.” Here April giggled like a school-girl, and her eyes seemed to
send out rays of sun which danced on the wet tear-stains. “So it
must always be new,” she went on; “and, ever since I saw you, I’ve
been trying to decide what it should be. There was a delightful one
about ducklings which I thought of,—but no!” and she solemnly
shook her head.
“Why, what a rude boy you are!” cried April, beginning to sob. “I
declare, I ne—never was t—treated so before.”
“Very well then,” said April, pacified. “If you feel that way, I’ll
proceed. This doll lived in a closet. I should never have come across
her probably if it hadn’t been for the house-cleaning.
“You must know that there are countries in the world where
every spring and fall the houses are all turned upside down and
inside out, and then downside up and outside in, all for the sake of
being clean. The women do it. What becomes of the men I don’t
know: they climb trees or something to be out of the way, I
suppose. I like these times, of all things. I like to swing the heavy
carpets to and fro on the lines, and flap the maids’ aprons into their
faces as they stand on the ledge outside to wash the windows. It is
great fun. And I love to creep into holes and corners, and rummage
and poke about to see what folks have got. And one day, when
doing this in an old garret, I found the doll, who, as I said, was
living in a closet. They had put her there to be out of the way of the
cleaning.
“Her name was Maria. She was big, but not very beautiful. Her
head was dented, and there were marks of finger-nails on her
cheeks, which were faded and of a purplish-pink. But her arms and
legs were bran new, and white as snow, and her body was round
and full of sawdust. I couldn’t understand this at all until she
explained it. Her head, it seemed, was twenty-five years old; and her
body had only been in the world six weeks!
“Once, she said, she had possessed a body just the same age
as her head, and then she belonged to a person she called ‘Baby
May.’ Baby May used to bump her on the floor, and dig the soft wax
out of her cheeks with her nails. This treatment soon ruined her
good looks; and when she mentioned this, Maria almost cried,—but
not quite, because, as she said, years had taught her self-command.
I don’t know what she meant,” added April, reflectively. “I’m sure
years never taught me any thing of the sort. However, that is neither
here nor there! If she hadn’t had a fine constitution, Maria never
could have endured all this cruelty. Her body didn’t. It soon sank
under its sufferings; and, after spitting sawdust for some months,
wasted away so much that May’s mother said it must go into the
ragbag. People make a great fuss about having their heads cut off,
but Maria said it was quite easy if the scissors were sharp. Snip,
snip, rip, rip, and there you are. The head was put carefully away in
a wardrobe because it was so handsome, and May’s mamma
promised to buy a new body for it; but somehow she forgot, and by
and by May grew so big that she didn’t care to play with dolls any
more. So Maria’s head went on living in the wardrobe. Having no
longer any cares of the body to disturb it, it gave itself up to the
cultivation of the intellect. A wardrobe is a capital place for study, it
appears. People keep their best things there, and rarely come to
disturb them. At night, when the house is asleep, they wake up and
talk together, and tell secrets. The silk gowns converse about the
fine parties they have gone to, and the sights they have seen. There
were several silk gowns in the wardrobe. One of them had a large
spot of ice-cream on its front breadth. She used to let the other
things smell it, that they might know what luxury was like; and once
Maria got a chance, and licked it with her tongue, but she said it
didn’t taste as she expected. There was an India shawl, too, which
would lift the lid of its box, and relate stories—oh, so interesting!—
about black faces and white turbans and hot sunshine. The laces in
the drawer came from Belgium. That was a place to learn
geography! And the Roman pearls had a history too. They were
devout Catholics, and would tell their beads all night if nobody
seemed to be listening. But the Coral in the drawer below was Red
Republican in its opinions, and made no attempt to hide it. Both
hailed from Italy, but they were always quarrelling! Oh, Maria knew
a deal! As she grew wise, she ceased to care for tea-parties, and
being taken out to walk as formerly. All she wanted was to gain
information, and strengthen her mind. At least so she said; but for
all that,” remarked April, with a sly smile, “she had some lingering
regard for looks still, for she complained bitterly of the change in her
complexion. Perhaps it was putting so much inside her head made
the outside so dull and shabby!
“Who was it took her down?” asked Max, quite forgetful of his
original scorn about Maria’s history.
“It was Baby May. Not the same May, but another as like her as
two peas. In fact, the first May was grown up; and this was her little
girl. Grandmamma had bought a beautiful new body, and now
Maria’s head had to be sewed on to it. Her feelings when the
stitches were put in, she said, she could never describe. They were
like those of a poor old soldier, who, after living fifty years on his
pension, finds himself dragged from pipe and chimney-corner, and
obliged to begin again as a drummer-boy.”
“Yes,” said April; “but you haven’t heard the worst. Think of
being suddenly united to such a young body! There was Maria,
elderly and dignified, full of wisdom and experience, longing for
nothing so much as to be left alone to think over the facts she had
learned. And there were her arms and legs always wanting to be in
motion. New, impulsive, full of sawdust, it was misery to them to be
still. They wanted to dance and frisk all the time, to wear fine
clothes, to have other dolls come on visits, to drink tea out of the
baby-house tea-set, and have a good time generally. When Maria
assured them that she was tired of these things, and had seen the
vanity of them, they said they wanted to see the vanity too! And if
ever she got a quiet chance, and had fallen into a reverie about old
times and friends,—the silk stockings in the wardrobe, for instance,
and the touching story they had told her; or the shoe-buckles, who
were exiles from their country,—all of a sudden her obstreperous
limbs would assert themselves, out would flourish her legs, up fly
her hands and hit her in the eye, and the first thing she knew she
would be tumbled out on to the floor. Just think what a trial to a lady
of fine education and manners! It was enough to vex a saint. She
assured me she had lost at least three scruples of wax. But nobody
cared in the least about her scruples.”
“And what became of the poor thing in the end?” asked Thekla.
“That I can’t say,” replied April: “I had to come away, you know;
and I left her there. One of two things, she told me, was pretty sure
to happen: either her arms and legs would sober with time, or she
would get so hideous from unhappiness that May’s mamma would
buy a new head to match them. ‘Then, ah then!’ said she, ‘I may
perhaps be allowed to go back to my beloved top-shelf in the
wardrobe. Never, never will I quit it again so long as I live!’ She
ended with a sigh. I bade her farewell, but on the way downstairs I
met a little girl coming up and calling out, ‘Where dolly? me want
dolly!’ And I fear poor Maria was not left any longer in peace in the
attic closet.”
April closed her story. She took her moments from the can,
poured the dandelions into Thekla’s lap, and rose to go.
“I am late,” she said: “all my violets must be made before
midnight. I have none but these few in my hair.”
“Ah, no!” said April: “I must go. You won’t miss me long: May is
coming, my sister May. Everybody loves her better than they do me,”
and she wiped her eyes dolefully as she shut the door.
Oh, how cheerful the kitchen seemed now! Where were the
colds and the disconsolate looks? All gone; and Max and Thekla
laughed gayly into each other’s faces.
“I’ll tell you what,” said Max, “if April didn’t cry so easily, she’d
be one of the jolliest girls in the world.”
“Good-by, dears!”
CHAPTER V.
MAY’S GARDEN.
THE chicks throve. Day by day their legs grew strong, their yellow
bodies round and full, and their calls for food more clamorous. As
the snow melted, and the sun made warm spots on the earth, they
began to run from the cottage-door, and poke and scratch about
with their bills. But they always came back to the basket to sleep;
and Thekla prepared their food, and watched over them as well as
any old hen could have done.
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.
ebookbell.com