100% found this document useful (1 vote)
18 views49 pages

Bayesian Tensor Decomposition For Signal Processing and Machine Learning Modeling Tuningfree Algorithms and Applications Lei Cheng Instant Download

The document discusses 'Bayesian Tensor Decomposition for Signal Processing and Machine Learning,' focusing on tuning-free algorithms and their applications. It highlights the advantages of Bayesian methods over traditional optimization techniques, particularly in tensor rank learning and hyper-parameter determination. The book aims to provide a comprehensive understanding of Bayesian modeling in tensor research for students and professionals in the field.

Uploaded by

elandloebsrs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
18 views49 pages

Bayesian Tensor Decomposition For Signal Processing and Machine Learning Modeling Tuningfree Algorithms and Applications Lei Cheng Instant Download

The document discusses 'Bayesian Tensor Decomposition for Signal Processing and Machine Learning,' focusing on tuning-free algorithms and their applications. It highlights the advantages of Bayesian methods over traditional optimization techniques, particularly in tensor rank learning and hyper-parameter determination. The book aims to provide a comprehensive understanding of Bayesian modeling in tensor research for students and professionals in the field.

Uploaded by

elandloebsrs
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

Bayesian Tensor Decomposition For Signal

Processing And Machine Learning Modeling


Tuningfree Algorithms And Applications Lei Cheng
download
https://ebookbell.com/product/bayesian-tensor-decomposition-for-
signal-processing-and-machine-learning-modeling-tuningfree-
algorithms-and-applications-lei-cheng-49152314

Explore and download more ebooks at ebookbell.com


Here are some recommended products that we believe you will be
interested in. You can click the link to download.

Bayesian Thinking In Biostatistics Purushottam W Laud Wesley O Johnson

https://ebookbell.com/product/bayesian-thinking-in-biostatistics-
purushottam-w-laud-wesley-o-johnson-46469338

Bayesian Approaches To Shrinkage And Sparse Estimation Dimitris


Korobilis

https://ebookbell.com/product/bayesian-approaches-to-shrinkage-and-
sparse-estimation-dimitris-korobilis-46884056

Bayesian Reasoning And Gaussian Processes For Machine Learning


Applications Shubham Tayal

https://ebookbell.com/product/bayesian-reasoning-and-gaussian-
processes-for-machine-learning-applications-shubham-tayal-46969554

Bayesian Optimization In Action Meap V07 1st Chapters 1 To 8 Of 13


Quan Nguyen

https://ebookbell.com/product/bayesian-optimization-in-action-
meap-v07-1st-chapters-1-to-8-of-13-quan-nguyen-47541510
Bayesian Statistical Modeling With Stan R And Python Kentaro Matsuura

https://ebookbell.com/product/bayesian-statistical-modeling-with-stan-
r-and-python-kentaro-matsuura-47573672

Bayesian Optimization 1st Roman Garnett

https://ebookbell.com/product/bayesian-optimization-1st-roman-
garnett-47581428

Bayesian Analysis Of Stochastic Process Models David Insua Fabrizio


Ruggeri

https://ebookbell.com/product/bayesian-analysis-of-stochastic-process-
models-david-insua-fabrizio-ruggeri-47691782

Bayesian Analysis With Excel And R 1st Conrad Carlberg

https://ebookbell.com/product/bayesian-analysis-with-excel-and-r-1st-
conrad-carlberg-47735706

Bayesian Analysis Of Infectious Diseases Covid19 And Beyond 1st


Edition Lyle D Broemeling

https://ebookbell.com/product/bayesian-analysis-of-infectious-
diseases-covid19-and-beyond-1st-edition-lyle-d-broemeling-48123254
Lei Cheng
Zhongtao Chen
Yik-Chung Wu

Bayesian Tensor
Decomposition
for Signal
Processing and
Machine Learning
Modeling, Tuning-Free Algorithms, and
Applications
Bayesian Tensor Decomposition for Signal
Processing and Machine Learning
Lei Cheng · Zhongtao Chen · Yik-Chung Wu

Bayesian Tensor
Decomposition for Signal
Processing and Machine
Learning
Modeling, Tuning-Free Algorithms,
and Applications
Lei Cheng Zhongtao Chen
College of Information Science Department of Electrical and Electronic
and Electronic Engineering Engineering
Zhejiang University The University of Hong Kong
Hangzhou, China Hong Kong, China

Yik-Chung Wu
Department of Electrical and Electronic
Engineering
The University of Hong Kong
Hong Kong, China

ISBN 978-3-031-22437-9 ISBN 978-3-031-22438-6 (eBook)


https://doi.org/10.1007/978-3-031-22438-6

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature
Switzerland AG 2023
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse
of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and
transmission or information storage and retrieval, electronic adaptation, computer software, or by similar
or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this book
are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

Our world is full of data, and these data often appear in high-dimensional structures,
with each dimension describing a unique attribute. Examples include data in social
sciences, medicines, pharmacology, and environmental monitoring, just to name a
few. To make sense of the multi-dimensional data, advanced computational tools,
which directly work with tensor rather than first converting a tensor to a matrix, are
needed to unveil the hidden patterns of the data. This is where tensor decomposi-
tion models come into play. Due to the remarkable representation capability, tensor
decomposition models have led to state-of-the-art performances in many domains,
including social network mining, image processing, array signal processing, and
wireless communications.
Previous research on tensor decompositions mainly approached from an optimiza-
tion perspective, which unfortunately does not come with the capability of tensor
rank learning and requires heavy hyper-parameter tuning. While these two tasks are
important in complexity control and avoiding overfitting, they are often overlooked
or downplayed in current research, and assumed can be achieved by trivial opera-
tions, or somehow can be obtained from other methods. In reality, estimating the
tensor rank and a proper set of hyper-parameters usually involve exhaustive search.
This requires running the same algorithm many times, effectively increasing the
computational complexity in actual model deployment.
Another path for model learning is Bayesian methods. They provide a natural
recipe for the integration of tensor rank learning, automatic hyper-parameter deter-
mination, and tensor decomposition. Due to this unique capability, Bayesian models
and inference trigger a recent interest in tensor decompositions for signal processing
and machine learning. From these recent works, Bayesian models show comparable
or even better performance than optimization-based counterparts.
However, Bayesian methods are very different from optimization methods, with
the former learning distributions of the unknown parameters, and the latter learning
a point estimate. The process of building the models and inference algorithm deriva-
tions are fundamentally different as well. This leads to a barrier between the two
groups of researchers working on similar problems but starting from different

v
vi Preface

perspectives. This book aims to distill the essentials of Bayesian modeling and infer-
ence in tensor research, and present a unified view of various models. The book
addresses the needs of postgraduate students, researchers, and practicing engineers
whose interests lie in tensor signal processing and machine learning. It can be used
as a textbook for short courses on specific topics, e.g., tensor learning methods,
Bayesian learning, and multi-dimensional data analytics. Demo codes can be down-
loaded from https://github.com/leicheng-tensor/Reproducible-Bayesian-Tensor-Mat
rix-Machine-Learning-SOTA. It is our hope that by lowering the barrier to under-
standing and entering the Bayesian landscape, more ideas and novel algorithms can
be stimulated and facilitated in the research community.
This book starts by reviewing the basics and classical algorithms for tensor
decompositions, and then introduces their common challenge on rank determination
(Chap. 1). To overcome this challenge, this book develops models and algorithms
under the Bayesian sparsity-aware learning framework, with the philosophy and
key results elaborated in Chap. 2. In Chaps. 3 and 4, we use the most basic tensor
decomposition format, Canonical Polyadic Decomposition (CPD), as an example
to elucidate the fundamental Bayesian modeling and inference that can achieve
automatic rank determination and hyper-parameter learning. Both parametric and
non-parametric modeling and inference are introduced and analyzed. In Chap. 5,
we demonstrate how Bayesian CPD is connected with stochastic optimization in
order to fit large-scale data. In Chap. 6, we show how the basic model can incorpo-
rate additional nonnegative structures to achieve enhanced performances in various
signal processing and machine learning tasks. Chapter 7 discusses the extension
of Bayesian methods to complex-valued data, handling orthogonal constraints and
outliers. Chapter 8 uses the direction-of-arrival estimation, which has been one of
the focuses of array signal processing for decades, as a case study to introduce the
Bayesian tensor decomposition under missing data. Finally, Chap. 9 extends the
modeling idea presented in previous chapters to other tensor decomposition formats,
including tensor Tucker decomposition, tensor-train decomposition, PARAFAC2
decomposition, and tensor SVD.
The authors sincerely thank the group members, Le Xu, Xueke Tong, and Yangge
Chen, at The University of Hong Kong for working on this topic together over the
years. This project is supported in part by the NSFC under Grant 62001309, and in
part by the General Research Fund from the Hong Kong Research Grant Council
under Grant 17207018.

Hangzhou, China Lei Cheng


Hong Kong, China Zhongtao Chen
Hong Kong, China Yik-Chung Wu
August 2022
Contents

1 Tensor Decomposition: Basics, Algorithms, and Recent


Advances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Terminologies and Notations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.1 Scalar, Vector, Matrix, and Tensor . . . . . . . . . . . . . . . . . . . . . . 1
1.1.2 Tensor Unfolding/Matricization . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.3 Tensor Products and Norms . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Representation Learning via Tensors . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.1 Canonical Polyadic Decomposition (CPD) . . . . . . . . . . . . . . . 6
1.2.2 Tucker Decomposition (TuckerD) . . . . . . . . . . . . . . . . . . . . . . 7
1.2.3 Tensor Train Decomposition (TTD) . . . . . . . . . . . . . . . . . . . . . 8
1.3 Model Fitting and Challenges Ahead . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.3.1 Example: Tensor CPD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.3.2 Challenges in Rank Determination . . . . . . . . . . . . . . . . . . . . . . 13
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2 Bayesian Learning for Sparsity-Aware Modeling . . . . . . . . . . . . . . . . . . 15
2.1 Bayes’ Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2 Bayesian Learning and Sparsity-Aware Learning . . . . . . . . . . . . . . . . 16
2.3 Prior Design for Sparsity-Aware Modeling . . . . . . . . . . . . . . . . . . . . . 17
2.4 Inference Algorithm Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.5 Mean-Field Variational Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.5.1 General Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.5.2 Tractability of MF-VI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.5.3 Definition of MPCEF Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.5.4 Optimal Variational Pdfs for MPCEF Model . . . . . . . . . . . . . 31
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
3 Bayesian Tensor CPD: Modeling and Inference . . . . . . . . . . . . . . . . . . . 35
3.1 A Unified Probabilistic Modeling Using GSM Prior . . . . . . . . . . . . . 35
3.2 PCPD-GG: Probabilistic Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
3.3 PCPD-GH: Probabilistic Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.4 PCPD-GH, PCPD-GG: Inference Algorithm . . . . . . . . . . . . . . . . . . . 44

vii
viii Contents

3.4.1 Optimal Variational Pdfs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45


3.4.2 Setting the Hyper-parameters . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3.5 Algorithm Summary and Insights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.5.1 Convergence Property . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.5.2 Automatic Tensor Rank Learning . . . . . . . . . . . . . . . . . . . . . . . 48
3.5.3 Computational Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
3.5.4 Reducing to PCPD-GG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
3.6 Non-parametric Modeling: PCPD-MGP . . . . . . . . . . . . . . . . . . . . . . . 51
3.7 PCPD-MGP: Inference Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4 Bayesian Tensor CPD: Performance and Real-World
Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
4.1 Numerical Results on Synthetic Data . . . . . . . . . . . . . . . . . . . . . . . . . . 59
4.1.1 Simulation Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
4.1.2 PCPD-GH Versus PCPD-GG . . . . . . . . . . . . . . . . . . . . . . . . . . 60
4.1.3 Comparisons with Non-parametric PCPD-MGP . . . . . . . . . . 65
4.2 Real-World Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
4.2.1 Fluorescence Data Analytics . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
4.2.2 Hyperspectral Images Denoising . . . . . . . . . . . . . . . . . . . . . . . 73
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
5 When Stochastic Optimization Meets VI: Scaling Bayesian
CPD to Massive Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
5.1 CPD Problem Reformulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
5.1.1 Probabilistic Model and Inference
for the Reformulated Problem . . . . . . . . . . . . . . . . . . . . . . . . . 78
5.2 Interpreting VI Update from Natural Gradient Descent
Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
5.2.1 Optimal Variational Pdfs in Exponential Family Form . . . . . 81
5.2.2 VI Updates as Natural Gradient Descent . . . . . . . . . . . . . . . . . 83
5.3 Scalable VI Algorithm for Tensor CPD . . . . . . . . . . . . . . . . . . . . . . . . 86
5.3.1 Summary of Iterative Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 87
5.3.2 Further Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
5.4 Numerical Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
5.4.1 Convergence Performance on Synthetic Data . . . . . . . . . . . . . 90
5.4.2 Tensor Rank Estimation on Synthetic Data . . . . . . . . . . . . . . . 93
5.4.3 Video Background Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . 97
5.4.4 Image Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
6 Bayesian Tensor CPD with Nonnegative Factors . . . . . . . . . . . . . . . . . . . 103
6.1 Tensor CPD with Nonnegative Factors . . . . . . . . . . . . . . . . . . . . . . . . . 103
6.1.1 Motivating Example—Social Group Clustering . . . . . . . . . . . 103
6.1.2 General Problem and Challenges . . . . . . . . . . . . . . . . . . . . . . . 105
6.2 Probabilistic Modeling for CPD with Nonnegative Factors . . . . . . . . 106
Contents ix

6.2.1 Properties of Nonnegative Gaussian-Gamma Prior . . . . . . . . 106


6.2.2 Probabilistic Modeling of CPD with Nonnegative
Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
6.3 Inference Algorithm for Tensor CPD with Nonnegative
Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
6.3.1 Derivation for Variational Pdfs . . . . . . . . . . . . . . . . . . . . . . . . . 112
6.3.2 Summary of the Inference Algorithm . . . . . . . . . . . . . . . . . . . 114
6.3.3 Discussions and Insights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
6.4 Algorithm Accelerations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
6.5 Numerical Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
6.5.1 Validation on Synthetic Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
6.5.2 Fluorescence Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
6.5.3 ENRON E-mail Data Mining . . . . . . . . . . . . . . . . . . . . . . . . . . 129
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
7 Complex-Valued CPD, Orthogonality Constraint, and Beyond
Gaussian Noises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
7.1 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
7.2 Probabilistic Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
7.3 Inference Algorithm Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
(k)
7.3.1 Derivation for Q(  (k)),  1 ≤ k ≤ P ..................... 140
7.3.2 Derivation for Q  , P + 1 ≤ k ≤ N . . . . . . . . . . . . . . . . 141
7.3.3 Derivation for Q(E) . . .. . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . 142
7.3.4 Derivations for Q(γl ),Q ζi1 ,...,i N , and Q(β) . . . . . . . . . . . . . 143
7.3.5 Summary of the Iterative Algorithm . . . . . . . . . . . . . . . . . . . . 144
7.3.6 Further Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
7.4 Simulation Results and Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
7.4.1 Validation on Synthetic Data . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
7.4.2 Blind Data Detection for DS-CDMA Systems . . . . . . . . . . . . 150
7.4.3 Linear Image Coding for a Collection of Images . . . . . . . . . . 151
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
8 Handling Missing Value: A Case Study in Direction-of-Arrival
Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
8.1 Linking DOA Subspace Estimation to Tensor Completion . . . . . . . . 155
8.2 Probabilistic Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
8.3 MPCEF Model Checking and Optimal Variational Pdfs
Derivations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
8.3.1 MPCEF Model Checking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
8.3.2 Optimal Variational Pdfs Derivations . . . . . . . . . . . . . . . . . . . 163
8.4 Algorithm Summary and Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
8.5 Simulation Results and Discussions . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
x Contents

9 From CPD to Other Tensor Decompositions . . . . . . . . . . . . . . . . . . . . . . 169


9.1 Tucker Decomposition (TuckerD) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
9.2 Tensor Train Decomposition (TTD) . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
9.3 PARAFAC2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
9.4 Tensor-SVD (T-SVD) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
Chapter 1
Tensor Decomposition: Basics,
Algorithms, and Recent Advances

Abstract In this chapter, we will first introduce the preliminaries on tensors, includ-
ing terminologies and the associated notations, related multi-linear algebra, and more
importantly, widely used tensor decomposition formats. Then, we link the tensor
decompositions to the recent representation learning for multi-dimensional data,
showing the paramount role of tensors in modern signal processing and machine
learning. Finally, we review the recent algorithms for tensor decompositions, and
further analyze their common challenge in rank determination.

1.1 Terminologies and Notations

1.1.1 Scalar, Vector, Matrix, and Tensor

Plain letters (e.g., x) are used to denote scalars. The boldface lowercase (e.g., x)
and uppercase letters (e.g., X) are used for vectors and matrices, respectively. For
tensors, they are denoted by boldface calligraphic letters X.
In multi-linear algebra, the term order measures the number of indices used to
assess each data element (in scalar form). Specifically, vector x ∈ R I is of order 1
since its element xi can be assessed via only one index. Matrix X ∈ R I ×J is of order 2,
because two indices are enough to traverse all of its elements Xi, j . As a generalization,
tensors are of order three or higher. An N th order tensor X ∈ R I1 ×···×I N utilizes N
indices to address its elements Xi1 ,...,i N . For illustration, we depict scalar, vector,
matrix, and tensor in Fig. 1.1.
For an N th order tensor X, addressing each element requires N indices, and
each index corresponds to a mode, which is used to generalize the concepts of rows
and columns in matrices. For example, for a third-order tensor X ∈ R I1 ×I2 ×I3 , given
indices i 2 and i 3 , the vectors X:,i2 ,i3 are termed as mode-1 fibers.

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 1


L. Cheng et al., Bayesian Tensor Decomposition for Signal Processing and
Machine Learning, https://doi.org/10.1007/978-3-031-22438-6_1
2 1 Tensor Decomposition: Basics, Algorithms, and Recent Advances

Fig. 1.1 Illustration of scalar, matrix, and tensor

1.1.2 Tensor Unfolding/Matricization

Tensor unfolding/matricization aims to re-organize the fibers in one mode into a


matrix. For an N th order tensor X ∈ R I1 ×···×I N , since it has N modes, there are N
types of unfolding, each termed as mode-n unfolding. We formally define it as follows
and illustrate it in Fig. 1.2.

Definition 1.1 (Mode-n Unfolding) Given a tensor X ∈ R I1 ×···×I N , its


N
mode-n unfolding gives a matrix X(n) ∈ R In × k=1,k=n Ik . Each tensor element
Xi1 ,...,i N is mapped to the matrix element Xi(n) n, j
with j = i 1 + (i 2 − 1)I1 +
· · · + (i n−1 − 1)I1 · · · In−2 +(i n+1 − 1)I1 · · · In + · · · + (i N − 1) I1 · · · I N −1 .

Tensor unfolding/matricization is one of the most important operators in tensor-


based data analytics, since it gives a “matrix” view to describe an N th order tensor
data, such that fruitful results in linear algebra can be utilized. As will be seen in
the later sections, most tensor algorithms involve basic operations on the matrices
provided by “unfolding/matricization”, and the special tensor/matrix products intro-
duced in the following subsection.

1.1.3 Tensor Products and Norms

Tensor decompositions and products are essentially built on matrix products. We


introduce the most widely used ones in this subsection. For a full list of matrix/tensor
products, readers can refer to [1]. All the tensor operations in this subsection have
been implemented in Matlab tensor toolbox [2].
1.1 Terminologies and Notations 3

Fig. 1.2 Illustration of tensor unfolding/matricization for different modes

Definition 1.2 (Kronecker Product) Given two matrices A ∈ R I1 ×I2 and


B ∈ R J1 ×J2 , their Kronecker product is defined as
⎡ ⎤
a11 B · · · a1I2 B
⎢ ⎥
A  B = ⎣ ... . . . ... ⎦ ∈ R I1 J1 ×I2 J2 . (1.1)
a I1 1 B · · · a I1 I2 B
C

As seen in (1.1), the Kronecker product between A and B results in a matrix


C with enlarged dimensions. From another angle, the Kronecker product provides
an effective way to represent a large matrix C (if it satisfies (1.1)) by two smaller
matrices {A, B}. This product will be useful for tensor Tucker decompositions, as
will be elaborated in later sections.
Next, using the Kronecker product, we define another important matrix product.

Definition 1.3 (Khatri–Rao Product) Given two matrices A ∈ R I ×R and


B ∈ R J ×R , their Khatri–Rao product is defined as:

A  B = A:,1  B:,1 , A:,2  B:,2 , . . ., A:,R  B:,R ∈ R I J ×R . (1.2)
4 1 Tensor Decomposition: Basics, Algorithms, and Recent Advances

Fig. 1.3 Illustration of 1-mode product

From (1.2), it is easy to see that the Khatri–Rao product performs the column-wise
Kronecker product between two matrices {A, B}. The Khatri–Rao product is one of
the most critical operators in tensor canonical polyadic decomposition, which will
be elucidated in later sections.
The Hadamard product, which performs element-wise product between two matri-
ces {A, B}, is defined as follows.

Definition 1.4 (Hadamard Product) Given two matrices A ∈ R I1 ×I2 and


B ∈ R I1 ×I2 , their Hadamard product is:
⎡ ⎤
a11 b11 · · · a1I2 b1I2
⎢ .. .. .. ⎥ I ×I
AB=⎣ . . . ⎦ ∈ R 1 2. (1.3)
a I1 1 b I1 1 · · · a I1 I2 b I1 I2

Then we define several tensor products.

Definition 1.5 (n-mode Product) The n-mode product between a ten-


sor X ∈ R I1 ×···×I N and a matrix M ∈ R R×In results in a tensor (X ×n M)
∈ R I1 ×···×In−1 ×R×In+1 ×···×I N , with each element being:


In
(X ×n M)i1 ,...,in−1 ,r,in+1 ,...,i N = Mr,in Xi1 ,...,i N . (1.4)
i n =1

An illustration of 1-mode product between a 3D tensor X and a matrix M is


given in Fig. 1.3. Furthermore, the n-mode product can also be expressed as a matrix
product in terms of the mode-n unfolding,

(X ×n M)(n) = M × X(n) . (1.5)


1.2 Representation Learning via Tensors 5

If the tensor X ∈ R I1 ×I2 is a matrix (or a 2D tensor), its 1-mode product with M
reduces to a matrix product, i.e., X ×1 M = M × X, where M ∈ R R×I1 . Similarly,
X ×2 M = X × MT , where X ∈ R I1 ×I2 , M ∈ R R×I2 . Another generalization from
vector/matrix algebra is the generalized inner product.

Definition 1.6 (Generalized Inner Product) For a tensor X ∈ R I1 ×···×I N and


a tensor Y ∈ R I1 ×···×I N , their generalized inner product is defined as:


I1 
I2 
IN
< X, Y >= ··· Xi1 ,...,in Yi1 ,...,in . (1.6)
i 1 =1 i 2 =1 i n =1

In data analytic tasks, the l p norm, which was defined for vectors and matrices,
frequently appears in the designs of cost functions and regularizations. For tensors,
we can generalize its definition as follows.

Definition 1.7 (l p tensor norm) For a tensor X ∈ R I1 ×···×I N , its l p norm is:
⎛ ⎞1/ p

||X|| p = ⎝ |Xi1 ,...,i N | p ⎠ . (1.7)
i 1 ,...,i N

For p = 0, the l0 norm ||X||0 gives the number of non-zero elements (strictly
speaking l0 does not satisfy the usual norm properties), and thus acts as a measure
of sparsity. As its tightest convex surrogate, the l1 norm ||X||1 computes the sum
of absolute values of tensor X, and also can be treated as a convenient measure of
sparsity. The most widely used one is the l2 norm ||X||2 , which is also called the
Frobenius norm and denoted by ||X|| F .

1.2 Representation Learning via Tensors

Multi-dimensional data from various applications can be naturally represented as


tensors. To understand these data, representation learning aims at extracting low-
dimensional yet informative parameters (in terms of smaller tensors, matrices, and
vectors) from the tensor data. It is hoped that the extracted parameters can preserve the
structures endowed by the physical phenomenon and reveal hidden interpretations.
To achieve this goal, tensor decompositions with various structural constraints are
developed, as illustrated in Fig. 1.4. In the following subsections, we introduce three
widely used tensor decomposition formats with increasing complexity in modeling,
namely canonical polyadic decomposition (CPD), Tucker decomposition (TuckerD),
and tensor train decomposition (TTD).
6 1 Tensor Decomposition: Basics, Algorithms, and Recent Advances

Fig. 1.4 Illustration of representation learning via tensors

1.2.1 Canonical Polyadic Decomposition (CPD)

As illustrated in Fig. 1.5, the CPD, also known as PARAFAC [3], decomposes tensor
data X ∈ R I1 ×···×I N into a summation of R rank-1 tensors [3]:


R
X= ur(1) ◦ · · · ◦ ur(N ) , (1.8)
r =1
rank-1 tensor

where ◦ denotes the vector outer product. Equation (1.8) states that the tensor X
consists of R rank-1 component tensors. If we put the vectors u1(n) , . . ., u(n)
R into a
factor matrices U(n) ∈ R In ×R defined as:
 
U(n) = u1(n) , . . . , u(n)
R , (1.9)

 (1)
Equation (1.8) can be expressed as another equivalent form X = rR=1 U:,r ◦ ··· ◦
(N ) (1) (N )
U:,r := U , . . . , U , where · · · is known as the Kruskal operator. Notice
that the minimum number R that makes (1.8) hold is termed as tensor rank, which
generalizes the notion of matrix rank to high-order tensors.
Tensor CPD has been found in various data analytic tasks due to its appealing
uniqueness property. Here, we present one of the most widely used sufficient condi-
tions for CPD uniqueness. For other conditions that take additional structures (e.g.,
nonnegativity, orthogonality) into account, interested readers can refer to [1, 3].
1.2 Representation Learning via Tensors 7

Fig. 1.5 Illustration of a CPD for a third-order tensor

Property 1.1 (Uniqueness condition for CPD [1]) Suppose


U(1) , U(2) , . . ., U(N )  = (1) , (2) , . . ., (N ) , and n=1
N
kn ≥ 2R+(N − 1),
(i)
where ki denotes the k-rank of matrix U and R is the tensor rank. Then
the following equations hold: (1) = U(1) (1) , (2) = U(2) (2) , …,
(N ) = U(N ) (N )  where  is a permutation matrix and the diagonal
matrix (n) satisfies n=1 N
(n) = I R .

In Property 1.1, the k-rank of matrix A is defined as the maximum value k such
that any k columns are linearly independent [1]. Property 1.1 states that under mild
conditions, tensor CPD is unique up to trivial scaling and permutation ambiguities.
This is one of the major differences between tensor CPD and low-rank matrix decom-
position, which is, in general, not unique unless some constraints are imposed. This
nice property has made CPD an important tool in the blind source separation and
data clustering-related tasks, as will be demonstrated in the following chapters.

1.2.2 Tucker Decomposition (TuckerD)

The CPD disregards interactions among the columns of factor matrices and requires
the factor matrices to have the same number of columns. To achieve a more flex-
ible tensor representation, tensor TuckerD was introduced to generalize CPD by
allowing different column numbers of factor matrices and introducing a core tensor
G ∈ R R1 ×···×R N . Particularly, tensor TuckerD is defined as [1, 4]:

X = G ×1 U(1) ×2 U(2) ×3 · · · × N U(N ) , (1.10)


8 1 Tensor Decomposition: Basics, Algorithms, and Recent Advances

Fig. 1.6 Illustration of a TuckerD for a third-order tensor

where each factor matrix U(n) ∈ R In ×Rn , ∀n. The tuple (R1 , . . . , R N ) is known as
multi-linear rank. An illustration of TuckerD is provided in Fig. 1.6. Note that when
the core tensor G is super-diagonal and R1 = · · · = R N , TuckerD reduces to CPD.
Using the Kruskal operator, Tucker D can be compactly denoted by:

X = G; U(1) , . . . , U(N ) . (1.11)

Although TuckerD provides flexibilities for data representation, it is not unique


in general [1, 4]. Therefore, it is frequently used in the data compression, basis
function learning, and feature extraction-related tasks, where uniqueness is not the
most important consideration.

1.2.3 Tensor Train Decomposition (TTD)

The TTD decomposes tensor data X ∈ R I1 ×···×I N into a set of core tensors {G(n) ∈
R Rn ×In ×Rn+1 } such that [5]

Xi1 ,...,i N = G(1) (N )


:,i 1 ,: × · · · × G:,i N ,: . (1.12)

In (1.12), each core tensor slice G(n)


:,i n ,: ∈ R
Rn ×Rn+1
. Since each Xi1 ,...,i N is a scalar, R1
and R N +1 are both required to be 1. The tuple (R1 , . . . , R N +1 ) is termed as TT-rank.
In quantum physics, TTD is known as a matrix-product state [5]. The TTD for a
third-order tensor is illustrated in Fig. 1.7.
Due to its flexibility, TTD with appropriately chosen TT-rank has shown supe-
rior performance in a variety of data analytic tasks, including image completion,
classification, and neural network compression.
1.3 Model Fitting and Challenges Ahead 9

Fig. 1.7 Illustration of a TTD for a third-order tensor

1.3 Model Fitting and Challenges Ahead

Given the tensor decomposition models introduced in the last section, the next task
is to estimate the model parameters and hyper-parameters from the observed ten-
sor data. One straightforward approach is to formulate the learning problem as an
optimization problem (see Fig. 1.8). Specifically, in different application contexts,
cost functions can be designed to encode our knowledge of the tensor data and the
tensor model. Constraints on model parameters can be further added to embed the
side information. The problem formulation generally appears in the form:

Fig. 1.8 Tensor-based representation learning from an optimization perspective


Random documents with unrelated
content Scribd suggests to you:
“Out of her sight, however, his temper revived. He got into a
great huff. ‘Leave the den?’ Of course he would, and very glad to see
the last of it. So he went and chose a hole for himself to live in. It
was quite close to the village,—a great deal too close for safety. But
the silly creature had lost all his instinct by living with human beings.
And whenever the bells rang or any thing seemed to be going on, he
would rush out to peep, and find what it was. I only wonder they
didn’t catch him long ago.”

“Did they catch him, then?” asked Max.

“You shall hear. Only yesterday it was that a caravan with a


band of music came into the village. Greedy heard the sounds, and it
seemed as if he would go wild. He dodged among the bushes, and
looked on as long as he could stand it, and then, seized with a
desire to distinguish himself, out he came. The circus people couldn’t
believe their eyes when they saw him prancing after them, his head
on one side, and taking steps like a dancing-master. Of course such
a prize was not to be resisted. They lost no time; and, when I
caught sight of them, poor Greedy had already a muzzle on his jaws
and a rope round his neck. A boy was banging his sides with a stick,
his tail was between his legs, and I must say,” ended February,
laughing heartily, “he didn’t look particularly happy at being taken
back into fashionable life after this manner.”

“That’s first-rate,” cried Max, in fits of amusement.

“I’m so glad you liked it,” replied February, much pleased. “Now
I’ll trouble you for my thumb-nail and left ear-tip.”

The can was brought, and Max carefully measured out what
was wanted. February kissed Thekla’s hand (the tip of his nose felt
very cold), made a clumsy bow to both, and went away.

The children hugged each other. “If they’re all like that,” cried
they, “how jolly it will be!”
Greedy.
CHAPTER III.
LITTLE TOT.

FEBRUARY went by like a flash, or the children thought so. It was


really a short month: but, besides, they were very busy; and work,
you know, makes time fly. Thekla, who had just learned to spin, had
a job on hand of which she was proud. It was no less than spinning
and carding the wool for a bran-new suit of clothes which Max was
to wear next year. Dyed brown, and woven by Mother Gretel the
cunning weaver, they were to be something grand. As for Max, his
work was wood-carving. Nearly all the German boys can carve; and
he and Thekla thought the spoon over which he was so busy, and
which had grape leaves and tendrils on the handle, most beautiful. It
would go to the great Spring Fair, and fetch a large price, perhaps as
much as a silver dollar. Altogether, they could hardly believe the
calendar when it showed them a month had gone by, and that
evening they must look for another visitor.
“Then the Tot said, ‘Budda hundry.’”

It was a dark night, and very cold. As they sat by the fire
waiting, they could hear the frost cracking and snapping the tree-
boughs. Now and then a crash like thunder came. It was a limb,
overloaded with ice, breaking off, and falling to the ground. And by
and by, among the other noises, a strange, wild voice began to
mingle, making them all more fearful. It was March, who, as he
came through the forest, was talking to himself.

“Blow, blow!” he was saying. “I’m coming on to blow. Rock,


rock! There’d better be no babies in my tree-tops. To and fro, to and
fro, roots and trunks alike, and the very stones must laugh and roll if
I choose to tickle them.” And then he gave a loud thump at the door,
and, without waiting answer, banged it open and marched in. He
looked so big and fierce and stormy that Thekla shrank back,
without daring to push forward a stool for him to sit upon; and even
Max, who had pluck enough for ten boys, felt afraid.
“Won’t you sit down, sir?” he said at last very meekly, and went
to shut the door, which March had left open. Quite a little heap of
dead leaves and snow had collected on the sill; and Thekla, who was
a born housewife, ran to brush them up. March twirled round on his
stool, and watched her proceedings with great scorn.

“Sweep!” he said in a voice like a big wind. “You call that


sweeping? You should see me when I get at it. I scoop up all the
leaves in the world at once, and send them spinning. Whole snow-
storms go into my dust-pan. Ho! ho!”

“But I am so little,” replied Thekla, in her bird’s voice; “and,


beside, I have brushed up all there are.”

“All there are? Nonsense,” cried March; “but no matter. Am I, or


am I not to tell a story? If not, let me know at once; for I have an
engagement with a couple of hurricanes, and want to be off. A
pretty business,” he went on, glaring fiercely, “to sit here by this
melting fire to amuse a couple of thieving brats, when I have so
much to do. Ho! ho!”

“Oh!” whispered Thekla to Max, “let’s give him his moments,


and let him go: he makes me afraid.”

“Not I,” said Max, who was plucking up courage, “not if I know
it!—Of course you are to tell a story,” he continued aloud: “you
promised, and you ought to be a Month of your word. Thekla, put
away that broom. Now we’re all ready, sir.”

March scowled, but made no resistance. As Max had said, he


was a Month of his word; and he began in a queer voice, which was
now loud and then soft, now dying away to a murmur and then
bellowing out again in a way that made you jump.

“Once upon a time, as I was driving across a prairie, I saw a


house.”
“I don’t know what a prairie is,” said Thekla, gently.

“I don’t suppose you do,” growled March: “that’s one of the


things you don’t know, and there are a good many more of ’em. A
prairie’s a big field without any fences, and several thousand miles
square. People live there,—some people do: I spend a good deal of
time there myself. First-rate place for a promenade,—no corners to
turn, plenty of room. As I said, I saw a house.

“There was a snow-storm along with me. We had nine hundred


billion horses, all white as wool; and we went fast. Killing pace.
Horses kept dropping down dead, lay in heaps wherever we went;
and we left ’em there. About four million dashed up against the
house I was telling you about. They ’most covered it up, for it wasn’t
a big house. There were two little windows and a door. Windows had
curtains; but one was slipped aside, and the fire looked out like a
red eye. I didn’t like that; so I put my eye to the other side, to see if
I couldn’t look him down.

“Funniest thing I ever saw!” said March, giving a hoarse


chuckle. “Such tots! Biggest only four years old; t’other not a year.
There was a pussy too. They three—true, on my word—were the
only creatures in the house that night.”

“Where could their father and mother be?” asked Max,


excessively interested.

“Oh! went off that morning to the town,—like fools,—and


couldn’t get back. We saw to that. Stuck in ten drifts, most frozen to
death. Wife half-crazy about the babies; husband just managed to
get to shelter. Ho! ho!” cried March. “Served ’em right, I say. Ho! ho!

“Don’t you think, that Tot, the biggest one, was putting a stick
of wood on the fire when I looked in? Stick as big as she was,
almost! How she did it was a mystery. Little apron blew into the
flame, but I flew up the chimney and blew it the other way. ’Tisn’t
often I do a good turn, but I couldn’t help it then.”
“That was right,” said Thekla.

“Hold your tongue!” cried March, rudely. “What do you know


about it? Two sticks that little thing got on. I never did! How she
managed it, and such a baby!

“Then she put a shawl over the other tot. Patted the corners
down just like an old woman, and put one on herself. Hind side
before, but no matter for that. Then she got into bed, and sang,
‘Hush by, Budda,—hus’ by, Budda,’ till the baby went to sleep. Then
she went to sleep too. I thought I’d like to see what would happen
when they woke up, so I sent the snow-storm on and stayed behind
with my eye to the chink.

“I’m not a tender-hearted person myself,” said March, modestly,


“but really I couldn’t bear to disturb those children. Several times I
wanted to roar dreadfully,—roaring is one of my greatest pleasures,
—but I didn’t. I never quite knew why, but so it was. The snow isn’t
noisy, so it was as still all night about the little house as if it had
been mid-summer.

“I watched, and the children slept. By and by when morning


came, the baby woke up and began to cry. The Tot patted him and
said, ‘Hush-a-by, Budda,’ a great many times; but he wouldn’t stop.
Babies don’t stop,” added March, reflectively, “as a general thing.
Then the Tot said, ‘Budda hundry;’ and she got up, and tugged and
tugged to put a stick on the fire, and fetched a tin cup and spoon,
and set them on a chair by the table where there was a milk-pan.
She had to tip it with her little hands, and a great deal spilled on the
floor and a great deal on her apron, but some went in the cup. She
began to cry at first; then she said, ‘Mamie didn’t mean to,’ and
brightened up again. And she warmed the milk and fed that baby
like a woman,” cried March, giving his knee a great slap. “I never
did! Baby ate it all, and went to sleep again. Tot drank some too, but
not much. Wanted to save it for the baby, I guess.
“It was a very cold day. I kept in a long time; but at last I had
to howl or I should have burst. Tot got frightened. She said her little
prayers, and hid her head under the pillow; but when the other
cried, she stopped, and gave him some milk, and sang, ‘Hush by,
Budda,’ till he went off again. I tell you what,” said March, “I did feel
sorry for that child.

“There was only one stick of wood left, and that was a big one.
Tot couldn’t move it. Pussy got on the table, and lapped up all the
milk in the pan. Then Tot cried hard, and said, ‘Mamma, come! oh
do come!’ over and over. She put all the clothes there were on the
bed. When the baby cried, she patted him with her little hand, and
cried too. When morning came, they were both still. I could see
them through the window. Away off on the prairie I heard the slow
jingle of a bell.

“‘Hurry! hurry!’ I roared, ‘or you’ll be too late.’ Then I scooped


up the snow, and blew open a path. The sleigh got nearer. The
woman couldn’t wait. She held out her arms to the cottage. At last
she jumped into the snow (it was up to her waist), and floundered
to the door. She beat upon it, threw it open, and cried out, ‘Mary!
baby! O my baby!’

“They lay in the bed; but no little voices answered. The mother
gave a loud scream. ‘Oh, they are dead!’ she shrieked, and flung
herself over them.

“The men ran in. There were four of them. They built a fire and
warmed blankets, and put hot milk into the mouths of the little ones.

“‘This little fellow isn’t dead,’ said one of them. He wasn’t. Pretty
soon he opened his eyes, and when he saw his mother he began to
cry. Tot had wrapped him up so warm that the cold didn’t kill him,—
only made him dull.

“It took longer to bring her round, but at last they did. And the
first thing she said was, ‘Mamie didn’t mean to spill the milk.’
“I declare,” said March with a frog in his throat, “I never did see
the beat of that child.”

“And is that the end?” asked Thekla, who had been quietly
crying for some time past over little Tot’s troubles.

“Of course it’s the end,” replied March. “What did you expect?
And a very nice story it is, though I say it as shouldn’t.

“And now I’m off,” shouted he, and made a rush for the door.

“One minute!” cried Max: “you’ve forgotten something. Here’s


your moments, you know. And then there is the present you were to
give us: don’t leave that out.”

“I’m glad you reminded me,” said March,—“very glad indeed.”


His wild eyes sparkled with a fierce light which was ugly to see. With
one hand he seized his “moments,” the other was fumbling in his
pocket.

“Here it is!” he cried, and flung something in their faces.


Another instant he had banged the door and was gone. They could
hear him roaring and whooping as he went.

The poor children—all red in the face, sneezing, coughing—


looked at each other.

“Ow! ow!” cried Max.

“Thzs! thzs!” responded Thekla.

March’s present was a bad cold in the head!


Little Tot and the Baby asleep.
CHAPTER IV.
“MARIA.”

SUCH colds! Never was any thing like them. Day after day Max sat
by the fire with a splitting headache, cold chills running down his
back; while night after night Thekla awoke, coughing and choking
from a spot in her throat which burned like a live coal. I can tell you,
when March gives a present he does it in real earnest.
“One day in an old garret I found the doll,
who, as I said, was living in a closet.”

They were so miserable you might have thought that even


March must pity them a little. But he didn’t,—not a bit. As he told
the children, he was any thing but a “tender-hearted person.” When
they were at the very worst, they could hear him astride the roof,
roaring and whooping down the chimney in the most unfeeling way;
and he regularly banged the door open on cold nights to let the wind
in; so that, at last, Max never thought of sitting down to supper
without first putting a heavy chair against it to keep it shut. So
blustering and ill-tempered a Month was never known. But at last his
turn came to go; and, by that time, what with patience and catnip
tea the children had begun to get better.

There is a great difference, however, between being better and


being well. Thekla’s hands were still too weak and thin to twirl the
spindle, and for many a day the wood-carving had lain untouched in
the cupboard. It seemed as if they were too languid to enjoy any
thing; and, when the evening came for April’s visit, Max would hardly
take the trouble to rise and fetch the can, though Thekla reminded
him. After it was brought out, however, and the fire poked into a
blaze, they felt a little brighter. Poor things, it was a long time since
any thing pleasant had happened to them!

The night was still. The noisy winds had fallen asleep, so that
you could hear the least sounds far away in the forest. By and by
light footsteps became audible, drawing nearer; and Max had time
to run for a chair and place it in the cosiest corner, before a soft tap
fell upon the door.

“May I come in?” said a voice, very gently and politely. How
different from rude March!

This was April. She looked very young and small; and, as Thekla
went forward to greet her, she felt as if it were some little visitor of
her own age come to tea. It was with a sense of protection and
hospitality that she took from her hand a great bundle, which
seemed heavy. April sat down, and then she put her arm round
Thekla’s waist and pulled her nearer, bundle and all. She had an odd,
pretty face when you came to look at it. The lips laughed of
themselves; but the eyes, which were blue and misty, seemed to
have tears behind them all ready to fall. Or if, as sometimes
happened, the lips took a fancy to pout, then the eyes had their
turn, and brightened and twinkled so that you could not help
smiling. It would have puzzled anybody whether to call the
countenance most sad or most merry. April’s hair was all wavy and
blowsy, as if she had been out in a gale of wind. Two or three violets
were stuck in it; and the voice with which she spoke sounded like
the tinkle of rain-drops on the leaves.

“Look,” she said, “what I have brought you!” and she


unfastened the bundle, which was pinned together with a long red
thorn.

O mercy! It seemed as if the sun, which went to bed three


hours ago, had got up again, and was pouring over April’s lap on to
the kitchen floor. For there lay a great heap of dandelions, golden
and splendid, which perked up their heads, and laughed and winked
on all around. The whole room seemed to brighten from their
glorious color. And, what was funny, these dandelions had voices, as
it seemed; for out of the middle of the heap came queer sounds of
peeping and chirping, which the children could not at all understand.

April laughed. She parted the flowers, and there were two little
new-born chicks, as yellow as the yolk of an egg. They were soft
and downy; and their cunning black eyes and little beaks gave them
a knowing look, which was astonishing, when you recollected how
short a time they had been in the world. “Cheep! cheep!” they cried,
and one ran directly into Thekla’s outstretched hands. The warm
fingers felt to it like a nest; and the little creature cuddled down
contentedly, with a soft note which expressed comfort. The other,
April handed to Max.

“They are for you,” she said. “If you like them and take care of
them, you may have a whole poultry-yard some day. My broods are
not always lucky; but these will be.”

“Like them,” indeed! You should have seen the happy fuss which
went on over the new pets. Max ran for a basket; Thekla brought
flannel to line it, and meal and water; and the chicks were kissed,
fed, and tucked away as if they had been babies. By and by they fell
fast asleep under their warm coverlet; and then the children went
back to the fire, and, while Max made ringlets of the dandelion-
stalks and stuck them in Thekla’s hair, April began:—

“My story isn’t much,” she said. “I’ve told so many in the course
of my life that I’m quite exhausted, for I make it a rule never to tell
the same twice. Some are so sad that it makes me cry merely to
think of them,”—and as she said this April’s tears suddenly rained
down her face,—“and others so jolly that I should split my sides if I
tried.” Here April giggled like a school-girl, and her eyes seemed to
send out rays of sun which danced on the wet tear-stains. “So it
must always be new,” she went on; “and, ever since I saw you, I’ve
been trying to decide what it should be. There was a delightful one
about ducklings which I thought of,—but no!” and she solemnly
shook her head.

“Oh, why not? Do, pray do!” cried Max.

“Couldn’t,” said April. “That story—the first half of it at least—I


told to a little girl in England last year. I didn’t finish because
something came along and set me crying, but half is just as bad as
the whole. I couldn’t tell that again. Don’t look so disappointed,
though! I’ve got one for you; and, though it isn’t one of my best, I
dare say you’ll like it well enough. It’s about a doll.”

“A doll! Pshaw!” said Max, impolitely.

“Why, what a rude boy you are!” cried April, beginning to sob. “I
declare, I ne—never was t—treated so before.”

“Max!” exclaimed Thekla, “how could you? You’ve hurt her


feelings. Don’t cry any more, dear,” she went on,—for somehow
Thekla felt older and bigger than this fascinating little maiden who
laughed and cried by turns,—“he didn’t mean to. He is a real kind
boy, only sometimes he speaks before he thinks. And I like dolls—
oh, so much!”
“Do you?” said April, brightening. “Then it’s all right. As for you,”
she added, turning sharply round on Max, “you can go out and sit on
the steps, if you don’t want to hear it.”

“Oh!” stammered Max, dreadfully ashamed of himself, “I do. I’d


just as lief hear it as not. And I beg your pardon, if I spoke rudely.”

“Very well then,” said April, pacified. “If you feel that way, I’ll
proceed. This doll lived in a closet. I should never have come across
her probably if it hadn’t been for the house-cleaning.

“You must know that there are countries in the world where
every spring and fall the houses are all turned upside down and
inside out, and then downside up and outside in, all for the sake of
being clean. The women do it. What becomes of the men I don’t
know: they climb trees or something to be out of the way, I
suppose. I like these times, of all things. I like to swing the heavy
carpets to and fro on the lines, and flap the maids’ aprons into their
faces as they stand on the ledge outside to wash the windows. It is
great fun. And I love to creep into holes and corners, and rummage
and poke about to see what folks have got. And one day, when
doing this in an old garret, I found the doll, who, as I said, was
living in a closet. They had put her there to be out of the way of the
cleaning.

“Her name was Maria. She was big, but not very beautiful. Her
head was dented, and there were marks of finger-nails on her
cheeks, which were faded and of a purplish-pink. But her arms and
legs were bran new, and white as snow, and her body was round
and full of sawdust. I couldn’t understand this at all until she
explained it. Her head, it seemed, was twenty-five years old; and her
body had only been in the world six weeks!

“Once, she said, she had possessed a body just the same age
as her head, and then she belonged to a person she called ‘Baby
May.’ Baby May used to bump her on the floor, and dig the soft wax
out of her cheeks with her nails. This treatment soon ruined her
good looks; and when she mentioned this, Maria almost cried,—but
not quite, because, as she said, years had taught her self-command.
I don’t know what she meant,” added April, reflectively. “I’m sure
years never taught me any thing of the sort. However, that is neither
here nor there! If she hadn’t had a fine constitution, Maria never
could have endured all this cruelty. Her body didn’t. It soon sank
under its sufferings; and, after spitting sawdust for some months,
wasted away so much that May’s mother said it must go into the
ragbag. People make a great fuss about having their heads cut off,
but Maria said it was quite easy if the scissors were sharp. Snip,
snip, rip, rip, and there you are. The head was put carefully away in
a wardrobe because it was so handsome, and May’s mamma
promised to buy a new body for it; but somehow she forgot, and by
and by May grew so big that she didn’t care to play with dolls any
more. So Maria’s head went on living in the wardrobe. Having no
longer any cares of the body to disturb it, it gave itself up to the
cultivation of the intellect. A wardrobe is a capital place for study, it
appears. People keep their best things there, and rarely come to
disturb them. At night, when the house is asleep, they wake up and
talk together, and tell secrets. The silk gowns converse about the
fine parties they have gone to, and the sights they have seen. There
were several silk gowns in the wardrobe. One of them had a large
spot of ice-cream on its front breadth. She used to let the other
things smell it, that they might know what luxury was like; and once
Maria got a chance, and licked it with her tongue, but she said it
didn’t taste as she expected. There was an India shawl, too, which
would lift the lid of its box, and relate stories—oh, so interesting!—
about black faces and white turbans and hot sunshine. The laces in
the drawer came from Belgium. That was a place to learn
geography! And the Roman pearls had a history too. They were
devout Catholics, and would tell their beads all night if nobody
seemed to be listening. But the Coral in the drawer below was Red
Republican in its opinions, and made no attempt to hide it. Both
hailed from Italy, but they were always quarrelling! Oh, Maria knew
a deal! As she grew wise, she ceased to care for tea-parties, and
being taken out to walk as formerly. All she wanted was to gain
information, and strengthen her mind. At least so she said; but for
all that,” remarked April, with a sly smile, “she had some lingering
regard for looks still, for she complained bitterly of the change in her
complexion. Perhaps it was putting so much inside her head made
the outside so dull and shabby!

“Well, for twenty-three long years Maria lived in the wardrobe at


the head of polite society. She was treated with great respect. The
dresses always bowed to her when they went in and out. When their
time came for being ripped up and pieced into bedquilts, they said
farewell with many tears. All this gratified her feelings, of course. So
you can imagine what a shock it was when, one day, the wardrobe
door was suddenly opened, and she was lifted down and laid in a
pair of little clutching hands, which grasped her eagerly. A small
thumb-nail pierced her left cheek. ‘I could have screamed,’ said
Maria; ‘but where would have been the use? Dolls have positively no
rights.’”

“Who was it took her down?” asked Max, quite forgetful of his
original scorn about Maria’s history.

“It was Baby May. Not the same May, but another as like her as
two peas. In fact, the first May was grown up; and this was her little
girl. Grandmamma had bought a beautiful new body, and now
Maria’s head had to be sewed on to it. Her feelings when the
stitches were put in, she said, she could never describe. They were
like those of a poor old soldier, who, after living fifty years on his
pension, finds himself dragged from pipe and chimney-corner, and
obliged to begin again as a drummer-boy.”

“It was really cruel, I think,” said Thekla, indignantly.

“Yes,” said April; “but you haven’t heard the worst. Think of
being suddenly united to such a young body! There was Maria,
elderly and dignified, full of wisdom and experience, longing for
nothing so much as to be left alone to think over the facts she had
learned. And there were her arms and legs always wanting to be in
motion. New, impulsive, full of sawdust, it was misery to them to be
still. They wanted to dance and frisk all the time, to wear fine
clothes, to have other dolls come on visits, to drink tea out of the
baby-house tea-set, and have a good time generally. When Maria
assured them that she was tired of these things, and had seen the
vanity of them, they said they wanted to see the vanity too! And if
ever she got a quiet chance, and had fallen into a reverie about old
times and friends,—the silk stockings in the wardrobe, for instance,
and the touching story they had told her; or the shoe-buckles, who
were exiles from their country,—all of a sudden her obstreperous
limbs would assert themselves, out would flourish her legs, up fly
her hands and hit her in the eye, and the first thing she knew she
would be tumbled out on to the floor. Just think what a trial to a lady
of fine education and manners! It was enough to vex a saint. She
assured me she had lost at least three scruples of wax. But nobody
cared in the least about her scruples.”

“And what became of the poor thing in the end?” asked Thekla.

“That I can’t say,” replied April: “I had to come away, you know;
and I left her there. One of two things, she told me, was pretty sure
to happen: either her arms and legs would sober with time, or she
would get so hideous from unhappiness that May’s mamma would
buy a new head to match them. ‘Then, ah then!’ said she, ‘I may
perhaps be allowed to go back to my beloved top-shelf in the
wardrobe. Never, never will I quit it again so long as I live!’ She
ended with a sigh. I bade her farewell, but on the way downstairs I
met a little girl coming up and calling out, ‘Where dolly? me want
dolly!’ And I fear poor Maria was not left any longer in peace in the
attic closet.”

April closed her story. She took her moments from the can,
poured the dandelions into Thekla’s lap, and rose to go.
“I am late,” she said: “all my violets must be made before
midnight. I have none but these few in my hair.”

“Not yet.—stay a little longer!” pleaded the children.

“Ah, no!” said April: “I must go. You won’t miss me long: May is
coming, my sister May. Everybody loves her better than they do me,”
and she wiped her eyes dolefully as she shut the door.

“What a goose I am!” she cried, flinging it open again, with a


merry laugh. “Don’t mind my nonsense. Good-by, dears,—good-by!”

Oh, how cheerful the kitchen seemed now! Where were the
colds and the disconsolate looks? All gone; and Max and Thekla
laughed gayly into each other’s faces.

“I’ll tell you what,” said Max, “if April didn’t cry so easily, she’d
be one of the jolliest girls in the world.”

“Good-by, dears!”
CHAPTER V.
MAY’S GARDEN.

THE chicks throve. Day by day their legs grew strong, their yellow
bodies round and full, and their calls for food more clamorous. As
the snow melted, and the sun made warm spots on the earth, they
began to run from the cottage-door, and poke and scratch about
with their bills. But they always came back to the basket to sleep;
and Thekla prepared their food, and watched over them as well as
any old hen could have done.
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.

More than just a book-buying platform, we strive to be a bridge


connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.

Join us on a journey of knowledge exploration, passion nurturing, and


personal growth every day!

ebookbell.com

You might also like