0% found this document useful (0 votes)
74 views87 pages

Reinforcement Learning For Finance Solve Problems in Finance With CNN and RNN Using The Tensorflow Library 1st Edition Samit Ahlawat Download

The document is a book titled 'Reinforcement Learning For Finance' by Samit Ahlawat, focusing on solving financial problems using CNN and RNN with the TensorFlow library. It covers various aspects of reinforcement learning, neural networks, and their applications in finance, providing practical examples and modular coding techniques. The book aims to serve as a comprehensive resource for Python programmers interested in applying reinforcement learning in financial contexts.

Uploaded by

aadyavizza6y
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
74 views87 pages

Reinforcement Learning For Finance Solve Problems in Finance With CNN and RNN Using The Tensorflow Library 1st Edition Samit Ahlawat Download

The document is a book titled 'Reinforcement Learning For Finance' by Samit Ahlawat, focusing on solving financial problems using CNN and RNN with the TensorFlow library. It covers various aspects of reinforcement learning, neural networks, and their applications in finance, providing practical examples and modular coding techniques. The book aims to serve as a comprehensive resource for Python programmers interested in applying reinforcement learning in financial contexts.

Uploaded by

aadyavizza6y
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 87

Reinforcement Learning For Finance Solve

Problems In Finance With Cnn And Rnn Using The


Tensorflow Library 1st Edition Samit Ahlawat
download
https://ebookbell.com/product/reinforcement-learning-for-finance-
solve-problems-in-finance-with-cnn-and-rnn-using-the-tensorflow-
library-1st-edition-samit-ahlawat-47440704

Explore and download more ebooks at ebookbell.com


Here are some recommended products that we believe you will be
interested in. You can click the link to download.

Reinforcement Learning For Finance A Pythonbased Introduction 1st


Edition Yves J Hilpisch

https://ebookbell.com/product/reinforcement-learning-for-finance-a-
pythonbased-introduction-1st-edition-yves-j-hilpisch-63476728

Reinforcement Learning For Finance Early Release Yves J Hilpisch

https://ebookbell.com/product/reinforcement-learning-for-finance-
early-release-yves-j-hilpisch-56353516

Reinforcement Learning For Finance Samit Ahlawat

https://ebookbell.com/product/reinforcement-learning-for-finance-
samit-ahlawat-232354288

Reinforcement Learning For Finance For True Epub Yves Hilpisch

https://ebookbell.com/product/reinforcement-learning-for-finance-for-
true-epub-yves-hilpisch-62308972
Reinforcement Learning For Sequential Decision And Optimal Control
Shengbo Eben Li

https://ebookbell.com/product/reinforcement-learning-for-sequential-
decision-and-optimal-control-shengbo-eben-li-48497914

Reinforcement Learning For Maritime Communications Liang Xiao

https://ebookbell.com/product/reinforcement-learning-for-maritime-
communications-liang-xiao-51012474

Reinforcement Learning For Reconfigurable Intelligent Surfaces Alice


Faisal

https://ebookbell.com/product/reinforcement-learning-for-
reconfigurable-intelligent-surfaces-alice-faisal-55777478

Reinforcement Learning For Adaptive Dialogue Systems A Datadriven


Methodology For Dialogue Management And Natural Language Generation
1st Edition Verena Rieser

https://ebookbell.com/product/reinforcement-learning-for-adaptive-
dialogue-systems-a-datadriven-methodology-for-dialogue-management-and-
natural-language-generation-1st-edition-verena-rieser-2371544

Reinforcement Learning For Optimal Feedback Control 1st Ed Rushikesh


Kamalapurkar

https://ebookbell.com/product/reinforcement-learning-for-optimal-
feedback-control-1st-ed-rushikesh-kamalapurkar-7150044
Reinforcement
Learning for
Finance
Solve Problems in Finance with
CNN and RNN Using the
TensorFlow Library

Samit Ahlawat
Reinforcement
Learning for Finance
Solve Problems in Finance
with CNN and RNN Using
the TensorFlow Library

Samit Ahlawat
Reinforcement Learning for Finance: Solve Problems in Finance with CNN
and RNN Using the TensorFlow Library

Samit Ahlawat
Irvington, NJ, USA

ISBN-13 (pbk): 978-1-4842-8834-4 ISBN-13 (electronic): 978-1-4842-8835-1


https://doi.org/10.1007/978-1-4842-8835-1

Copyright © 2023 by Samit Ahlawat


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or
part of the material is concerned, specifically the rights of translation, reprinting, reuse of
illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way,
and transmission or information storage and retrieval, electronic adaptation, computer software,
or by similar or dissimilar methodology now known or hereafter developed.
Trademarked names, logos, and images may appear in this book. Rather than use a trademark
symbol with every occurrence of a trademarked name, logo, or image we use the names, logos,
and images only in an editorial fashion and to the benefit of the trademark owner, with no
intention of infringement of the trademark.
The use in this publication of trade names, trademarks, service marks, and similar terms, even if
they are not identified as such, is not to be taken as an expression of opinion as to whether or not
they are subject to proprietary rights.
While the advice and information in this book are believed to be true and accurate at the date of
publication, neither the authors nor the editors nor the publisher can accept any legal
responsibility for any errors or omissions that may be made. The publisher makes no warranty,
express or implied, with respect to the material contained herein.
Managing Director, Apress Media LLC: Welmoed Spahr
Acquisitions Editor: Celestin Suresh John
Development Editor: Laura Berendson
Coordinating Editor: Mark Powers
Cover designed by eStudioCalamar
Cover image by Joel Filipe on Unsplash (www.unsplash.com)
Distributed to the book trade worldwide by Apress Media, LLC, 1 New York Plaza, New York, NY
10004, U.S.A. Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail [email protected],
or visit www.springeronline.com. Apress Media, LLC is a California LLC and the sole member
(owner) is Springer Science + Business Media Finance Inc (SSBM Finance Inc). SSBM Finance
Inc is a Delaware corporation.
For information on translations, please e-mail [email protected]; for
reprint, paperback, or audio rights, please e-mail [email protected].
Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook
versions and licenses are also available for most titles. For more information, reference our Print
and eBook Bulk Sales web page at http://www.apress.com/bulk-sales.
Any source code or other supplementary material referenced by the author in this book is
available to readers on GitHub (https://github.com/Apress). For more detailed information,
please visit http://www.apress.com/source-code.
Printed on acid-free paper
To my family and friends without whose support this book
would not have been possible.
Table of Contents
About the Author���������������������������������������������������������������������������������ix

Acknowledgments�������������������������������������������������������������������������������xi

Preface����������������������������������������������������������������������������������������������xiii

Introduction����������������������������������������������������������������������������������������xv

Chapter 1: Overview�����������������������������������������������������������������������������1
1.1 Methods for Training Neural Networks�����������������������������������������������������������2
1.2 Machine Learning in Finance��������������������������������������������������������������������������3
1.3 Structure of the Book��������������������������������������������������������������������������������������4

Chapter 2: Introduction to TensorFlow�������������������������������������������������5


2.1 Tensors and Variables�������������������������������������������������������������������������������������5
2.2 Graphs, Operations, and Functions���������������������������������������������������������������11
2.3 Modules��������������������������������������������������������������������������������������������������������14
2.4 Layers�����������������������������������������������������������������������������������������������������������17
2.5 Models����������������������������������������������������������������������������������������������������������25
2.6 Activation Functions��������������������������������������������������������������������������������������33
2.7 Loss Functions����������������������������������������������������������������������������������������������37
2.8 Metrics����������������������������������������������������������������������������������������������������������46
2.9 Optimizers�����������������������������������������������������������������������������������������������������77
2.10 Regularizers������������������������������������������������������������������������������������������������96
2.11 TensorBoard����������������������������������������������������������������������������������������������120

v
Table of Contents

2.12 Dataset Manipulation��������������������������������������������������������������������������������122


2.13 Gradient Tape��������������������������������������������������������������������������������������������126

Chapter 3: Convolutional Neural Networks���������������������������������������139


3.1 A Simple CNN����������������������������������������������������������������������������������������������140
3.2 Neural Network Layers Used in CNNs���������������������������������������������������������148
3.3 Output Shapes and Trainable Parameters of CNNs�������������������������������������150
3.4 Classifying Fashion MNIST Images�������������������������������������������������������������152
3.5 Identifying Technical Patterns in Security Prices����������������������������������������159
3.6 Using CNNs for Recognizing Handwritten Digits�����������������������������������������172

Chapter 4: Recurrent Neural Networks���������������������������������������������177


4.1 Simple RNN Layer���������������������������������������������������������������������������������������178
4.2 LSTM Layer�������������������������������������������������������������������������������������������������182
4.3 GRU Layer���������������������������������������������������������������������������������������������������186
4.4 Customized RNN Layers������������������������������������������������������������������������������188
4.5 Stock Price Prediction���������������������������������������������������������������������������������190
4.6 Correlation in Asset Returns�����������������������������������������������������������������������207

Chapter 5: Reinforcement Learning Theory��������������������������������������233


5.1 Basics���������������������������������������������������������������������������������������������������������234
5.2 Methods for Estimating the Markov Decision Problem�������������������������������240
5.3 Value Estimation Methods���������������������������������������������������������������������������241
5.3.1 Dynamic Programming�����������������������������������������������������������������������242
5.3.2 Generalized Policy Iteration����������������������������������������������������������������265
5.3.3 Monte Carlo Method���������������������������������������������������������������������������277
5.3.4 Temporal Difference (TD) Learning�����������������������������������������������������284
5.3.5 Cartpole Balancing�����������������������������������������������������������������������������305

vi
Table of Contents

5.4 Policy Learning�������������������������������������������������������������������������������������������319


5.4.1 Policy Gradient Theorem���������������������������������������������������������������������319
5.4.2 REINFORCE Algorithm�������������������������������������������������������������������������321
5.4.3 Policy Gradient with State-Action Value Function Approximation������323
5.4.4 Policy Learning Using Cross Entropy��������������������������������������������������325
5.5 Actor-Critic Algorithms��������������������������������������������������������������������������������326
5.5.1 Stochastic Gradient–Based Actor-Critic Algorithms���������������������������329
5.5.2 Building a Trading Strategy����������������������������������������������������������������330
5.5.3 Natural Actor-Critic Algorithms�����������������������������������������������������������346
5.5.4 Cross Entropy–Based Actor-Critic Algorithms������������������������������������347

Chapter 6: Recent RL Algorithms�����������������������������������������������������349


6.1 Double Deep Q-Network: DDQN������������������������������������������������������������������349
6.2 Balancing a Cartpole Using DDQN���������������������������������������������������������������353
6.3 Dueling Double Deep Q-Network����������������������������������������������������������������356
6.4 Noisy Networks�������������������������������������������������������������������������������������������357
6.5 Deterministic Policy Gradient����������������������������������������������������������������������359
6.5.1 Off-Policy Actor-Critic Algorithm���������������������������������������������������������360
6.5.2 Deterministic Policy Gradient Theorem����������������������������������������������361
6.6 Trust Region Policy Optimization: TRPO������������������������������������������������������362
6.7 Natural Actor-Critic Algorithm: NAC�������������������������������������������������������������368
6.8 Proximal Policy Optimization: PPO��������������������������������������������������������������369
6.9 Deep Deterministic Policy Gradient: DDPG��������������������������������������������������370
6.10 D4PG���������������������������������������������������������������������������������������������������������373
6.11 TD3PG�������������������������������������������������������������������������������������������������������376
6.12 Soft Actor-Critic: SAC��������������������������������������������������������������������������������379

vii
Table of Contents

6.13 Variational Autoencoder����������������������������������������������������������������������������384


6.14 VAE for Dimensionality Reduction�������������������������������������������������������������389
6.15 Generative Adversarial Networks��������������������������������������������������������������399

Bibliography�������������������������������������������������������������������������������������403

Index�������������������������������������������������������������������������������������������������411

viii
About the Author
Samit Ahlawat is Senior Vice President in
Quantitative Research, Capital Modeling, at
JPMorgan Chase in New York, USA. In his
current role, he is responsible for building
trading strategies for asset management
and for building risk management models.
His research interests include artificial
intelligence, risk management, and
algorithmic trading strategies. He has given CQF Institute talks on artificial
intelligence, has authored several research papers in finance, and holds a
patent for facial recognition technology. In his spare time, he contributes
to open source code.

ix
Acknowledgments
I would like to express my heartfelt appreciation for my friends and
coworkers, in academia and the workplace, who encouraged me to write
this book.

xi
Preface
When I began using artificial intelligence tools in quantitative financial
research, I could not find a comprehensive introductory text focusing on
financial applications. Neural network libraries like TensorFlow, PyTorch,
and Caffe had made tremendous contributions in the rapid development,
testing, and deployment of deep neural networks, but I found most
applications restricted to computer science, computer vision, and robotics.
Having to use reinforcement learning algorithms in finance served as
another reminder of the paucity of texts in this field. Furthermore, I found
myself referring to scholarly articles and papers for mathematical proofs of
new reinforcement learning algorithms. This led me to write this book to
provide a one-stop resource for Python programmers to learn the theory
behind reinforcement learning, augmented with practical examples drawn
from the field of finance.
In practical applications, reinforcement learning draws upon deep
neural networks. To facilitate exposition of topics in reinforcement
learning and for continuity, this book also provides an introduction to
TensorFlow and covers neural network topics like convolutional neural
networks (CNNs) and recurrent neural networks (RNNs).
Finally, this book also introduces readers to writing modular, reusable,
and extensible reinforcement learning code. Having worked on developing
trading strategies using reinforcement learning and publishing papers,
I felt existing reinforcement learning libraries like TF-Agents are tightly
coupled with the underlying implementation framework and do not

xiii
Preface

express central concepts in reinforcement learning in a manner that is


modular enough for someone conversant with concepts to pick up
TF-­Agent library usage or extend its algorithms for specific applications.
The code samples covered in this book provide examples of how to write
modular code for reinforcement learning.

xiv
Introduction
Reinforcement learning is a rapidly growing area of artificial intelligence
that involves an agent learning from past experience of rewards gained
by taking specific actions in certain states. The agent seeks to learn a
policy prescribing the optimum action in each state with the objective of
maximizing expected discounted future rewards. It is an unsupervised
learning technique where the agent learns the optimum policy by past
interactions with the environment. Supervised learning, by contrast, seeks
to learn the pattern of output corresponding to each state in training
data. It attempts to train the model parameters in order to get a close
correspondence between predicted and actual output for a given set of
inputs. This book outlines the theory behind reinforcement learning
and illustrates it with examples of implementations using TensorFlow.
The examples demonstrate the theory and implementation details of the
algorithms, supplemented with a discussion of corresponding APIs from
TensorFlow and examples drawn from quantitative finance. It guides
a reader familiar with Python programming from basic to advanced
understanding of reinforcement learning algorithms, coupled with a
comprehensive discussion on how to use state-of-the-art software libraries
to implement advanced algorithms in reinforcement learning.
Most applications of reinforcement learning have focused on robotics
or computer science tasks. By focusing on examples drawn from finance,
this book illustrates a spectrum of financial applications that can benefit
from reinforcement learning.

xv
CHAPTER 1

Overview
Deep neural networks have transformed virtually every scientific human
endeavor – from image recognition, medical imaging, robotics, and self-­
driving cars to space exploration. The extent of transformation heralded
by neural networks is unrivaled in contemporary human history, judging
by the range of new products that leverage neural networks. Smartphones,
smartwatches, and digital assistants – to name a few – demonstrate the
promise of neural networks and signal their emergence as a mainstream
technology. The rapid development of artificial intelligence and machine
learning algorithms has coincided with increasing computational power,
enabling them to run rapidly. Keeping pace with new developments in
this field, various open source libraries implementing neural networks
have blossomed. Python has emerged as the lingua franca of the artificial
intelligence programming community. This book aims to equip Python-­
proficient programmers with a comprehensive knowledge on how to use
the TensorFlow library for coding deep neural networks and reinforcement
learning algorithms effectively. It achieves this by providing detailed
mathematical proofs of key theorems, supplemented by implementation of
those algorithms to solve real-life problems.
Finance has been an early adopter of artificial intelligence algorithms
with the application of neural networks in designing trading strategies
as early as the 1980s. For example, White (1988) applied a simple
neural network to find nonlinear patterns in IBM stock price. However,
recent cutting-edge research on reinforcement learning has focused

© Samit Ahlawat 2023 1


S. Ahlawat, Reinforcement Learning for Finance,
https://doi.org/10.1007/978-1-4842-8835-1_1
Chapter 1 Overview

predominantly on robotics, computer science, or interactive game-­


playing. The lack of financial applications has led many to question
the applicability of deep neural networks in finance where traditional
quantitative models are ubiquitous. Finance practitioners feel that the
lack of rigorous mathematical proofs and transparency about how neural
networks work has restricted their wider adoption within finance. This
book aims to address both of these concerns by focusing on real-life
financial applications of neural networks.

1.1 Methods for Training Neural Networks


Neural networks can be trained using one of the following three methods:

1. Supervised learning involves using a training


dataset with known output, also called ground
truth values. For a classification task, this would
be the true labels, while for a regression task, it
would be the actual output value. A loss function
is formulated that measures the deviation of the
model output from the true output. This function is
minimized with respect to model parameters using
stochastic gradient descent.

2. Unsupervised learning methods use a training


dataset made up of input features without any
knowledge of the true output values. The objective
is to classify inputs into clusters for clustering or
dimension reduction applications or for identifying
outliers.

3. Reinforcement learning involves an agent that


learns an optimal policy within the framework of
a Markov decision problem (MDP). The training

2
Chapter 1 Overview

dataset consists of a set of actions taken in different


states by an agent, followed by rewards earned and
the next state to which the agent transitions. Using
the history of rewards, reinforcement learning
attempts to learn an optimal policy to maximize the
expected sum of discounted future rewards. This
book focuses on reinforcement learning.

1.2 Machine Learning in Finance


Machine learning applications in finance date back to the 1980s with the
use of neural networks in stock price prediction (White, 1988). Within
finance, automated trading strategies and portfolio management have
been early adopters of artificial intelligence and machine learning tools.
Allen and Karjalainen (1999) applied genetic algorithms to combine
simple trading rules to form more complex ones. More recent applications
of machine learning in finance can be seen in the works of Savin et al.
(2007), who used the pattern recognition method presented by Lo et al.
(2000) to test if the head-and-shoulders pattern had predictive power;
Chavarnakul and Enke (2008), who employed a generalized regression
neural network (GRNN) to construct two trading strategies based on
equivolume charting that predicted the next day’s price using volume-
and price-based technical indicators; and Ahlawat (2016), who applied
probabilistic neural networks to predict technical patterns in stock
prices. Other works include Enke and Thawornwong (2005), Li and Kuo
(2008), and Leigh et al. (2005). Chenoweth et al. (1996) have studied the
application of neural networks in finance. Enke and Thawornwong (2005)
tested the hypothesis that neural networks can provide superior prediction
of future returns based on their ability to identify nonlinear relationships.
They employed only fundamental measures and did not consider
technical ones. Their neural network provided higher returns than the
buy-and-hold strategy, but they did not consider transaction costs.

3
Chapter 1 Overview

There are many other applications of machine learning in finance


besides trading strategies, perhaps less glamorous but equally significant
in business impact. This book gives a comprehensive exposition of several
machine learning applications in finance that are at cutting edge of
research and practical use.

1.3 Structure of the Book


This book begins with an introduction to the TensorFlow library in
Chapter 2 and illustrates the concepts with financial applications that
involve building models to solve practical problems. The datasets for
problems are publicly available. Relevant concepts are illustrated with
mathematical equations and concise explanations.
Chapter 3 introduces readers to convolutional neural networks
(CNNs), and Chapter 4 follows up with a similar treatment of recurrent
neural networks (RNNs). These networks are frequently used in building
value function models and policies in reinforcement learning, and a
comprehensive understanding of CNN and RNN is indispensable for
using reinforcement learning effectively on practical problems. As before,
all foundational concepts are illustrated with mathematical theory,
explanation, and practical implementation examples.
Chapter 5 introduces reinforcement learning concepts: from Markov
decision problem (MDP) formulation to defining value function and
policies, followed by a comprehensive discussion of reinforcement
learning algorithms illustrated with examples and mathematical proofs.
Finally, Chapter 6 provides a discussion of recent, groundbreaking
advances in reinforcement learning by discussing technical papers and
applying those algorithms to practical applications.

4
CHAPTER 2

Introduction to
TensorFlow
TensorFlow is an open source, high-performance machine learning library
developed by Google and released for public use in 2015. It has interfaces
for Python, C++, and Java programming languages. It has the option of
running on multiple CPUs or GPUs. TensorFlow offers two modes of
execution: eager mode that can be run immediately and graph mode
that creates a dependency graph and executes nodes in that graph only
where needed.
This book uses TensorFlow 2.9.1. Older TensorFlow constructs from
version 1 of the library such as Session and placeholder are not covered
here. Their use has been rendered obsolete in TensorFlow version 2.0 and
higher. Output shown in the code listings has been generated using the
PyCharm IDE’s interactive shell.

2.1 Tensors and Variables


Tensors are n-dimensional arrays, similar in functionality to the numpy
library’s ndarray object. They are instances of the tf.Tensor object. A three-­
dimensional tensor of 32-bit floating-point numbers can be created using
code in Listing 2-1. Tensor has attributes shape and dtype that tell the
shape and data type of the tensor. Once created, tensors retain their shape.

© Samit Ahlawat 2023 5


S. Ahlawat, Reinforcement Learning for Finance,
https://doi.org/10.1007/978-1-4842-8835-1_2
Chapter 2 Introduction to TensorFlow

Listing 2-1. Creating a Three-Dimensional Tensor

1   import tensorflow as tf
2
3   tensor = tf.constant([[list(range(3))],
4                          [list(range(1, 4))],
5                          [list(range(2, 5))]], dtype=tf.
float32)
6
7   print(tensor)
8
9   tf.Tensor(
10   [[[0. 1. 2.]]
11   [[1. 2. 3.]]
12   [[2. 3. 4.]]], shape=(3, 1, 3), dtype=float32)

Most numpy functions for creating ndarrays have analogs in


TensorFlow, for example, tf.ones, tf.zeros, tf.eye, tf.ones_like, etc. Tensors
support usual mathematical operations like +, −, etc., in addition to matrix
operations like transpose, matmul, and einsum, as shown in Listing 2-2.

Listing 2-2. Mathematical Operations on Tensors


1   import tensorflow as tf
2
3   ar = tf.constant([[1, 2], [2, 2]], dtype=tf.float32)
4
5   print(ar)
6   <tf.Tensor: id=1, shape=(2, 2), dtype=float32, numpy=
7   array([[1., 2.],
8   [2., 2.]], dtype=float32)>
9
10   # elementwise multiplication

6
Chapter 2 Introduction to TensorFlow

11   print(ar * ar)
12   Out[8]:
13   <tf.Tensor: id=2, shape=(2, 2), dtype=float32, numpy=
14   array([[1., 4.],
15   [4., 4.]], dtype=float32)>
16
17   # matrix multiplication C = tf.matmul(A, B) => cij =
sum_k (aik * bkj)
18   print(tf.matmul(ar, tf.transpose(ar)))
19
20   <tf.Tensor: id=5, shape=(2, 2), dtype=float32, numpy=
21   array([[5., 6.],
22   [6., 8.]], dtype=float32)>
23
24   # generic way of matrix multiplication
25   print(tf.einsum("ij,kj->ik", ar, ar))
26
27   <tf.Tensor: id=23, shape=(2, 2), dtype=float32, numpy=
28   array([[5., 6.],
29   [6., 8.]], dtype=float32)>
30
31   # cross product
32   print(tf.einsum("ij,kl->ijkl", ar, ar))
33
34   <tf.Tensor: id=32, shape=(2, 2, 2, 2),
dtype=float32, numpy=
35   array([[[[1., 2.],
36   [2., 2.]],
37   [[2., 4.],
38   [4., 4.]]],
39   [[[2., 4.],

7
Chapter 2 Introduction to TensorFlow

40   [4., 4.]],
41   [[2., 4.],
42   [4., 4.]]]], dtype=float32)>

Tensors can be sliced using the usual Python notation with a


semicolon. For advanced slicing, use tf.slice that accepts a begin index
and the number of elements along each axis to slice. tf.strided_slice can
be used for adding a stride. To obtain specific indices from a tensor, use
tf.gather. To extract specific elements of a multidimensional tensor
specified by a list of indices, use tf.gather_nd. These APIs are illustrated
using examples in Listing 2-3.

Listing 2-3. Tensor Slicing Operations

1   import tensorflow as tf
2
3   tensor = tf.constant([[1, 2], [2, 2]], dtype=tf.float32)
4
5   print(tensor[1:, :])
6   <tf.Tensor: id=37, shape=(1, 2), dtype=float32,
numpy=array([[2., 2.]], dtype=float32)>
7
8   print(tf.slice(tensor, begin=[0,1], size=[2, 1]))
9   tf.Tensor(
10   [[2.]
11   [2.]], shape=(2, 1), dtype=float32)
12
13   print(tf.gather_nd(tensor, indices=[[0, 1], [1, 0]]))
14   Out[18]: <tf.Tensor: id=42, shape=(2,), dtype=float32,
numpy=array([2., 2.], dtype=float32)>

Ragged tensors are tensors with a nonuniform shape along an axis, as


illustrated in Listing 2-4.

8
Chapter 2 Introduction to TensorFlow

Listing 2-4. Ragged Tensors

1   import tensorflow as tf
2
3   jagged = tf.ragged.constant([[1, 2], [2]])
4   print(jagged)
5   <tf.RaggedTensor [[1, 2], [2]]>

TensorFlow allows space-efficient storage of sparse arrays, that is,


arrays with most elements as 0. The tf.sparse.SparseTensor API takes
the indices of non-zero elements, their values, and the dense shape of the
sparse array. This is shown in Listing 2-5.

Listing 2-5. Sparse Tensors

1   import tensorflow as tf
2
3   tensor = tf.sparse.SparseTensor(indices=[[1,0], [2,2]],
values=[1, 2], dense_shape=[3, 4])
4   print(tensor)
5   SparseTensor(indices=tf.Tensor(
6   [[1 0]
7   [2 2]], shape=(2, 2), dtype=int64), values=tf.Tensor([1 2],
shape=(2,), dtype=int32), dense_shape=tf.Tensor([3 4],
shape=(2,), dtype=int64))
8
9   print(tf.sparse.to_dense(tensor))
10   tf.Tensor(
11   [[0 0 0 0]
12   [1 0 0 0]
13   [0 0 2 0]], shape=(3, 4), dtype=int32)

9
Chapter 2 Introduction to TensorFlow

In contrast to tf.Tensor that is immutable after creation, a TensorFlow


variable can be changed. A variable is an instance of the tf.Variable
class and can be created by initializing it with a tensor. Variables can be
converted to tensors using tf.convert_to_tensor. Variables cannot be
reshaped after creation, only modified. Calling tf.reshape on a variable
returns a new tensor. Variables can also be created from another variable,
but the operation copies the underlying tensor. Variables do not share
underlying data. assign can be used to update the variable by changing
its data tensor. assign_add is another useful method of a variable that
replicates the functionality of the += operator. Operations on tensors like
matmul or einsum can also be applied to variables or to a combination of
tensor and variable. Variable has a Boolean attribute called trainable that
signifies if the variable is to be trained during backpropagation. Operations
on variables are shown in Listing 2-6.

Listing 2-6. Variables

1   import tensorflow as tf
2
3   tensor = tf.constant([[1, 2], [3, 4]])
4   variable = tf.Variable(tensor)
5   print(variable)
6   <tf.Variable 'Variable:0' shape=(2, 2) dtype=int32, numpy=
7   array([[1, 2],
8   [3, 4]])>
9
10   # return the index of highest element
11   print(tf.math.argmax(variable))
12
13   tf.Tensor([1 1], shape=(2,), dtype=int64)
14
15   print(tf.convert_to_tensor(variable))
16   tf.Tensor(

10
Chapter 2 Introduction to TensorFlow

17   [[1 2]
18   [3 4]], shape=(2, 2), dtype=int32)
19
20   print(variable.assign([[1,2], [1, 1]]))
21   <tf.Variable 'UnreadVariable' shape=(2, 2) dtype=int32,
numpy=
22   array([[1, 2],
23   [1, 1]])>

2.2 Graphs, Operations, and Functions


There are two modes of execution within TensorFlow: eager execution
and graph execution. Eager mode of execution processes instructions as
they occur in the code, while graph execution is delayed. Graph mode
builds a dependency graph connecting the data represented as tensors
(or variables) using operations and functions. After the graph is built, it is
executed. Graph execution offers a few advantages over eager execution:

1. Graphs can be exported to files or executed in non-


Python environments such as mobile devices.

2. Graphs can be compiled to speed up execution.

3. Nodes with static data and operations on those


nodes can be precomputed.

4. Node values that are used multiple times can


be cached.

5. Branches of the graph can be identified for parallel


execution.

Operations in TensorFlow are represented using the tf.Operation class


and can be used as a node. Operation nodes are created using one of the
predefined operations such as tf.matmul, tf.reduce_sum, etc. To create a

11
Chapter 2 Introduction to TensorFlow

new operation, use the tf.Operation class. A few important operations are
enumerated in the following. All of them can be accessed directly using the
tf.operation_name syntax.

1. Operations defined in the tf.math library:

• tf.abs: Calculates the absolute value of a tensor.

• tf.divide: Divides two tensors.

• tf.maximum: Returns the element-wise maximum


of two tensors.

• tf.reduce_sum: Calculates the sum of all tensor


elements. It takes an optional axis argument to
calculate the sum along that axis.

2. Operations defined in the tf.linalg library:

tf.det: Calculates the determinant of a


(a). 
square matrix

tf.svd: Calculates the SVD decomposition of a


(b). 
rectangular matrix provided as a tensor

(c). tf.trace: Returns the trace of a tensor


Functions are defined using the tf.function method, passing the
Python function as an argument. tf.function is a decorator that augments
a Python function with attributes necessary for running it in a TensorFlow
graph. A few examples of TensorFlow operations and functions are
illustrated in Listing 2-7. Each TensorFlow function generates an internal
graph from its arguments. By default, a TensorFlow function uses a graph
execution model. To switch to eager execution mode, set tf.config.run_
functions_eagerly(True). Please note that the following output may not
match output from another run because of random numbers used.

12
Chapter 2 Introduction to TensorFlow

Listing 2-7. TensorFlow Operations and Functions

1   import tensorflow as tf
2   import numpy as np
3
4   tensor = tf.constant(np.ones((3, 3), dtype=np.int32))
5
6   print(tensor)
7
8   <tf.Tensor: id=0, shape=(3, 3), dtype=int32, numpy=
9   array([[1, 1, 1],
10   [1, 1, 1],
11   [1, 1, 1]])>
12
13   print(tf.reduce_sum(tensor))
14   <tf.Tensor: id=2, shape=(), dtype=int32, numpy=9>
15
16   print(tf.reduce_sum(tensor, axis=1))
17   <tf.Tensor: id=4, shape=(3,), dtype=int32, numpy=
array([3, 3, 3])>
18
19   @tf.function
20   def sigmoid_activation(inputs, weights, bias):
21       x = tf.matmul(inputs, weights) + bias
22       return tf.divide(1.0, 1 + tf.exp(-x))
23
24   inputs = tf.constant(np.ones((1, 3), dtype=np.float64))
25   weights = tf.Variable(np.random.random((3, 1)))
26   bias = tf.ones((1, 3), dtype=tf.float64)

13
Chapter 2 Introduction to TensorFlow

27
28   print(sigmoid_activation(inputs, weights, bias))
29   <tf.Tensor: id=195, shape=(1, 3), dtype=float64,
numpy=array([[0.89564016, 0.89564016, 0.89564016]])>

Code shown in Listing 2-8 sets the default execution mode to


graph mode.

Listing 2-8. Running TensorFlow Operations in Graph


(Non-­eager) Mode

1   import timeit
2
3   tf.config.experimental_run_functions_eagerly(False)
4   t1 = timeit.timeit(lambda: sigmoid_activation(inputs,
weights, tf.constant(np.random.random((1, 3)))),
number=1000)
5   print(t1)
6   0.7758807

2.3 Modules
TensorFlow uses the base class tf.Module to build layers and models. A
module is a class that keeps track of its state using instance variables and can
be called as a function. To achieve this, it must provide an implementation
for the method __call__. This is illustrated in Listing 2-9. Due to the use of
random numbers, output values may vary from those shown.

Listing 2-9. Custom Module

1   import tensorflow as tf
2   import numpy as np
3
4

14
Chapter 2 Introduction to TensorFlow

5   class ExampleModule(tf.Module):
6       def __init__(self, name=None):
7           super(ExampleModule, self).__init__(name=name)
8           self.weights = tf.Variable(np.random.random(5),
name="weights")
9           self.const = tf.Variable(np.array([1.0]),
dtype=tf.float64,
10           trainable=False, name="constant")
11
12       def __call__(self, x, *args, **kwargs):
13           return tf.matmul(x, self.weights[:, tf.newaxis]) +
self.const[tf.newaxis, :]
14
15
16   em = ExampleModule()
17   x = tf.constant(np.ones((1, 5)), dtype=tf.float64)
18   print(em(x))
19
20
21   <tf.Tensor: id=24631, shape=(1, 1), dtype=float64,
numpy=array([[2.45019464]])>

Module is the base class for both layers and models. It can be used as
a model, serving as a collection of layers. Module shown in Listing 2-10
defers the creation of weights for the first layer until inputs are provided.
Once input shape is known, it creates the tensors to store the weights.
Decorator tf.function can be added to the __call__ method to convert it to
a graph.

15
Chapter 2 Introduction to TensorFlow

Listing 2-10. Module

1   import tensorflow as tf
2
3
4   class InferInputSizeModule(tf.Module):
5       def __init__(self, noutput, name=None):
6           super().__init__(name=name)
7           self.weights = None
8           self.noutput = noutput
9           self.bias = tf.Variable(tf.zeros([noutput]),
name="bias")
10
11       def __call__(self, x, *args, **kwargs):
12           if self.weights is None:
13               self.weights = tf.Variable(tf.random.
normal([x.shape[-1], self.noutput]))
14
15           output = tf.matmul(x, self.weights) + self.bias
16           return tf.nn.sigmoid(output)
17
18   class SimpleModel(tf.Module):
19       def __init__(self, name=None):
20           super().__init__(name=name)
21
22           self.layer1 = InferInputSizeModule(noutput=4)
23           self.layer2 = InferInputSizeModule(noutput=1)
24
25       @tf.function
26       def __call__(self, x, *args, **kwargs):
27           x = self.layer1(x)
28           return self.layer2(x)

16
Chapter 2 Introduction to TensorFlow

29
30   model = SimpleModel()
31   print(model(tf.ones((1, 10))))
32
33   <tf.Tensor: id=24700, shape=(1, 1), dtype=float32,
numpy=array([[0.632286]], dtype=float32)>

Objects of type tf.Module can be saved to checkpoint files. Creating a


checkpoint creates two files: one with module data and another containing
metadata with extension .index. Saving a module to a checkpoint and
loading it back from a checkpoint is illustrated in Listing 2-11.

Listing 2-11. Checkpoint a Model

1   import tensorflow as tf
2
3   path = r"C:\temp\simplemodel"
4   checkpoint = tf.train.Checkpoint(model=model)
5   checkpoint.write(path)
6
7
8   model2 = SimpleModel()
9   model_orig = tf.train.Checkpoint(model=model2)
10   model_orig.restore(path)

2.4 Layers
Layers are objects with tf.keras.layers.Layer as the base class. The Keras
library is used in TensorFlow for implementing layers and models. The
tf.keras.layers.Layer class derives from the tf.Module class and has
a method call in place of the __call__ method in tf.Module. There are
several advantages to using Keras instead of tf.Module. For instance,
training variables of nested Keras layers are automatically collected for

17
Chapter 2 Introduction to TensorFlow

training during backpropagation, whereas with tf.Module, variables have


to be collected explicitly by the programmer. Additionally, one can provide
an optional build method that gets called the first time Layer is invoked
using the call method to initialize layer weights or other state variables
based on input shape.
According to TensorFlow convention, input is always a two-­
dimensional or higher tensor. The first dimension indicates the batches.
For example, if we have a set of N inputs, with each input comprised of one
feature, input shape will be (N, 1). Notice how TensorFlow requires the
first dimension to correspond to the number of batches. Similarly, the first
dimension of output is the number of batches.
TensorFlow layers are derived from base class tf.keras.layers.Layer.
A layer has the following noteworthy methods. For a full list, please check
the TensorFlow API reference:

1. The __init__(self) method to initialize layer weights


or other instance variables.

2. The build(self, input_shape) method is optional.


When provided, it gets called the first time Layer is
called with the input_shape parameter.

3. The call(self, inputs, *args, **kwargs) method


takes the input and produces the output. This
method takes two optional arguments listed in the
following:

• training: A Boolean argument if the call to Layer


is made during the training period or prediction
period. This argument may be used if the layer
needs to do special work during training or
prediction calls.

18
Chapter 2 Introduction to TensorFlow

• mask: A Boolean tensor indicating some mask. For


example, a layer could apply special logic to inputs
if their batch number is present in the mask, or a
recurrent neural network layer can use this to flag
special timesteps.

4. The get_config(self) method returns a dictionary


with layer configurations that need to be serialized
when saving a checkpoint.

weights is a property of the Layer class and cannot be


5. 
set in derived classes. Variables, that is, instances of
type Variable, that are assigned as instance attributes
become constituents of the weights property.

trainable_weights is also a property of the Layer


6. 
class that contains trainable weights of this layer.

add_loss: Add additional losses like a regularization


7. 
loss to the loss function.

add_metric: Add additional metrics for tracking


8. 
training performance.

get_weights: Get all the weights – both trainable and


9. 
non-trainable – of a layer as a list of numpy arrays.

10. set_weights: Set the weights of this layer to those


provided in the list of numpy arrays. The structure
of this list must be identical to the list returned by
get_weights.

Sample code shown in Listing 2-12 creates a custom Keras layer that
applies an upper bound of 0.9 on all its inputs. Before the first call to Layer,
the build method has not been called, and weights is empty. After the
first call to Layer, weights and trainable_weights properties have been
initialized as seen from the output.

19
Chapter 2 Introduction to TensorFlow

Listing 2-12. Writing a Customized Layer

1   import tensorflow as tf
2   from tensorflow.keras.layers import Layer
3
4
5   class CustomDenseLayer(Layer):
6       def __init__(self, neurons):
7           super().__init__()
8           self.neurons = neurons
9
10       def build(self, input_shape):
11           # input_shape[-1] is the number of features for
this layer
12           self.wt = tf.Variable(tf.random.normal((input_
shape[-1], self.neurons), dtype=tf.float32),
13           trainable=True)
14           self.bias = tf.Variable(tf.zeros((self.neurons,),
dtype=tf.float32),
15           trainable=True)
16           self.upperBound = tf.constant(0.9, dtype=tf.
float32, shape=(input_shape[-1],))
17
18       def call(self, inputs):
19           return tf.matmul(tf.minimum(self.upperBound,
inputs), self.wt) + self.bias
20
21
22   layer = CustomDenseLayer(5)
23   print(layer.weights)
24   print(layer.trainable_weights)
25

20
Chapter 2 Introduction to TensorFlow

26   []
27   []
28
29   input = tf.random_normal_initializer(mean=0.5)
(shape=(2, 5), dtype=tf.float32)
30   print(layer(inputs=input))
31
32   <tf.Tensor: id=171, shape=(2, 5), dtype=float32, numpy=
33   array([[-1.1098292 , -0.2773003 ,  0.24687909,  1.0952137 ,
1.221024  ],
34   [-1.116677  , -0.4057744 ,  0.18726291,  1.0598873 ,
1.3692323 ]],
35   dtype=float32)>
36
37   print(layer.weights)
38
39   [<tf.Variable 'custom_dense_layer_4/Variable:0' shape=(5, 5)
dtype=float32, numpy=
40   array([[-1.3313855 , -0.7012864 , -1.003786  , -0.6224709 ,
3.0700085 ],
41   [-0.1896328 ,  1.156029  ,  0.5904321 ,  0.20901136,
-0.6205104 ],
42   [-0.13661204, -1.201732  , -0.08776241,  0.64640564,
-0.9309348 ],
43   [-0.6379096 ,  0.43822217, -0.13019271,  0.4309327 ,  
0.8983831 ],
44   [ 0.03697195, -0.30708486,  1.1169728 ,  1.5509295 ,  
0.3927749 ]],
45   dtype=float32)>, <tf.Variable 'custom_dense_layer_4/
Variable:0' shape=(5,) dtype=float32, numpy=
array([0., 0., 0., 0., 0.], dtype=float32)>]

21
Chapter 2 Introduction to TensorFlow

46
47   print(layer.trainable_weights)
48
49   [<tf.Variable 'custom_dense_layer_4/Variable:0' shape=(5, 5)
dtype=float32, numpy=
50   array([[-1.3313855 , -0.7012864 , -1.003786  , -0.6224709 ,  
3.0700085 ],
51   [-0.1896328 ,  1.156029  ,  0.5904321 ,  0.20901136,
-0.6205104 ],
52   [-0.13661204, -1.201732  , -0.08776241,  0.64640564,
-0.9309348 ],
53   [-0.6379096 ,  0.43822217, -0.13019271,  0.4309327 ,
  0.8983831 ],
54   [ 0.03697195, -0.30708486,  1.1169728 ,  1.5509295 ,
  0.3927749 ]],
55   dtype=float32)>, <tf.Variable 'custom_dense_layer_4/
Variable:0' shape=(5,) dtype=float32, numpy=
array([0., 0., 0., 0., 0.], dtype=float32)>]

Keras layers also provide the ability to add loss functions like a
regularization loss to the overall loss function and to track additional metrics.

Listing 2-13. Creating a Custom Layer for Lasso (L1) Regularization

1   import tensorflow as tf
2   from tensorflow.keras.layers import Layer
3
4
5   class LassoLossLayer(Layer):
6       def __init__(self, features, neurons):
7           super().__init__()
8           self.wt = tf.Variable(tf.random.normal((features,
neurons), dtype=tf.float32),

22
Chapter 2 Introduction to TensorFlow

9           trainable=True)
10           self.bias = tf.Variable(tf.zeros((neurons,),
dtype=tf.float32),
11           trainable=True)
12           self.meanMetric = tf.keras.metrics.Mean()
13
14       def call(self, inputs):
15           # LASSO regularization loss
16           self.add_loss(tf.reduce_sum(tf.abs(self.wt)))
17           self.add_loss(tf.reduce_sum(tf.abs(self.bias)))
18           # metric to calculate mean of inputs
19           self.add_metric(self.meanMetric(inputs))
20           return tf.matmul(inputs, self.wt) + self.bias

In practice, one rarely needs to create custom layers. TensorFlow provides


a range of layers useful in different neural networks. A few of them are
described in the following. For a complete list, refer to the TensorFlow API:
• Average: Takes the average of inputs.
• AveragePooling1D: One-dimensional pooling layer
used in convolutional neural networks. It takes pooling
size and stride arguments. AveragePooling2D and
AveragePooling3D layers are also available.

• BatchNormalization: Normalizes the input by


subtracting the batch mean and dividing by the batch
standard deviation during training. During prediction,
when the training argument is False, uses a moving
average of the mean and standard deviation computed
using the values from the training phase and the
current batch mean and standard deviation.

23
Chapter 2 Introduction to TensorFlow

• Conv1D: One-dimensional convolution layer with


provided number of filters (or number of channels),
kernel size, and stride. Two- and three-dimensional
convolution layers are also available.

• Conv1DTranspose: Deconvolution layer that produces


the inverse of a convolution layer.

• Dense: A layer that connects all neurons in the layer to


features (layer inputs).

• Dropout: Randomly sets the rate proportion of inputs


to zero during training while scaling up the remaining
inputs by frac11 − rate so that the sum of inputs is
unchanged. This is helpful for preventing overfitting.
During prediction, this layer is a pass-through, sending
the inputs as outputs.

• Embedding: This layer takes an input of dimension


input_dim and returns a corresponding embedding of
dimension output_dim. input_dim and output_dim
are constructor arguments for this layer.

• MaxPool1D: Pool the inputs within the kernel, selecting


the maximum value of input. This layer is useful
in convolutional neural networks. Two- and three-
dimensional max pooling layers are also available.
1
• Softmax: Softmax layer that computes pi  and
1  e  xi
pi
returns the normalized probability for a vector
∑ j pj
of inputs xi. This layer has no trainable weights.

Layer’s activation function can be provided as a constructor argument.


If the activation function is omitted, the unit activation function is applied
by default, that is, y = W ⋅ X.

24
Chapter 2 Introduction to TensorFlow

2.5 Models
TensorFlow models have tf.keras.Model as the base class, which in
turn derives from the tf.keras.layers.Layer class. Models can serve as
a collection of layers. For example, a sequential model is a collection of
layers that applies the input to the first layer, passing its output to the
second layer as input, and so on. Because models have Layer as a base
class, all functionality of layers is available in models. Models can be saved
as a checkpoint, deriving this functionality from the tf.Module base class.
Models also have a method save to serialize the model to a file. A serialized
model can be loaded using the tf.keras.models.load_model command.
An example of a customized sequential layer is shown in code
Listing 2-14. The model has two layers: a dense layer with ReLU (rectified
linear unit) activation and a softmax layer. As can be seen, the outputs
from the softmax layer add to 1 for each row. Due to the use of random
numbers, output values may vary from those shown.

Listing 2-14. Writing a Customized Model

1   import tensorflow as tf
2   from tensorflow.keras import Model
3
4
5   class CustomSequentialModel(Model):
6       def __init__(self, name=None, **kwargs):
7           super().__init__(name, **kwargs)
8           self.layer2 = tf.keras.layers.Softmax()
9           self.layer1 = tf.keras.layers.Dense(10,
activation=tf.keras.activations.relu)
10
11       def call(self, inputs, training=None, mask=None):
12           x = self.layer1(inputs)

25
Chapter 2 Introduction to TensorFlow

13           return self.layer2(x)
14
15   model = CustomSequentialModel()
16   output = model(tf.random.normal((2, 10), dtype=tf.float32))
17   print(output)
18   tf.Tensor(
19   [[0.07642513 0.25438178 0.06848245 0.0847797  0.06848245
0.10721327
20   0.06848245 0.07157873 0.10768385 0.09249022]
21   [0.0404469  0.0404469  0.0404469  0.0404469  0.0404469  
0.06400955
22   0.0404469  0.60652715 0.0404469  0.04633499]], shape=(2, 10),
dtype=float32)
23
24   print(tf.reduce_sum(output, axis=1))
25   tf.Tensor([1.         0.99999994], shape=(2,),
dtype=float32)

TensorFlow provides a sequential model tf.keras.Sequential. Layers


are added to a sequential model using the add method. The first layer to
a sequential model takes an optional argument input_shape specifying
the number of features. If input shape for the first layer is not specified,
the model must be built before compiling it. The build method of the
model class takes input shape as argument. Before a model can be fitted
to training data, it must be compiled, specifying the optimizer and loss
function. Once fitted, the model can be used for making predictions. Usage
of a sequential model is illustrated using an example shown in Listing 2-15.
The code creates a sequential model comprised of three dense layers. It is
then compiled and fitted to data using backpropagation. Once trained, it
can be used for predicting.

26
Chapter 2 Introduction to TensorFlow

A few important methods of the tf.keras.Sequential model class are


listed in the following:

1. add: Add a layer to the sequential model.

compile: Compile the model. This step is required


2. 
before the model can be trained. It specifies the
optimizer used, loss function, metrics, and if it
should run eagerly or in graph mode.

compute_loss: Calculates the loss given the


3. 
predicted outputs, the inputs, and the outputs using
the loss function supplied to the model. If predicted
outputs are not provided, the method first predicts
the output using the inputs. Calculates the loss
between predicted output and output.

evaluate: Evaluate the model in prediction mode.


4. 
Since this is not training mode, layers such as
dropout layers behave accordingly.

fit: Fit the model using provided inputs and


5. 
outputs using backpropagation. It accepts optional
arguments such as batch_size that specifies the
number of samples used in each stochastic gradient
step and epochs that specifies the number of
optimization iterations. Returns a history object
that can be used to track the evolution of loss and
metrics over training epochs.

6. predict: Predict the output from the model.

get_layer: Retrieve a layer from the model using an


7. 
index or name.

8. save: Saves the model to a file.

27
Chapter 2 Introduction to TensorFlow

summary: Prints a summary of input and output


9. 
shapes and trainable parameters in each layer.

10. to_json: Saves the model to a JSON file.

Use of these APIs is illustrated using an example shown in Listing 2-15.


In this code, data is generated by adding Gaussian white noise to function
4x + 2.5. A model is fitted to the dataset using no regularization first,
followed by using L2 regularization. Predicted results are plotted.

Listing 2-15. Sequential Model

1   import tensorflow as tf
2   import numpy as np
3   import matplotlib.pyplot as plt
4   import seaborn as sns
5   sns.set_theme(style="whitegrid")
6
7   # generate data
8   x = np.linspace(0, 5, 400, dtype=np.float32)  # 400 points
spaced from 0 to 5
9   x = tf.constant(x)
10   y = 4*x + 2.5 + tf.random.truncated_normal((400,),
dtype=tf.float32)
11   sns.scatterplot(x.numpy(), y.numpy())
12   plt.ylabel("y = 4x + 2.5 + noise")
13   plt.xlabel("x")
14   plt.show()
15
16   # create test and training data
17   x_train, y_train = x[0:350], y[0:350]

28
Chapter 2 Introduction to TensorFlow

18   x_test, y_test = x[350:], y[350:]


19
20   # create the model
21   seq_model = tf.keras.Sequential()
22   seq_model.add(tf.keras.layers.Dense(5, input_shape=(1,)))
23   seq_model.add(tf.keras.layers.Dense(10, activation=tf.
keras.activations.relu))
24   seq_model.add(tf.keras.layers.Dense(1))
25   print(seq_model.summary())
26
27   # Custom loss function with optional regularization
28   class Loss(tf.keras.losses.Loss):
29       def __init__(self, beta, weights):
30           super().__init__()
31           self.weights = weights
32           self.beta = beta
33
34       def call(self, y_true, y_pred):
35           reg_loss = 0
36           for i in range(len(self.weights)):
37               reg_loss += tf.reduce_mean(tf.square(self.
weights[i]))
38           return tf.reduce_mean(tf.square(y_pred - y_true))
+ self.beta * reg_loss
39
40   my_loss = Loss(0, seq_model.get_weights())
41
42   # compile the model
43   seq_model.compile(optimizer=tf.keras.optimizers.Adam(),
44                     loss=my_loss,

29
Chapter 2 Introduction to TensorFlow

45                     metrics=[tf.keras.metrics.
MeanSquaredError()])
46
47   # fit the model to training data
48   history = seq_model.fit(x_train, y_train, batch_size=10,
epochs=10)
49
50   # plot the history
51   plt.plot(history.history["mean_squared_error"],
label="mean_squared_error")
52   plt.ylabel("Mean Square Error")
53   plt.xlabel("Epoch")
54   plt.show()
55
56   # predict unseen test data
57   y_pred = seq_model.predict(x_test)
58   plt.plot(x_test, y_test, '.', label="Test Data")
59   plt.plot(x_test, 4*x_test+2.5, label="Underlying Data")
60   plt.plot(x_test, y_pred.squeeze(), label="Predicted Values")
61   plt.legend()
62   plt.show()
63
64
65   Model: "sequential"
66   __________________________________________________________
67   Layer (type)            Output Shape             Param #
68   __________________________________________________________
69   dense (Dense)            (None, 5)                10
70   __________________________________________________________
71   dense_1 (Dense)         (None, 10)                60
72   __________________________________________________________
73   dense_2 (Dense)         (None, 1)                11

30
Chapter 2 Introduction to TensorFlow

74   __________________________________________________________
75   Total params: 81
76   Trainable params: 81
77   Non-trainable params: 0
78   __________________________________________________________
79   None

The model is first fitted using no regularization, setting β = 0 in the


argument to the loss function. Prediction results on testing data are shown
in Figure 2-1. As can be observed, predicted values are very close to the
underlying data-generating function, indicating good performance in
testing data. Figure 2-2 shows the history of mean square error over the
epochs. As can be seen from Figure 2-2, mean square error has converged.
Next, L2 regularization loss is introduced by setting β = 0.05 in the
argument to the loss function. Prediction results are plotted in Figure 2-3.
Regularization loss penalizes the higher value of the weight, forcing it
down. As a result, predicted values are lower than the underlying ­data-­
generating function. Regularization is helpful in fitting a model to data
with outliers. The testing data has no outliers in this example.

Figure 2-1. Predictions of the Model with No Regularization Against


Underlying Data

31
Chapter 2 Introduction to TensorFlow

Figure 2-2. History of Mean Square Error over Training Epochs

Figure 2-3. Predictions of the Model with L2 Regularization Against


Underlying Data

32
Chapter 2 Introduction to TensorFlow

2.6 Activation Functions
An activation function specifies the function applied to the dot product
of neuron weights and inputs to determine the neuron’s output. In
equation 2.1, g represents the activation:

y  g W · X  b  (2.1)

TensorFlow has a number of predefined activation functions in


module tf.keras.activations. A few of them are described in the following:

ELU: This is the exponential linear unit defined in tf.


1. 
keras.activations.elu. Its activation function
is illustrated in equation 2.2. α > 0. For a large
negative value of x, ELU saturates to a small negative
value, −α. ELUs help address the vanishing gradient
problem because they do not saturate for large x:

 x if x 0
y
  e 1 if x 0
x (2.2)

2. exponential: Takes natural exponent ex of input.

GELU: Gaussian error linear unit that uses standard


3. 
normal Gaussian CDF to calculate its output as
shown in equation 2.3:
2
x
1 v
y  x  v e 2 dv (2.3)

2

ReLU: Rectified linear unit activation produces


4. 
max(x, 0) as the output. It cuts off negative
values at 0.

33
Chapter 2 Introduction to TensorFlow

5. The LeakyReLU activation function gives an output


shown in equation 2.4. For positive values of x,
it is identical to ReLU. Unlike ReLU, the output
does not cut off to 0 for negative values of x. This
helps avoid zero activation and zero gradients for
negative values:

  x if x  0
y (2.4)
  x if x  0

SELU: Scaled exponential linear unit activation


6. 
scales the output of ELU activation by a scaling
parameter β. Its output is shown in equation 2.5:

  x if x  0
y (2.5)
   e  1 if x  0
x

1
sigmoid: y 
7.  . This activation function
1  ex
saturates for large and small values of input x, giving
rise to the vanishing gradient problem in deep
neural networks and recurrent neural networks.

softmax: Produces probability distribution from its


8. 
inputs as shown in equation 2.6. Being a probability
distribution,  yi  1 :
i

e xi
yi  (2.6)
 je j
x

34
Chapter 2 Introduction to TensorFlow

tanh: Applies a hyperbolic tangent function as


9. 
shown in equation 2.7 to produce output. Like
the sigmoid function, it saturates for high and low
values of input x:

e x  ex
y (2.7)
e x  ex

An activation function is provided as an argument to the layers object’s


constructor. Either the full name or a string can be used. TensorFlow keeps
a mapping of strings to predefined activation functions. The two methods
of specifying an activation function are shown in Listing 2-16. The
advantage of using a fully qualified object name is that default arguments
to the activation function can be changed.

Listing 2-16. Specifying an Activation Function

1   import tensorflow as tf
2   input = tf.random.normal((1, 5), dtype=tf.float32)
3   (input < 0).numpy().sum()
4   layer = tf.keras.layers.Dense(10, activation="relu",
input_shape=(5,))
5   output = layer(input)
6   assert (output < 0).numpy().sum() == 0
7
8   layer2 = tf.keras.layers.Dense(10, activation=tf.keras.
activations.relu, input_shape=(5,))
9   output2 = layer2(input)
10   assert (output2 < 0).numpy().sum() == 0

35
Chapter 2 Introduction to TensorFlow

New activation functions can be added by defining a functor, that is,


a class that can be instantiated and called using operator (), as shown in
Listing 2-17. This example defines a new activation function y = min (α, x)
where α is a configurable parameter set to 0.5. Inputs are all set to zero,
giving x = 0 and output y = α.

Listing 2-17. Customizing an Activation Function

1   import tensorflow as tf
2
3   class MyActivation(object):
4       def __init__(self, alpha):
5           self.alpha = alpha
6
7       def __call__(self, x):
8           return tf.where(x < self.alpha, self.alpha, x)
9
10   layer = tf.keras.layers.Dense(1,
activation=MyActivation(0.5), input_shape=(2,))
11   input = tf.constant([[0, 0]], dtype=tf.float32)
12   output = layer(input)
13   print(output)
14
15   tf.Tensor([[0.5]], shape=(1, 1), dtype=float32)

36
Chapter 2 Introduction to TensorFlow

2.7 Loss Functions
A loss function defines a measure of difference between output and
predicted output. Training a model involves adjusting the model’s
parameters to minimize the loss over a training dataset.
Loss functions have tf.keras.losses.Loss as their base class and
override the method call(y_true, y_pred). Predefined loss functions in
TensorFlow can be found in module tf.keras.losses. A few loss functions
from that module are described in the following:

1. BinaryCrossentropy: Calculates loss between


predicted labels and true labels in a binary
(two-class) classification problem. Definition
of the loss function is shown in equation 2.8.
The constructor of this loss takes an argument
from_logits indicating if the predicted outputs are
true probabilities or un-normalized probabilities.
The default value of from_logits is false. If true,
pclass0 + pclass1 = 1.0 must hold. In equation 2.8, I() is
the indicator function. pclass0(i) denotes the predicted
probability of observation i belonging to class 0:

L   i  I  ytrue  i   0  log  pclass 0  i    I  ytrue  i   1 log 1  pclass1  i    (2.8)

2. CategoricalCrossentropy: Calculates loss between


predicted labels and true labels in a multiclass
classification problem. Like its two-class cousin
BinaryCrossentropy, it takes a from_logits
argument indicating if the predicted outputs are
true probabilities or un-normalized probabilities.
Definition of this loss is shown in equation 2.9.

37
Random documents with unrelated
content Scribd suggests to you:
ST. NICHOLAS.
The only record that can be found of this town is that Thomas
Poteet filed a plat thereof April 20. 1858, showing it to be located in
the southwest corner of section 6, township 7, range 20.
CONCORD.
This is another town about which there is little information
available. The plat was filed June 20, 1857, by James R. Whitehead
and shows it to have been located in the west half of section 1,
township 5, range 17. The streets were numbered from 1 to 18, and
the cross streets were named Buchanan, Emily, Mary, Carolina,
Jefferson, St. Joseph, Ellwood, Able, Alexander, and there were two
public squares, called North and South.
PARNELL.
The plat of Parnell was filed December 24, 1883, by J. C.
Hotham, and shows the town site to be located in the southwest
corner of the southeast quarter of section 20, township 6, range 20.
It is located on both the Santa Fe and the Missouri Pacific railroads.
The station was named for a hero of the Civil war, James L. Parnell, a
private soldier in Company F, Thirteenth Kansas volunteer infantry,
who was killed during the skirmish at Haare Head, Ark., August 4,
1864. Parnell was the original settler on the site of Parnell and was
one of the first citizens of Atchison county to respond under
President Lincoln’s call of July, 1862. He enlisted in the Thirteenth
Kansas. Ex-Sheriff Frank Hartin was a comrade of Parnell in
Company F and married into the Parnell family.
SHANNON.
Shannon was platted by G. W. Sutliff February 22, 1883. and is
located in the northwest corner of the northeast quarter of section 1.
township 6, range 19, about eight miles west of Atchison, on the
Parallel road. The town consists of one store building, in which the
postoffice is located, and a few residences, together with railroad
station and a small elevator.
ELMWOOD.
Elmwood was platted by Anna Hoke and J. S. Hoke April 12,
1873, and was located on the south half of the northeast quarter of
section 2, township 6, range 20. This was a “paper” town, and the
only record now available of it is the plat on file in the court house at
Atchison.
CUMMINGSVILLE.
Cummingsville was platted by William Cummings December 16,
1872, and was located on the north half of the southwest quarter of
section 1, township 7, range 19, on the line of the Atchison, Topeka &
Santa Fe railway, southwest of Atchison, in Center township, and
took its name from the founder of the town. The original plat
provided for two streets, Market and Main, but on September 21,
1883, Samuel C. King filed a plat, creating an addition to
Cummingsville, composed of four blocks. The first settler on the
townsite was Robert Kennish, who located there in November, 1872,
and was appointed postmaster when the postoffice was established
the following fall. Mr. Kennish opened the first store in
Cummingsville in December, 1872, and he for many years was
station agent there, one of the oldest in the service of the Atchison,
Topeka & Santa Fe railway. He was a much beloved character. He
died a few years ago at the home of his daughter, Mrs. Nelson W.
Cox, who lives in Cummingsville with her invalid husband, Nels Cox,
who for eight years served Atchison county in the capacity of clerk of
the court. In April, 1873, C. D. Harrison and family located in
Cummingsville, and their child, Lorenzo, was the first child born on
the townsite, and his was also the first death, Lorenzo having died
March 25, 1875. In the winter of 1880–81, R. C. Ripple taught the
first school, and the Methodist church (South) was built in 1880.
Cummingsville now is a town of over 100 residences, and in addition
to its bank, it has several good stores, a cream station and an
elevator. Much grain and live stock is shipped out of Cummingsville
annually.
EDEN P. O.
Eden was located about eight miles northwest of Atchison, and
Charles Servoss was appointed the first postmaster there in 1858.
The postoffice was located on a farm adjoining the Johnson Wymore
farm on the south. Servoss resigned as postmaster in 1863 and
removed to Detroit, Mich. He was succeeded by H. C. Lee, who kept
the office on a farm adjoining the Wymore farm on the west. Mr. Lee
was a grandfather of Miss Kate Platt and Mrs. S. F. Harburger,
formerly of Atchison, and the father of Mrs. Flora B. Hiatt. Mr. Lee
held the office until 1872, when Francis Schletzbaum, Sr., was named
as postmaster, and removed the office to his farm, which adjoined
the old Wymore farm on the north. The postoffice remained there
until it was discontinued upon the establishment of free rural
delivery service in 1900.
POTTER.
Potter is pleasantly situated on a slight rise or knoll in the
beautiful valley of Stranger creek, and near the southeast corner of
Mt. Pleasant township. From the first it has been the principal
station on the Santa Fe railroad, between Atchison and Leavenworth,
being situated about midway between the two cities. It is an
attractive little town, with well graded streets and good cement
sidewalks, and a number of attractive residences. While it is one of
the younger towns of the county, it has made strides that make it
compare favorably with some of its older sisters, in volume of
business at least, if not in population.
Potter, as the home of the white man, dates back further than
any community in the county. Elsewhere in this history will be found
an account of Paschal Pensoneau, the old French trader, who
established himself on Stranger creek, near the present townsite,
during the early forties.
The building of Potter is the third and the most successful
attempt to establish a town in that vicinity. The first attempt was at
Mount Pleasant. This was one of the first towns started in Kansas,
and here was located the first postoffice in Atchison county. It
prospered for a time and was a candidate for the county seat. It
gradually declined, and since the establishment of Potter, has been
little more than a memory. In the early days, some say before Mt.
Pleasant was started, a town was laid out near the big Mercer spring,
just northeast of the present site of Potter, and called Martinsburg. It
was extensively boomed, but outside of a small store and a few huts,
it never advanced beyond the paper stage.
Early in 1886 the Leavenworth, Northern & Southern railway,
now a branch of the Santa Fe, and known as the “Pollywog,” was
built and a station located where Potter now stands. A town was
platted and called Bennett Springs, after James Gordon Bennett, the
well known eastern journalist. The mineral springs on the Masterson
farm near the townsite were attracting considerable attention at the
time, and it was thought that a popular resort could be built up there.
The medicinal properties of the water were discovered by Dr. Rice, a
local physician, and subsequently analyzed by experts, who
confirmed Dr. Rice’s conclusions, and a number of people claimed to
have used the waters in liver, kidney and other complaints with good
results. Henry C. Squires, afterwards a Potter banker, conceived the
idea of establishing a health resort here, and named it in honor of
James Gordon Bennett, who, it was thought, would use his influence
towards getting eastern capital interested in the project. The
expected financial backing was not forthcoming, however, and the
proposed development of the springs was never made.
In the meantime the railroad people had christened the town
Potter, in honor of Hon. Joseph Potter, owner of the quarter section
on which the town was laid out, and, while the name of the town still
appears on the tax rolls as Bennett Springs, the original name having
never been legally changed, the town is now generally known as
Potter. Joseph Potter was the original settler, having preëmpted the
land on which the town stands, in 1854, and the first sales of lots in
Potter were deeded to their purchaser thirty-two years later direct
from the Government preëmption owner. The taking up of the land,
filing, etc., cost Mr. Potter about $220 for 160 acres, and when it was
divided up into town lots it brought him $200 an acre. Mr. Potter
entered part of this land with a land warrant given him for services in
the Mexican war.

Street Scene, Potter, Kansas

The first lots in the town were sold to the late James Stalons, for
many years a justice of the peace, preacher of the Gospel and
prominent citizen of the county. The first house on the townsite was
built by Thomas J. Potter in 1882, four years before the town was
laid out. The house is still standing. The first business house in the
town was erected by Charles Klein, who operated a store there until
his death. A year or two after Potter was started the postoffice was
removed from Mt. Pleasant to the place, and James B. Weir was the
first postmaster. The first hotel was operated by Mrs. Elvira Pierce.
Dr. Barnes had the first drug store, and was also the first physician;
Frank Blodgett, the first hardware store, and B. F. Shaw & Company,
the first furniture store. The first barber was Thomas Seever; the first
blacksmith, Lou Chilson; the first butcher, John Yost; the first
carpenter, P. H. Fleer; the first painters, George Brown and Grant
Cass; the first stone masons, S. B. Morrow and Frank Maxwell; the
first shoemaker, Patrick Murphy; the first stock buyer, Henry Show;
the first school teacher, Albert Limbaugh; the first railroad agent, C.
L. Cherrie; the first lumber dealer, David Hudson; the first harness
maker, Harry Rickets; the first rural mail carrier, Frank White. Frank
Mayfield operated the first livery stable; the first elevator was built
by James Hawley; the first church building was that of the
Methodists. The first Methodist preacher was Rev. John W. Faubian,
and the first Christian preacher, Rev. T. W. Cottingham. The first
telephone exchange was operated by Charles and George Sprong.
The first lodge was Echo Lodge, No. 103, Independent Order of Odd
Fellows. The first bank was the Potter State Bank. Potter has had
three newspapers, the first, the Potter Press, was established by E. E.
Campbell, in 1898. In 1900 Mr. and Mrs. Eppie Barber started the
Potter Leaf. Three years later Charles B. Remsburg bought the Leaf’s
circulation and launched the Potter Kansan, which is now owned
and published by his father, J. E. Remsburg.
Potter is one of the most flourishing towns of its size in Kansas.
Though its population is less than 200, it boasts of two banks, the
aggregate resources of which amount to nearly a quarter million
dollars. There probably is not another town of its size in the State
that has two banks. The town has two good elevators which during
the years 1912, 1913 and 1914 handled on an average of 140,000
bushels of grain a year. These elevators are operated by Fred Ode &
Sons and James Robinson. The railroad station at Potter does a
business that amounts to something like $40,000 annually. The
shipping of live stock is an important industry here. The principal
buyers are Tinsley, Potter, and Timple Bros. Much fruit is grown
around Potter, and as high as $20,000 has been paid out for apples
during one shipping season.
Potter has a rural high school, the first of its kind established in
the State, and an $8,000 school building.
The town has two general stores, those of W. A. Hodge and P. P.
Knoch; a hardware store, operated by B. F. Shaw; a grocery store, by
Thomas J. Potter; a furniture store, by Frank Beard; a drug store, by
G. E. Coulter; a hotel, by Mrs. G. F. Pope; two blacksmith shops, by
R. E. Brown and G. F. Pope; a livery stable, by H. G. Hawley; two
barber shops, by George Brown and Frank Blankenship; a cement
tile factory, by Grisham & Maxwell; a millinery store, by Mrs. T. J.
Maxwell; a telephone exchange, by E. C. Yoakum; a newspaper, The
Potter Weekly Kansan, by J. E. Remsburg; two physicians, Dr. G. W.
Redmon and Dr. S. M. Myers. Dr. A. E. Ricks, of Atchison, has a
branch dental office here; the Lambert Lumber Company, of
Leavenworth, has a commodious and well stocked yard here, with
Samuel Parker as manager. There are two churches, Methodist and
Christian, two public halls, and one lodge hall. L. M. Jewell conducts
an insurance, real estate and loan business. There is also a garage,
and other business enterprises in the town.

School House, Potter, Kansas


MOUNT PLEASANT.
In 1854 Thomas L. Fortune, Jr., a Virginian, settled on the “old
Military road” and opened one of the very earliest stores in Atchison
county, around this store springing up the village of Mount Pleasant.
A postoffice was established here in 1855, and Mr. Fortune was
appointed postmaster. Being an inventive genius, he finally gave up
his store business and devoted his energies towards perfecting and
building a road-wagon, to which reference has heretofore been made,
and which he thought would revolutionize the freighting business
across the plains.
The townsite of Mount Pleasant was surveyed in 1857 by John P.
Wheeler, agent for the Town Company.
Michael Wilkins and James Laird were the very first settlers in
the township, being followed shortly afterwards by Levi Bowles,
Jacob Grindstaff, Andrew J. Peebler, Martin Jones, Chris Horn, P. R.
King, W. C. Findley, A. S. Speck and Amos Hamon.
The first hotel in the town was opened by Henry Payne, who
operated it many years.
T. J. Payne and Philo W. Hull were the next parties to engage in
business, Mr. Payne leaving when the new town of Sumner was
started, and locating there.
The next to engage in business was P. R. King, who established a
general store about 1858. He remained at Mount Pleasant until after
the county seat question had been settled, when he removed to
Atchison.
In the fall of 1858 a district school was opened. In 1860 the
Cumberland Presbyterians erected a church building, having held
religious services at the homes of the members prior to this time.
Rev. A. A. Moore was their first pastor.
On May 1, 1862, the Church of Christ was organized by Elder W.
S. Jackson, with seventeen members, services being held in the
school house.
Mount Pleasant Lodge, No. 158, Ancient Free and Accepted
Masons, of Mount Pleasant, was organized in the fall of 1868 by the
following charter members: William J. Young, X. Klein, M. R.
Benton, John Hawley. S. K. McCreary, Joseph Howell and Albert
Hawley. Their first meeting was held October 20, 1868, with the
following as first officers: William Young, worshipful master; X.
Klein, senior warden; A. Hawley, junior warden; S. K. McCreary,
secretary; M. R. Benton, treasurer.
In August, 1862, the name of the postoffice was changed to
Locust Grove.
LEWIS’ POINT.
In pre-territorial times and in the steamboat days, Kansas had
many geographical names that are not now to be found on the map.
Some of them, where permanent settlements have sprung up, have
been perpetuated, but the majority of them do not live even in the
memory of the oldest inhabitants. One of the latter is “Lewis’ Point,”
near the present site of Oak Mills. Old “Cap.” Lewis is long since
dead, his name almost forgotten, and the rapacious Missouri river
and “Mansell’s Slide” are now about to devour the “Point.” with
which his name was coupled in our early geography. While “Lewis’
Point” was never a place of any prominence, and not even the site of
a village or settlement, yet it was a geographical name that was
known to every steamboat man running on this section of the river,
and is worthy of preservation in our local history. “Lewis’ Point” was
at the projection of land lying immediately above Oak Mills, on the
Missouri river. It took its name from the fact that Calvin Lewis, an
old riverman, settled at this point at an early day, and it became a
frequent stopping place for steamboats to take on wood. In those
days there was a splendid wood supply in that vicinity. Lewis’ house
stood near the site of the old Champton, or William Moody, house,
which was destroyed by fire about a year ago.
It is not generally known that a steamboat was ever built on
Atchison county soil, much less that Oak Mills was ever the scene of
the ship builder’s craft, outside of the construction of Indian canoes
and the modern skiffs built by Dick King or some other later-day
river man. Yet, it is a fact that Calvin Lewis once built and launched
at “Lewis’ Point” a small stern-wheel steamboat, and operated it on
the river for several years. In 1855 the first territorial legislature of
Kansas passed an act authorizing Lewis to operate a ferry at “Lewis’
Point.”
FARLEY’S FERRY.
The same legislature that gave permission to Lewis to operate a
ferry at “Lewis’ Point,” granted the same privilege to Nimrod Farley,
to maintain a ferry across the Missouri river, opposite Iatan, Mo.
Farley was a well known character in the Missouri bottoms in the
vicinity of Iatan, Cow Island, and Oak Mills, in the early days. He
lived near Iatan, but it seems that he owned land on the Kansas side,
near Oak Mills, which offered a landing for his ferry. He was a
brother of Josiah Farley, who laid out the town of Farley, in Platte
county, in 1850. George McAdow later became proprietor of Farley’s
Ferry and operated it until it was destroyed by Jayhawkers, shortly
before the war.
CHAPTER VIII.
THE CIVIL WAR.

THE ISSUE BETWEEN EARLY SETTLERS—INFLUX OF FREE


STATE AND PRO-SLAVERY PARTISANS—EARLY
VOLUNTEERING—MILITARY ORGANIZATIONS—
THREATENED INVASION FROM MISSOURI—POLITICAL
SOCIETIES—JAYHAWKERS—CLEVELAND’S GANG—
LYNCHINGS—ATCHISON COUNTY TROOPS IN THE WAR—
PRICE’S ATTEMPTED INVASION.

The six years intervening between 1854 and 1860 constitute a


momentous period in the history of Atchison county. No new
community was ever organized under more unpromising
circumstances. It was not merely land hunger and lust for personal
gain that were the impelling motives which brought men to Kansas
in that day. Neither gold, nor gas, nor oil, nor precious gems lured
men here. Kansas was then, as it is now, an agricultural paradise,
and such an environment has ordinarily but little charm for the
daring adventurer and the seeker after sudden riches, who toil not
and spin less. It is true that a large number of peaceful, plodding
home-seekers—the tillers of the soil—the hewers of wood and the
haulers of water, immigrated to Kansas to take up land and build
permanent homes, but they were in the minority prior to 1860. The
tremendous issue of human slavery was the all absorbing fact, and
the long struggle here wrought a complete revolution in the political
thought of the whole country. Men came to Kansas for the most part
for political rather than for business or agricultural reasons. The
settlement of Kansas was an inspired political movement of
partisans. There was little room for neutrals, and those who were
“too proud to fight” went elsewhere. There was little consideration
on the part of the early settlers of Kansas, of any questions except
slavery and anti-slavery. They came in large numbers from the South
and from the North, and met here upon the frontier in a final test of
strength. The Free Soilers won, but only after bitter contests in which
passion, prejudice and bloody partisanship ran riot, and Atchison
county played a most conspicuous part in this great battle. The
Nation and the world looked on as the battle lines surged forward
and backward. And while they fought here in a last desperate
struggle for supremacy, these courageous men and women on both
sides founded their towns, built their court houses, their primary
schools and their churches with an abiding faith in the hearts of each
of them that victory would finally crown their efforts. Atchison
county made progress in spite of the fact that her leaders were
wrong. We gave promise here of being the metropolis of Kansas, for
we had many geographical and commercial advantages over other
struggling communities of the Territory. But before the well laid
plans of our citizens matured, before projects for the development of
steam transportation to bring us nearer the outside world could be
concluded the mighty conflict which ended in four bloody years of
civil war, broke upon the Nation, and Kansas within three months
after being admitted as a State enrolled itself on the side of the
Union. Atchison county sprang to arms almost a thousand strong,
and may it ever be said to its everlasting glory that few, if any,
counties in the State had a more patriotic record. One hundred and
thirty-one Atchison county men enlisted in the First Kansas
regiment; twenty-five in the Seventh; eighty-five in the Eighth;
eighty-six in the Tenth; 260 in the Thirteenth; 100 in the First
Kansas (colored); twenty-five in the First Nebraska; 105 in the
Thirteenth Missouri; thirty in the Fifteenth Kansas; forty in the
Ninth, and fifty in the Sixteenth, or a total of 937 men, which,
together with the scattering of men in other regiments in adjoining
States, brought the total number of soldiers engaged during the Civil
war to 1,000. The population of Atchison county at that time was
7,747, and the voting population 1,133, which shows that the total
number of voters was but slightly larger than the total number of
volunteers. At that time Atchison, by reason of its location, was
subject to incursions from Confederate troops and Jayhawkers from
Missouri, which called for the organization at different periods of the
war, of home guard companies, which are not included in the
foregoing statement. At the outset of the war Atchison had three
militia companies, A, B and C, and a fourth, known as the All Hazard
company, the origin of whose name is thus explained. At the city
election in the spring of 1861 the issue was union or dis-union. The
Republicans and Union Democrats united in supporting G. H.
Fairchild for mayor. He was a Union Democrat who on various
occasions announced his unwavering friendship of the Union and for
the maintenance of the constitution and laws “at all hazards,” and
when this company enlisted for the war Mayor Fairchild was its
captain and it became Company K of the First Kansas. It participated
in the battle of Wilson’s Creek, August 10, 1861, which was the first
action in which a Kansas regiment was under fire.
In 1861 there were constant threats of invasion from Missouri
rebel organizations in Buchanan and Platte counties, and in that year
another home guard company was organized with the following
officers: Charles Holbert, captain; J. G. Bechtold, first lieutenant;
Clem Rhor, second lieutenant; W. Becker, third lieutenant; John
Schupp, ensign. During the following year the danger of invasion
became still more threatening and 650 men in sixteen companies
came to Atchison to protect the town from destruction. The Atchison
county companies were commanded by Captains Holbert, Hays,
Batsett, Evans and Vanwinkle. It was due to the thoroughness with
which the people of Atchison organized themselves against invasion
that they were spared from being completely annihilated. On the
fifteenth day of September, 1861, another company for home guard
service was mustered in at Ft. Leavenworth. J. M. Graham was
captain; J. G. Bechtold, first lieutenant; R. N. Bryant, second
lieutenant. This company subsequently became Company E of the
First Kansas Regiment Home Guards, numbering fifty men, and
were ordered back to Atchison for duty, where they were stationed
until all danger of invasion had passed, after which the company
became a part of the Eighth Kansas. The victories of the Union forces
in 1862 were frequent, and as a result many rebel sympathizers came
to Atchison for safety, where they became very troublesome. In order
to counteract the growing evil over the activities of these men, Mayor
Fairchild issued a proclamation in which he warned them that they
must not expect to be protected in any manner by the city laws as
long as they held to the views which they expounded at even
favorable opportunity. “It would be absurd to suppose,” the
proclamation said, “that a patriotic community could treat otherwise
than its enemies, persons who are in sympathy with base men who
have brought upon our country untold misery, almost unlimited
taxation and almost inconceivable pecuniary suffering. As a
representative of a loyal people I will not encourage men to return
among us who have circulated reports that they were refugees from
the loyal States on account of their secession doctrines, nor will I give
protection to men who unmistakably at heart belong to the
Confederacy.” This proclamation met with such favor that a mass
meeting of Union men in Atchison county was held at Price’s Hall
March 15, 1862. The whole county was well represented and stirring
addresses were delivered by Colonel Edge, of Doniphan county, Tom
Murphy, the genial proprietor of the Massasoit House, Rev. W. S.
Wenz, Lieutenant Price, E. Chesebrough, Mayor Fairchild, Caleb
May, and others, after which resolutions denouncing the southern
sympathizers and notifying them not to return were unanimously
adopted. During the latter part of the same year a call for aid to assist
the Atchison county troops met with immediate response and within
a few days, commencing August 20, 1862, almost $4,000 was
subscribed by the citizens of Atchison. Seven hundred and forty-five
dollars came from Mt. Pleasant township. Among the leading
contributors were Theodore Bartholow, E. Chesebrough, G. W.
Fairchild, J. W. Russell, W. L. Challiss, Dr. William Irwin, G. W.
Howe, Bela M. Hughes, William Hetherington, Otis & Glick, Henry
Deisbach, J. E. Wagner, Rice McCubbin, McCausland & Brown, Tom
Murphy, W. A. Cochrane, Samuel C. Pomeroy, Stebbins & Company,
E. Butcher, and William C. Smith, each of whom subscribed the sum
of $50 or over. Atchison also made a notable contribution when
Quantrell invaded Lawrence, sending $4,000 to assist the people of
that city. In 1863 depredations of the Jayhawkers became very
annoying, and a vigilance committee was organized and all good,
peaceful and loyal citizens were called upon to band themselves
together for the protection of their lives, homes and property. Those
who joined the vigilance committee took an oath to support the
Government of the United States and Kansas, and to do all in their
power to put down the rebellion, and also to keep secret all
proceedings of the organization. This committee did very effective
work in bringing to punishment violators of law and also in keeping
the lawless bands of Jayhawkers and other thieves out of Atchison
county.
The following “circular” has been unearthed by the author, and
while it bears no date it apparently contained the constitution, by-
laws, ritual and oath of these societies.
“CIRCULAR TO OFFICERS.
“Be extremely careful in the selection of your members. Admit
no one who is not of good standing in the community, and whom you
have not good reason to believe to be firm and uncompromising in
his devotion to the Union, and to be relied upon to assist in any
emergency in maintaining the laws and good order in the
community. This is of the first and highest importance to the order,
and if any member shows symptoms of defection, watch him closely.
“In all cases, deal kindly with your opponents, and strive by
gentle means to win them over to a change of sentiment. Many good
men may thus be brought within our circle who would otherwise be
lost to us.
“The first club established in your county seat will be called the
County Club, to which all clubs in the county will report, and by
those officers all such clubs will be established. It is important that
we be frequently advised as to our strength in the State; and for this
purpose each subordinate club will report weekly to the county club
the number of members enrolled therein; and the County Club will
report monthly to the Ex. Com. at —— the number of clubs and
number of members in the county. These reports should be carefully
sealed and addressed ——.
“The officers of County Clubs will be supplied with a printed
constitution and ritual, and they will furnish officers of subordinate
clubs copies of the same, with a strict injunction to secrecy.
“All correspondence must be secret as possible; and in order
that this may be accomplished the monthly reports may consist only
of the place, date, number of clubs in the county and number of
members. No signature must be attached. These reports will be
summed up and published by the Ex. Com.
“Strict secrecy as to the working of the organization is enjoined
and promptness and vigor in its extension is very important. We
must work now and work rapidly. No time is to be lost; our
opponents are working vigorously and secretly, but it is not too late
to counteract their machinations and utterly overthrow them. Work!
Work! Work!
“CONSTITUTION.

“OBJECT.

“The object shall be to preserve and maintain the Union and the
constitution of the United States and of the State of Kansas, and to
defend Kansas against invasion, insurrection, civil commotion and to
protect Union men against assassination, arson, robbery,
prescription and all other wrongs inflicted by the enemies of the
Government of the United States and of this State upon loyal
persons.

“OFFICERS.

“The officers shall consist of Pr., V. P., R. S., T., M., and S., who
shall hold their office for three months.

“DUTIES OF OFFICERS.

“The duties of officers shall be the same as in similar


organizations and all business shall be conducted in the usual
parliamentary form.

“ADMISSION OF MEMBERS.

“Persons may become members who are eighteen years of age


and upwards, and are citizens of the United States.

“INITIATION.
“All initiations shall take place in and with the authority of the
officers of the club who may delegate suitable persons to initiate
members from time to time as occasion requires outside of any
regular meeting of the club. Branch clubs may be formed by proper
application to this club when the president may appoint suitable
persons to establish the same.

“WITHDRAWALS.

“Any member may withdraw from this club by giving written


notice of the same to the R. S. at any regular meeting; but the
obligations of such member shall remain the same as before.

“AMENDMENTS.

“This constitution may be altered or amended by giving one


week’s notice thereof, by a vote of two-thirds of the executive
committee of the State. Each county club may make by-laws for its
own organization, not conflicting with this constitution.

“RITUAL.

“Eternal God! Supreme Ruler, Governor and Architect of the


Universe! We humbly beseech Thee to protect the people of the
United States in general and especially the members of this
organization. Wilt thou be pleased to direct and prosper all our
consultations to the advancement of Thy glory, the good of Thy
country, the safety, honor and welfare of Thy people, and may all
things be ordered and settled by the Legislature and Executive
branches of our Government upon the best and surest foundation, so
that peace and happiness, truth and justice may be established
among us for all generations. Wilt Thou be pleased to guide and
direct us as Thou didst our Fathers in the Revolution. With the
strength of Thine almighty arm Thou didst uphold and sustain them
through all their trials, and at last didst crown them with victory.
May charity, and brotherly love cement us; may we be united with
our principles founded upon the teachings of Thy Holy Word and
may Thy Good Spirit guide, strengthen and comfort us, now and
forever, Amen.
“All candidates for membership to this club will be required to
answer the following questions to be propounded by the marshal
before initiation:
“1. Are you opposed to secession or dis-union?
“2. Do you acknowledge that your first and highest allegiance is
due to the Government of the United States of America?
“3. Are you willing to take such an oath of allegiance to the
United States of America?
“4. Are you willing to pledge yourself to resist to the extent of
your power, all attempts to subvert or overthrow the constitution of
the United States, or the constitution of the State of Kansas?
“Should the candidates answer affirmatively, the marshal, after
repeating to the president, will conduct them into the club room and
present them to the president, who shall then address the candidates
as follows:
“Gentlemen:—We rejoice that you have thus voluntarily come
forward to unite yourselves with us. The cause we advocate is that of
our country; banded together for the purpose of perpetuating the
liberties for which our fathers fought, we have sworn to uphold and
protect them.
“It is a strange and sad necessity which impels American citizens
to band themselves together to sustain the constitution and the
Union; but the Government under which we live is threatened with
destruction. Washington enjoined upon us that ‘the unity of the
Government which constitutes us one people is a main pillar in the
edifice of our real independence; the support of our tranquility at
home, our peace abroad—of our safety, of our prosperity, of that very
liberty which we so highly prize.’ He charges that we should ‘properly
estimate the immense value of our national Union to our collective
and individual happiness; that we should cherish a cordial, habitual
and immovable attachment to it; accustoming ourselves to think and
speak of it as the palladium of our political safety and prosperity;
watching for its preservation with jealous anxiety; discountenancing
whatever may suggest even a suspicion that it can in any event be
abandoned.’
“He tells us again that ‘to the efficiency and permanency of the
Union, a Government for the whole is indispensable. No alliances,
however strict between the parts, is an adequate substitute.’
“It is to sustain this Government we are banded together, and
for this purpose you are now required to take a solemn obligation.
“Place your left hand on the National Flag and raise your right
hand toward Heaven; repeating after me:
“We and each of us do solemnly swear in the presence of God
and these witnesses to support, protect and defend the constitution
and Government of the United States and of the State of Kansas
against all enemies, foreign and domestic, and to maintain and
defend the Government of the United States and the flag thereof, and
aid in maintaining the laws of the United States in this State and to
defend the State of Kansas against invasion from any State or States
and from any other rebellion, invasion, insurrection to the best of
our ability without any mental reservation or evasion—So help us
God.
“The members will respond.
“To this we pledge ourselves.
“We do severally solemnly swear and affirm that we will protect,
aid and defend each member of all Union clubs, and will never make
known in any way or manner, to any person or persons, not
members of Union clubs, any of the signs, passwords, proceedings,
purposes, debates or plans of this or any other club under this
organization, except when engaged in admitting new members into
this organization.
“The president will then deliver the following address to the
candidates:
“‘The oath which you have now taken of your own free will and
accord cannot rest lightly upon your conscience, neither can it be
violated without leaving the stain of perjury upon your soul. Our
country is now in “disorder” and “confusion;” the fires of commotion
and contest are now raging in our midst, war has come to us but we
cannot, we must not, we dare not omit to do that which in our
judgment the safety of the Union requires, not regardless of
consequences, we must yet meet consequences; seeing the hazard
that surrounds the discharge of public duty, it must yet be
discharged. Let us then, cheerfully shun no responsibility justly
devolving upon us here or elsewhere in attempting to maintain the
Union. Let us cheerfully partake its fortune and its fate. Let us be
ready to perform our appropriate part, whenever and wherever the
occasion may call us, and to take our chances among those upon
whom the blows may fall first and fall thickest.
“‘Above all remember the words of our own immortal Clay: “If
Kentucky tomorrow unfurls the banner of resistance, I never will
fight under that banner. I owe a paramount allegiance to the whole
Union. A subordinate one to my own State.”
“‘Be faithful, then, to your country, for your interests are
indissolubly connected with hers; be faithful to these, your brethren,
for your life and theirs may be involved in this contest; be faithful to
posterity for the blessings you have enjoyed in this Government are
but held in trust for thee.’
“Response by all the members—We Will!
“The president will then present the constitution and oath to the
candidates for their signature.”
Charles Metz, a notorious Jayhawker, whose personal
appearance and characteristics are best described in an essay
entitled, “The Last of the Jayhawkers,” contributed to the old Kansas
Magazine, by John J. Ingalls. “Conspicuous among the irregular
heroes who thus sprang to arms in 1861,” says Ingalls, “and
ostensibly their leader, was an Ohio stage driver by the name of
Charles Metz, who having graduated with honor from the
penitentiary of Missouri, assumed for prudential reasons the more
euphonious and distinguished appellation of ‘Cleveland.’ He was a
picturesque brigand. Had he worn a slashed doublet and trunk hose
of black velvet he would have been the ideal of an Italian bandit.
Young, erect and tall, he was sparely built and arrayed himself like a
gentleman in the costume of the day. His appearance was that of a
student. His visage was thin, his complexion olive tinted and
colorless, as if ‘sicklied over with a pale cast of thought.’ Black
piercing eyes, finely cut features, dark hair and beard correctly
trimmed, completed a tout ensemble that was strangely at variance
with the aspect of the score of dissolute and dirty desperadoes that
formed his command. These were generally degraded ruffians of the
worst type, whose highest idea of elegance in personal appearance
was to have their mustaches a villainous, metallic black, irrespective
of the consideration whether its native hue was red or brown. * * * *
“The vicinity of the fort with its troops rendered Leavenworth
undesirable as a base of operations. St. Joseph was also heavily
garrisoned, and they accordingly selected Atchison as the point from
which to move on the enemy’s works. Atchison at that time
contained about 2,500 inhabitants. Its business was transacted upon
one street and extended west about four blocks from the river. Its
position upon the extreme curve of the ‘Grand Detour of the
Missouri, affording unrivaled facilities to the interior in the event of
pursuit. Having been principally settled by Southerners it still
afforded much legitimate gain for our bird of prey, and its loyal
population having already largely enlisted, the city was incapable of
organized resistance to the depredations of the marauders.
“They established their headquarters at the saloon of a German
named Ernest Renner, where they held their councils of war and
whence they started upon their forays. The winter was favorable to
their designs, as the river closed early, enabling them to cross upon
the ice. Cleveland proclaimed himself marshal of Kansas, and
announced his determination to run the country. He invited the
cordial co-operation of all good citizens to assist him in sustaining
the government and punishing its foes. Ignorant of his resources and
of his purposes, the people were at first inclined to welcome their
strange guests as a protection from the dangers to which they were
exposed, but it soon became apparent that the doctors were worse
than the disease. They took possession of the town, defied the
municipal authorities, and committed such intolerable excesses that
their expulsion was a matter of public safety. Their incursions into
Missouri were so frequent and audacious that a company of infantry
was sent from Weston and stationed at Winthrop to effect their
capture, but to no purpose. * * * * If a man had an enemy in any part
of the country whom he wished to injure, he reported him to
Cleveland as a rebel, and the next night he was robbed of all he
possessed and considered fortunate if he escaped without personal
violence. * * * * A small detachment of cavalry was sent from the fort
to take them, but just as they had dismounted in front of the saloon
and were hitching their horses, Cleveland appeared at the door with
a cocked navy in each hand and told them that he would shoot the
first man who moved a finger. Calling two or three of his followers he
disarmed the dragoons, took their horses and equipments and sent
them back on foot to reflect upon the vicissitudes of military affairs.
Early in 1862 the condition became desperate and the city
authorities, in connection with the commander at Winthrop,
concerted a scheme which brought matters to a crisis. Cleveland and
about a dozen of his gang were absent in Missouri on a scout. The
time of their return was known, and Marshal Charles Holbert had his
force stationed in the shadow of an old warehouse near the bank of
the river. It was a brilliant moonlight night in mid-winter. The
freebooters emerged from the forest and crossed upon the ice. They
were freshly mounted and each one had a spare horse.
Accompanying them were two sleighs loaded with negroes, harness
and miscellaneous plunder. As they ascended the steep shore of the
levee, unconscious of danger, they were all taken prisoners except
Cleveland, who turned suddenly, spurred his horse down the
embankment and escaped. The captives were taken to Weston, where
they soon afterward enlisted in the Federal army. The next day
Cleveland rode into town, captured the city marshal on the street and
declared his intention to hold him as a hostage for the safety of his
men. He compelled the marshal to walk by the side of his horse a
short distance, when finding a crowd gathering for his capture, he
struck him a blow on the head with his pistol and fled.”
Cleveland continued his exploits for a number of months after
this, but was finally captured in one of the southern counties where
he was attempting to let himself down the side of a ravine. He was
shot by a soldier from above, and the ball entered his arm and passed
through his body. He was buried in St. Joseph. Mo., and a marble
head stone over his grave bears the following inscription, placed
there by his widow: “One hero less on earth, one angel more in
heaven.”
As the direct result of the operations of Cleveland and his gang,
the spirit of lawlessness grew and the people finally “took the law
into their own hands.” Perhaps the best account of the lynchings that
followed was given by Hon. Mont. Cochran March 17, 1902, at the
time a Congressman from Missouri, but formerly a leading citizen
and county attorney of Atchison. Mr. Cochran said:
“The thieves who fell victims to Judge Lynch, while not known
as Cleveland’s gang, operated extensively throughout the period of
lawlessness in which no effort whatever was made to bring the
outlaws to justice. After the Cleveland gang had been effectively
broken up, these depredatory scoundrels continued their operations.
Their last crime, and the one for which they were gibbeted, was the
attempted robbery of an old man named Kelsey. He had received at
Ft. Leavenworth $1,500 on a Government contract, and, upon
returning home by the way of Atchison, he deposited it in
Hetherington’s bank. The thieves went to his house at night and
demanded the money. Of course, he could not produce it. They
tortured the old man and his wife alternately for hours, and when
after the departure of the thieves, the neighbors were called in,
Kelsey and his wife were nearer dead than alive. The next morning
hundreds of their neighbors, armed to the teeth, swarmed into
Atchison. In Third street, north of Commercial, was a little log
building, which had been the home of an early settler, in which was a
gunsmith’s shop. Three or four of the farmers went there to have
their fire arms put in order. When they came out one of them had a
revolver in his hand. Two fellows standing by, seeing the farmers
approaching, dived into an alley and started westward at lightning
speed. The farmers pursued and at the house of a notorious
character, known as Aunt Betsey, the fugitives were run to cover. The
house was surrounded and they were captured. One of them was
Sterling, the fiddler and pianist of the bagnio. Other arrests followed
until five were in durance. Then ensued probably the most
extraordinary proceeding known to the annals of Judge Lynch. The
mob took possession of the jail and the court house and for a week
held them. The prisoners were tried one by one. Sterling was
convicted and executed. An elm tree, standing on the banks of White
Clay creek, in the southwest quarter of the town, was admirably
suited to the purpose. When the wagon, bearing Sterling to his doom
reached the ground the whole town was in attendance. A range of
hills to the south swarmed with women. Asa Barnes, a prominent
farmer, a man of iron resolution and unswerving honesty, was the
leader of the mob. With clinched teeth and blanched face he ordered
Sterling to take his place on the seat of the wagon, and, while the
desperado was as game as a peacock, he promptly obeyed. Standing
on the wagon seat Sterling took off his hat, banged it down and
placing his foot on it, shook his clenched hand at the sea of upturned
faces, and with a volley of imprecations, said: ‘I am the best d—d
man that ever walked the earth and if you will drop me down and
give me a gun, I will fight any ten of you.’ Sandy Corbin, a great
bluffer, who bore but little better reputation than the man with the
noose on his neck, pretended that he wanted to fight Sterling single-
handed. Nobody else paid any attention to Sterling’s ravings, and in
a twinkling he was swung into eternity. The next day two others, a
man named Brewer, a soldier at home on a furlough, and a young
fellow known as Pony, met the same fate. There was much sympathy
for Pony. He was a drunkard and all his delinquencies were
attributed to this weakness. Just as they were ready to swing him up,
two or three members of the mob told him that if he would give
information as to others implicated, but who had not been arrested,
they would save him. His reply was: ‘I went into this thing as a man
and I will die as a man.’ There was a stir among those nearest the
wagon and it was discovered that an effort was being made to save
the boy from death. The traces were cut and the horses led away. The
effort failed. Fifty men seized the wagon and dragged it away. The
fourth to suffer the vengeance of the mob was an old gray-haired
man named Moody. At the trial he strongly protested his innocence,
and promised, if given a respite of twenty-four hours, he would prove
an alibi. This was granted, but the witnesses were not forthcoming
and the next day the old man was put to death. A priest visited him
in jail, which was constantly surrounded day and night, and when he
came out after administering the rights of the church to the doomed
man, it was remarked by those who saw him that the priest was as
pale as a ghost. The report gained currency that when asked if Moody
was innocent, he refused to answer yea or nay, and, although it had
not then developed that Moody could not produce the witnesses he
promised, the conduct of the priest was taken as proof that Moody
was guilty. During the week in which these extraordinary
proceedings took place, the mob was in undisputed control of the
court house and jail. Judge Lynch was perched upon the wool sack
and a jury of twelve men, who had qualified under oath, in the usual
form, occupied the jury box. Not the slightest effort at concealment
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.

More than just a book-buying platform, we strive to be a bridge


connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.

Join us on a journey of knowledge exploration, passion nurturing, and


personal growth every day!

ebookbell.com

You might also like