100% found this document useful (5 votes)
11 views67 pages

Instant Access to (Ebook) Deep Learning for Remote Sensing Images with Open Source Software by Rémi Cresson ISBN 9780367858483, 9781000093599, 9781000093605, 9781000093612, 9781003020851, 0367858487, 100009359X, 1000093603, 1000093611 ebook Full Chapters

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 67

Download Full Version ebook - Visit ebooknice.

com

(Ebook) Deep Learning for Remote Sensing Images


with Open Source Software by Rémi Cresson ISBN
9780367858483, 9781000093599, 9781000093605,
9781000093612, 9781003020851, 0367858487,
100009359X, 1000093603, 1000093611
https://ebooknice.com/product/deep-learning-for-remote-
sensing-images-with-open-source-software-11966912

Click the button below to download

DOWLOAD EBOOK

Discover More Ebook - Explore Now at ebooknice.com


Instant digital products (PDF, ePub, MOBI) ready for you
Download now and discover formats that fit your needs...

Start reading on any device today!

(Ebook) Biota Grow 2C gather 2C cook by Loucas, Jason;


Viles, James ISBN 9781459699816, 9781743365571,
9781925268492, 1459699815, 1743365578, 1925268497
https://ebooknice.com/product/biota-grow-2c-gather-2c-cook-6661374

ebooknice.com

(Ebook) Matematik 5000+ Kurs 2c Lärobok by Lena


Alfredsson, Hans Heikne, Sanna Bodemyr ISBN 9789127456600,
9127456609
https://ebooknice.com/product/matematik-5000-kurs-2c-larobok-23848312

ebooknice.com

(Ebook) SAT II Success MATH 1C and 2C 2002 (Peterson's SAT


II Success) by Peterson's ISBN 9780768906677, 0768906679

https://ebooknice.com/product/sat-ii-success-
math-1c-and-2c-2002-peterson-s-sat-ii-success-1722018

ebooknice.com

(Ebook) Master SAT II Math 1c and 2c 4th ed (Arco Master


the SAT Subject Test: Math Levels 1 & 2) by Arco ISBN
9780768923049, 0768923042
https://ebooknice.com/product/master-sat-ii-math-1c-and-2c-4th-ed-
arco-master-the-sat-subject-test-math-levels-1-2-2326094

ebooknice.com
(Ebook) Cambridge IGCSE and O Level History Workbook 2C -
Depth Study: the United States, 1919-41 2nd Edition by
Benjamin Harrison ISBN 9781398375147, 9781398375048,
1398375144, 1398375047
https://ebooknice.com/product/cambridge-igcse-and-o-level-history-
workbook-2c-depth-study-the-united-states-1919-41-2nd-edition-53538044

ebooknice.com

(Ebook) Remote Sensing and GIS for Ecologists: Using Open


Source Software (Data in the Wild) by Martin Wegmann
(editor), Benjamin Leutner (editor), Stefan Dech (editor)
ISBN 9781784270223, 1784270229
https://ebooknice.com/product/remote-sensing-and-gis-for-ecologists-
using-open-source-software-data-in-the-wild-11120736

ebooknice.com

(Ebook) GIMP 2.6 for Photographers: Image Editing with


Open Source Software by Klaus Goelker ISBN 1933952490

https://ebooknice.com/product/gimp-2-6-for-photographers-image-
editing-with-open-source-software-2223832

ebooknice.com

(Ebook) Open source for business a practical guide to open


source software licensing by Heather Meeker ISBN
9782015906065, 2015906061
https://ebooknice.com/product/open-source-for-business-a-practical-
guide-to-open-source-software-licensing-33608164

ebooknice.com

(Ebook) On-Board Processing for Satellite Remote Sensing


Images by Guoqing Zhou ISBN 9781032329642, 1032329645

https://ebooknice.com/product/on-board-processing-for-satellite-
remote-sensing-images-49191128

ebooknice.com
Deep Learning for
Remote Sensing
Images with Open
Source Software
Signal and Image Processing of
Earth Observations Series
Series Editor
C.H. Chen

Published Titles

Deep Learning for Remote Sensing Images with Open Source Software
Rémi Cresson

Sea Ice Image Processing with MATLAB®


Qin Zhang and Roger Skjetne

Compressive Sensing of Earth Observations


C.H. Chen

Radar Imaging for Maritime Observation


Fabrizio Berizzi, Marco Martorella, and Elisa Giusti

Principles of Synthetic Aperture Radar Imaging


A System Simulation Approach
Kun-Shan Chen

Remote Sensing Image Fusion


Luciano Alparone, Bruno Aiazzi, Stefano Baronti, and Andrea Garzelli
Deep Learning for
Remote Sensing
Images with Open
Source Software

Rémi Cresson
CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742

c 2020 by Taylor & Francis Group, LLC


CRC Press is an imprint of Taylor & Francis Group, an Informa business

No claim to original U.S. Government works

Printed on acid-free paper

International Standard Book Number-13: 978-0-367-85848-3 (Hardback)

This book contains information obtained from authentic and highly regarded sources. Rea-
sonable efforts have been made to publish reliable data and information, but the author
and publisher cannot assume responsibility for the validity of all materials or the conse-
quences of their use. The authors and publishers have attempted to trace the copyright
holders of all material reproduced in this publication and apologize to copyright holders if
permission to publish in this form has not been obtained. If any copyright material has not
been acknowledged please write and let us know so we may rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted,
reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other
means, now known or hereafter invented, including photocopying, microfilming, and record-
ing, or in any information storage or retrieval system, without written permission from the
publishers.

For permission to photocopy or use material electronically from this work, please access
www.copyright.com (http://www.copyright.com/) or contact the Copyright Clearance Cen-
ter, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-
for-profit organization that provides licenses and registration for a variety of users. For
organizations that have been granted a photocopy license by the CCC, a separate system
of payment has been arranged.

Trademark Notice: Product or corporate names may be trademarks or registered trade-


marks, and are used only for identification and explanation without intent to infringe.

Visit the Taylor & Francis Web site at


http://www.taylorandfrancis.com

and the CRC Press Web site at


http://www.crcpress.com
Contents

Preface ix

Author xi

I Backgrounds 1
1 Deep learning background 3
1.1 What is deep learning? . . . . . . . . . . . . . . . . . . . . . 3
1.2 Convolution . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.3 Pooling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.4 Activation functions . . . . . . . . . . . . . . . . . . . . . . . 7
1.5 Challenges ahead for deep learning with remote sensing images 8

2 Software 9
2.1 Orfeo ToolBox . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.1 Applications . . . . . . . . . . . . . . . . . . . . . . . 10
2.1.2 Streaming mechanism . . . . . . . . . . . . . . . . . . 10
2.1.3 Remote modules . . . . . . . . . . . . . . . . . . . . . 10
2.1.4 The Python API . . . . . . . . . . . . . . . . . . . . . 10
2.2 TensorFlow . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.1 APIs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.2 Computations . . . . . . . . . . . . . . . . . . . . . . . 11
2.2.3 Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3 Orfeo ToolBox + TensorFlow = OTBTF . . . . . . . . . . . 12
2.3.1 Installation . . . . . . . . . . . . . . . . . . . . . . . . 13
2.3.2 Featured applications . . . . . . . . . . . . . . . . . . 14
2.3.3 Principle . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.4 Multiple input sources and outputs . . . . . . . . . . . 16
2.4 QGIS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

II Patch-based classification 19
3 Introduction 21

v
vi Contents

4 Data used: the Tokyo dataset 23


4.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.2 Remote sensing imagery . . . . . . . . . . . . . . . . . . . . . 23
4.3 Terrain truth . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

5 A simple convolutional neural network 27


5.1 Normalization . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.2 Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.2.1 Selection . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.2.2 Extraction . . . . . . . . . . . . . . . . . . . . . . . . . 30
5.3 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.3.1 Principle . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.3.2 Model architecture . . . . . . . . . . . . . . . . . . . . 33
5.3.2.1 Input . . . . . . . . . . . . . . . . . . . . . . 34
5.3.2.2 Layers . . . . . . . . . . . . . . . . . . . . . . 34
5.3.2.3 Estimated class . . . . . . . . . . . . . . . . 34
5.3.2.4 Loss function . . . . . . . . . . . . . . . . . . 34
5.3.2.5 Optimizer . . . . . . . . . . . . . . . . . . . . 35
5.4 Generate the model . . . . . . . . . . . . . . . . . . . . . . . 35
5.5 Train the model from scratch . . . . . . . . . . . . . . . . . . 37
5.6 Comparison with Random Forest . . . . . . . . . . . . . . . 40
5.7 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

6 Fully Convolutional Neural Network 45


6.1 Using the existing model as an FCN . . . . . . . . . . . . . . 45
6.2 Pixel-wise fully convolutional model . . . . . . . . . . . . . . 46
6.3 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
6.4 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50

7 Classifiers on deep features 51


7.1 Principle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
7.2 Overview of composite applications in OTB . . . . . . . . . . 52
7.3 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
7.4 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

8 Dealing with multiple sources 55


8.1 More sources? . . . . . . . . . . . . . . . . . . . . . . . . . . 55
8.2 Model with multiple inputs . . . . . . . . . . . . . . . . . . . 56
8.3 Normalization . . . . . . . . . . . . . . . . . . . . . . . . . . 59
8.4 Sampling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
8.5 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
8.5.1 Inference . . . . . . . . . . . . . . . . . . . . . . . . . 62
8.5.1.1 Patch-based mode . . . . . . . . . . . . . . . 62
8.5.1.2 Fully convolutional mode . . . . . . . . . . . 62

9 Discussion 65
Contents vii

III Semantic segmentation 67


10 Semantic segmentation of optical imagery 69
10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
10.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69

11 Data used: the Amsterdam dataset 73


11.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
11.2 Spot-7 image . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
11.3 OpenStreetMap data . . . . . . . . . . . . . . . . . . . . . . 74
11.3.1 OSM downloader plugin . . . . . . . . . . . . . . . . . 75
11.3.2 Download OSM data . . . . . . . . . . . . . . . . . . . 75
11.3.3 Prepare the vector layer . . . . . . . . . . . . . . . . . 77

12 Mapping buildings 83
12.1 Input data pre-processing . . . . . . . . . . . . . . . . . . . . 83
12.1.1 Satellite image pansharpening . . . . . . . . . . . . . . 83
12.1.2 Image normalization . . . . . . . . . . . . . . . . . . . 84
12.1.3 Sample selection . . . . . . . . . . . . . . . . . . . . . 84
12.1.3.1 Patch position seeding . . . . . . . . . . . . . 84
12.1.3.2 Patch position selection . . . . . . . . . . . . 86
12.1.3.3 Patches split . . . . . . . . . . . . . . . . . . 87
12.1.4 Rasterization . . . . . . . . . . . . . . . . . . . . . . . 88
12.1.5 Patch extraction . . . . . . . . . . . . . . . . . . . . . 89
12.2 Building the model . . . . . . . . . . . . . . . . . . . . . . . 90
12.2.1 Architecture . . . . . . . . . . . . . . . . . . . . . . . 91
12.2.2 Implementation . . . . . . . . . . . . . . . . . . . . . . 92
12.2.2.1 Exact output . . . . . . . . . . . . . . . . . . 94
12.2.2.2 Expression field . . . . . . . . . . . . . . . . 95
12.2.3 Generate the SavedModel . . . . . . . . . . . . . . . . 95
12.3 Training the model . . . . . . . . . . . . . . . . . . . . . . . 96
12.4 Inference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

13 Discussion 99

IV Image restoration 101


14 Gapfilling of optical images: principle 103
14.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
14.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
14.3 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
14.3.1 Encoder . . . . . . . . . . . . . . . . . . . . . . . . . . 106
14.3.2 Decoder . . . . . . . . . . . . . . . . . . . . . . . . . . 106
14.3.3 Loss . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
viii Contents

15 The Marmande dataset 109


15.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
15.2 Sentinel-2 images . . . . . . . . . . . . . . . . . . . . . . . . 109
15.3 Sentinel-1 image . . . . . . . . . . . . . . . . . . . . . . . . . 112

16 Pre-processing 115
16.1 Sentinel images . . . . . . . . . . . . . . . . . . . . . . . . . 115
16.1.1 Optical images . . . . . . . . . . . . . . . . . . . . . . 115
16.1.2 SAR image . . . . . . . . . . . . . . . . . . . . . . . . 117
16.1.2.1 Calibration . . . . . . . . . . . . . . . . . . . 117
16.1.2.2 Filtering values . . . . . . . . . . . . . . . . . 118
16.1.2.3 Linear stretch . . . . . . . . . . . . . . . . . 118
16.1.2.4 Spatial resampling . . . . . . . . . . . . . . . 118
16.2 Patches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
16.2.1 Patch position seeding . . . . . . . . . . . . . . . . . . 119
16.2.1.1 Sentinel-2 image masks . . . . . . . . . . . . 119
16.2.1.2 Merge masks . . . . . . . . . . . . . . . . . . 122
16.2.1.3 Grid generation . . . . . . . . . . . . . . . . 123
16.2.1.4 Grid filtering . . . . . . . . . . . . . . . . . . 124
16.2.1.5 Patch centroids . . . . . . . . . . . . . . . . . 126
16.2.1.6 Training and validation datasets . . . . . . . 126
16.2.2 Extraction of patches . . . . . . . . . . . . . . . . . . 127
16.3 More: automate steps with the OTB Python API . . . . . . 128
16.3.1 Build the pipeline . . . . . . . . . . . . . . . . . . . . 129
16.3.2 Run the pipeline . . . . . . . . . . . . . . . . . . . . . 132

17 Model training 133


17.1 Training from Python . . . . . . . . . . . . . . . . . . . . . . 133
17.2 Get the code . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
17.3 Use the code . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
17.3.1 Description . . . . . . . . . . . . . . . . . . . . . . . . 134
17.3.2 Parameters . . . . . . . . . . . . . . . . . . . . . . . . 134
17.4 Export the model . . . . . . . . . . . . . . . . . . . . . . . . 135

18 Inference 137
18.1 Inputs and outputs . . . . . . . . . . . . . . . . . . . . . . . 137
18.2 Generating the image . . . . . . . . . . . . . . . . . . . . . . 137
18.3 Postprocessing . . . . . . . . . . . . . . . . . . . . . . . . . . 139

19 Discussion 143

Bibliography 145

Index 149
Preface

The volume of earth observation data has increased in recent years, and more
and more remote sensing imagery is available today. On one hand, the amount
of remote sensing data is constantly growing due to the rise of very-high-
resolution sensors and short repeat cycle satellites. Thanks to programs like
Landsat or Sentinel, many free imagery products are available. In addition,
community-based geographic information gathering systems are expanding
each in existing geospatial data bases: the OpenStreetMap initiative is a well-
known example. Regarding this amount of geospatial data, tackling complexity
in earth observation information extraction is rising as a major and exciting
challenge.
Deep learning is a growing trend in big data analysis, and had a break-
through impact in the last few years on such diverse domains as image analysis,
speech recognition, autonomous cars, or the arts. Convolutional Neural Net-
works are designed to extract features in images, enabling image recognition,
object detection, and semantic segmentation. Recurrent Neural Networks are
suited for sequential data analysis, such as speech recognition and action recog-
nition tasks. As for numerous fields, the deep learning revolution has impacted
the remote sensing sector. In recent years, a number of studies have shown
that remote sensing benefits strongly from these new approaches, thanks to the
availability of data and computing resources. Deep learning allows researchers
and engineers in remote sensing to move beyond usual approaches and tackle
a number of problems with solid results. Various remote sensing problems
have been successfully addressed with deep learning: classification, segmen-
tation, object detection, image restoration, image enhancement, etc. Deep
learning has proven to be pertinent in many kinds of remote sensing imagery:
synthetic aperture radar, hyperspectral imagery, very-high-resolution images,
time series, etc.
A lot of the literature explains deep learning theory, but this book rather
focuses on its technical application on satellite imagery. The core of this work
teaches how to apply deep learning techniques to real-world remote sensing
images using existing, open-source tools like QGIS, TensorFlow and Orfeo
ToolBox. The generation of land cover maps from large satellite images will
be the central topic. However, perspectives of applying deep learning to various
earth observation sensors for a number of purposes are open, and we will show
that deep learning applies not only to image classification, but also to image
restoration from multimodal imagery.

ix
Visit https://ebooknice.com to
discover a wide range of
eBooks across various genres.
Enjoy exclusive deals and
discounts to enhance your
reading experience. Start your
digital reading journey today!
x Preface

After a short summary of deep learning background, the book will intro-
duce the common steps to extract samples from remote sensing images, create
and train deep networks, and use them to generate output images, e.g. land
cover maps. Various approaches and deep network architectures will be intro-
duced in different parts of the book. For each of them, all steps enabling
the reader to perform the processing of data will be detailed. We provide an
online repository containing the ancillary data and sharing code snippets used
in the exercises. The software involved in this tutorial are open-source, and
instructions to install them are provided.
Author

Rémi Cresson received an electrical engineering degree from the Ecole


Nationale Superieure de l’Energie, de l’Eau et de l’Environnement, Greno-
ble Institute of Technology, France, in 2009. He is now with the French
National Institute for Agricultural Research, in the Land, Environment,
Remote Sensing and Spatial Information Joint Research Unit at the Uni-
versity of Montpellier, France. His research and engineering interests include
geospatial image processing at scale, high-performance computing, machine
learning, and geospatial data interoperability. He is involved in open source
software development, a member of the Orfeo ToolBox Project Steering Com-
mittee, and also a member of the open source geospatial foundation OSGeo.

xi
Part I

Backgrounds
1
Deep learning background

In this section, we provide the essential principles of deep learning. After read-
ing this chapter, one should have the required theoretical basis to understand
what is involved in processing geospatial data in the rest of the book.

1.1 What is deep learning?


Deep learning is becoming increasingly important to solve a number of
image processing tasks [1]. Among common algorithms, Convolutional Neural
Network- and Recurrent Neural Network-based systems achieve state-of-the-
art results on satellite and aerial imagery in many applications. For instance,
synthetic aperture radar (SAR) interpretation with target recognition [2], clas-
sification of SAR time series [3], parameter inversion [4], hyperspectral image
classification [5], anomaly detection [6], very-high-resolution image interpre-
tation with scene classification [7, 8], object detection [9], image retrieval [10],
and classification from time series [11]. Deep learning has addressed other
issues in remote sensing, like data fusion (see [12] for a review) e.g. multi-
modal classification [13], pansharpening [14], and 3D reconstruction.
Deep learning refers to artificial neural networks with deep neuronal layers
(i.e. a lot of layers). Artificial neurons and edges have parameters that adjust
as learning proceeds. Inspired from biology, an artificial neuron is a math-
ematical function modeling a neuron. Neuron parameters (sometimes called
weights) are values that are optimized during the training step. Equation 1.1
provides an example of a basic artificial neuron model. In this equation, X
is the input, y the output, A the values for the gains, b is the offset value,
and f is the activation function. In this minimal example, the parameters of
the artificial neuron are gains and one offset value. Gains compute the scalar
product with the input of the neuron, and the offset is added to the scalar
product. The resulting value is passed into a non-linear function, frequently
called the activation function.
X
y = f (AX + b) = f [ (ai × xi ) + b] (1.1)
i

3
4 Deep learning background

Weights modify the strength of the signal at a connection. Artificial neu-


rons may output in non-linear functions to break the linearity, for instance,
to make the signal sent only if the resulting signal crosses a given threshold.
Typically, artificial neurons are built in layers, as shown in figure 1.1.

FIGURE 1.1
Example of a network of artificial neurons aggregated into layers. In the arti-
ficial neuron: xi is the input, y is the neuron output, ai are the values for the
gains, b is the offset value, and f is the activation function.

Different layers may perform different kinds of transformations on their


inputs. Signals are traversing the network from the first layer (the input layer)
to the last layer (the output layer), possibly after having traveled through
some layers multiple times (this is the case, for instance, with layers having
feedback connections). Among common networks, Convolutional Neural Net-
works (CNNs) achieve state-of-the-art results on images. CNNs are designed
to extract features in images, enabling image recognition, object detection,
and semantic segmentation. Recurrent Neural Networks (RNNs) are suited
for sequential data analysis, such as speech recognition and action recognition
tasks. A review of deep learning techniques applied to remote sensing can be
found in [15].
In the following we will briefly present how CNN works. Basically, a CNN
processes one or multiple input multidimensional arrays with a number of
operations. Among these operations, we can enumerate convolution, pooling,
non-linear activation functions, and transposed convolution. The input space
a CNN “sees” is generally called the receptive field. Depending on the imple-
mented operations in the net, a CNN generates an output multidimensional
array with output size, dimension, and physical spacing that can be different
from the input.
Convolution 5

1.2 Convolution
Equation 1.2 shows how the result Y of a convolution with kernel K on input
X is computed in dimension two (x and y are pixel coordinates). In this
equation, Y and K are 2-dimensional matrices of real-valued numbers.

XX
Y (x, y) = (X ∗ K)(x, y) = X(x − i, y − j) × K(i, j) (1.2)
i j

Because of the nature of the convolution, one CNN layer is composed of


neurons with edges that are massively shared with the input: it can be assimi-
lated to a small densely connected neural layer that performs the whole input
locally (i.e. with connections to input restricted to a neighborhood of neurons).
Generally, CNN layers are combined with non-linear activation functions and
pooling operators. The convolution can be performed every N th step in x and
y dimensions: this is usually called the stride. The output of a convolution
involving strides is described in equation 1.3, where sx and sy are the stride
values in each dimension.

XX
Y (x, y) = (X ∗ K)(x, y) = X(x × sx − i, y × sy − j) × K(i, j) (1.3)
i j

A convolution can be performed keeping the valid portion of the output,


meaning that only the values of Y that are computed from every value of
K over X are kept. In this case, the resulting output has a lower size than
the input. It can also be performed over a padded input: zeros are added all
around the input so that the output is computed at each location, even where
the kernel doesn’t lay entirely inside the input. This is generally used to keep
the input size through the convolution. However, the borders of the output
are affected by the zeros of the padding. Figure 1.2 shows how the stride and
padding of the convolution affect the output size (the number of rows and
columns of the resulting image) and the physical spacing (the physical size of
the output pixel).
6 Deep learning background

FIGURE 1.2
Left: no padding, stride=2, Center: padding, stride=2, Right: padding,
stride=1. The output is the upside, white matrix, and the input is the blue
matrix at the bottom. The output size depends on the following parameters of
the convolution: padding, kernel size, and stride. The output physical spac-
ing is changed accordingly to the stride in the convolution: n strides ensure
that physical spacing is scaled by n.

Important note

• Non-unitary strides modify the output physical spacing.


• Without padding, the output size is changed, depending on the kernel
size.
• With padding, the output size can be kept (same as the input, if no
stride) but values at the border might be contaminated by the zero
padding.

1.3 Pooling
Another important concept of CNNs is pooling, which is a form of non-linear
downsampling. Several non-linear functions can be implemented in pooling
operations, among which max pooling is the most common: it partitions the
input image and for each sub-region, outputs the maximum ref to figure 1.3.
The intuition is that the exact location of a feature is less important than
its rough location relative to other features. The pooling layer aims to pro-
gressively reduce the spatial size of the representation, to reduce the number
of parameters and amount of computation in the network, and hence to also
control overfitting. It is common to periodically insert a pooling layer between
successive convolutional layers in a CNN architecture. The pooling operation
Activation functions 7

provides another form of translation invariance. The pooling layer operates


independently on every depth slice of the input and resizes it spatially. Like
convolution with non-unitary stride, the pooling operators also modify the
output size.

FIGURE 1.3
2 × 2 max pooling with stride 2. Each 2 × 2 sub-region of the image, the
maximum value is kept.

Important note
Pooling, and convolutions with stride, can be viewed as a subsampling
process, which does modify the output size, and the output physical spac-
ing. Depending on the implementation, it can also keep partially sampled
items at borders.

1.4 Activation functions


Inspired from biology, the activation functions are usually an abstraction rep-
resenting the way a neuron will be activated (i.e. it will transmit its output
value) or not. Basically, the activation functions control the amount of what
is actually transmitted to output. There are many activation functions used
in deep learning. Among these, we can enumerate the following, which are
common ones we will use during this tutorial:
1. The rectified linear unit returns a maximum value between 0 and
x:
f (x) = max(0, x)
2. A leaky rectified linear unit allows a small, positive gradient when
the unit is not active. It is parametrized with α, a scalar value:
(
x if x > 0,
f (x) =
ax otherwise,

3. Sigmoid:
1
f (x) =
1 + e−x
Visit https://ebooknice.com to
discover a wide range of
eBooks across various genres.
Enjoy exclusive deals and
discounts to enhance your
reading experience. Start your
digital reading journey today!
8 Deep learning background

4. Hyperbolic tangent (Tanh), which is a shift-scaled sigmoid


(tanh(x) = 2 × sigmoid(2x) − 1):

ex − e−x
f (x) =
ex + e−x

1.5 Challenges ahead for deep learning with remote


sensing images
There are still challenges remaining ahead to process real-world remote sens-
ing images with deep learning. One crucial point is software implementation.
Entering into deep learning requires familiarity with a framework. Implement-
ing new methods requires an extensive programming background and deep
learning knowledge. The tools must scale enough to the huge size of real-world
remote sensing images and datasets. In particular, the image size problem is
out of deep learning frameworks scope. In computed vision, popular problems
and datasets are often related to small-sized and true-color images. Many
deep learning architectures must be redesigned to suit remote sensing images.
Another point is that algorithms presented in the literature often serve one
unique purpose and each new algorithm implementation requires new effort
and expertise to be applied on standardized geospatial data. Additionally,
recent studies have pointed out that using the features computed with deep
nets in machine learning algorithms like Support Vector Machines or Random
Forests, offers great opportunities. However, even if most remote sensing image
processing software implement a machine learning framework with such algo-
rithms, only a few of them enable the combination of deep learning algorithms
with those already implemented well-known machine learning algorithms.
In the next section, we introduce the software that leverages all these
issues.
2
Software

Software is a crucial point in both remote sensing and deep learning. This
section introduces the open source software involved in the practice sessions
of the book.

2.1 Orfeo ToolBox


There are many types of geospatial images processing software, of which many
are powerful open source programs like GRASS, SAGA, GDAL, and the Orfeo
ToolBox.1 The Orfeo ToolBox, also known as OTB ref to figure 2.1, is an
open source library for remote sensing image processing developed by the
French spatial agency (CNES). It contains about a hundred user-oriented
applications including geometric and radiometric corrections, image manip-
ulation, statistics, etc. It is suited for large-scale remote sensing image pro-
cessing, offering a great number of functions and algorithms [16]. It is based
on the Insight ToolKit2 (ITK), a widely used open source library for medical
image computation, and the Geospatial Data Abstraction Library3 (GDAL).
Additional optional libraries enrich the features of OTB (OpenCV, Shark ML,
etc.).

FIGURE 2.1
The Orfeo ToolBox.
1 https://www.orfeo-toolbox.org/
2 https://itk.org/
3 https://gdal.org/

9
10 Software

2.1.1 Applications
The Orfeo ToolBox applications aim to provide users a number of implemented
algorithms. These applications can be used directly through multiple interfaces
like command line, graphical user interface, and third-party software like QGIS
for instance. For developers, applications can also be used from Python and
C++ APIs. Each OTB application comes with a set of parameters. Parameters
used in OTB applications are typed (images, vector data, file, folder, integer,
floating, etc.) and this formalism makes the OTB applications usable through
all interfaces and APIs.

2.1.2 Streaming mechanism


The streaming mechanism enables the processing of images several times on
different regions. Instead of working on the entire data, which would require
a lot of memory, blocks are processed one by one until the output data is
produced. This mechanism preserves the memory footprint and enables the
processing of very large data. Most of the algorithms implemented in OTB
applications and C++ classes support this mechanism. The few algorithms
that don’t implement the streaming mechanism require that all input and
output data are stored in memory at execution.

2.1.3 Remote modules


For developers, OTB eases the integration and the deployment of new fea-
tures thanks to the external modules. This mechanism enables you to stan-
dardize the development of components external to the library, such as a new
OTB application. Remote modules allow anyone to extend the functionalities
of OTB without being part of the core project repository. Remote modules
can have different licenses than the main OTB repository. For instance, the
OTBTF remote module that we will use in this tutorial, implements deep
learning using the TensorFlow library. It is not in the core of the official OTB
library, but provides applications that can be used as any other using the OTB
application API.

2.1.4 The Python API


The Python API of the Orfeo ToolBox enables you to build pipelines using in-
memory connection of existing OTB applications. The resulting pipeline still
enforces the streaming mechanism. Another useful feature is the ability of
using Numpy arrays in pipelines, as input or output of the OTB applications.
The Python API is really great to build complex pipelines with already exist-
ing algorithmic blocks like the OTB applications (including the applications
from remote modules). This book targets user-oriented interactions with OTB
applications, and Python bindings are partially introduced in section 16.3.
TensorFlow 11

We invite curious readers who want to exploit the full potential of the tool-
box using the Python API, to read the official documentation along with
the available tutorials online. Additional help can be obtained from the users
community on the dedicated forum from the Orfeo ToolBox website.4

2.2 TensorFlow
TensorFlow is the high-performance numerical computation library for train-
ing and inference on deep neural networks [17], initiated by Google. It has
become a popular open source deep learning framework, with a large world-
wide growing community of developers ref to figure 2.2.

FIGURE 2.2
The TensorFlow library.

2.2.1 APIs
TensorFlow has a powerful high-level Python API to build deep nets. Cur-
rently, TensorFlow uses Keras which allows you to build networks from a
higher implementation level. TensorFlow also has a simple yet powerful C++
API to integrate deep nets in C++ projects.

2.2.2 Computations
TensorFlow uses symbolic programming that distinguishes definitions of com-
putations from their proper execution. In TensorFlow, tensors are abstraction
objects of the operations and values in the memory, simplifying manipula-
tion regardless of the computing environment: for instance, Central Process-
ing Unit (CPU) or Graphical Processing Unit (GPU). In TensorFlow, the
so-called model consists of operations arranged into a graph of nodes. Each
node is an operation taking zero or more tensors as inputs, and producing one
or multiple tensors. This data flow graph defines the operations (e.g. linear
4 forum.orfeo-toolbox.org
12 Software

algebra operators), and the actual computations are performed in the Tensor-
Flow session.

2.2.3 Graphs
The Python API enables the construction of TensorFlow graphs, and the
session runs the graph, delegating calculations to low level, highly optimized
routines. Among tensors, we can distinguish concepts such as Placeholders,
Constants and Variables. A Placeholder is a symbol hosting input data, e.g.
a set of images. As its name indicates, Constants are tensors with constant
values, and Variables hold non-persistent values, e.g. parameters to estimate
during training. Variables must be explicitly initialized, and can be saved or
restored during a session along with the graph.

2.3 Orfeo ToolBox + TensorFlow = OTBTF


Nowadays several popular open source frameworks exist, such as TensorFlow,
Theano or PyTorch, but using them requires an extensive programming back-
ground and deep learning knowledge, which is limiting deep learning democ-
ratization in the remote sensing community. This book intends to introduce
deep learning applied on geospatial data, even if the reader is not a Python
coder and signal processing expert. The OTBTF remote module of the Oreo
ToolBox will be used to support the interaction between the remote sensing
and the deep learning worlds. Thanks to the user-friendly interface provided
by OTB, using geospatial data as input for deep networks is eased. OTBTF
uses TensorFlow internally to bring the deep learning magic acting with the
same interface as OTB, but not in refraining deep learning experts from using
their own model. The main features of OTBTF are the following:
1. It is open source and cross-platform.
2. It allows users without programming skills to use deep nets on
remote sensing images.
3. As an extension of the Orfeo ToolBox, it inherits from the following
advantages:
•The powerful API of OTB (available interfaces, applications
chaining).
•Seamless interactions with the existing machine learning and
geospatial data sampling framework of OTB.
•It implements the streaming mechanism (see section 2.1.2) and
thus allows the processing of very large remote sensing images
Orfeo ToolBox + TensorFlow = OTBTF 13

in a computationally efficient manner (benchmarks are avail-


able in [18]).

4. Deep learning experts can still use their coding skills to build/train
their models using their favorite API (e.g. Python) and use the
SavedModel in OTBTF to produce their resulting geospatial
images.

2.3.1 Installation
Still, the OTBTF remote module is not currently integrated in the official
OTB release. Currently, there are several ways to install it:
• Use the ready-to-use environment in the provided VirtualBox instance. The
only required software for this tutorial is the Oracle VirtualBox software,
that can be freely downloaded online.5 It is the easiest option if you want to
give a try. However, please note that this is not optimized for performance.
You can contact the author to request the download of the virtual machine.
• Download the official OTBTF Docker image on DockerHub.6 This option
is great for Linux and Mac users since Docker efficiently uses computer
resources. If you have an NVIDIA GPU you can choose the Docker image
supporting the NVIDIA runtime.
• Compile OTBTF from the source. For this, please follow the instructions
provided on the OTBTF GitHub repository. This is the difficult way, but
you will be able to build the software with the appropriate optimization
flags for your hardware.
If any problems are encountered during the installation, we refer the reader
to the up-to-date troubleshooting section of the OTBTF repository. Further
questions can also be asked on the Orfeo ToolBox forum7 and tickets opened
on the OTBTF GitHub repository.

Recommendations
We strongly advise Linux and Mac users to use Docker images for better
performance. Another important point is the hardware: Graphical Pro-
cessing Units (GPUs) drastically speed up the processes in deep learn-
ing. In particular, for chapter 14 we recommend using a GPU to run
the model training. The OTBTF Docker image supporting nvidia-Docker
provided in the OTBTF GitHub repository simplifies greatly the use of

5 www.virtualbox.org
6 https://hub.docker.com/u/mdl4eo
7 forum.orfeo-toolbox.org
14 Software

the software on GPU-enabled hardware. Instructions are available in the


documentation section of the OTBTF repository.

2.3.2 Featured applications


The remote module contains a set of new user-oriented applications. Among
others, the key applications that will be used in this tutorial are the following:
• PatchesSampling is dedicated to extraction of patches in multiple images.

• TensorflowModelTrain can train a TensorFlow model from multiple input


images.
• TensorflowModelServe can run a TensorFlow model on multiple input
images.
There is also a set of composite applications, that chain together existing OTB
applications and the above applications to perform state-of-the-art machine
learning (e.g. combining Random Forest with features from deep learning). In
the practice session, we will see how to use these new applications with remote
sensing images.

2.3.3 Principle
As explained in section 2.2, TensorFlow provides APIs to allow developers
to easily build deep nets. Once a deep net is built, it can be exported as a
SavedModel ,8 serialized as a Google Protobuf (language-neutral, platform-
neutral, extensible mechanism for serializing structured data). This Saved-
Model includes the processing graph, and the variables of the model. It can
then be used in the new OTB applications provided by the OTBTF remote
module. OTBTF implements mechanisms to convert images into tensors that
can feed the TensorFlow models, execute the model, and convert back the
resulting tensors of the model into output images. Thanks to the streaming
mechanism of OTB, there is no limitation on processed image size. A Tensor-
Flow model is associated with its following intrinsic parameters, that must be
given to OTBTF:
1. The receptive field: as explained in section 1, the receptive field
is the input volume that the model “sees” in the input space.
2. The expression field: it describes the volume that the model
“creates” in the output space.
3. Scale factor: the scale factor describes the physical spacing change
between one reference input (typically, the first input) and the out-
put space. For instance, we consider a CNN that transforms a single
8 www.tensorflow.org/guide/saved model
Orfeo ToolBox + TensorFlow = OTBTF 15

input into a single output, implementing a total of 2 pooling opera-


tors with stride 2. Then the total scale factor is 4, meaning that the
output spacing is multiplied by a factor of 4, and hence the output
image size will be divided by 4 in each x and y dimension.
The TensorflowModelServe application automatically computes the
region propagation in the pipeline (i.e. from input images to output), and
the output image information, which are two crucial steps to enable stream-
ing. Figure 2.3 illustrates how related the above deep nets parameters are in
these steps.

FIGURE 2.3
Deep net parameters. The figure shows the incidence of two deep net parame-
ters on region propagation and output image information. The two nets have
the same Expression Field and Receptive field, but a different Scale Factor
parameter (Fscale = 1 for the net on top, Fscale = 2 for the net on bottom).

The application computes regions of images that are aligned to the expres-
sion field, i.e. image regions which have a start index and a size that are a
multiple of the expression field. When a requested output region lies across
multiple aligned sub-regions, the TensorFlow model is run over the largest
aligned region, and only the requested part is then kept (figure 2.4). This
mechanism avoids blocking artifacts and ensures that the output is indepen-
dent of the tiling layout used by the streaming manager, i.e. (i) seamless, if
allowed by the model, and (ii) reproducible.
16 Software

FIGURE 2.4
Computed regions alignment. To guarantee that the output is seamless and
reproducible, the requested region is computed from the region which is
included in the larger region aligned to the grid formed by the expression
field of the net.

The TensorflowModelServe application can run deep nets in two modes:


• Patch based: Extract and process patches independently at regular inter-
vals. Patch sizes are equal to the receptive field sizes of inputs. For each
generated output image region, tensors of patches feed the TensorFlow
model, then resulting tensors are translated back to the output image. This
approach is straightforward, but not efficient in term of processing for large
images because of the numerous transformations between images and tensors
of patches.
• Fully convolutional: Unlike patch-based mode, it allows the processing of
an entire requested region. For each generated output image region, tensors
composed of one single element, corresponding to the input image requested
region, is fed to the TensorFlow model. This mode requires that receptive
fields, expression fields and scale factors are consistent with operators imple-
mented in the TensorFlow model, and input images’ physical spacing and
alignment. This approach is computationally efficient since the transforma-
tions between images and tensors of patches are avoided: the network is
applied directly on the input images.

2.3.4 Multiple input sources and outputs


The applications provided in OTBTF are designed to process any number
of input image sources. This is motivated by the fact that data fusion is a
challenge that deep learning can leverage easily, using deep nets tailored to one
specific problem. Today’s revolution in remote sensing is the high availability
of optical and SAR time series, thanks to the Sentinel and Landsat programs.
In addition, very-high-resolution sensors are sharper in each generation and
nowadays there are a lot of sensors with sub-metric pixel spacing (Pleiades,
TerraSAR, etc.). Regarding this matter, deep learning typically allows you to
design a deep net that inputs time series and very high remote sensing images
Exploring the Variety of Random
Documents with Different Content
SONETTO

Vaghe colline, ombrose amenità,


Canti, e danze di lieta gioventù,
Ruscel, che cade d'erta balza in giù,
E dolce nel cader strepito fa.

Aura, che lieve sussurrando va,


Augel, che spiegha agili i vanni in su,
Talor diletto, o Creditor, mi dà,
Ma poscia in mente mi ritorni tu.

Tu mi funesti ogni piacere, e un dì


Gir non può lungi il mio pensier da te,
Sicchè a te non ritorni, onde partì.

E il costante pensier de' Giulj tre


Emmisi fatto natural così,
Che quasi necessario omai si fe.
SONETTO

Mai l'Uom felice in vita sua non fu,


Fanciullo un guardo sol tremar lo fa;
Quindi trapassa la più fresca età,
Intento alle bell'arti, e alle virtù.

Poi nel fiero bollor di gioventù


Or d'amore, or di sdegno ardendo va,
Di quà malanni, e cancheri di là,
E guai cogli anni crescon sempre più.

Alfin vengono i debiti, e allor sì


Che più speme di ben per lui non vi è,
E anch'io la vita mia trassi così.

E il debito fatal di Giulj tre


Ora ai malanni, che passai fin quì
Solennemente il compimento diè.
SONETTO

O Bambolin, che nella prima età


Solazzandoti vai lieto così,
Nè molesto pensier t'infastidì,
Nè affannoso rancor noja ti dà;

Deh l'innocente tua tranquillità


Protegga il Ciel, che provat'hai fin quì,
Nè ti riserbi a più funesto dì,
Quando il tuo biondo crin s'imbiancherà.

Quanto, fanciul felice, invidio a te


Quel contento, che il Cielo ti donò,
E quella pace, che 'l mio cor perdè!

Ma quel, che invidio più, sai tu cos'è?


E' che intorno non hai, siccome io l'ho,
Chi ti tormenti ognor per Giulj tre.
SONETTO

Canta lo stanco Passaggier, che a piè


Torna da lungi alla natìa Città,
Canta l'adusto Mietitor, benchè
Del Sol cocente esposto ai rai si sta.

Canta il Nocchier, benchè oda intorno a se


La ria procella, che fremendo va,
E canta l'Augelletto, che perdè
La cara sospirata libertà.

Canto giocosi versi anch'io così,


Sebben l'antica pace al cor non ho,
E il bel contento, che godeva un dì.

E la noja così temprando vo,


Che cagionommi il Creditor fin quì;
Giacch'è tutt'un, ch'io me ne affligga, o nò.
SONETTO

Se a rimirar qualche augelletto sto,


Che rapido per l'aere sen va,
E dall'Egitto se ne venne quà,
O le fredde Alpi, e l'Appennin passò,

Felice lui dich'io, cui 'l Ciel donò


Sì bella, e spaziosa libertà,
Che Cielo, e region fissa non ha;
Ma il vol disciorre, ove gli aggrada, ei può.

Deh perchè far non posso anch'io così,


Perchè egual libertà si niega a me,
Che debbo star contro mia voglia quì?

Quì dove eterna stanza il Ciel mi diè,


E inevitabilmente e notte, e dì
Ho attorno il Creditor de' Giulj tre.
SONETTO

Tu mi chiedi danari, ed io non gli ho, [2]


E il tempo perdi senza utilità,
Se vuoi, che te ne faccia un Pagherò,
Di fartelo non ho difficultà.

Non te li nego già, nè te li do,


Che nessuno può dar, quel che non ha:
Ti prometto pagar, quando gli avrò,
E tu accetta la buona volontà.

Or dunque datti pace, e i Giulj tre


Non domandarmi tante volte il dì,
Quando gli avrò, te li darò da me.

Perchè volermi tormentar? perchè


Voler seccare un pover'uom così?
Hai tempo a dir: quel, che non c'è, non c'è.
SONETTO

Mentre la greggia pascolava un dì [3]


Gige pastore, un aureo anel trovò,
Che nel dito poichè lo collocò,
Subitamente agli occhi altrui sparì.

Con quell'anello i rei disegni ordì


Di tante fellonìe, che poscia oprò:
Il talamo real contaminò,
E sovra il regio soglio empio salì.

Se avess'io quell'anel, non vorre' già


Esser tanto fellon, com'egli fu,
Nè servirmene in tante iniquità.

Prevalermi vorrei di tal virtù,


Acciò quando di me cercando va,
Il Creditor non mi trovasse più.
SONETTO

Se colla produttrice alma virtù,


E colla vigorosa attività
Penetra il Sol le viscere colà
Dei monti di Golgonda, o del Perù;

La disposta materia ognor vie più


Purga, stringe, ed assoda: indi ne fa
Oro, o gemma durissima, che fu
Reggio diadema, o ricco anel sen va.

La tua nell'ossa ancor mi penetrò


Attività seccante, in guisa, che
Il mio disposto già cuore indurò,

E quindi poi l'aurea fermossi in me


Durezza adamantina di quel nò,
Che pregievoli rende i Giulj tre.
SONETTO

Or che Europa tra fiere ostilità


D'incendio Marziale arse, e avvampò,
E il Contadin, che prima i campi arò,
Cingesi d'arme, ed alla guerra va:

Desioso ciascun di novità


Cerca quai forze il Moscovita armò,
Se uscì la flotta Inglese, e dove andò,
E che fanno i Francesi al Canadà.

Quanti a caval, quanti soldati a piè


Muovon, se l'Anglo al Prussian s'unì,
E se s'unì l'Ispano al Franco Re.

Ma di ciò poco, o nulla importa a me:


Sol penso al Creditore e notte, e dì,
Sol mi occupa l'affar de' Giulj tre.
SONETTO

Oppressa dai gran debiti allorchè [4]


La Plebe di Quirin si ritirò
Dai Padri, e sopra il Monte Sacro andò,
Seguìta già l'espulsion dei Re;

Menenio coll'Apologo del piè,


Del ventre, e delle man loro mostrò,
Che sussister Repubblica non può,
Se concordia nel Popolo non è;

E della pace, che si stabilì,


La principal condizion si fu,
Quella, che i loro debiti abolì.

Anch'io l'ho teco, o Creditor, e tu


Meco in pace tornar sol puoi così,
Se del debito mio non parli più.
SONETTO

Vincolo conjugal non mi legò


Che sempre amante fui di libertà,
E se manca la mia posterità,
Al mondo non fo ben, nè mal gli fo:

Ma se il giogo, che spesso altrui pesò,


Anch'io portassi dalla prima età,
Giogo, che tanto piace a chi non l'hà,
Quanto dispiace a chi se l'addossò;

Forse che allora, o Creditor, poichè


L'effigie tua la fantasìa m'empì,
Ed impronta indelebile vi fe;

I figliuoli farei simili a te,


E per casa girar vedrei così
Tanti Creditorelli intorno a me.
SONETTO

Io mi sognai, saran due notti, o tre,


Stare in un luogo pien d'amenità
V'eran cetere, flauti, ed oboè
E canti, e giuochi, e balli in quantità.

Ridevan liete, e discorrean con me


Ninfe di bella, e giovanile età:
Nel mondo inter luogo più bel non v'è,
Delizia tal l'Imperador non l'ha.

Di tal piacer mentre godendo vo,


Ecco il mio Creditor, che comparì,
E le mie belle imagini turbò!

E mi destai gridando, e notte, e dì


Dunque s'io veglio, o dormo, o vado, o sto,
Sempre Costui m'inquieterà così?
SONETTO

Dimmi, che giova, o Creditor, che tu


Così spesso mi chieda i Giulj tre,
E sempre importunissimo con me
T'adiri, e stridi, come Corvo, o Grù.

T'accheta alfin, non me li chieder più


Che il tempo perdi, e l'opra; imperocchè
Vedi ben, che finor, nè a me, nè a te
Il chieder tuo di giovamento fu.

Non giova a me la tua importunità;


Poichè chiedi danar, quanto tu vuo'
La borsa il chieder tuo non m'empirà:

E d'altra parte a te giovar non può;


Poichè l'istanza tua mai non farà,
Che danari io ti dia, quando non gli ho.
SONETTO

Mi ricordo aver letto in un Rabbì,


Che certamente non hai letto tu,
Che a tempo antico pratticato fu,
Un costume frà lor, che si abolì.

Poichè d'anni un tal numero compì, [5]


In tutte le lor dodici Tribù
Era vietato di parlar mai più
De' debiti, che fatti eran fin lì.

Perchè prattica tal vigor non ha


Ne' nostri tempi, e nella nostra Fe,
Nè anche per noi tal Giubileo si dà?

Che almen speranza vi sarìa per me,


Che giungendo una tal solennità,
Terminasse l'affar de' Giulj tre.
SONETTO

Non è il debito un mal che abbia con se


Visibili apparenti qualità,
Pleuritico, epilettico non è,
Sintomi, e diagnostici non ha.

Urto, o sconcerto, exempli gratia in me


Ne' solidi, ne' fluidi non fa,
Nè il sangue arresta, o accelera, allorchè
Regolarmente circolando va.

Ma gli è una pena al cor fiera così,


Che altra pena sì fiera unqua non fu,
Gli è un sordo mal, che rode notte, e dì.

E benchè ognun lo provi, o meno, o più,


Pur nessun giusta idea ne concepì,
Se un Creditor non ha, come sei tu.
SONETTO

L'uso scema il piacer: Cosa non v'ha


Così grata, ed amabile così,
Che spiacimento non apporti a chi
Ne abusa con soverchia assiduità.

Armonica gentil soavità,


Che prima l'alma di dolcezza empì,
Posciachè lungamente ella s'udì,
Più non alletta; nè piacer più dà.

Or qual pena poi sia, se ognor si de'


Soffrir cosa, che grata esser non può,
E che non ha, se non disgusto in se?

Questo appunto m'avvien, che mai da te


Triegua e riposo, o Creditor non ho:
Nè di chieder mai cessi i Giulj tre.
SONETTO

O sia qualche diabolica virtù,


Che di seguirmi ognor t'affatturò,
Sia destin, sia disgrazia, io non lo so:
So ben, che sempre, ove son io, sei tu.

Ond'io, che andrei nell'Indie, o nel Perù,


Per isfuggirti, o Creditor, men vo,
Ove non orma umano piè stampò,
Per non udirti, e non vederti più;

Ivi fra quelle taciturnità


Alto mi lagno, o Creditor, di te,
E lascio il chiuso affanno in libertà:

Ma di mie voci il suon tornando a me,


Fin dalle cupe sue concavità
Par, che l'Eco mi chieda i Giulj tre.
SONETTO

Mentre l'Eco mi chiede i Giulj tre,


Nè veggo alcun, che istanza tal mi fa,
Incerto è il mio pensier; se verità
O se stimarsi illusion si de'.

Scuotendo il dubbio poi, dico: se in me


Reale impression formando va,
Se alcun difetto il senso mio non ha,
Illusion fantastica non è.

Indi pur sieguo a ragionar: se quì


Alcun non v'è, che voce tal formò,
Chi potè mai formarla, o d'onde uscì?

Ma veggo alfin, che origine io le do


Co' miei lamenti, e da per me così
Il mio cordoglio alimentando vo.
SONETTO

Se un natural perpetuo moto egli è


Possibil mai, come talun pensò,
Altro, che il circolare esser non può,
Che col girar sempre ritorna in se.

Quindi, quel che mi danno i Giulj tre,


Perenne duol forse soffrir dovrò,
Perchè mentre al di fuor spandendo il vo,
Con perpetuo girar ritorna a me.

Passa al cor dalla mente, indi si fa


Voce, la qual poichè dai labbri uscì
Nei sodi opposti corpi a ferir va;

Vien ripercossa indi all'orecchio, e quì


Al timpano auditorio impulso dà,
E dal cerebro al cor torna così.
SONETTO

È fola ciò, che dicesi dei dì


Critici, climaterici, e che so,
Strane follìe, vani pensier di chi
Ignota scienza altrui spacciar tentò.

Quando i decreti suoi Dio stabilì,


A questo tempo, o a quel non si legò,
E ogni giorno morir si può così,
Come ogni giorno nascere si può:

Ma senza starci a far difficultà,


Se giorno climaterico quello è,
In cui succede qualche avversità;

Quel giorno, che prestommi Giulj tre,


Un Creditor, che discrezion non ha,
Fu giorno climaterico per me.
SONETTO

Or che il lucido Sol da noi partì,


E nel grembo di Teti si tuffò,
E in Ciel l'argentea Luna comparì,
E già la notte il fosco vel spiegò.

E il Mietitor, che i caldi rai soffrì,


E l'Arator, che il vomere trattò,
Stanco dall'opra, e dal sudor del dì
Sul duro letticciuol si coricò.

Ed or, che la notturna oscurità


Al sonno invita, che natura diè
Per sollievo alle umane avversità;

Scendi, placido oblìo, sovra di me,


E sommergi ogni mia calamità
Colla memoria delli Giulj trè.
SONETTO

O Sonno placidissimo, che se'


Ristoro dell'afflitta umanità,
Dalle Cimmerie cavernosità
Stendi il tacito vol sopra di me.

Ma quel tuo Morfeo non condur con te,


Che in tante guise trasformar si sa,
Ch'Ei nella fantasia mi sveglierà
La rimembranza delli Giulj trè.

Che se per vane imagini dovrò


In sogno ancor sempre tremar così,
Nè pur da te grato riposo avrò;

Sonno rimanti pur: Non vò, che tu


M'accresca l'inquietitudini del dì,
Io n'ho pur tante, ah non ne vò di più.
SONETTO

Nocchier, che lungamente s'avvezzò


Al procelloso mar, quando infierì,
Per goder lieti, e più tranquilli dì,
Se finalmente al patrio suol tornò;

E sulle molli piume ivi posò


Le membra, e i lumi chiuse, udir così
Fremer gli sembra il mar, come l'udì,
Quando la tempestosa onda solcò.

Avvezzo anch'io da certo tempo in quà


Per quei tre Giulj, o Creditor, da te
Noje tali a soffrir, che il Ciel lo sa;

In sogno ancora s'appresenta a me


Quella tua faccia, che terror mi fa,
In sogno ancor mi chiedi i Giulj tre.
SONETTO

È cosa natural, ch'io sogni ciò,


Che vide l'occhio mio, l'orecchio udì,
Che i sogni sono imagini del dì.
Che poi 'l sonno corruppe, ed alterò,

Che allora in fantasia dettar si può


L'imagin, che già 'l senso in lei scolpì.
L'armi il guerrier spesso sognò così.
Così le reti il cacciator sognò.

Ma meraviglia è ben, come allorchè


Veglio, e la fantasia vagando va
Su' varj oggetti, ch'offre il senso a me,

Sempre sta fisso il mio pensiero in te,


La tua faccia su gli occhi ognor mi sta,
Sempre chieder mi sento i Giulj tre.
SONETTO

Quel, che ha più di vigore, e attività


Spirto di puro sangue, e i nervi empì,
Se esternamente oggetto alcun si offrì,
E agli organi sensorj impulso dà;

Tosto il moto al cervel portando va,


E di ciò, che si vide, o che si udì,
Tante volte l'imago imprime lì,
Quante l'oggetto esterna impression fa.

Or se qualunque volta domandò


L'avaro Creditore i Giulj tre,
La sensazione al cerebro passò;

Quì tale omai, come io credendo vo,


Lunga, larga, e profonda impression fe,
Che l'intero cervel quasi ingombrò.
SONETTO

Quindi è, che ognor rammento il luogo, il dì


Che il Creditor tre Giulj mi prestò,
E viva ne ho l'imagine così,
Qual di cosa presente aver si può

Che l'imaginazion cotanto empì,


E gli anfratti del cerebro occupò,
Che il mio pensier sempre ritorna lì,
Sebben sviando in altro oggetto il vo.

Che ovunque io stia, che ovunque volga il piè


L'occhio, e l'orecchio offrirmi altro non sa
Che il Creditor nojoso, e i Giulj tre;

E per virtù di fantasia, benchè


Talora avanti agli occhi Ei non mi sta,
Se non altrove, io lo ritrovo in me.
SONETTO

Placido scorre un fiumicel laggiù


Lungo i bei Campi Elisi, ove chi andò,
Poichè l'alma dal corpo si staccò,
Per volger d'anni non ritorna sù.

Han quell'acque ammirabile virtù,


Come la greca favola narrò,
Che chi un sorso una volta ne gustò
Le cose andate non rammenta più.

Ah se fosse ciò ver! ora di quì


Vorrei partire, e portar giù con me
Un barilotto per empirlo lì.

E dare a ber vorrei quell'acqua a te,


Creditore indiscreto, acciò così
Obliassi una volta i Giulj tre.
SONETTO

Felici tempi, in cui Berta filò,


Avventurosa fortunata età,
Che d'oro anticamente si chiamò,
Forse per l'aurea sua felicità!

Non v'erano Strumenti, e Pagherò,


Nè tante liti, come oggi si fa,
Nè per debito alcun mai si citò,
Nè in carcere perdè la libertà.

Cangiaro i tempi: or non è più così,


E guai, se un pover uom' debiti fe,
Bisogna andar prigione, e morir lì.

E se sì duro il Creditor non gli è,


Lo perseguita almeno e notte, e dì,
Siccome appunto ora tu fai con me.
SONETTO

Propizio il Ciel m'assista, e di lassù


Il guardo ognor volga benigno a me:
Ma perchè l'Uomo in vita sua non è
Dalle sventure esente, e mai nol fu;

Perciò se d'alto mai cadessi giù,


E il capo, o il collo mi ferissi, o un piè,
Dopo il dolor, che la ferita fe,
Poco vi penserei, o nulla più:

Ma benchè il tempo, e l'obbliosa età


Cancelli ogni pensier, non già così
Tormi il pensier del debito potrà;

Che viva la memoria ognor fin quì


Il Creditor me ne mantenne, e va
Più volte rinfrescandola ogni dì.
SONETTO

Se morte un brutto scherzo non mi fa


In mezzo agli anni di mia gioventù,
Se per l'opposto mai scritto è lassù,
Che giunger debba alla canuta età;

Appoggiato al baston per la Città


Andrò col dorso curvo, e il capo in giù,
E la memoria debile non più
Del tempo andato si ricorderà.

E dei tre Giulj sol rammenterò


Il memorabil debito, e così
Ogni anno a' Nepotini parlerò:

Questo giorno per me critico fu,


O Figli, incominciò da questo dì
Il mio malanno, e non finì mai più.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebooknice.com

You might also like