0% found this document useful (0 votes)
10 views47 pages

Deep Learning Tutorials Lisa Lab download

The document provides a comprehensive guide to deep learning tutorials from the LISA lab at the University of Montreal, including various models and techniques such as logistic regression, multilayer perceptrons, and convolutional neural networks. It also includes links to additional resources and ebooks related to deep learning and machine learning. The content is structured to facilitate understanding and implementation of deep learning methods.

Uploaded by

siminsilviq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views47 pages

Deep Learning Tutorials Lisa Lab download

The document provides a comprehensive guide to deep learning tutorials from the LISA lab at the University of Montreal, including various models and techniques such as logistic regression, multilayer perceptrons, and convolutional neural networks. It also includes links to additional resources and ebooks related to deep learning and machine learning. The content is structured to facilitate understanding and implementation of deep learning methods.

Uploaded by

siminsilviq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

Deep Learning Tutorials Lisa Lab download

https://ebookbell.com/product/deep-learning-tutorials-lisa-
lab-32853472

Explore and download more ebooks at ebookbell.com


Here are some recommended products that we believe you will be
interested in. You can click the link to download.

Artificial Intelligence Engines A Tutorial Introduction To The


Mathematics Of Deep Learning 1st Edition James Stone

https://ebookbell.com/product/artificial-intelligence-engines-a-
tutorial-introduction-to-the-mathematics-of-deep-learning-1st-edition-
james-stone-23407368

Python Machine Learning Machine Learning And Deep Learning With Python
Scikitlearn And Tensorflow Stepbystep Tutorial For Beginners Samuel
Burns Chenjin5com

https://ebookbell.com/product/python-machine-learning-machine-
learning-and-deep-learning-with-python-scikitlearn-and-tensorflow-
stepbystep-tutorial-for-beginners-samuel-burns-chenjin5com-36752488

Deep Learning Ian Goodfellow Yoshua Bengio Aaron Courville

https://ebookbell.com/product/deep-learning-ian-goodfellow-yoshua-
bengio-aaron-courville-44886094

Deep Learning In Biology And Medicine Davide Bacciu Paulo J G Lisboa

https://ebookbell.com/product/deep-learning-in-biology-and-medicine-
davide-bacciu-paulo-j-g-lisboa-44899934
Deep Learning For Sustainable Agriculture Ramesh Poonia Vijander Singh

https://ebookbell.com/product/deep-learning-for-sustainable-
agriculture-ramesh-poonia-vijander-singh-46110062

Deep Learning Based Speech Quality Prediction Gabriel Mittag

https://ebookbell.com/product/deep-learning-based-speech-quality-
prediction-gabriel-mittag-46124808

Deep Learning For Computational Problems In Hardware Security Pranesh


Santikellur

https://ebookbell.com/product/deep-learning-for-computational-
problems-in-hardware-security-pranesh-santikellur-46163396

Deep Learning For Social Media Data Analytics Tzungpei Hong

https://ebookbell.com/product/deep-learning-for-social-media-data-
analytics-tzungpei-hong-46196138

Deep Learning For Targeted Treatments Transformation In Healthcare


Rishabha Malviya

https://ebookbell.com/product/deep-learning-for-targeted-treatments-
transformation-in-healthcare-rishabha-malviya-46226054
Deep Learning Tutorial
Release 0.1

LISA lab, University of Montreal

September 01, 2015


CONTENTS

1 LICENSE 1

2 Deep Learning Tutorials 3

3 Getting Started 5
3.1 Download . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3.2 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3.3 Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
3.4 A Primer on Supervised Optimization for Deep Learning . . . . . . . . . . . . . . . . . . . 8
3.5 Theano/Python Tips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

4 Classifying MNIST digits using Logistic Regression 17


4.1 The Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.2 Defining a Loss Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
4.3 Creating a LogisticRegression class . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.4 Learning the Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.5 Testing the model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.6 Putting it All Together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4.7 Prediction Using a Trained Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

5 Multilayer Perceptron 35
5.1 The Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.2 Going from logistic regression to MLP . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
5.3 Putting it All Together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.4 Tips and Tricks for training MLPs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

6 Convolutional Neural Networks (LeNet) 51


6.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.2 Sparse Connectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
6.3 Shared Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
6.4 Details and Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
6.5 The Convolution Operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
6.6 MaxPooling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.7 The Full Model: LeNet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
6.8 Putting it All Together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6.9 Running the Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

i
6.10 Tips and Tricks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63

7 Denoising Autoencoders (dA) 65


7.1 Autoencoders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
7.2 Denoising Autoencoders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
7.3 Putting it All Together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
7.4 Running the Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78

8 Stacked Denoising Autoencoders (SdA) 81


8.1 Stacked Autoencoders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
8.2 Putting it all together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
8.3 Running the Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
8.4 Tips and Tricks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

9 Restricted Boltzmann Machines (RBM) 91


9.1 Energy-Based Models (EBM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
9.2 Restricted Boltzmann Machines (RBM) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
9.3 Sampling in an RBM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
9.4 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
9.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106

10 Deep Belief Networks 109


10.1 Deep Belief Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
10.2 Justifying Greedy-Layer Wise Pre-Training . . . . . . . . . . . . . . . . . . . . . . . . . . 110
10.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
10.4 Putting it all together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
10.5 Running the Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
10.6 Tips and Tricks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118

11 Hybrid Monte-Carlo Sampling 119


11.1 Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
11.2 Implementing HMC Using Theano . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
11.3 Testing our Sampler . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
11.4 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132

12 Recurrent Neural Networks with Word Embeddings 133


12.1 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
12.2 Code - Citations - Contact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
12.3 Task . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
12.4 Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
12.5 Recurrent Neural Network Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
12.6 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
12.7 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
12.8 Running the Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140

13 LSTM Networks for Sentiment Analysis 143


13.1 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
13.2 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
13.3 Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

ii
13.4 Code - Citations - Contact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
13.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148

14 Modeling and generating sequences of polyphonic music with the RNN-RBM 149
14.1 The RNN-RBM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
14.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
14.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
14.4 How to improve this code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157

15 Miscellaneous 159
15.1 Plotting Samples and Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159

16 References 163

Bibliography 165

Index 167

iii
iv
CHAPTER

ONE

LICENSE

Copyright (c) 2008–2013, Theano Development Team All rights reserved.


Redistribution and use in source and binary forms, with or without modification, are permitted provided that
the following conditions are met:
• Redistributions of source code must retain the above copyright notice, this list of conditions and the
following disclaimer.
• Redistributions in binary form must reproduce the above copyright notice, this list of conditions and
the following disclaimer in the documentation and/or other materials provided with the distribution.
• Neither the name of Theano nor the names of its contributors may be used to endorse or promote
products derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ‘’AS IS” AND ANY EXPRESS
OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN
NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY DIRECT, INDIRECT, IN-
CIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIA-
BILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR
OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
OF THE POSSIBILITY OF SUCH DAMAGE.

1
Deep Learning Tutorial, Release 0.1

2 Chapter 1. LICENSE
CHAPTER

TWO

DEEP LEARNING TUTORIALS

Deep Learning is a new area of Machine Learning research, which has been introduced with the objective of
moving Machine Learning closer to one of its original goals: Artificial Intelligence. See these course notes
for a brief introduction to Machine Learning for AI and an introduction to Deep Learning algorithms.
Deep Learning is about learning multiple levels of representation and abstraction that help to make sense of
data such as images, sound, and text. For more about deep learning algorithms, see for example:
• The monograph or review paper Learning Deep Architectures for AI (Foundations & Trends in Ma-
chine Learning, 2009).
• The ICML 2009 Workshop on Learning Feature Hierarchies webpage has a list of references.
• The LISA public wiki has a reading list and a bibliography.
• Geoff Hinton has readings from 2009’s NIPS tutorial.
The tutorials presented here will introduce you to some of the most important deep learning algorithms and
will also show you how to run them using Theano. Theano is a python library that makes writing deep
learning models easy, and gives the option of training them on a GPU.
The algorithm tutorials have some prerequisites. You should know some python, and be familiar with
numpy. Since this tutorial is about using Theano, you should read over the Theano basic tutorial first. Once
you’ve done that, read through our Getting Started chapter – it introduces the notation, and [downloadable]
datasets used in the algorithm tutorials, and the way we do optimization by stochastic gradient descent.
The purely supervised learning algorithms are meant to be read in order:
1. Logistic Regression - using Theano for something simple
2. Multilayer perceptron - introduction to layers
3. Deep Convolutional Network - a simplified version of LeNet5
The unsupervised and semi-supervised learning algorithms can be read in any order (the auto-encoders can
be read independently of the RBM/DBN thread):
• Auto Encoders, Denoising Autoencoders - description of autoencoders
• Stacked Denoising Auto-Encoders - easy steps into unsupervised pre-training for deep nets
• Restricted Boltzmann Machines - single layer generative RBM model
• Deep Belief Networks - unsupervised generative pre-training of stacked RBMs followed by supervised
fine-tuning

3
Deep Learning Tutorial, Release 0.1

Building towards including the mcRBM model, we have a new tutorial on sampling from energy models:
• HMC Sampling - hybrid (aka Hamiltonian) Monte-Carlo sampling with scan()
Building towards including the Contractive auto-encoders tutorial, we have the code for now:
• Contractive auto-encoders code - There is some basic doc in the code.
Recurrent neural networks with word embeddings and context window:
• Semantic Parsing of Speech using Recurrent Net
LSTM network for sentiment analysis:
• LSTM network
Energy-based recurrent neural network (RNN-RBM):
• Modeling and generating sequences of polyphonic music

4 Chapter 2. Deep Learning Tutorials


CHAPTER

THREE

GETTING STARTED

These tutorials do not attempt to make up for a graduate or undergraduate course in machine learning, but
we do make a rapid overview of some important concepts (and notation) to make sure that we’re on the same
page. You’ll also need to download the datasets mentioned in this chapter in order to run the example code
of the up-coming tutorials.

3.1 Download

On each learning algorithm page, you will be able to download the corresponding files. If you want to
download all of them at the same time, you can clone the git repository of the tutorial:
git clone https://github.com/lisa-lab/DeepLearningTutorials.git

3.2 Datasets

3.2.1 MNIST Dataset

(mnist.pkl.gz)
The MNIST dataset consists of handwritten digit images and it is divided in 60,000 examples
for the training set and 10,000 examples for testing. In many papers as well as in this tutorial,
the official training set of 60,000 is divided into an actual training set of 50,000 examples and
10,000 validation examples (for selecting hyper-parameters like learning rate and size of the
model). All digit images have been size-normalized and centered in a fixed size image of 28 x
28 pixels. In the original dataset each pixel of the image is represented by a value between 0
and 255, where 0 is black, 255 is white and anything in between is a different shade of grey.
Here are some examples of MNIST digits:

For convenience we pickled the dataset to make it easier to use in python. It is available for
download here. The pickled file represents a tuple of 3 lists : the training set, the validation
set and the testing set. Each of the three lists is a pair formed from a list of images and a list
of class labels for each of the images. An image is represented as numpy 1-dimensional array

5
Deep Learning Tutorial, Release 0.1

of 784 (28 x 28) float values between 0 and 1 (0 stands for black, 1 for white). The labels are
numbers between 0 and 9 indicating which digit the image represents. The code block below
shows how to load the dataset.
import cPickle, gzip, numpy

# Load the dataset


f = gzip.open(’mnist.pkl.gz’, ’rb’)
train_set, valid_set, test_set = cPickle.load(f)
f.close()

When using the dataset, we usually divide it in minibatches (see Stochastic Gradient Descent).
We encourage you to store the dataset into shared variables and access it based on the minibatch
index, given a fixed and known batch size. The reason behind shared variables is related to using
the GPU. There is a large overhead when copying data into the GPU memory. If you would
copy data on request ( each minibatch individually when needed) as the code will do if you do
not use shared variables, due to this overhead, the GPU code will not be much faster then the
CPU code (maybe even slower). If you have your data in Theano shared variables though, you
give Theano the possibility to copy the entire data on the GPU in a single call when the shared
variables are constructed. Afterwards the GPU can access any minibatch by taking a slice
from this shared variables, without needing to copy any information from the CPU memory
and therefore bypassing the overhead. Because the datapoints and their labels are usually of
different nature (labels are usually integers while datapoints are real numbers) we suggest to
use different variables for label and data. Also we recommend using different variables for
the training set, validation set and testing set to make the code more readable (resulting in 6
different shared variables).
Since now the data is in one variable, and a minibatch is defined as a slice of that variable, it
comes more natural to define a minibatch by indicating its index and its size. In our setup the
batch size stays constant throughout the execution of the code, therefore a function will actually
require only the index to identify on which datapoints to work. The code below shows how to
store your data and how to access a minibatch:
def shared_dataset(data_xy):
""" Function that loads the dataset into shared variables

The reason we store our dataset in shared variables is to allow


Theano to copy it into the GPU memory (when code is run on GPU).
Since copying data into the GPU is slow, copying a minibatch everytime
is needed (the default behaviour if the data is not in a shared
variable) would lead to a large decrease in performance.
"""
data_x, data_y = data_xy
shared_x = theano.shared(numpy.asarray(data_x, dtype=theano.config.floatX))
shared_y = theano.shared(numpy.asarray(data_y, dtype=theano.config.floatX))
# When storing data on the GPU it has to be stored as floats
# therefore we will store the labels as ‘‘floatX‘‘ as well
# (‘‘shared_y‘‘ does exactly that). But during our computations
# we need them as ints (we use labels as index, and if they are
# floats it doesn’t make sense) therefore instead of returning
# ‘‘shared_y‘‘ we will have to cast it to int. This little hack
# lets us get around this issue

6 Chapter 3. Getting Started


Deep Learning Tutorial, Release 0.1

return shared_x, T.cast(shared_y, ’int32’)

test_set_x, test_set_y = shared_dataset(test_set)


valid_set_x, valid_set_y = shared_dataset(valid_set)
train_set_x, train_set_y = shared_dataset(train_set)

batch_size = 500 # size of the minibatch

# accessing the third minibatch of the training set

data = train_set_x[2 * batch_size: 3 * batch_size]


label = train_set_y[2 * batch_size: 3 * batch_size]

The data has to be stored as floats on the GPU ( the right dtype for storing on the GPU is given by
theano.config.floatX). To get around this shortcomming for the labels, we store them as float, and
then cast it to int.

Note: If you are running your code on the GPU and the dataset you are using is too large to fit in memory
the code will crash. In such a case you should store the data in a shared variable. You can however store a
sufficiently small chunk of your data (several minibatches) in a shared variable and use that during training.
Once you got through the chunk, update the values it stores. This way you minimize the number of data
transfers between CPU memory and GPU memory.

3.3 Notation

3.3.1 Dataset notation

We label data sets as D. When the distinction is important, we indicate train, validation, and test sets as:
Dtrain , Dvalid and Dtest . The validation set is used to perform model selection and hyper-parameter selec-
tion, whereas the test set is used to evaluate the final generalization error and compare different algorithms
in an unbiased way.
The tutorials mostly deal with classification problems, where each data set D is an indexed set of pairs
(x(i) , y (i) ). We use superscripts to distinguish training set examples: x(i) ∈ RD is thus the i-th training
example of dimensionality D. Similarly, y (i) ∈ {0, ..., L} is the i-th label assigned to input x(i) . It is
straightforward to extend these examples to ones where y (i) has other types (e.g. Gaussian for regression,
or groups of multinomials for predicting multiple symbols).

3.3.2 Math Conventions

• W : upper-case symbols refer to a matrix unless specified otherwise


• Wij : element at i-th row and j-th column of matrix W
• Wi· , Wi : vector, i-th row of matrix W
• W·j : vector, j-th column of matrix W

3.3. Notation 7
Deep Learning Tutorial, Release 0.1

• b: lower-case symbols refer to a vector unless specified otherwise


• bi : i-th element of vector b

3.3.3 List of Symbols and acronyms

• D: number of input dimensions.


(i)
• Dh : number of hidden units in the i-th layer.
• fθ (x), f (x): classification function associated with a model P (Y |x, θ), defined as argmaxk P (Y =
k|x, θ). Note that we will often drop the θ subscript.
• L: number of labels.
• L(θ, D): log-likelihood D of the model defined by parameters θ.
• `(θ, D) empirical loss of the prediction function f parameterized by θ on data set D.
• NLL: negative log-likelihood
• θ: set of all parameters for a given model

3.3.4 Python Namespaces

Tutorial code often uses the following namespaces:


import theano
import theano.tensor as T
import numpy

3.4 A Primer on Supervised Optimization for Deep Learning

What’s exciting about Deep Learning is largely the use of unsupervised learning of deep networks. But
supervised learning also plays an important role. The utility of unsupervised pre-training is often evaluated
on the basis of what performance can be achieved after supervised fine-tuning. This chapter reviews the
basics of supervised learning for classification models, and covers the minibatch stochastic gradient descent
algorithm that is used to fine-tune many of the models in the Deep Learning Tutorials. Have a look at these
introductory course notes on gradient-based learning for more basics on the notion of optimizing a training
criterion using the gradient.

3.4.1 Learning a Classifier

Zero-One Loss

The models presented in these deep learning tutorials are mostly used for classification. The objective in
training a classifier is to minimize the number of errors (zero-one loss) on unseen examples. If f : RD →

8 Chapter 3. Getting Started


Deep Learning Tutorial, Release 0.1

{0, ..., L} is the prediction function, then this loss can be written as:
|D|
X
`0,1 = If (x(i) )6=y(i)
i=0

where either D is the training set (during training) or D ∩ Dtrain = ∅ (to avoid biasing the evaluation of
validation or test error). I is the indicator function defined as:

1 if x is True
Ix =
0 otherwise

In this tutorial, f is defined as:

f (x) = argmaxk P (Y = k|x, θ)

In python, using Theano this can be written as :


# zero_one_loss is a Theano variable representing a symbolic
# expression of the zero one loss ; to get the actual value this
# symbolic expression has to be compiled into a Theano function (see
# the Theano tutorial for more details)
zero_one_loss = T.sum(T.neq(T.argmax(p_y_given_x), y))

Negative Log-Likelihood Loss

Since the zero-one loss is not differentiable, optimizing it for large models (thousands or millions of param-
eters) is prohibitively expensive (computationally). We thus maximize the log-likelihood of our classifier
given all the labels in a training set.
|D|
X
L(θ, D) = log P (Y = y (i) |x(i) , θ)
i=0

The likelihood of the correct class is not the same as the number of right predictions, but from the point of
view of a randomly initialized classifier they are pretty similar. Remember that likelihood and zero-one loss
are different objectives; you should see that they are corralated on the validation set but sometimes one will
rise while the other falls, or vice-versa.
Since we usually speak in terms of minimizing a loss function, learning will thus attempt to minimize the
negative log-likelihood (NLL), defined as:
|D|
X
N LL(θ, D) = − log P (Y = y (i) |x(i) , θ)
i=0

The NLL of our classifier is a differentiable surrogate for the zero-one loss, and we use the gradient of this
function over our training data as a supervised learning signal for deep learning of a classifier.
This can be computed using the following line of code :

3.4. A Primer on Supervised Optimization for Deep Learning 9


Deep Learning Tutorial, Release 0.1

# NLL is a symbolic variable ; to get the actual value of NLL, this symbolic
# expression has to be compiled into a Theano function (see the Theano
# tutorial for more details)
NLL = -T.sum(T.log(p_y_given_x)[T.arange(y.shape[0]), y])
# note on syntax: T.arange(y.shape[0]) is a vector of integers [0,1,2,...,len(y)].
# Indexing a matrix M by the two vectors [0,1,...,K], [a,b,...,k] returns the
# elements M[0,a], M[1,b], ..., M[K,k] as a vector. Here, we use this
# syntax to retrieve the log-probability of the correct labels, y.

3.4.2 Stochastic Gradient Descent

What is ordinary gradient descent? it is a simple algorithm in which we repeatedly make small steps down-
ward on an error surface defined by a loss function of some parameters. For the purpose of ordinary gradient
descent we consider that the training data is rolled into the loss function. Then the pseudocode of this algo-
rithm can be described as :
# GRADIENT DESCENT

while True:
loss = f(params)
d_loss_wrt_params = ... # compute gradient
params -= learning_rate * d_loss_wrt_params
if <stopping condition is met>:
return params

Stochastic gradient descent (SGD) works according to the same principles as ordinary gradient descent, but
proceeds more quickly by estimating the gradient from just a few examples at a time instead of the entire
training set. In its purest form, we estimate the gradient from just a single example at a time.
# STOCHASTIC GRADIENT DESCENT
for (x_i,y_i) in training_set:
# imagine an infinite generator
# that may repeat examples (if there is only a finite training
loss = f(params, x_i, y_i)
d_loss_wrt_params = ... # compute gradient
params -= learning_rate * d_loss_wrt_params
if <stopping condition is met>:
return params

The variant that we recommend for deep learning is a further twist on stochastic gradient descent using so-
called “minibatches”. Minibatch SGD works identically to SGD, except that we use more than one training
example to make each estimate of the gradient. This technique reduces variance in the estimate of the
gradient, and often makes better use of the hierarchical memory organization in modern computers.
for (x_batch,y_batch) in train_batches:
# imagine an infinite generator
# that may repeat examples
loss = f(params, x_batch, y_batch)
d_loss_wrt_params = ... # compute gradient using theano
params -= learning_rate * d_loss_wrt_params

10 Chapter 3. Getting Started


Deep Learning Tutorial, Release 0.1

if <stopping condition is met>:


return params

There is a tradeoff in the choice of the minibatch size B. The reduction of variance and use of SIMD
instructions helps most when increasing B from 1 to 2, but the marginal improvement fades rapidly to
nothing. With large B, time is wasted in reducing the variance of the gradient estimator, that time would be
better spent on additional gradient steps. An optimal B is model-, dataset-, and hardware-dependent, and
can be anywhere from 1 to maybe several hundreds. In the tutorial we set it to 20, but this choice is almost
arbitrary (though harmless).

Note: If you are training for a fixed number of epochs, the minibatch size becomes important because it
controls the number of updates done to your parameters. Training the same model for 10 epochs using a
batch size of 1 yields completely different results compared to training for the same 10 epochs but with a
batchsize of 20. Keep this in mind when switching between batch sizes and be prepared to tweak all the
other parameters acording to the batch size used.

All code-blocks above show pseudocode of how the algorithm looks like. Implementing such algorithm in
Theano can be done as follows :
# Minibatch Stochastic Gradient Descent

# assume loss is a symbolic description of the loss function given


# the symbolic variables params (shared variable), x_batch, y_batch;

# compute gradient of loss with respect to params


d_loss_wrt_params = T.grad(loss, params)

# compile the MSGD step into a theano function


updates = [(params, params - learning_rate * d_loss_wrt_params)]
MSGD = theano.function([x_batch,y_batch], loss, updates=updates)

for (x_batch, y_batch) in train_batches:


# here x_batch and y_batch are elements of train_batches and
# therefore numpy arrays; function MSGD also updates the params
print(’Current loss is ’, MSGD(x_batch, y_batch))
if stopping_condition_is_met:
return params

3.4.3 Regularization

There is more to machine learning than optimization. When we train our model from data we are trying
to prepare it to do well on new examples, not the ones it has already seen. The training loop above for
MSGD does not take this into account, and may overfit the training examples. A way to combat overfitting
is through regularization. There are several techniques for regularization; the ones we will explain here are
L1/L2 regularization and early-stopping.

3.4. A Primer on Supervised Optimization for Deep Learning 11


Deep Learning Tutorial, Release 0.1

L1 and L2 regularization

L1 and L2 regularization involve adding an extra term to the loss function, which penalizes certain parameter
configurations. Formally, if our loss function is:
|D|
X
N LL(θ, D) = − log P (Y = y (i) |x(i) , θ)
i=0

then the regularized loss will be:

E(θ, D) = N LL(θ, D) + λR(θ)

or, in our case

E(θ, D) = N LL(θ, D) + λ||θ||pp

where
 1
|θ| p
X
p
||θ||p =  |θj |
j=0

which is the Lp norm of θ. λ is a hyper-parameter which controls the relative importance of the regularization
parameter. Commonly used values for p are 1 and 2, hence the L1/L2 nomenclature. If p=2, then the
regularizer is also called “weight decay”.
In principle, adding a regularization term to the loss will encourage smooth network mappings in a neural
network (by penalizing large values of the parameters, which decreases the amount of nonlinearity that
the network models). More intuitively, the two terms (NLL and R(θ)) correspond to modelling the data
well (NLL) and having “simple” or “smooth” solutions (R(θ)). Thus, minimizing the sum of both will, in
theory, correspond to finding the right trade-off between the fit to the training data and the “generality” of
the solution that is found. To follow Occam’s razor principle, this minimization should find us the simplest
solution (as measured by our simplicity criterion) that fits the training data.
Note that the fact that a solution is “simple” does not mean that it will generalize well. Empirically, it
was found that performing such regularization in the context of neural networks helps with generalization,
especially on small datasets. The code block below shows how to compute the loss in python when it
contains both a L1 regularization term weighted by λ1 and L2 regularization term weighted by λ2
# symbolic Theano variable that represents the L1 regularization term
L1 = T.sum(abs(param))

# symbolic Theano variable that represents the squared L2 term


L2_sqr = T.sum(param ** 2)

# the loss
loss = NLL + lambda_1 * L1 + lambda_2 * L2

Early-Stopping

Early-stopping combats overfitting by monitoring the model’s performance on a validation set. A validation
set is a set of examples that we never use for gradient descent, but which is also not a part of the test set. The

12 Chapter 3. Getting Started


Discovering Diverse Content Through
Random Scribd Documents
spread for them on his right. The camels came next, and knelt down
gently, whereupon the houris sprang from their palanquins with a
lightness and grace which astonished Roland more than all. Their
feet hardly left an imprint in the sand. Like the favourites of the
harem, they also approached Roland, and, kissing the ground before
him, ranged themselves on his left. Then like a flood advanced the
troop of celestial dancers, tripping along to the sound of castanets,
flutes, theorbos, timbrels, guitars, and mandolins, amid loud singing,
accompanied by the most lively strains of music.
The animation of their movements increased or diminished
according to the rhythm, which they marked by accurate beats of
the foot and clapping of hands, in slow or quick time. Their eyes
were now filled with soft languor—now darted glances of fire.
Balancing themselves from the hips, they swung their bodies and
waved their arms with ease and grace. At times a comb, unable to
imprison such a wealth of tresses, fell out, and freed locks that were
as dark as the night.
But now the Prophet gave the signal: the dances ceased, and the
houris flew, like a flock of frightened birds, to take their position
opposite Roland, and under shelter of the elephants.
Mahomet, in his turn, drew nearer to the nephew of Charlemagne,
who immediately dismounted—an act of courtesy to age he
invariably observed.
“May Allah, who has made all things of earth and heaven, of day
and night, extend his blessing to you in this world and in the one
you inhabit! You are welcome,” said the Prophet! “I must ask your
pardon for the poverty of this reception, as our meeting has been
arranged at such short notice that I have only had time to bring as
my suite a few of my immediate followers, and the troops which
happen to be my guard of honour for the day. Besides, I feared that
in surrounding myself with too great pomp, I might seem to be
offering a defiance to a late enemy, whom I only desire to make a
friend of. If I have not treated you with more ceremony, it is because
I wish to treat you like a brother.”
Roland made a wry face, which the Prophet thought it convenient
to attribute to the glare of the sun in his eyes, and therefore made a
sign to four angels, who immediately flew off and spread a rosy
cloud before the luminary. .
“I accept your explanation,” said Roland, coolly, half doubting
whether the Prophet were not making fun of him. “I have equal
need of pardon; but if I have come without a fitting retinue, you
must attribute it to my desire to answer your invitation promptly.”
After this exchange of courtesies, Roland commenced the
conversation by saying, “You will forgive me if I beg you at once to
inform me what it is that has obtained me the honour of this
interview, as I am in a hurry to return to earth. I mount guard to-
night in the Emperor’s tent, and I never like to fail in the
performance of duty.”
“Never fear,” said Mahomet; “I’ll have the sun put back. We have
all time for our interview.”
“I am all attention.”
“There is not a more valiant knight than you living. Your single
arm is worth an army. Your judgment is sound, your decision speedy
——”
“How much do you expect for this panegyric? I warn you, before
you go any further, not to set too high a price on it, as I have a clear
estimate of my modest worth.”
“I am in the habit of giving far more than I get, so fear not, but
suffer me to proceed. In my youth I was called El Amin—‘the Safe
Man.’ I know that I possess a generous soul, and that none can be
more loyal than you.”
“This eulogy is evidently the prologue of some treason you are
going to ask of me.”
“If it be treason to leave a bad cause for a good one, to renounce
attempts which are futile, and to accept good fortune when it is
offered, I have, in effect, treason to propose to you.”
“By the Trinity! but you are putting a high price on compliments
for which nobody asked you!”
“I swear by the holy mountain—by the temple of pilgrimage—by
the vault of heaven and the depths of ocean—that the divine
vengeance is about to fall! nothing can delay it. The convulsed skies
shall totter! the uprooted mountains shall move! I swear by the
resting-place of the star—”
“Of a truth, here is plenty of fine words!” said Roland, shrugging
his shoulders. “When we gallant Christian knights make a statement,
they believe us without our having to call in the aid of the sky, and
sea, and stars.”
“As surely as I overthrew the three idols of Mecca, Lata, Aloza,
and Menât, the Christians shall be driven from Spain, and their lands
invaded. Their army shall be dispersed, and shall fly shamefully.
Their hour is come, and it will be bitter and terrible.”
“I have read all that in the Koran,” answered Roland, who felt his
patience failing him. “But that does not say what you want of me, or
why you are thus wasting my time. Since the future is revealed to
you, and you are so certain of our approaching overthrow, there can
be no obstacle to my returning to my post.”
“Yes, the future is ours. You alone delay the coming of the day of
glory. We shall conquer, but while you live it will be only at the price
of terrible sacrifices that we can purchase victory. Why persist in
returning to a world in which death awaits you? I offer you the
sovereignty of this realm, its wealth, its women, its warriors. The
inhabitants of air, earth, and water, the stars which move in the
firmament—all that is gifted with reason or instinct, essence and
matter—in one word, everything shall belong to you and owe to you
unreserved obedience. If the sun annoys you, the moon shall take
its place. Give but the sign, and rivers shall dry up to let you pass. A
population more vast than all the nations of earth put together shall
live only to serve you. These warriors are brave.”
“Of what use is their bravery if they have no enemies to contend
with?”
“These horses are more swift than the wind.”
“Of what service is their speed, since there is here no goal that I
desire to reach?”
“These women are lovely.”
“Their beauty is sheer waste, for I do not love them!”
“Durandal is famous on earth, and yet the humblest of these
soldiers could cut it in two with the edge of his poniard.”
“Enough!” interposed Roland. “I have already told you I am in a
hurry. You have not, I imagine, the impudence to suppose you are
rich enough in wonders to induce me to commit a base action—your
Allah himself would be ashamed of such a thing. You have told me I
am the bravest of living knights: should I be so if I feared the death
you threaten me with? ‘My single arm is worth a whole army,’ you
add. Have I any right, then, to deprive my comrades of its aid at
that moment, of all others, when you profess that they are in
danger? ‘My judgment is sound:’ allow me to offer you a further
proof of it by laughing at your menaces, and predicting your
complete overthrow. Mahomet and Jupiter will soon meet and shake
hands, and the crescent will be sent where the old moons go.”
“You will not listen?”
“I have heard too much already!”
“Behold these lovely creatures, who stretch out their arms towards
you!
“They but make me see how far lovelier my Aude is.”
“See the lands I offer you!”
“What is a region of wonders compared with the spot where a
man was born?”
“Roland, by the faith of Mahomet! you shall never again behold
the land of France!”
“I am a Christian, besides being a Frenchman. The native land to
which I aspire is Heaven, and that birthplace you cannot prevent me
from beholding once more.”
“Infidel hound!” said the Prophet, “I——” But the words were such
as Roland could not listen to patiently. Mahomet did not finish his
sentence, for the gauntlet of the knight smote him on the mouth.

Original Size -- Medium-Size

Original Size -- Medium-Size


CHAPTER X. WHEN ROLAND
REMEMBERS HIS LATIN, AND THE
DEVIL FORGETS HIS.

I
AM unable to tell you what followed. Even Roland had no clear
recollection. When he recovered his senses, he rose and cast his
eyes round him, to find himself in the midst of a vast sandy
plain, stretching on all sides to the horizon. The sun poured its
hostile rays upon him so fiercely, that in a few minutes his armour
became insupportably hot. The atmosphere was so charged with
electricity, that the plume of his helmet crackled, and gave out
sparks. In vain he searched the horizon for a place of shelter—there
was nothing to be seen but level plain and blue sky. Gigantic red
ants came and went busily—they were the only occupants of this
desert. All of a sudden he beheld before him in the distance white
mosques, knots of palms, and a sea-port with some vessels at
anchor, and others sailing out of the harbour. He saw, too, long
caravans, which journeyed to the city gates.
Roland felt his courage revive, and set out in the direction of the
city. But he did not appear to come any the closer to it; he took to
running until he fell down with fatigue on the burning sand. Then
the city seemed to turn of a yellow hue, the blue of the sea grew
paler, and was lost in that of the sky; the trees vanished, and the
Count of Mans found himself once more alone in the desert.
“Why come to a halt?” said he to himself. “Better move forward in
any direction at hap-hazard. I can only gain by the change.”
He rose, determined to struggle on as long as his limbs would
sustain him. What was his surprise to see, in an opposite direction to
that he had just been pursuing, a mountain covered with verdure,
on the summit of which stood a castle! Three walls of
circumvallation surrounded it. At the foot of each flowed a river
covered with vessels of war. Three hanging ladders of marvellous
workmanship united the three platforms of the fortress, and four
bastions guarded the approach to each ladder.

Original Size -- Medium-Size

Roland once more pushed on; but as he advanced, the fortress


rose into the skies, until, after about an hour’s walking, he found
himself with nothing before him save the blank horizon of the
desert. Then despair seized him. He sank on his knees, crossed
himself, and shed four tears, the first he had ever wept. They fell on
the sand, and there formed four springs for a stream of cool and
clear water. Roland received from this new vigour, and having
rendered thanks to Providence, he was preparing to move forward,
when he remarked with surprise a great stirring of the sand. Little
clouds of dust began to rise in all directions, although there was not
a breath stirring. Then the sand began to whirl round incessantly,
marking a great circle at a short distance from our hero.
As it began to whirl, it heaped itself up, drawn towards the centre
by some strange force of attraction. You would have said that some
gigantic polypus was sucking up all the sand of the desert. After a
few minutes there mounted, still eddying round, a huge column,
which grew as Roland watched it, until the summit was lost to sight
in the sky. A hot wind, like the harmattan of the Guinea coast, rose
and drove the sand before it in clouds. The sun turned red as molten
iron.
The pillar of sand at last lost its equilibrium, and fell with a
horrible rushing sound. Roland closed his eyes, but he did not recoil.
Hearing a great roar of laughter, he instinctively clutched his sword
by the hilt. What he saw next induced him to draw it from its sheath.
The sand, in falling, had reared a mound, the base of which
formed an enormous circle, in the centre of which Roland perceived,
with surprise, a huge monster buried in sand to his waist. It was
Eblis, the Devil of the East.

Original Size -- Medium-Size

His Majesty was a hundred feet in height, which is a respectable


size, even for a demon of the highest rank. His black skin, striped
with red, was covered with small scales, which made it glisten like
armour. His hair was so long and curly, a snake might have lost its
way in it. His flat nose was pierced with a ring of admirable
workmanship, as you see done to the wild bulls of the Roman
Campagna. His white teeth, set with precious stones, gave to his
smile a very variegated appearance. His small eyes assumed, one
after the other, all the prismatic colours, which made it impossible to
sustain his gaze. His ears, which exactly resembled those of an
elephant, flapped on his shoulders; but he had, to make up for it, a
tail sixty feet long, terminating in a hooked claw, which could have
wielded the Monument easily as a toothpick.
Eblis had no other covering than his wings, which were large, soft,
and marvellously pliable, and in which he delighted to wrap himself.
Conceive, further, that a phosphorescent gleam played incessantly
over the monster’s skin, and you will easily understand why Roland
unsheathed Durandal.
Eblis was writhing with laughter.
“I haven’t roared so through all eternity, upon my honour! Here, I
say, my little man, do you know you have just done a master-
stroke?”
This familiar tone displeased Roland.
“I have just met Mahomet,” continued Eblis, “and you have broken
five of his front teeth. I have seen a good many prophets in my
time, but I vow, on the faith of the accursed, I never saw one in
such a rage. I have, in honour of the blow, given three days’ holiday
in the infernal regions. There will be concerts, balls, hunts, and
theatres. I have had written, by one of our best authors, a little
comedy in the style of Apollodorus, in the last scene of which
Mahomet receives a hundred strokes of the bastinado. I have given
orders to an army of cooks; you can hear even here a rattle of stew-
pans altogether refreshing. I will undertake to let you see we are not
so backward in this respect as people pretend. You will meet with
many old friends among the guests; we have quite a crowd of
visitors just now. My wife, who is a lively one, will be delighted to
make your acquaintance. Come, let me present you to her as the
best of my friends.”
“Babbler!” exclaimed Roland, but little flattered at these marks of
friendship. “What right have you to address me in this style?”
Eblis, who was not accustomed to be treated so cavalierly, was
dumb with surprise for a moment.
“By my father’s horns!” said he, at last, “I must have
misunderstood you. Give me your hand, Roland, to disabuse me of
the error.”
He stretched out his tail to the knight, who, however, only drew
back a few steps.
“What, puny wretch!” shrieked Eblis, turning as white with rage as
it was possible for one so black to do. “I shall send you back to
earth. Do you think I am of the same stuff as Mahomet?”
“But here Roland flung his second gauntlet in the demon’s face.
“That makes the pair!” said the nephew of Charlemagne, placing
himself in an attitude of defence.
“Zacoum Zimzim Galarabak!” shouted Eblis, mad with fury. (You
must know that is the most terrible oath that can be uttered in the
Saracen tongue.) The earth shook and gaped at Roland’s feet. He
felt himself launched into space. His armour suddenly became icy
cold.
“If I get back without an attack of rheumatism I shall be lucky,”
said the knight.
He heard around him the flapping of wings; it was a troop of
afreets and djins.
“Reflect, Roland. There is yet time. Mahomet is prepared to
forgive you.”
All the answer Roland vouchsafed was the intoning of the canticle

“Sub tuum Fræsidium confugimus.”
“In a few moments your body will be dashed to pieces on earth.
Remember the wondrous things the Prophet offered to share with
you.”
“Sancta Dei genitrix; nostras deprecationes ne despicias,”
continued Roland. And now it seemed to him that, instead of falling
at hazard, he was being gently carried. The chorus of afreets and
djins was left far behind, but he still heard the sound of pinions.

Original Size -- Medium-Size

“Set your mind at rest,” said a voice so exquisitely musical that


Roland trembled to hear it. “I am the Archangel Michael. Our
Blessed Lady has sent me to preserve you. She had been touched by
your constancy and courage. Repose in safety on my wings, and we
shall soon reach earth.”
And, in truth, in a few minutes’ time the Count of Mans, to his
astonishment, found himself before Saragossa. He was at prayer in
his tent when he heard the voice of Miton.
“My dear Roland, where are you?” cried the Count of Rennes,
anxiously.
“Here I am,” said the knight, hurrying to his friend.
Original Size -- Medium-Size

“Charlemagne, who knows how punctual you are, seeing you were
ten minutes behind your time to take on your guard, has sent to
look for you in every direction. You are pale, my dear Count; what
has happened to you?”
“I will tell you all about it,” said Roland, as he hastened to his post
near the Emperor.

END OF THE SECOND BOOK.


BOOK THE THIRD — THE FORTRESS
OF FEAR.

Original Size -- Medium-Size


CHAPTER I. THE FOUR FOES OF
CROQUEMITAINE.

C
HARLEMAGNE had an excellent memory. He never omitted to
ponder over the dangers to which Mitaine was exposed at
every turn. He had the scene of the late ambush carefully
searched by his spies in the first place, and afterwards by his
soldiers. All, on their return, made the same report. They said the
forest was inhabited, and there was a good deal of talk about a
castle called “The Fortress of Fear,” which was to be found
somewhere in the neighbourhood, although nobody they met with
had seen it. None, however, doubted its existence. If a child
disappeared, or any cattle were carried off, the trembling peasants
said, “The Lord of Fear-fortress had taken them.” If a fire broke out
anywhere, it was the Lord of Fear-fortress who must have lit it. The
origin of all accidents, mishaps, catastrophes, or disasters was
traced to the mysterious owner of this invisible castle.
“I should like to have the mystery cleared up,” said Charlemagne
to himself. “I can hardly resign myself to the belief that it is Ganelon,
my old brother-in-arms.”
He called his knights together.
“My faithful champions, I need four of you for a perilous
adventure, I know not where I am sending you—I know not whether
you will return. Who will risk death for my good favour?”
All the knights at once flung themselves at his feet, each
entreating the Emperor to honour him with his choice.
“You place me in a difficult position,” said the Emperor, greatly
moved; “I see that chance must point out the four champions. I can
without fear trust to it, for you are all equally brave.”
The names of all the knights present were put into a helmet, and
Mitaine played the part of Destiny to the best of her power, little
thinking she was choosing her own champions and avengers. The
first name she called out was that of Allegrignac of Cognac, Count of
Salençon and Saintonge.
“The lot suits me admirably,” said the Emperor, giving a friendly
wave of his hand to the knight. “You know the language of the
country, and will be a safe guide for your companions.”
Mitaine next named the Baron of Mont-Rognon, Lord of
Bourglastic, Tortebesse, and elsewhere.
“This is indeed a capital choice! There is no stouter arm in the
Arvennes than yours; and if there be a postern to be burst open by
a powerful shoulder, you will be there, Mont-Rognon.”
“Porc-en-Truie, Lord of Machavoine,” cried Mitaine.
“I am in luck to-day, by St. James! You are known to be
experienced, Porc-en-Truie, and you will conduct the adventure, I
entrust to you, to a prosperous end, I feel sure. But I am curious to
know who is my fourth champion.”
“Maragougnia, Count of Rioin,” said Mitaine.
“Now we have wisdom, strength, and cunning. Maragougnia can
give the serpent points at wisdom, and beat him. If I do not succeed
with such knights I shall despair altogether.”
Charlemagne withdrew with his four champions, told them of the
perils to which his god-child had been exposed, the investigation he
had instituted, the suspicions he had entertained; and finally, he
spoke of the Fortress of Fear, winding up in these terms:—
“I am anxious to square accounts with this Croquemitaine. You will
pass through the forest till you arrive at Alagon, a little hamlet on
the banks of the Ebro. There you will inquire for the Fonda del
Caïman, or, if you prefer it, the sign of the Crocodile. You will there
rest yourselves for a short time, and then set out on your quests.
You, Allegrignac, striking off from the river, will pursue your course
towards Pampeluna. You, Mont-Rognon, will proceed in the direction
of Catalyud; and look out for the Saracens, my friend, who on that
side are disgusted enough with the trouble we have given them.
You, Porc-en-Truie, will make for Fuentes. If you are guided by me,
you will travel by night only, and conceal yourself carefully by day.
You will appreciate my counsel when once you are on the road. You,
finally, my gallant Maragougnia, will have to direct your steps
towards Lerida, but you will not go beyond the river Alcander. I have
reserved this expedition for you because it is the most hazardous—
there, you need not thank me. I understand you! Quarter the
country in every direction, and find out for me this Fortress of Fear.
He who brings me the head of its dreaded lord shall be created a
baron and peer of my realm.”
The Emperor replenished the purses of his champions, and took
leave of them with an embrace. When they’ found themselves alone
they interchanged looks of bewilderment.
“What do you think of that?” said Porc-en-Truie, with a grimace.
“That I shall be a duke,” said Allegrignac, cutting a caper. “This
adventure won’t take me a minute!”
“To think that we must set out to-night!” said Mont-Rognon, in
tones of regret; “and to think that I have ordered a splendid supper
for to-night, which my fellows will get the benefit of!”
“To think that we shall none of us ever come back again!” said
Maragougnia, in a melancholy voice, as he wiped away a tear with
the sleeve of his chain-mail.
“Pshaw! who knows?” broke in Porc-en-Truie, with a smile. “Let us
set out, and then we can see!”
They appointed to meet on the borders of the forest, and within
an hour afterwards they’ were all on the spot, equipped for war or
for travel.
Porc-en-Truie, Lord of Machavoine, was a great fellow of thirty
years of age, more skilled in avoiding blows than in dealing them. He
invariably shirked all his military duties, not because he was a
coward, but because he was incorrigibly idle. He had been known to
tramp three hours afoot to save himself the trouble of saddling his
horse, and he had killed his dearest friend in a tournament, in order
to terminate a long and fatiguing tilting match. He arrived at the
rendezvous on horseback, with no weapon but his sword.
“How imprudent!” cried Allegrignac, the moment he saw him
coming. “Are we going to a wedding only, or are you desirous of
emulating Miton’s great feat at the Tourney of Fronsac?”
“I hate a load of weapons, and I don’t mean to kill myself for this
Mitaine—for whom, between you and me, I don’t care a grain of
mustard-seed!”

Original Size -- Medium-Size

Allegrignac of Cognac, Count of Salençon, was twenty-five years


of age, and six feet six high. He had an open countenance, a stout
heart, an untiring tongue, limbs of steel, a stomach of leather, and a
very slender patrimony. His hair was curly, his teeth were white. He
was as proud as a Spaniard, as brave as a Frenchman, as simple-
minded as a goose. He was possessed of a pleasant contralto voice,
a cheerful spirit, and a grey horse called Serenade.
Picture to yourself a figure clad in complete steel, and with
weapons of vast weight, like one of those armed and bandy-legged
giants you see in a procession of trades, capable of lifting enormous
weights, not to mention cattle, and any other unconsidered trifles he
could lay hands on, and you have a portrait of the Baron of Mont-
Rognon, Lord of Bourglastic, Tortebesse, and elsewhere. This huge
mass of muscle existed only to eat and drink. He was a descendant
of Esau on his father’s side, and of Gargantua on his mother’s. He
once performed a gigantic feat—he killed six hundred Saracens who
happened to get in his way as he was going to dinner. He had an
elastic stomach, and a mouth armed with four rows of teeth.
Having described his stomach and his mouth,
I need not go on with the likeness, for all that remained were
mere incidental appurtenances.
He arrived third at the place of meeting, leading by the halter a
mule laden with provisions and bottles.
“What’s this?” said Allegrignac, laughingly.
“That!” said Mont-Rognon, offended at his bluntness. “That’s
supper.”
“What’s the use of that?” said Porc-en-Truie.
Mont-Rognon the Monstrous.
Mont-Rognon in a hurry for his dinner
“Charlemagne has ordered us to perish for him,” broke in the Lord
of Bourglastic, “but he did not stipulate that we should perish of
hunger.”
Original Size -- Medium-Size

Maragougnia, Count of Riom, was the last to arrive. He was


equipped in the most gloomy style. His armour was of browned
steel, sprinkled with silver tears. From the coronet that surmounted
his helmet sprang a few mangy black feathers, which drooped over
his shoulders like the branches of a weeping willow, and all the rest
of his accoutrements were to match.
He had one extraordinary quality, which was his strong point—
instead of making him lose his head, fear only gave him increased
presence of mind. They related deeds of prowess of his which were,
in reality, only prodigies of cowardice. He did everything with a
profound air of melancholy. His first wife, they say, died of yawning;
the second perished of sheer weariness in three weeks.
Behind him came a page, who might be considered to have
originated the sombre livery worn nine hundred years later by the
page of the Duchess of Marlborough.*

* Vide “Malbrouck:”—

“Elle voit venir son page


De unir tout habillé.”
This lugubrious squire bore the count’s change of arms—to wit:
two daggers of mercy; three swords, various; one lance; one
helmet; one morion; two daggers, poisoned; one battle-axe; one
flail, iron; one shield; one breastplate; one shirt of mail; two pairs of
gauntlets; three pairs of spurs.
“Good heavens!” said Allegrignac; “are we going to equip all the
nation for war? Look, Porc-en-Truie! the Count of Riom has stripped
the armouries of his ten castles.”
“I wouldn’t stir an inch,” said Porc-en-Truie, in the interval of a
couple of yawns, “to assure myself that Maragougnia has done
something silly. If you assured me to the contrary, I might perhaps
be surprised into getting up to see. And yet no! I couldn’t believe it;
so I should stay where I was.”
Porc-en-Truie, I must observe, sat himself down on the grass the
moment he arrived.
“You’re quite welcome to laugh at my prudence,” said
Maragougnia, “but I don’t forget we are going to certain death.”
“Certain death! Fiddlesticks! I mean yet to rival the Methusalems
of the period,” said Porc-en-Truie, rising. “And now let’s be off, if we
are to reach Alagon to-night.”
“To prepare for death,” said Maragougnia, dashing away a tear
with his gauntlet.
“To go to sleep,” said Porc-en-Truie, with a yawn.
“To try a throw with the dice,” said Allegrignac, jingling the money
in his purse.
“To make a good supper,” said Mont-Rognon, with a hollow voice,
gnashing his teeth like castanets.
In ten minutes the four knights had entered the wood. At sunset
Alleericmac was hammering with his fist at the door of the Fonda del
Caïman.
Original Size -- Medium-Size

Original Size -- Medium-Size


CHAPTER II. THE SIGN OF THE
CROCODILE.

T
HE innkeeper was a man of middle size, half Spaniard and half
Moor, with a big body and thin leys, a brown skin and grey
eyes. He had acquired considerable reputation in the district
for his mode of dressing calves’ feet with saffron, and his
handiness in stabbing people in the right place. He made everything
a matter of trade, and used to regret that he had inherited no
religious opinions which he could have abjured at a fixed price to be
got either from the Saracens or the Christians. For the rest, he was a
most obliging host, provided your purse was well supplied; and I
believe I shall put the finishing stroke to the likeness when I say he
was the biggest robber in all Spain, from Pontevedra to Girone.
Ali Pépé opened the door. One is always forgetting something, and
I forgot to tell you his name was Ali Pépé.
“Where’s the landlady?” asked Allegrignac, twisting his moustache.
“I want a bed,” yawned Porc-en-Truie.
“Some supper!” growled Mont-Rognon.
Maragougnia said nothing. He was absorbed in studying the inn,
and the estimate he formed seemed far from satisfactory.
Ali Pépé stood on the defensive, blocking the entrance of the inn.
“Your lordships appear of too exalted a station for me to omit to
inform you that you will find the accommodation here very unsuited
to you.”
“Here’s frankness and disinterestedness! But where can we find
better accommodation?”
Original Size -- Medium-Size

“My inn is the only one in the district.”


“Then make way for us,” said Mont-Rognon, catching up Ali Pope
by the girdle, and carrying him in at arm’s length into the kitchen.
“We shall be able to converse better here!” Maragougnia entered
last. He tried all the locks, in order to see whether the doors closed
securely. He examined all the outlets, sounded the panels, and
ordered his squire to bring him his arms.
“We want four beds,” said Porc-en-Truie.
“In the same room,” said Maragougnia, who had a horror of being
solitary.
“First of all we want supper,” bellowed Mont-Rognon; “don’t let us
forget the most important of our wants.”
“A modest supper,” suggested Maragougnia, who was afraid of the
expense.
“A modest supper!” bellowed the Lord of Bourglastic. “Don’t you
do anything of the kind, landlord, or I’ll burn the place about your
ears. Empty your poultry-yard, drag your fish-ponds, uncork your
bottles; set to work—kill, pluck, draw, and broach,—in short, make
ready, to the best of your power, a feast for an emperor or a sultan!”
“You will lay for me separately,” said the Count of Riom, tearfully,
“a few radishes and some wine of first-rate——”
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.

More than just a book-buying platform, we strive to be a bridge


connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.

Join us on a journey of knowledge exploration, passion nurturing, and


personal growth every day!

ebookbell.com

You might also like