DevOps in Python: Infrastructure as Python, 2nd Edition Moshe Zadka download
DevOps in Python: Infrastructure as Python, 2nd Edition Moshe Zadka download
https://ebookmeta.com/product/devops-in-python-infrastructure-as-
python-2nd-edition-moshe-zadka/
https://ebookmeta.com/product/devops-in-python-infrastructure-as-
python-2nd-edition-moshe-zadka-2/
https://ebookmeta.com/product/patterns-and-practices-for-
infrastructure-as-code-with-examples-in-python-and-terraform-
meap-v08-rosemary-wang/
https://ebookmeta.com/product/mastering-python-networking-your-
one-stop-solution-to-using-python-for-network-automation-
programmability-and-devops-3rd-edition-eric-chou/
https://ebookmeta.com/product/contemporary-feminist-
pragmatism-1st-edition-maurice-hamington-celia-bardwell-jones-
eds/
Law of the Sea in South East Asia Environmental
Navigational and Security Challenges 1st Edition Donald
R Rothwell (Editor)
https://ebookmeta.com/product/law-of-the-sea-in-south-east-asia-
environmental-navigational-and-security-challenges-1st-edition-
donald-r-rothwell-editor/
https://ebookmeta.com/product/human-microbiome-in-health-and-
disease-part-b-1st-edition-bhabatosh-das/
https://ebookmeta.com/product/hornblower-and-the-crisis-1st-
edition-c-s-forester/
https://ebookmeta.com/product/divine-inspiration-in-byzantium-
notions-of-authenticity-in-art-and-theology-karin-krause/
https://ebookmeta.com/product/modern-epidemiology-4th-edition-
kenneth-rothman-timothy-l-lash/
Ted s Surprise Suzanne Barchers
https://ebookmeta.com/product/ted-s-surprise-suzanne-barchers/
DevOps in Python
Infrastructure as Python
Second Edition
Moshe Zadka
DevOps in Python: Infrastructure as Python
Moshe Zadka
Belmont, CA, USA
Introduction�����������������������������������������������������������������������������������������������������������xvii
Chapter 2: Packaging����������������������������������������������������������������������������������������������� 7
2.1 Virtual Environments��������������������������������������������������������������������������������������������������������������� 7
2.2 pip������������������������������������������������������������������������������������������������������������������������������������������� 9
2.3 Setup and Wheels����������������������������������������������������������������������������������������������������������������� 12
2.4 Binary Wheels����������������������������������������������������������������������������������������������������������������������� 16
2.5 m anylinux Wheels������������������������������������������������������������������������������������������������������������� 18
2.5.1 Self-Contained Wheels������������������������������������������������������������������������������������������������� 19
2.5.2 Portable Wheels������������������������������������������������������������������������������������������������������������ 19
2.5.3 manylinux Containers��������������������������������������������������������������������������������������������������� 20
2.5.4 Installing manylinux Wheels����������������������������������������������������������������������������������������� 20
2.6 tox���������������������������������������������������������������������������������������������������������������������������������������� 21
2.6.1 One Environment���������������������������������������������������������������������������������������������������������� 22
2.6.2 Multiple Environments�������������������������������������������������������������������������������������������������� 23
2.6.3 Multiple Differently Configured Environments�������������������������������������������������������������� 23
v
Table of Contents
Chapter 5: Testing��������������������������������������������������������������������������������������������������� 67
5.1 Unit Testing��������������������������������������������������������������������������������������������������������������������������� 67
5.2 Mocks, Stubs, and Fakes������������������������������������������������������������������������������������������������������ 72
5.3 Testing Files�������������������������������������������������������������������������������������������������������������������������� 73
vi
Table of Contents
vii
Table of Contents
viii
Table of Contents
ix
Table of Contents
Index��������������������������������������������������������������������������������������������������������������������� 231
x
About the Author
Moshe Zadka has been involved in the Linux community
since 1998, helping in Linux “installation parties.” He has
been programming Python since 1999 and has contributed
to the core Python interpreter. Moshe has been a DevOps/
SRE since before those terms existed, caring deeply about
software reliability, build reproducibility, and more. He has
worked in companies as small as three people and as big as
tens of thousands—and usually in a position where software
meets system administration.
xi
About the Technical Reviewer
Martyn Bristow is a software developer in the United
Kingdom. He began using Python as a researcher but is
now an experienced broad user of Python for data analysis,
software test automation, and DevOps. He currently builds
data analysis web apps in Python deployed to Kubernetes.
You can find Martyn on GitHub @martynbristow.
xiii
Acknowledgments
Thanks to my wife, Jennifer Zadka, without whose support I could not have written
this book.
Thanks to my parents, Yaacov and Pnina Zadka, who taught me how to learn.
Thanks to my advisor, Yael Karshon, who taught me how to write.
Thanks to Mahmoud Hashemi for inspiration and encouragement.
Thanks to Mark Williams for being there for me.
Thanks to Glyph Lefkowitz for teaching me about Python, programming, and being a
good person.
Thanks to Brennon Church and Andrea Ross, who supported my personal growth
and learning journey.
xv
Introduction
Python began as a language to automate an operating system: the Amoeba. A typical
Unix shell would be ill-suited since it had an API, not just textual file representations.
The Amoeba OS is a relic now. However, Python continues to be a useful tool for
automation of operations—the heart of typical DevOps work.
It is easy to learn and easy to write readable code is a necessity when a critical part of
the work is responding to a 4 a.m. alert and modifying some misbehaving program.
It has powerful bindings to C and C++, the universal languages of the operating
system—and yet is natively memory-safe, leading to few crashes at the automation layer.
Finally, although not true when it was created, Python is one of the most popular
languages. This means that it is relatively easy to hire people with Python experience and
easy to get training materials and courses for people who need to learn on the job.
This book guides you through how to take advantage of Python to automate
operations.
To get the most out of the book, you need to be somewhat familiar with Python. If
you are new to Python, there are many great resources to learn it, including the official
Python tutorial at docs.python.org. You also need to be somewhat familiar with Unix-like
operating systems like Linux, especially how to use the command line.
xvii
CHAPTER 1
Installing Python
Before you can use Python, you need to install it. Some operating systems, such as
macOS and some Linux variants, have Python preinstalled. Those versions of Python,
colloquially called system Python, often make poor defaults for people who want to
develop in Python.
The version of Python installed is often behind the latest practices. System
integrators often patch Python in ways that can lead to surprises. For example, Debian-
based Python is often missing modules like venv and ensurepip. macOS Python links
against a Mac shim around its native SSL library. Those things mean, especially when
starting and consuming FAQs and web resources, it is better to install Python from
scratch.
This chapter covers a few ways to do so and the pros and cons of each.
1.1 OS Packages
Volunteers have built ready-to-install packages for some of the more popular operating
systems.
The most famous is the deadsnakes PPA (Personal Package Archives). The dead in
the name refers to the fact that those packages are already built—with the metaphor
that sources are “alive.” Those packages are built for Ubuntu and usually support all the
versions of Ubuntu that are still supported upstream. Getting those packages is done
with a simple
1
© Moshe Zadka 2022
M. Zadka, DevOps in Python, https://doi.org/10.1007/978-1-4842-7996-0_1
Chapter 1 Installing Python
1.2 Using pyenv
pyenv tends to offer the highest return on investment for installing Python for local
development. The initial setup does have some subtleties. However, it allows installing
many side-by-side Python versions as needed. It also allows you to manage how each is
accessed—either using a per-user default or a per-directory default.
Installing pyenv itself depends on the operating system. On a macOS, the easiest
way is to install it via Homebrew. Note that in this case, pyenv itself might need to be
upgraded to install new versions of Python.
On a Unix-based operating system, such as Linux or FreeBSD, the easiest way to
install pyenv is by using the curl|bash command.
$ PROJECT=https://github.com/pyenv/pyenv-installer \
PATH=raw/master/bin/pyenv-installer \
curl -L $PROJECT/PATH | bash
Of course, this comes with its own security issues and could be replaced with a
two-step process where you can inspect the shell script before running or even use git-
checkout to pin to a specific revision.
2
Chapter 1 Installing Python
$ cd pyenv-installer
$ bash pyenv-installer
export PATH="~/.pyenv/bin:$PATH"
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"
or locally by using
Local means they are available in a given directory. This is done by putting a python-
version.txt file in this directory. This is important for version-controlled repositories,
but a few different strategies are used to manage those. One strategy is to add this file to
the ignored list. This is useful for heterogeneous teams or open source projects. Another
strategy is to check in the file so that the same version of Python is used in the repository.
Note that since pyenv is designed to install side-by-side versions of Python, it has no
concept of upgrading Python. A newer Python version needs to be installed with pyenv
and then set as the default.
By default, pyenv installs non-optimized versions of Python. If optimized versions
are needed, enter the following.
env PYTHON_CONFIGURE_OPTS="--enable-shared
--enable-optimizations
3
Chapter 1 Installing Python
--with-computed-gotos
--with-lto
--enable-ipv6" pyenv install
Let’s build a version that is pretty similar to binary versions from python.org.
4
Chapter 1 Installing Python
1.4 PyPy
The usual implementation of Python is sometimes known as CPython to distinguish
it from the language proper. The most popular alternative implementation is PyPy, a
Python-based JIT implementation of Python in Python. Because it has a dynamic JIT
(just-in-time) compilation to assembly, it can sometimes achieve phenomenal speed-
ups (three times or even ten times) over regular Python.
There are sometimes challenges in using PyPy. Many tools and packages are
tested only with CPython. However, sometimes spending the effort to check if PyPy is
compatible with the environment is worth it if performance matters.
There are a few subtleties in installing Python from source. While it is theoretically
possible to translate using CPython, in practice, the optimizations in PyPy mean that
translating using PyPy works on more reasonable machines. Even when installing from
source, it is better to first install a binary version to bootstrap.
The bootstrapping version should be PyPy, not PyPy3. PyPy is written in the
Python 2 dialect, which is one of the only cases where worrying about the deprecation
is irrelevant since PyPy is a Python 2 dialect interpreter. PyPy3 is the Python 3 dialect
implementation, which is usually better in production as most packages are slowly
dropping support for Python 2.
The latest PyPy3 supports 3.5 features of Python, as well as f-strings. However, the
latest async features, added in Python 3.6, do not work.
1.5 Anaconda
The closest to a system Python that is still reasonable for use as a development platform
is Anaconda, a metadistribution. It is, in essence, an operating system on top of the
operating system. Anaconda has its grounding in the scientific computing community,
and so its Python comes with easy-to-install modules for many scientific applications.
Many of these modules are non-trivial to install from PyPI, requiring a complicated build
environment.
It is possible to install multiple Anaconda environments on the same machine. This
is handy when needing different Python versions or different versions of PyPI modules.
To bootstrap Anaconda, you can use the bash installer available at https://conda.
io/miniconda.html. The installer also modifies ~/.bash_profile to add the path to
conda, the installer.
5
Chapter 1 Installing Python
conda environments are created using conda create --name <name> and activated
using source conda activate <name>. There is no easy way to use inactivated
environments. It is possible to create a conda environment while installing packages:
conda create --name some-name python. You can specify the version using = – conda
create --name some-name python=3.5. It is also possible to install more packages into
a conda environment, using conda install package[=version], after the environment
has been activated. Anaconda has a lot of prebuilt Python packages, especially ones
that are non-trivial to build locally. This makes it a good choice if those packages are
important to your use case.
1.6 Summary
Running a Python program requires an interpreter installed on the system. Depending
on the operating system and the versions, there are several different ways to install
Python. Using the system Python is a problematic option. On macOS and Unix systems,
using pyenv is almost always the preferred option. On Windows, using the prepackaged
installers from Python.org is often a good idea.
6
CHAPTER 2
Packaging
One of the main strengths of Python is the ecosystem, the third-party packages on
PyPI. There are packages to do anything from running computations in parallel on GPUs
for machine learning to reducing the boilerplate needed for writing classes. This means
that a lot of the practical work with Python is handling the third-party dependencies.
The current packaging tooling is pretty good, but things have not always been this
way. It is important to understand which best practices are antiquated rituals based on
faulty assumptions but have some merit and are actually good ideas.
When dealing with packaging, there are two ways to interact. One is to be a
consumer wanting to use the functionality of a package. Another is to be the producer,
publishing a package. These describe, usually, different development tasks, not
different people.
It is important to have a solid understanding of the consumer side of packages before
moving to production. If the goal of a package publisher is to be useful to the package
user, it is crucial to imagine the last mile before starting to write a single line of code.
2.1 Virtual Environments
Virtual environments are often misunderstood because the concept of environments is
not clear. A Python environment refers to the root of the Python installation. The reason
an environment is important is because of the lib/site-packages subdirectory of that
root. The lib/site-packages subdirectory is where third-party packages are installed.
The most popular tool to add packages to an environment is pip, which is covered
in the next section. Before using pip, it is important to understand how virtual
environments work.
A real environment is based on Python installation, which means that to get a new
real environment, a new Python must be installed and often rebuilt. This is sometimes
an expensive proposition.
7
© Moshe Zadka 2022
M. Zadka, DevOps in Python, https://doi.org/10.1007/978-1-4842-7996-0_2
Chapter 2 Packaging
The advantage of a virtual environment is that it is cheap to set up and tear down.
Some modern Python tooling takes advantage of that, setting up and tearing down
virtual environments as a normal part of their operation. Setting up and tearing down
virtual environments, being cheap and fast, is also a common part of Python developer
workflow.
A virtual environment copies the minimum necessary out of the real environment to
mislead Python into thinking it has a new root. The precise file structure is less important
than remembering that the command to create a virtual environment is simple and fast.
Here, simple means that all the command does is copy some files and perhaps make
a few symbolic links. Because of that, there are a few failure modes—mostly when file
creation fails because of permission issues or a full disk.
There are two ways to use virtual environments: activated and inactivated. To use
an inactivated virtual environment, which is most common in scripts and automated
procedures, you explicitly call Python from the virtual environment.
This means that a virtual environment in /home/name/venvs/my-special-env
calling /home/name/venvs/my-special-env/bin/python has a Python process that uses
this environment. For example, /home/name/venvs/my-special-env/bin/python -m
pip runs pip but installs in the virtual environment.
Note that entrypoint–based scripts are installed alongside Python, so running
/home/name/venvs/my-special-env/bin/pip also installs packages in the virtual
environment.
The other way to use a virtual environment is to activate it. Activating a virtual
environment in a bash-like shell means sourcing its activated script.
$ source /home/name/venvs/my-special-env/bin/activate
The sourcing sets a few environment variables, only one of which is important. The
important variable is PATH, which gets prefixed by /home/name/venvs/my-special-env/
bin. This means that commands like python or pip are found there first. Two cosmetic
variables are set. $VIRTUAL_ENV points to the root of the environment. This is useful
in management scripts that want to be aware of virtual environments. PS1 is prefixed
with (my-special-env), which is useful for visualizing the virtual environment while
working interactively in the console.
It is generally a good practice to only install third-party packages inside a virtual
environment. Combined with the fact that virtual environments are cheap, if one gets
into a bad state, it is best to remove the whole directory and start from scratch.
8
Chapter 2 Packaging
For example, imagine a bad package install that causes the Python start-up to fail.
Even running pip uninstall is impossible since pip fails on start-up. However, the
cheapness means you can remove the whole virtual environment and re-create it with a
good set of packages.
A modern practice is to move increasingly toward treating virtual environments as
semi-immutable. After creating them, there is a single stage for installing all required
packages. Instead of modifying the virtual environment if an upgrade is required, destroy
the environment, re-create, and reinstall.
The modern way to create virtual environments is to use the venv standard library
module. This only works on Python 3. Since Python 2 has been strongly deprecated since
the beginning of 2020, it is best avoided in any case.
venv is used as a command with python -m venv <directory>, as there is no
dedicated entrypoint. It creates the directory for the environment.
It is best if this directory does not exist before that. A best practice is to remove
it before creating the environment. There are also two options for creating the
environment: which interpreter to use and what initial packages to install.
2.2 pip
The packaging tool for Python is pip. There have been other tools that have mostly been
abandoned by the community and should not be used.
Installations of Python used to not come with pip out of the box. This has changed
in recent versions, but many versions which are still supported do not have it. When
running on such a version, python -m ensurepip installs it.
Some Python installations, especially system ones, disable ensurepip. When
lacking ensurepip, there is a way of manually getting it: get-pip.py. This is a single
downloadable file that, when executed, unpacks pip.
Luckily, pip is the only package that needs these weird gyrations to install. All other
packages can, and should, be installed using pip.
For example, if sample-environment is a virtual environment, installing the glom
package can be done with the following code.
The last command tests that glom has been properly installed. Glom is a package
to handle deeply-nested data, and called with no arguments, outputs an empty Python
dictionary. This makes it handy for quickly testing whether a new virtual environment
can install new packages properly.
Internally, pip is also treated as a third-party package. Upgrading pip itself is done
with pip install --upgrade pip.
Depending on how Python was installed, its real environment might or might not
be modifiable by the user. Many instructions in various README files and blogs might
encourage using sudo pip install. This is almost always the wrong thing to do; it
installs the packages in the global environment.
The pip install command downloads and installs all dependencies. However,
it can fail to downgrade incompatible packages. It is always possible to install explicit
versions: pip install package-name==<version> installs this precise version. This is
also a good way for local testing to get explicitly non-general-availability packages, such
as release candidates, beta, or similar.
If wheel is installed, pip builds, and usually caches, wheels for packages. This is
especially useful when dealing with a high virtual environment churn since installing a
cached wheel is a fast operation. This is also highly useful when dealing with native or
binary packages that need to be compiled with a C compiler. A wheel cache eliminates
the need to build it again.
pip does allow uninstalling with pip uninstall <package>. This command, by
default, requires manual confirmation. Except for exotic circumstances, this command
is not used. If an unintended package has snuck in, the usual response is to destroy the
environment and rebuild it. For similar reasons, pip install --upgrade <package>
is not often needed; the common response is to destroy and re-create the environment.
There is one situation where it is a good idea.
pip install supports a requirements file: pip install --requirements or pip
install -r. The requirements file simply has one package per line. This is no different
from specifying packages on the command line. However, requirement files often specify
strict dependencies. A requirements file can be generated from an environment with
pip freeze.
Like most individual packages or wheels, installing anything that is not strict and
closed under requirements requires pip to decide which dependencies to install. The
general problem of dependency resolution does not have an efficient and complete
solution. Different strategies are possible to approach such a solution.
10
Chapter 2 Packaging
The way pip resolves dependencies is by using backtracking. This means that
it optimistically tries to download the latest possible requirements recursively. If a
dependency conflict is found, it backtracks; try a different option.
As an example, consider three packages.
• top
• middle
• base
There are two base versions: 1.0 and 2.0. The package dependencies are setup.
cfg files.
The following is for the top.
[metadata]
name = top
version = 1.0
[options]
install_requires =
base
middle
[metadata]
name = middle
version = 1.0
[options]
install_requires =
base<2.0
The base package has two versions: 1.0 and 2.0. It does not have any dependencies.
Because top depends directly on base, pre-backtracking versions of pip get the latest
and then have a failed resolution.
11
Chapter 2 Packaging
This solution has the advantage that it is complete, but it can take unfeasible amounts
of time in certain cases. This is rare, but merely taking a long time is not.
One way to increase the speed is to include >= dependencies in the loose
requirements. This is usually a good idea since packages are better at guaranteeing
backward compatibility than forward compatibility. As a side benefit, this can
dramatically reduce the solution space that pip needs to backtrack in.
In most scenarios, it is better to use strict requirements for day-to-day development
and regenerate the strict requirements from the loose requirements (which can take a
while) on a cadence that balances keeping up to date with churn.
12
Chapter 2 Packaging
together. The pedantic way to call installable things is distribution. A distribution can
correspond to no packages (it can be a top-level single-module distribution) or multiple
packages.
It is good to keep a 1-1-1 relationship when packaging things: a single distribution
corresponding to one package and named the same. Even if there is only one file, put it
as an __init__.py file under a directory.
Packaging is an area that has seen a lot of changes. Copying and pasting from
existing packages is not a good idea; good packages are, for the most part, mature
packages. Following the latest best practices means making changes to an existing
working process.
Starting with setuptools version 61.0.0, it is possible to create a package with only
two files besides the code files.
• pyproject.toml
• README.rst
The README is not strictly necessary. However, most source code management
systems display it rendered, so it is best to break it out into its own file.
Even an empty pyproject.toml generates a package. However, almost all packages
need at least a few more details.
The build-system is the one mandatory section in a non-empty pyproject.toml file.
It is usually the first.
[build-system]
requires = [
"setuptools"
]
build-backend = "setuptools.build_meta"
Many systems can be used to build valid distributions. The setuptools system,
which used to be the only possibility, is now one of several. However, it is still the most
popular one.
Most of the rest of the data can be found in the project section.
[project]
name = "awesome_package"
version = "0.0.3"
description = "A pretty awesome package"
13
Chapter 2 Packaging
readme = "README.rst"
authors = [{name = "My Name",
email = "[email protected]"}]
dependencies = ["httpx"]
For most popular code organizations, this is enough for the setuptools systems to
find the code and create a correct package.
There are ways to have setuptools treat the version as dynamic and take it from a
file or an attribute. An alternative is to take advantage of pyproject.toml in a structured
format and manipulate it directly.
For example, the following code uses a CalVer (calendar versioning) scheme of
YEAR.MONTH.release in a month. It uses the built-in zoneinfo module, which requires
Python 3.9 or above, and the tomlkit library, which supports roundtrip-preserving TOML
parsing and serialization.
import tomlkit
import datetime
import os
import pathlib
import zoneinfo
now = datetime.datetime.now(tz=zoneinfo.ZoneInfo("UTC"))
prefix=f"{now.year}.{now.month}."
pyproject = pathlib.Path("pyproject.toml")
data = tomlkit.loads(pyproject.read_text())
current = data["project"].get("version", "")
if current.startswith(prefix):
serial = int(current.split(".")[-1]) + 1
else:
serial = 0
version = prefix + str(serial)
data["project"]["version"] = version
pyproject.write_text(tomlkit.dumps(data))
Some utilities keep the version synchronized between several files; for example,
pyproject.toml and example_package/__init__.py. The best way to use these utilities
is by not needing to do it.
14
Chapter 2 Packaging
# example_package/__init__.py
from importlib import metada
__version__ = metadata.distribution("example_package").version
del metadata # Keep top-level namespace clean
[project.scripts]
example-command = "example_package.commands:main"
# example_package/commands.py
def main():
print("an example")
$ example-command
an example
15
Chapter 2 Packaging
You can build a distribution with pyproject.toml, a README.rst, and some Python
code. There are several formats a distribution can take, but the one covered here is
the wheel.
After installing build using pip install build, run
This creates a wheel under dist. If the wheel needs to be in a different directory,
add --outdir <output directory> to the command.
You can do several things with the wheel, but it is important to note that one thing
you can do is pip install <wheel file>. Doing this as part of continuous integration
makes sure the wheel, as built by the current directory, is functional.
It is possible to use python -m build to create a source distribution. This is usually a
good idea to accommodate use cases that prefer to install from source. These use cases
are esoteric, but generating the source distribution is easy enough to be worth it.
It is possible to combine the --sdist and --wheel arguments into one run of
python -m build. This is also what python -m build does by default: create both a
source distribution and a wheel.
By default, python -m build installs any packages it needs to build the package in
a fresh virtual environment. When running python -m build in a tight edit-debug loop,
perhaps to debug a setup.cfg, this can get tedious. In those cases, create and activate a
virtual environment, and then run
This installs its dependencies in the current environment. While this is not a good fit
for production use, this is a faster way to debug packaging issues.
2.4 Binary Wheels
Python is well known and often used as an integration language. One of the ways this
integration happens is by linking to native code.
This is usually done using the C Application Binary Interface (C ABI). The C ABI is
used not only for integrating with C libraries but with other languages, such as C++, Rust,
or Swift, which can generate C ABI-compatible interfaces.
16
Chapter 2 Packaging
There needs to be some glue code bridging the C ABI to Python to integrate Python
with such code. It is possible to write this code by hand.
This is a tedious and error-prone process, so code generators are often used. Cython
is a popular generator that uses a Python-compatible language. Although Cython is often
used to interface to C ABI libraries, it can be used to generate extensions without such
integration. This makes examples slightly simpler, so the following Cython code is used
as a running example.
#cython: language_level=3
This code is in the binary_module.pyx file. It is short and does just enough to be
clear if it works correctly.
To build code with native integration, the files that describe the build are slightly
more complicated.
The pyproject.toml file is no longer empty. It now has two lines.
[build-system]
requires = ["setuptools", "cython"]
This makes sure the cython package is installed before trying to build a wheel.
The setup.py is no longer minimal. It contains enough code to integrate
with Cython.
import setuptools
from Cython import Build
setuptools.setup(
ext_modules=Build.cythonize("binary_module.pyx"),
)
17
Chapter 2 Packaging
Since *.pyx files are not included by default, it needs to be enabled explicitly in the
MANIFEST.in file.
include *.pyx
Since, in this example, there are no regular Python files, the setup.cfg does not need
to specify any.
[metadata]
name = binary_example
version = 1.0
With these files, running python -m build --wheel builds a binary wheel in dist
named something like binary_example-1.0-cp39-cp39-linux_x86_64.whl. Details of
the name depend on the platform, the architecture, and the Python version.
After installing this wheel, it can be used as follows.
2.5 manylinux Wheels
A binary wheel is not a pure Python wheel because at least one of its files contains native
code. On a Linux system, this native code is a shared library, a file with the .so suffix for a
shared object.
This shared library links against other libraries. For a library designed to wrap a
specific native library, as with pygtk wrapping gtk, it links with the wrapped library.
In almost all cases, whether it is designed to wrap a specific library or not, it links
against the standard C library. This is the library that has C functions like printf. Few
things can be done in native code without linking against it.
18
Chapter 2 Packaging
On most modern Linux systems, this linking is usually dynamic. This means that the
binary wheel does not contain the library it is linked with; it expects to load it at runtime.
If a wheel is built on a different system than the one it is installed on, a library that
is binary compatible with the one it is linked with has to be installed on the system. If a
binary compatible library is not installed, this leads to a failure at import time.
2.5.1 Self-Contained Wheels
The auditwheel tool takes binary wheels and patches them to make them more
portable. One of its functions is to grab the pieces from dynamic libraries and put them
in the wheel. This allows the wheels to be installed without requiring a different library.
For auditwheel to work correctly, the patchelf utility needs to be installed. Older
versions might produce wheels that break in strange ways. The safest way to have the
right version of patchelf is to download the latest source distribution and build it.
To make a self-contained wheel, first, build a regular build. This might require
careful reading of the instructions for building the package from source. This results
in the regular binary wheel in dist/. This was the case in the example before with the
binary_example module.
After this is done, run
2.5.2 Portable Wheels
The --plat flag in auditwheel is the platform tag. If it is linux_<cpu architecture>, the
wheel makes no guarantees about which GNU C Library it is compatible with.
Wheels like that should only be installed on a compatible Linux system. To avoid
mistakes, most Python package index systems, including PyPI, do not let these wheels be
uploaded.
19
Chapter 2 Packaging
Uploadable Python wheels must be tagged with a proper platform tag, which
shows which versions of the GNU C Library they are compatible with. Historically,
those tags relied on the CentOS release year: manylinux1 corresponded to CentOS 5,
manylinux20210 corresponded to CentOS 6, and manylinux20214 corresponded to
CentOS 7.
At the time of writing, manylinux_2_24 and manylinux_2_27 are the only post–
CentOS 7 versions. These correspond to Debian 9 and Ubuntu 18.04, respectively.
After deciding on the oldest supported platform tag, build on the newest system
which supports it. For example, if no deployment target uses GNU C Library < 2.24,
build the wheel until Debian 9. Especially for binary wheels with complicated build
dependencies, a newer system makes it easier to follow the documentation and reduces
the chances of running into unexpected issues.
2.5.3 manylinux Containers
Making sure that the patchelf tool is correctly installed and Python is built with the
correct version of the C library is a subtle and error-prone process. One way to avoid this
is to use the official manylinux container images.
These container images are available at quay/pypa/manylinux_<version>. There
are versions available for manylinux_2_24, manylinux2014, manylinux2010, and
manylinux1. These images contain all officially supported versions of Python and the
rest of the tooling necessary.
Note that specialized build dependencies need to be installed on those systems; for
example, when using manylinux_2_24 (a Debian-based container).
20
Chapter 2 Packaging
2.6 tox
tox is a tool to automatically manage virtual environments, usually for tests and builds.
It is used to make sure that those run in well-defined environments and is smart about
caching them to reduce churn. True to its roots as a test-running tool, tox is configured
in test environments.
tox itself is a PyPI package usually installed in a virtual environment. Because tox
creates ad hoc temporary virtual environments for testing, the virtual environment tox
is installed in can be common to many projects. A common pattern is to create a virtual
environment dedicated to tox.
[testenv:some-name]
.
.
.
Note that if the name of the environment contains pyNM (for example, py36), then tox
defaults to using CPython, the standard Python implementation, version N.M (3.6, in this
case) as the Python interpreter for that test environment.
tox also supports name-based environment guessing for more esoteric
implementations of Python. For example, PyPy, an implementation of Python in Python,
is supported with the name pypyNM.
21
Chapter 2 Packaging
If the name does not include one of the supported short names, or if there is a
need to override the default, a basepython field in the section can be used to indicate a
specific Python version. By default, tox looks for Python available in the path. However, if
the plug-in tox-pyenv is installed in the virtual environment that tox itself is installed in,
tox will query pyenv if it cannot find the right Python on the path.
Let’s analyze a few tox configuration files in order of increasing complexity.
2.6.1 One Environment
In this example, there is only one test environment. This test environment uses
Python 3.9.
[tox]
envlist = py39
The tox section is a global configuration. In this example, the only global
configuration is the list of environments.
[testenv]
This section configures the test environment. Since there is only one test
environment, there is no need for a separate configuration.
deps =
flake8
The deps subsection details which packages should be installed in the virtual test
environment. Here the configuration specifies flake8 with a loose dependency. Another
option is to specify a strict dependency; for example, flake8==1.0.0..
This helps with reproducible test runs. It could also specify -r <requirements
file> and manage them separately. This is useful when there is another tool that takes
the requirements file.
commands =
flake8 useful
In this case, the only command is to run flake8 in the useful directory. By default,
a tox test run succeeds if all commands return a successful status code. As something
designed to run from command lines, flake8 respects this convention and only exits
with a successful status code if there are no problems detected with the code.
22
Chapter 2 Packaging
2.6.2 Multiple Environments
In the following examples, the tox configuration runs unit tests against both Python
3.9 and Python 3.8. This is common for libraries that need to support more than one
version.
[tox]
envlist = py39,py38
[testenv]
deps =
pytest
hypothesis
pyhamcrest
commands =
pytest useful
In this environment, tox is configured to install the pytest runner and two testing
helper libraries. The tox.ini file documents the assumptions on the tools needed to run
the tests.
The command to be run is short. The pytest tool also respects the testing tools
convention and only exit successfully if there are no test failures.
[tox]
envlist = {py38,py39}-{unit,func},py39-wheel,docs
toxworkdir = {toxinidir}/build/.tox
23
Chapter 2 Packaging
This becomes more useful the more environments there are. This is also a way
to make too many environments. For example, {py37,py38,py39}-{unit,func}-
{olddeps,newdeps}-{mindeps,maxdeps} creates 3*2*2*2*2=24 environments, which
takes a toll when running the tests.
The numbers for a matrix test like this climb up fast; using an automated test
environment means things would either take longer or need higher parallelism.
This is a normal trade-off between the comprehensiveness of testing and resource
use. There is no magic solution other than carefully considering how many variations to
officially support.
Instead of having a separate testenv-<name> configuration section per environment,
it is possible to use one section and special-case the environments using matching. This
is a more efficient way to create many similar versions of test environments.
[testenv]
deps =
{py38,py39}-unit: coverage
{py38,py39}-{func,unit}: twisted
{py38,py39}-{func,unit}: ncolony
The coverage tool is only used for unit tests. The Twisted and ncolony libraries are
needed for unit and functional tests.
commands =
{py38,py39}-unit: python -Wall \
-Wignore::DeprecationWarning \
-m coverage \
run -m twisted.trial \
--temp-directory build/_trial_temp \
{posargs:ncolony}
{py38,py39}-unit: coverage report --include ncolony* \
--omit */tests/*,*/interfaces*,*/_version* \
--show-missing --fail-under=100
{py38,py39}-func: python -Werror -W ignore::DeprecationWarning \
-W ignore::ImportWarning \
-m ncolony tests.functional_test
24
Chapter 2 Packaging
Configuring one big test environment means all the commands are in one bag and
selected based on patterns. This is also a more realistic test run command, including
warnings configuration, coverage, and arguments to the test runner.
While the exact complications vary, there are almost always enough things that lead
the commands to grow to a decent size.
The following environment is different enough that it makes sense to break it out into
its own section.
[testenv:py39-wheel]
skip_install = True
deps =
build
commands =
python -c 'import os, sys;os.makedirs(sys.argv[1])" {envtmpdir}/dist
python -m build --outdir {envtmpdir}/dist --no-isolation
The py39-wheel section ensures that the wheel can be built. A more sophisticated
configuration might install the wheel and run the unit tests.
Finally, the docs section builds the documentation. This helps avoid syntax errors
resulting in the documentation failing to build.
[testenv:docs]
changedir = docs
deps =
sphinx
commands =
sphinx-build -W -b html -d {envtmpdir}/doctrees . {envtmpdir}/html
basepython = python3.9
For it to run, it needs to have a docs subdirectory with a conf.py and, depending on
the contexts of the configuration file, more files. Note that basepython must be explicitly
declared in this case since it is not part of the environment’s name.
The documentation build is one of the reasons why tox shines. It only installs sphinx
in the virtual environment for building documentation. This means that an undeclared
dependency on sphinx would make the unit tests fail since sphinx is not installed there.
25
Chapter 2 Packaging
2.7 Pip Tools
The pip-tools PyPI package contains a dedicated command to freeze dependencies.
The pip-compile command takes a loose requirements file as an input and produces
one with strict requirements.
The usual names for the files are requirements.in for the input and requirements.
txt for the output. Sometimes, when there are a few related variations of dependencies,
the files are called requirements-<purpose>.in and requirements-<purpose>.txt,
respectively.
There are two common purposes.
More commonly, the loose requirements are already in a setup.cfg for the local
code that uses the libraries. In those cases, pip-compile can take this as input directly.
For example, a setup.cfg file for a web application might have a dependency on
gunicorn and a test dependency on pytest.
[options]
install_requires=
gunicorn
[options.extras_require]
test =
pytest
import setuptools
setuptools.setup()
It is usually best to also have a pyproject.toml, though it can be empty. Even though
pip-compile does not depend on it, it does help with other parts of the workflow.
In such a case, pip-compile uses the package metadata automatically.
26
Chapter 2 Packaging
The output, requirements.txt, contains quite a few comment lines. The only non-
comment line is the pinned gunicorn dependency. Note that the version is different
when running pip-compile again in the future when a new package has been released.
It is also possible to generate the requirements-test.txt file by running pip-
compile with the --extra argument.
This time, the pytest dependency generated more dependencies. All dependencies
are pinned.
Note that while pip-tools does try to replicate the algorithm pip uses, there are
some edge cases in which the resolution differs, or one might fail to find a resolution.
This tends to happen in edge cases, and one way to improve things is to add specificity to
some of the dependencies, often as >= details.
Note that even this relatively simple (hypothetical) Python program with two
direct dependencies had nine total dependencies. This is typical; frozen, complete
dependencies often number in the tens for simple programs and hundreds for many
code bases.
27
Chapter 2 Packaging
2.8 Poetry
Poetry is a package and dependency management system. It gives one tool which
handles the entire Python development process: managing dependencies, creating
virtual environments, building and publishing packages, and installing Python
applications.
2.8.1 Installing
There are several ways to install Poetry. One is by using a get-poetry.py script, which
uses Python to install Poetry locally.
This can be done by piping it straight into Python.
$ curl -sSL \
28
Chapter 2 Packaging
https://raw.githubusercontent.com/python-poetry/poetry/master/get-
poetry.py \
| python -
It is also possible to download the script with curl -o get-poetry.py ... and
then run it.
In some circumstances, it might make sense to install Poetry into a dedicated virtual
environment, using pip install poetry in the virtual environment. One advantage of
the pip-based installation method is that it works with a local Python package index.
This is sometimes useful for compliance or security reasons.
Regardless of how it was installed, poetry self update updates Poetry to the latest
version. It is also possible to update to a specific version as a parameter to the update
command.
For shell completions, poetry completions <shell name> outputs shellcode for
completions compatible with the given shell. This can be loaded globally or per user as
appropriate for the relevant shell. Among the shells supported are bash, zsh, and fish.
2.8.2 Creating
Usually, the best way to start with Poetry is on a fresh project. It can create a skeleton for
a Poetry-based project.
$ git init .
$ git add .
$ git commit -a -m 'Output of "poetry new"'
The most important file is pyproject.toml at the root of the directory. It contains the
build-system section.
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
29
Chapter 2 Packaging
This makes it compatible with python -m build for building wheels and source
distributions.
The pyproject.toml file also contains some Poetry-specific sections, all marked
by having tool.poetry as a prefix. The main section, tool.poetry, contains package
metadata.
[tool.poetry]
name = <name>
version = <version>
description = <description>
authors = ["<author name and e-mail>", ...]
The version field can be edited manually, but it is better to use poetry version
<bump rule> or poetry version <version> to modify the versions.
30
Chapter 2 Packaging
+version = "1.2.3"
description = ""
–– tests/__init__.py is a file that does not contain tests. Making the tests
directory into a package is required for the way Poetry runs pytest.
2.8.3 Dependencies
Assume that the goal of simple_app is to be a Pyramid-based web application that runs
with gunicorn. The first step is to add those dependencies to Poetry. The pyramid add
subcommand adds dependencies.
[tool.poetry.dependencies]
python = "^3.8"
pyramid = "^2.0"
gunicorn = "^20.1.0"
By default, Poetry assumes that the dependencies are semantically versioned. This
means that potential security fixes can be fixed if they are not backported to previous
versions. Most Python packages do not backport fixes, so this is something to be
careful with.
This command also creates the poetry.lock file, which has recursively-complete
pinned dependencies. These are the dependencies used by Poetry. The poetry lock
command updates the pinned dependencies.
31
Chapter 2 Packaging
2.8.4 Developing
Even a minimally functional Pyramid app requires a little more code.
# simple_app/web.py
from pyramid import config, response
def root(request):
return response.Response("Useful string")
with config.Configurator() as cfg:
cfg.add_route("root", "/")
cfg.add_view(root, route_name='root')
application = cfg.make_wsgi_app()
Since the pyramid and gunicorn dependencies are already in Poetry, it can directly
run the code. There is no need to explicitly create a virtual environment.
2.8.5 Building
The poetry build command generates a source distribution and a wheel under dist.
Alternatively, python -m build does the same.
32
Other documents randomly have
different content
MUTTON, Boiled Neck of.
Ingredients.—4 lbs. of the middle, or best end of the neck of
mutton; a little salt. Mode.—Trim off a portion of the fat, should
there be too much, and if it is to look particularly nice, the chine-
bone should be sawn down, the ribs stripped half-way down, and
the ends of the bones chopped off; this is, however, not necessary.
Put the meat into sufficient boiling water to cover it; when it boils,
add a little salt and remove all the scum. Draw the saucepan to the
side of the fire, and let the water get so cool that the finger may be
borne in it; then simmer very slowly and gently until the meat is
done, which will be in about 1½ hour, or rather more, reckoning
from the time that it begins to simmer. Serve with turnips and caper
sauce, and pour a little of it over the meat. The turnips should be
boiled with the mutton; and when at hand, a few carrots will also be
found an improvement. These, however, if very large and thick, must
be cut into long thinnish pieces, or they will not be sufficiently done
by the time the mutton is ready. Garnish the dish with carrots and
turnips, placed alternately round the mutton. Time.—4 lbs. of the
neck of mutton, about 1½ hour. Average cost, 8½d. per lb.
Sufficient for 6 or 7 persons. Seasonable at any time.
NECK OF MUTTON.
1-2. Best end. 2-3.
Scrag.
MUTTON PIE.
[Cold Meat Cookery.] Ingredients.—The remains of a cold leg, loin,
or neck of mutton, pepper and salt to taste, 2 blades of pounded
mace, 1 dessertspoonful of chopped parsley, 1 teaspoonful of
minced savoury herbs; when liked, a little minced onion or shalot; 3
or 4 potatoes, 1 teacupful of gravy; crust. Mode.—Cold mutton may
be made into very good pies if well seasoned and mixed with a few
herbs; if the leg is used, cut it into very thin slices; if the loin or
neck, into thin cutlets. Place some at the bottom of the dish; season
well with pepper, salt, mace, parsley, and herbs; then put a layer of
potatoes sliced, then more mutton, and so on till the dish is full; add
the gravy, cover with a crust, and bake for 1 hour. Time.—1 hour.
Seasonable at any time.
Note.—The remains of an underdone leg of mutton may be
converted into a very good family pudding, by cutting the meat into
slices, and putting them into a basin lined with a suet crust. It
should be seasoned well with pepper, salt, and minced shalot,
covered with a crust, and boiled for about three hours.
MUTTON PIE.
Ingredients.—2 lbs. of the neck or loin of mutton, weighed after
being boned; 2 kidneys, pepper and salt to taste, 2 teacupfuls of
gravy or water, 2 tablespoonfuls of minced parsley; when liked, a
little minced onion or shalot; puff crust. Mode.—Bone the mutton,
and cut the meat into steaks all of the same thickness, and leave but
very little fat. Cut up the kidneys, and arrange these with the meat
neatly in a pie-dish; sprinkle over them the minced parsley and a
seasoning of pepper and salt; pour in the gravy, and cover with a
tolerably good puff crust. Bake for 1½ hour, or rather longer, should
the pie be very large, and let the oven be rather brisk. A well-made
suet crust may be used instead of puff crust, and will be found
exceedingly good. Time.—1½ hour, or rather longer. Average cost,
2s. Sufficient for 5 or 6 persons. Seasonable at any time.
MUTTON PUDDING.
Ingredients.—About 2 lbs. of the chump end of the loin of
mutton, weighed after being boned; pepper and salt to taste, suet
crust made with milk, in the proportion of 6 oz. of suet to each
pound of flour; a very small quantity of minced onion (this may be
omitted when the flavour is not liked). Mode.—Cut the meat into
rather thin slices, and season them with pepper and salt; line the
pudding-dish with crust; lay in the meat, and nearly, but do not
quite, fill it up with water; when the flavour is liked, add a small
quantity of minced onion; cover with crust, and proceed in the same
manner as directed in recipe for rump steak and kidney pudding.
Time.—About 3 hours. Average cost, 1s. 9d. Sufficient for 6 persons.
Seasonable all the year, but more suitable in winter.
SHOULDER OF MUTTON.
NECTARINES, Preserved.
Ingredients.—To every lb. of sugar allow ¼ pint of water;
nectarines. Mode.—Divide the nectarines in two, take out the stones,
and make a strong syrup with sugar and water in the above
proportion. Put in the nectarines, and boil them until they have
thoroughly imbibed the sugar. Keep the fruit as whole as possible,
and turn it carefully into a pan. The next day boil it again for a few
minutes, take out the nectarines, put them into jars, boil the syrup
quickly for five minutes, pour it over the fruit, and, when cold, cover
the preserve down. The syrup and preserve must be carefully
skimmed, or it will not be clear. Time.—10 minutes to boil the sugar
and water; 20 minutes to boil the fruit the first time, 10 minutes the
second time; 5 minutes to boil the syrup. Seasonable in August and
September, but cheapest in September.
NECTAR, Welsh.
Ingredients.—1 lb. of raisins, 3 lemons, 2 lbs. of loaf sugar, 2
gallons of boiling water. Mode.—Cut the peel of the lemons very
thin, pour upon it the boiling water, and, when cool, add the strained
juice of the lemons, the sugar, and the raisins, stoned and chopped
very fine. Let it stand 4 or 5 days, stirring it every day; then strain it
through a jelly-bag, and bottle it for present use. Time.—4 or 5
days. Average cost, 1s. 9d. Sufficient to make 2 gallons.
NEGUS, to make.
Ingredients.—To every pint of port wine allow 1 quart of boiling
water, ¼ lb. of sugar, 1 lemon, grated nutmeg to taste. Mode.—As
this beverage is more usually drunk at children’s parties than at any
other, the wine need not be very old or expensive for the purpose, a
new fruity wine answering very well for it. Put the wine into a jug,
rub some lumps of sugar (equal to ¼ lb.) on the lemon-rind until all
the yellow part of the skin is absorbed, then squeeze the juice, and
strain it. Add the sugar and lemon-juice to the port-wine, with the
grated nutmeg; pour over it the boiling water, cover the jug, and,
when the beverage has cooled a little, it will be fit for use. Negus
may also be made of sherry, or any other sweet white wine, but is
more usually made of port than of any other beverage. Sufficient.—
Allow 1 pint of wine, with the other ingredients in proportion, for a
party of 9 or 10 children.
NOVEMBER—BILLS OF FARE.
Dinner for 18 persons.
First Course.
Entrées.
Second Course.
Third Course.
First Course.
Entrées.
Second Course.
Third Course.
Entremets and Removes.
Dessert.
NOYEAU CREAM.
Ingredients.—1½ oz. of isinglass, the juice of 2 lemons, noyeau
and pounded sugar to taste, 1½ pint of cream. Mode.—Dissolve the
isinglass in a little boiling water, add the lemon-juice, and strain this
to the cream, putting in sufficient noyeau and sugar to flavour and
sweeten the mixture nicely; whisk the cream well, put it into an oiled
mould, and set the mould in ice or in a cool place; turn it out, and
garnish the dish to taste. Time.—Altogether, ½ hour. Average cost,
with cream at 1s. per pint and the best isinglass, 4s. Sufficient to fill
a quart mould. Seasonable at any time.
NOYEAU, Home-made.
Ingredients.—2 oz. of bitter almonds, 1 oz. of sweet ditto, 1 lb. of
loaf sugar, the rinds of 3 lemons, 1 quart of Irish whiskey or gin, 1
tablespoonful of clarified honey, ½ pint of new milk. Mode.—Blanch
and pound the almonds, and mix with them the sugar, which should
also be pounded. Boil the milk; let it stand till quite cold; then mix all
the ingredients together, and let them remain for 10 days, shaking
them every day. Filter the mixture through blotting-paper, bottle off
for use in small bottles, and seal the corks down. This will be found
useful for flavouring many sweet dishes. A tablespoonful of the
above noyeau, added to a pint of boiled custard instead of brandy as
given in our recipe for custard, makes an exceedingly agreeable and
delicate flavour. Average cost, 2s. 9d. Sufficient to make about 2½
pints of noyeau. Seasonable.—May be made at any time.
OCTOBER—BILLS OF FARE.
Dinner for 18 persons.
First Course.
Entrées.
Second Course.
Third Course.
Dessert and Ices.
OMELET.
Ingredients.—6 eggs, 1 saltspoonful of salt, ½ saltspoonful of
pepper, ¼ lb. of butter. Mode.—Break the eggs into a basin, omitting
the whites of 3, and beat them up with the salt and pepper until
extremely light; then add 2 oz. of the butter broken into small
pieces, and stir this into the mixture. Put the other 2 oz. of butter
into a frying-pan, make it quite hot, and, as soon as it begins to
bubble, whisk the eggs, &c., very briskly for a minute or two, and
pour them into the pan; stir the omelet with a spoon one way until
the mixture thickens and becomes firm, and when the whole is set,
fold the edges over, so that the omelet assumes an oval form; and
when it is nicely brown on one side, and quite firm, it is done. To
take off the rawness on the upper side, hold the pan before the fire
for a minute or two, and brown it with a salamander or hot shovel.
Serve very expeditiously on a very hot dish, and never cook until it is
just wanted. The flavour of this omelet may be very much enhanced
by adding minced parsley, minced onion or eschalot, or grated
cheese, allowing 1 tablespoonful of the former, and half the quantity
of the latter, to the above proportion of eggs. Shrimps or oysters
may also be added: the latter should be scalded in their liquor, and
then bearded and cut into small pieces. In making an omelet, be
particularly careful that it is not too thin, and, to avoid this, do not
make it in too large a frying-pan, as the mixture would then spread
too much, and taste of the outside. It should also not be greasy,
burnt, or too much done, and should be cooked over a gentle fire,
that the whole of the substance may be heated without drying up
the outside. Omelets are sometimes served with gravy; but this
should never be poured over them, but served in a tureen, as the
liquid causes the omelet to become heavy and flat, instead of eating
light and soft. In making the gravy, the flavour should not overpower
that of the omelet, and should be thickened with arrowroot or rice
flour. Time.—With 6 eggs, in a frying-pan 18 or 20 inches round, 4
to 6 minutes. Average cost, 9d. Sufficient for 4 persons. Seasonable
at any time.
OMELET.
OMELET, Bachelor’s.
Ingredients.—2 or 3 eggs, 2 oz. of butter, teaspoonful of flour, ½
teacupful of milk. Mode.—Make a thin cream of the flour and milk;
then beat up the eggs, mix all together, and add a pinch of salt and
a few grains of cayenne. Melt the butter in a small frying-pan, and,
when very hot, pour in the batter. Let the pan remain for a few
minutes over a clear fire; then sprinkle upon the omelet some
chopped herbs and a few shreds of onion; double the omelet
dexterously, and shake it out of the pan on to a hot dish. A simple
sweet omelet can be made by the same process, substituting sugar
or preserve for the chopped herbs. Time.—2 minutes. Average cost,
6d. Sufficient for 2 persons. Seasonable at any time.
OMELETTE SOUFFLÉ.
Ingredients.—6 eggs, 5 oz. of pounded sugar, flavouring of
vanilla, orange-flower water, or lemon-rind, 3 oz. of butter, 1
dessertspoonful of rice-flour. Mode.—Separate the yolks from the
whites of the eggs, add to the former the sugar, the rice-flour, and
either of the above flavourings that may be preferred, and stir these
ingredients well together. Whip the whites of the eggs, mix them
lightly with the batter, and put the butter into a small frying-pan. As
soon as it begins to bubble, pour the batter into it, and set the pan
over a bright but gentle fire; and when the omelet is set, turn the
edges over to make it an oval shape, and slip it on to a silver dish,
which has been previously well buttered. Put it in the oven, and
bake from 12 to 15 minutes; sprinkle finely-powdered sugar over the
soufflé, and serve it immediately. Time.—About 4 minutes in the
pan; to bake, from 12 to 15 minutes. Average cost, 1s. Sufficient for
3 or 4 persons. Seasonable at any time.
ONION SOUP.
Ingredients.—6 large onions, 2 oz. of butter, salt and pepper to
taste, ½ pint of cream, 1 quart of stock. Mode.—Chop the onions,
put them in the butter, stir them occasionally, but do not let them
brown. When tender, put the stock to them, and season; strain the
soup, and add the boiling cream. Time.—½ hour. Average cost, 1s.
per quart. Seasonable in winter. Sufficient for 4 persons.
ONIONS, Pickled.
Ingredients.—1 gallon of pickling onions, salt and water, milk; to
each ½ gallon of vinegar, 1 oz. of bruised ginger, ¼ tablespoonful of
cayenne, 1 oz. of allspice, 1 oz. of whole black pepper, ¼ oz. of
whole nutmeg bruised, 8 cloves, ¼ oz. of mace. Mode.—Gather the
onions, which should not be too small, when they are quite dry and
ripe; wipe off the dirt, but do not pare them; make a strong solution
of salt and water, into which put the onions, and change this,
morning and night, for 3 days, and save the last brine they were put
in. Then take the outside skin off, and put them into a tin saucepan
capable of holding them all, as they are always better done together.
Now take equal quantities of milk and the last salt and water the
onions were in, and pour this to them; to this add 2 large spoonfuls
of salt, put them over the fire, and watch them very attentively.
Keep constantly turning the onions about with a wooden skimmer,
those at the bottom to the top, and vice versâ; and let the milk and
water run through the holes of the skimmer. Remember, the onions
must never boil, or, if they do, they will be good for nothing; and
they should be quite transparent. Keep the onions stirred for a few
minutes, and, in stirring them, be particular not to break them. Then
have ready a pan with a colander, into which turn the onions to
drain, covering them with a cloth to keep in the steam. Place on a
table an old cloth, 2 or 3 times double; put the onions on it when
quite hot, and over them an old piece of blanket; cover this closely
over them, to keep in the steam. Let them remain till the next day,
when they will be quite cold, and look yellow and shrivelled; take off
the shrivelled skins, when they should be as white as snow. Put
them into a pan, make a pickle of vinegar and the remaining
ingredients, boil all these up, and pour hot over the onions in the
pan. Cover very closely to keep in all the steam, and let them stand
till the following day, when they will be quite cold. Put them into jars
or bottles well bunged, and a tablespoonful of the best olive-oil on
the top of each jar or bottle. Tie them down with bladder, and let
them stand in a cool place for a month or six weeks, when they will
be fit for use. They should be beautifully white, and eat crisp,
without the least softness, and will keep good many months.
Seasonable from the middle of July to the end of August.