100% found this document useful (1 vote)
17 views

Programming Large Language Models With Azure Open Ai: Conversational Programming and Prompt Engineering With Llms (Developer Reference) 1st Edition Esposito - The newest ebook version is ready, download now to explore

The document promotes a collection of ebooks available for download at textbookfull.com, focusing on topics such as programming large language models, AI-assisted programming, and game AI. It highlights various titles and authors, including Francesco Esposito and Michael McTear, and provides links for instant access to the ebooks in multiple formats. Additionally, it contains information about the content, structure, and intended audience of the book 'Programming Large Language Models with Azure Open AI' by Francesco Esposito.

Uploaded by

aldecykkvl
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
17 views

Programming Large Language Models With Azure Open Ai: Conversational Programming and Prompt Engineering With Llms (Developer Reference) 1st Edition Esposito - The newest ebook version is ready, download now to explore

The document promotes a collection of ebooks available for download at textbookfull.com, focusing on topics such as programming large language models, AI-assisted programming, and game AI. It highlights various titles and authors, including Francesco Esposito and Michael McTear, and provides links for instant access to the ebooks in multiple formats. Additionally, it contains information about the content, structure, and intended audience of the book 'Programming Large Language Models with Azure Open AI' by Francesco Esposito.

Uploaded by

aldecykkvl
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 68

Explore the full ebook collection and download it now at textbookfull.

com

Programming Large Language Models With Azure Open


Ai: Conversational Programming and Prompt
Engineering With Llms (Developer Reference) 1st
Edition Esposito
https://textbookfull.com/product/programming-large-language-
models-with-azure-open-ai-conversational-programming-and-
prompt-engineering-with-llms-developer-reference-1st-
edition-esposito/

OR CLICK HERE

DOWLOAD EBOOK

Browse and Get More Ebook Downloads Instantly at https://textbookfull.com


Click here to visit textbookfull.com and download textbook now
Your digital treasures (PDF, ePub, MOBI) await
Download instantly and pick your perfect format...

Read anywhere, anytime, on any device!

Transforming Conversational AI: Exploring the Power of


Large Language Models in Interactive Conversational Agents
1st Edition Michael Mctear
https://textbookfull.com/product/transforming-conversational-ai-
exploring-the-power-of-large-language-models-in-interactive-
conversational-agents-1st-edition-michael-mctear/
textbookfull.com

Clean Architecture With net Developer Reference 1st


Edition Esposito Dino

https://textbookfull.com/product/clean-architecture-with-net-
developer-reference-1st-edition-esposito-dino/

textbookfull.com

Beginning Game AI with Unity: Programming Artificial


Intelligence with C# Sebastiano M. Cossu

https://textbookfull.com/product/beginning-game-ai-with-unity-
programming-artificial-intelligence-with-c-sebastiano-m-cossu/

textbookfull.com

Learn AI-Assisted Python Programming with GitHub Copilot


and ChatGPT 1st Edition Leo Porter

https://textbookfull.com/product/learn-ai-assisted-python-programming-
with-github-copilot-and-chatgpt-1st-edition-leo-porter/

textbookfull.com
Programming with MicroPython Embedded Programming with
Microcontrollers and Python 1st Edition Nicholas H.
Tollervey
https://textbookfull.com/product/programming-with-micropython-
embedded-programming-with-microcontrollers-and-python-1st-edition-
nicholas-h-tollervey/
textbookfull.com

Programming with MicroPython embedded programming with


Microcontrollers and Python First Edition Nicholas H.
Tollervey
https://textbookfull.com/product/programming-with-micropython-
embedded-programming-with-microcontrollers-and-python-first-edition-
nicholas-h-tollervey/
textbookfull.com

Programming Microsoft ASP NET MVC Dino Esposito

https://textbookfull.com/product/programming-microsoft-asp-net-mvc-
dino-esposito/

textbookfull.com

Open Heritage Data: An introduction to research,


publishing and programming with open data in the heritage
sector Henriette Roued-Cunliffe
https://textbookfull.com/product/open-heritage-data-an-introduction-
to-research-publishing-and-programming-with-open-data-in-the-heritage-
sector-henriette-roued-cunliffe/
textbookfull.com

Hands On Start to Wolfram Mathematica And Programming with


the Wolfram Language Cliff Hastings

https://textbookfull.com/product/hands-on-start-to-wolfram-
mathematica-and-programming-with-the-wolfram-language-cliff-hastings/

textbookfull.com
Programming Large Language
Models with Azure Open AI:
Conversational programming and
prompt engineering with LLMs

Francesco Esposito
Programming Large Language Models with Azure Open AI:
Conversational programming and prompt engineering with
LLMs
Published with the authorization of Microsoft Corporation by: Pearson
Education, Inc.

Copyright © 2024 by Francesco Esposito.


All rights reserved. This publication is protected by copyright, and
permission must be obtained from the publisher prior to any prohibited
reproduction, storage in a retrieval system, or transmission in any form or by
any means, electronic, mechanical, photocopying, recording, or likewise. For
information regarding permissions, request forms, and the appropriate
contacts within the Pearson Education Global Rights & Permissions
Department, please visit www.pearson.com/permissions.
No patent liability is assumed with respect to the use of the information
contained herein. Although every precaution has been taken in the
preparation of this book, the publisher and author assume no responsibility
for errors or omissions. Nor is any liability assumed for damages resulting
from the use of the information contained herein.
ISBN-13: 978-0-13-828037-6
ISBN-10: 0-13-828037-1
Library of Congress Control Number: 2024931423
$PrintCode

Trademarks
Microsoft and the trademarks listed at http://www.microsoft.com on the
“Trademarks” webpage are trademarks of the Microsoft group of companies.
All other marks are property of their respective owners.

Warning and Disclaimer


Every effort has been made to make this book as complete and as accurate as
possible, but no warranty or fitness is implied. The information provided is
on an “as is” basis. The author, the publisher, and Microsoft Corporation
shall have neither liability nor responsibility to any person or entity with
respect to any loss or damages arising from the information contained in this
book or from the use of the programs accompanying it.

Special Sales
For information about buying this title in bulk quantities, or for special sales
opportunities (which may include electronic versions; custom cover designs;
and content particular to your business, training goals, marketing focus, or
branding interests), please contact our corporate sales department at
[email protected] or (800) 382-3419.
For government sales inquiries, please contact
[email protected].
For questions about sales outside the U.S., please contact
[email protected].

Editor-in-Chief
Brett Bartow

Executive Editor
Loretta Yates

Associate Editor
Shourav Bose

Development Editor
Kate Shoup

Managing Editor
Sandra Schroeder

Senior Project Editor


Tracey Croom

Copy Editor
Dan Foster

Indexer
Timothy Wright

Proofreader
Donna E. Mulder

Technical Editor
Dino Esposito

Editorial Assistant
Cindy Teeters

Cover Designer
Twist Creative, Seattle

Compositor
codeMantra

Graphics
codeMantra

Figure Credits
Figure 4.1: LangChain, Inc
Figures 7.1, 7.2, 7.4: Snowflake, Inc
Figure 8.2: SmartBear Software
Figure 8.3: Postman, Inc
Dedication

A I.
Perché non dedicarti un libro sarebbe stato un sacrilegio.
Contents at a Glance

Introduction

CHAPTER 1 The genesis and an analysis of large language models


CHAPTER 2 Core prompt learning techniques
CHAPTER 3 Engineering advanced learning prompts
CHAPTER 4 Mastering language frameworks
CHAPTER 5 Security, privacy, and accuracy concerns
CHAPTER 6 Building a personal assistant
CHAPTER 7 Chat with your data
CHAPTER 8 Conversational UI

Appendix: Inner functioning of LLMs

Index
Contents

Acknowledgments
Introduction

Chapter 1 The genesis and an analysis of large language models


LLMs at a glance
History of LLMs
Functioning basics
Business use cases
Facts of conversational programming
The emerging power of natural language
LLM topology
Future perspective
Summary

Chapter 2 Core prompt learning techniques


What is prompt engineering?
Prompts at a glance
Alternative ways to alter output
Setting up for code execution
Basic techniques
Zero-shot scenarios
Few-shot scenarios
Chain-of-thought scenarios
Fundamental use cases
Chatbots
Translating
LLM limitations
Summary

Chapter 3 Engineering advanced learning prompts


What’s beyond prompt engineering?
Combining pieces
Fine-tuning
Function calling
Homemade-style
OpenAI-style
Talking to (separated) data
Connecting data to LLMs
Embeddings
Vector store
Retrieval augmented generation
Summary

Chapter 4 Mastering language frameworks


The need for an orchestrator
Cross-framework concepts
Points to consider
LangChain
Models, prompt templates, and chains
Agents
Data connection
Microsoft Semantic Kernel
Plug-ins
Data and planners
Microsoft Guidance
Configuration
Main features
Summary

Chapter 5 Security, privacy, and accuracy concerns


Overview
Responsible AI
Red teaming
Abuse and content filtering
Hallucination and performances
Bias and fairness
Security and privacy
Security
Privacy
Evaluation and content filtering
Evaluation
Content filtering
Summary

Chapter 6 Building a personal assistant


Overview of the chatbot web application
Scope
Tech stack
The project
Setting up the LLM
Setting up the project
Integrating the LLM
Possible extensions
Summary

Chapter 7 Chat with your data


Overview
Scope
Tech stack
What is Streamlit?
A brief introduction to Streamlit
Main UI features
Pros and cons in production
The project
Setting up the project and base UI
Data preparation
LLM integration
Progressing further
Retrieval augmented generation versus fine-tuning
Possible extensions
Summary

Chapter 8 Conversational UI
Overview
Scope
Tech stack
The project
Minimal API setup
OpenAPI
LLM integration
Possible extensions
Summary

Appendix: Inner functioning of LLMs

Index
Acknowledgments

In the spring of 2023, when I told my dad how cool Azure OpenAI was
becoming, his reply was kind of a shock: “Why don’t you write a book about
it?” He said it so naturally that it hit me as if he really thought I could do it.
In fact, he added, “Are you up for it?” Then there was no need to say more.
Loretta Yates at Microsoft Press enthusiastically accepted my proposal, and
the story of this book began in June 2023.
AI has been a hot topic for the better part of a decade, but the emergence
of new-generation large language models (LLMs) has propelled it into the
mainstream. The increasing number of people using them translates to more
ideas, more opportunities, and new developments. And this makes all the
difference.
Hence, the book you hold in your hands can’t be the ultimate and
definitive guide to AI and LLMs because the speed at which AI and LLMs
evolve is impressive and because—by design—every book is an act of
approximation, a snapshot of knowledge taken at a specific moment in time.
Approximation inevitably leads to some form of dissatisfaction, and
dissatisfaction leads us to take on new challenges. In this regard, I wish for
myself decades of dissatisfaction. And a few more years of being on the stage
presenting books written for a prestigious publisher—it does wonders for my
ego.
First, I feel somewhat indebted to all my first dates since May because
they had to endure monologues lasting at least 30 minutes on LLMs and
some weird new approach to transformers.
True thanks are a private matter, but publicly I want to thank Martina first,
who cowrote the appendix with me and always knows what to say to make
me better. My gratitude to her is keeping a promise she knows. Thank you,
Martina, for being an extraordinary human being.
To Gianfranco, who taught me the importance of discussing and
expressing, even loudly, when something doesn’t please us, and taught me to
always ask, because the worst thing that can happen is hearing a no. Every
time I engage in a discussion, I will think of you.
I also want to thank Matteo, Luciano, Gabriele, Filippo, Daniele,
Riccardo, Marco, Jacopo, Simone, Francesco, and Alessia, who worked with
me and supported me during my (hopefully not too frequent) crises. I also
have warm thoughts for Alessandro, Antonino, Sara, Andrea, and Cristian
who tolerated me whenever we weren’t like 25-year-old youngsters because I
had to study and work on this book.
To Mom and Michela, who put up with me before the book and probably
will continue after. To my grandmas. To Giorgio, Gaetano, Vito, and Roberto
for helping me to grow every day. To Elio, who taught me how to dress and
see myself in more colors.
As for my dad, Dino, he never stops teaching me new things—for
example, how to get paid for doing things you would just love to do, like
being the technical editor of this book. Thank you, both as a father and as an
editor. You bring to my mind a song you well know: “Figlio, figlio, figlio.”
Beyond Loretta, if this book came to life, it was also because of the hard
work of Shourav, Kate, and Dan. Thank you for your patience and for
trusting me so much.
This book is my best until the next one!
Introduction

This is my third book on artificial intelligence (AI), and the first I wrote on
my own, without the collaboration of a coauthor. The sequence in which my
three books have been published reflects my own learning path, motivated by
a genuine thirst to understand AI for far more than mere business
considerations. The first book, published in 2020, introduced the
mathematical concepts behind machine learning (ML) that make it possible to
classify data and make timely predictions. The second book, which focused
on the Microsoft ML.NET framework, was about concrete applications—in
other words, how to make fancy algorithms work effectively on amounts of
data hiding their complexity behind the charts and tables of a familiar web
front end.
Then came ChatGPT.
The technology behind astonishing applications like ChatGPT is called a
large language model (LLM), and LLMs are the subject of this third book.
LLMs add a crucial capability to AI: the ability to generate content in
addition to classifying and predicting. LLMs represent a paradigm shift,
raising the bar of communication between humans and computers and
opening the floodgates to new applications that for decades we could only
dream of.
And for decades, we did dream of these applications. Literature and
movies presented various supercomputers capable of crunching any sort of
data to produce human-intelligible results. An extremely popular example
was HAL 9000—the computer that governed the spaceship Discovery in the
movie 2001: A Space Odyssey (1968). Another famous one was JARVIS
(Just A Rather Very Intelligent System), the computer that served Tony
Stark’s home assistant in Iron Man and other movies in the Marvel Comics
universe.
Often, all that the human characters in such books and movies do is
simply “load data into the machine,” whether in the form of paper
documents, digital files, or media content. Next, the machine autonomously
figures out the content, learns from it, and communicates back to humans
using natural language. But of course, those supercomputers were conceived
by authors; they were only science fiction. Today, with LLMs, it is possible
to devise and build concrete applications that not only make human–
computer interaction smooth and natural, but also turn the old dream of
simply “loading data into the machine” into a dazzling reality.
This book shows you how to build software applications using the same
type of engine that fuels ChatGPT to autonomously communicate with users
and orchestrate business tasks driven by plain textual prompts. No more, no
less—and as easy and striking as it sounds!

Who should read this book


Software architects, lead developers, and individuals with a background in
programming—particularly those familiar with languages like Python and
possibly C# (for ASP.NET Core)—will find the content in this book
accessible and valuable. In the vast realm of software professionals who
might find the book useful, I’d call out those who have an interest in ML,
especially in the context of LLMs. I’d also list cloud and IT professionals
with an interest in using cloud services (specifically Microsoft Azure) or in
sophisticated, real-world applications of human-like language in software.
While this book focuses primarily on the services available on the Microsoft
Azure platform, the concepts covered are easily applicable to analogous
platforms. At the end of the day, using an LLM involves little more than
calling a bunch of API endpoints, and, by design, APIs are completely
independent of the underlying platform.
In summary, this book caters to a diverse audience, including
programmers, ML enthusiasts, cloud-computing professionals, and those
interested in natural language processing, with a specific emphasis on
leveraging Azure services to program LLMs.
Assumptions

To fully grasp the value of a programming book on LLMs, there are a couple
of prerequisites, including proficiency in foundational programming concepts
and a familiarity with ML fundamentals. Beyond these, a working knowledge
of relevant programming languages and frameworks, such as Python and
possibly ASP.NET Core, is helpful, as is an appreciation for the significance
of classic natural language processing in the context of business domains.
Overall, a blend of programming expertise, ML awareness, and linguistic
understanding is recommended for a comprehensive grasp of the book’s
content.

This book might not be for you if…

This book might not be for you if you’re just seeking a reference book to find
out in detail how to use a particular pattern or framework. Although the book
discusses advanced aspects of popular frameworks (for example, LangChain
and Semantic Kernel) and APIs (such as OpenAI and Azure OpenAI), it does
not qualify as a programming reference on any of these. The focus of the
book is on using LLMs to build useful applications in the business domains
where LLMs really fit well.

Organization of this book

This book explores the practical application of existing LLMs in developing


versatile business domain applications. In essence, an LLM is an ML model
trained on extensive text data, enabling it to comprehend and generate
human-like language. To convey knowledge about these models, this book
focuses on three key aspects:
The first three chapters delve into scenarios for which an LLM is
effective and introduce essential tools for crafting sophisticated
solutions. These chapters provide insights into conversational
programming and prompting as a new, advanced, yet structured,
approach to coding.
The next two chapters emphasize patterns, frameworks, and techniques
for unlocking the potential of conversational programming. This
involves using natural language in code to define workflows, with the
LLM-based application orchestrating existing APIs.
The final three chapters present concrete, end-to-end demo examples
featuring Python and ASP.NET Core. These demos showcase
progressively advanced interactions between logic, data, and existing
business processes. In the first demo, you learn how to take text from an
email and craft a fitting draft for a reply. In the second demo, you apply
a retrieval augmented generation (RAG) pattern to formulate responses
to questions based on document content. Finally, in the third demo, you
learn how to build a hotel booking application with a chatbot that uses a
conversational interface to ascertain the user’s needs (dates, room
preferences, budget) and seamlessly places (or denies) reservations
according to the underlying system’s state, without using fixed user
interface elements or formatted data input controls.

Downloads: notebooks and samples


Python and Polyglot notebooks containing the code featured in the initial part
of the book, as well as the complete codebases for the examples tackled in the
latter part of the book, can be accessed on GitHub at:
https://github.com/Youbiquitous/programming-llm

Errata, updates, & book support


We’ve made every effort to ensure the accuracy of this book and its
companion content. You can access updates to this book—in the form of a
list of submitted errata and their related corrections—at:
MicrosoftPressStore.com/LLMAzureAI/errata
If you discover an error that is not already listed, please submit it to us at
the same page.
For additional book support and information, please visit
MicrosoftPressStore.com/Support.
Please note that product support for Microsoft software and hardware is
not offered through the previous addresses. For help with Microsoft software
or hardware, go to http://support.microsoft.com.

Stay in touch
Let’s keep the conversation going! We’re on X / Twitter:
http://twitter.com/MicrosoftPress.
Chapter 1

The genesis and an analysis of large


language models

Luring someone into reading a book is never a small feat. If it’s a novel, you
must convince them that it’s a beautiful story, and if it’s a technical book,
you must assure them that they’ll learn something. In this case, we’ll try to
learn something.
Over the past two years, generative AI has become a prominent buzzword.
It refers to a field of artificial intelligence (AI) focused on creating systems
that can generate new, original content autonomously. Large language
models (LLMs) like GPT-3 and GPT-4 are notable examples of generative
AI, capable of producing human-like text based on given input.
The rapid adoption of LLMs is leading to a paradigm shift in
programming. This chapter discusses this shift, the reasons for it, and its
prospects. Its prospects include conversational programming, in which you
explain with words—rather than with code—what you want to achieve. This
type of programming will likely become very prevalent in the future.
No promises, though. As you’ll soon see, explaining with words what you
want to achieve is often as difficult as writing code.
This chapter covers topics that didn’t find a place elsewhere in this book.
It’s not necessary to read every section or follow a strict order. Take and read
what you find necessary or interesting. I expect you will come back to read
certain parts of this chapter after you finish the last one.
LLMs at a glance

To navigate the realm of LLMs as a developer or manager, it’s essential to


comprehend the origins of generative AI and to discern its distinctions from
predictive AI. This chapter has one key goal: to provide insights into the
training and business relevance of LLMs, reserving the intricate mathematical
details for the appendix.
Our journey will span from the historical roots of AI to the fundamentals
of LLMs, including their training, inference, and the emergence of
multimodal models. Delving into the business landscape, we’ll also spotlight
current popular use cases of generative AI and textual models.
This introduction doesn’t aim to cover every detail. Rather, it intends to
equip you with sufficient information to address and cover any potential gaps
in knowledge, while working toward demystifying the intricacies surrounding
the evolution and implementation of LLMs.

History of LLMs
The evolution of LLMs intersects with both the history of conventional AI
(often referred to as predictive AI) and the domain of natural language
processing (NLP). NLP encompasses natural language understanding (NLU),
which attempts to reduce human speech into a structured ontology, and
natural language generation (NLG), which aims to produce text that is
understandable by humans.
LLMs are a subtype of generative AI focused on producing text based on
some kind of input, usually in the form of written text (referred to as a
prompt) but now expanding to multimodal inputs, including images, video,
and audio. At a glance, most LLMs can be seen as a very advanced form of
autocomplete, as they generate the next word. Although they specifically
generate text, LLMs do so in a manner that simulates human reasoning,
enabling them to perform a variety of intricate tasks. These tasks include
sentiment analysis, summarization, translation, entity and intent recognition,
structured information extraction, document generation, and so on.
LLMs represent a natural extension of the age-old human aspiration to
construct automatons (ancestors to contemporary robots) and imbue them
with a degree of reasoning and language. They can be seen as a brain for such
automatons, able to respond to an external input.

AI beginnings
Modern software—and AI as a vibrant part of it—represents the culmination
of an embryonic vision that has traversed the minds of great thinkers since
the 17th century. Various mathematicians, philosophers, and scientists, in
diverse ways and at varying levels of abstraction, envisioned a universal
language capable of mechanizing the acquisition and sharing of knowledge.
Gottfried Leibniz (1646–1716), in particular, contemplated the idea that at
least a portion of human reasoning could be mechanized.
The modern conceptualization of intelligent machinery took shape in the
mid-20th century, courtesy of renowned mathematicians Alan Turing and
Alonzo Church. Turing’s exploration of “intelligent machinery” in 1947,
coupled with his groundbreaking 1950 paper, “Computing Machinery and
Intelligence,” laid the cornerstone for the Turing test—a pivotal concept in
AI. This test challenged machines to exhibit human behavior
(indistinguishable by a human judge), ushering in the era of AI as a scientific
discipline.

Note
Considering recent advancements, a reevaluation of the original
Turing test may be warranted to incorporate a more precise
definition of human and rational behavior.

NLP
NLP is an interdisciplinary field within AI that aims to bridge the interaction
between computers and human language. While historically rooted in
linguistic approaches, distinguishing itself from the contemporary sense of
AI, NLP has perennially been a branch of AI in a broader sense. In fact, the
overarching goal has consistently been to artificially replicate an expression
of human intelligence—specifically, language.
The primary goal of NLP is to enable machines to understand, interpret,
and generate human-like language in a way that is both meaningful and
contextually relevant. This interdisciplinary field draws from linguistics,
computer science, and cognitive psychology to develop algorithms and
models that facilitate seamless interaction between humans and machines
through natural language.
The history of NLP spans several decades, evolving from rule-based
systems in the early stages to contemporary deep-learning approaches,
marking significant strides in the understanding and processing of human
language by computers.
Originating in the 1950s, early efforts, such as the Georgetown-IBM
experiment in 1954, aimed at machine translation from Russian to English,
laying the foundation for NLP. However, these initial endeavors were
primarily linguistic in nature. Subsequent decades witnessed the influence of
Chomskyan linguistics, shaping the field’s focus on syntactic and
grammatical structures.
The 1980s brought a shift toward statistical methods, like n-grams, using
co-occurrence frequencies of words to make predictions. An example was
IBM’s Candide system for speech recognition. However, rule-based
approaches struggled with the complexity of natural language. The 1990s saw
a resurgence of statistical approaches and the advent of machine learning
(ML) techniques such as hidden Markov models (HMMs) and statistical
language models. The introduction of the Penn Treebank, a 7-million word
dataset of part-of-speech tagged text, and statistical machine translation
systems marked significant milestones during this period.
In the 2000s, the rise of data-driven approaches and the availability of
extensive textual data on the internet rejuvenated the field. Probabilistic
models, including maximum-entropy models and conditional random fields,
gained prominence. Begun in the 1980s but finalized years later, the
development of WordNet, a semantical-lexical database of English (with its
groups of synonyms, or synonym set, and their relations), contributed to a
deeper understanding of word semantics.
The landscape transformed in the 2010s with the emergence of deep
learning made possible by a new generation of graphics processing units
(GPUs) and increased computing power. Neural network architectures—
particularly transformers like Bidirectional Encoder Representations from
Transformers (BERT) and Generative Pretrained Transformer (GPT)—
revolutionized NLP by capturing intricate language patterns and contextual
information. The focus shifted to data-driven and pretrained language
models, allowing for fine-tuning of specific tasks.

Predictive AI versus generative AI


Predictive AI and generative AI represent two distinct paradigms, each
deeply entwined with advancements in neural networks and deep-learning
architectures.
Predictive AI, often associated with supervised learning, traces its roots
back to classical ML approaches that emerged in the mid-20th century. Early
models, such as perceptrons, paved the way for the resurgence of neural
networks in the 1980s. However, it wasn’t until the advent of deep learning in
the 21st century—with the development of deep neural networks,
convolutional neural networks (CNNs) for image recognition, and recurrent
neural networks (RNNs) for sequential data—that predictive AI witnessed a
transformative resurgence. The introduction of long short-term memory
(LSTM) units enabled more effective modeling of sequential dependencies in
data.
Generative AI, on the other hand, has seen remarkable progress, propelled
by advancements in unsupervised learning and sophisticated neural network
architectures (the same used for predictive AI). The concept of generative
models dates to the 1990s, but the breakthrough came with the introduction
of generative adversarial networks (GANs) in 2014, showcasing the power of
adversarial training. GANs, which feature a generator for creating data and a
discriminator to distinguish between real and generated data, play a pivotal
role. The discriminator, discerning the authenticity of the generated data
during the training, contributes to the refinement of the generator, fostering
continuous enhancement in generating more realistic data, spanning from
lifelike images to coherent text.
Table 1-1 provides a recap of the main types of learning processes.

TABLE 1-1 Main types of learning processes

Type Definition Training Use Cases


Supervised Trained on Adjusts Classification,
labeled data parameters to regression
where each input minimize the
has a prediction error
corresponding
label
Self- Unsupervised Learns to fill in NLP, computer
supervised learning where the blank (predict vision
the model parts of input data
generates its own from other parts)
labels
Semi- Combines Uses labeled data Scenarios with
supervised labeled and for supervised limited labeled
unlabeled data tasks, unlabeled data—for
for training data for example, image
generalizations classification
Unsupervised Trained on data Identifies inherent Clustering,
without explicit structures or dimensionality
supervision relationships in reduction,
the data generative
modeling

The historical trajectory of predictive and generative AI underscores the


symbiotic relationship with neural networks and deep learning. Predictive AI
leverages deep-learning architectures like CNNs for image processing and
RNNs/LSTMs for sequential data, achieving state-of-the-art results in tasks
ranging from image recognition to natural language understanding.
Generative AI, fueled by the capabilities of GANs and large-scale language
models, showcases the creative potential of neural networks in generating
novel content.

LLMs
An LLM, exemplified by OpenAI’s GPT series, is a generative AI system
built on advanced deep-learning architectures like the transformer (more on
this in the appendix).
These models operate on the principle of unsupervised and self-supervised
learning, training on vast text corpora to comprehend and generate coherent
and contextually relevant text. They output sequences of text (that can be in
the form of proper text but also can be protein structures, code, SVG, JSON,
XML, and so on), demonstrating a remarkable ability to continue and expand
on given prompts in a manner that emulates human language.
The architecture of these models, particularly the transformer architecture,
enables them to capture long-range dependencies and intricate patterns in
data. The concept of word embeddings, a crucial precursor, represents words
as continuous vectors (Mikolov et al. in 2013 through Word2Vec),
contributing to the model’s understanding of semantic relationships between
words. Word embeddings is the first “layer” of an LLM.
The generative nature of the latest models enables them to be versatile in
output, allowing for tasks such as text completion, summarization, and
creative text generation. Users can prompt the model with various queries or
partial sentences, and the model autonomously generates coherent and
contextually relevant completions, demonstrating its ability to understand and
mimic human-like language patterns.
The journey began with the introduction of word embeddings in 2013,
notably with Mikolov et al.’s Word2Vec model, revolutionizing semantic
representation. RNNs and LSTM architectures followed, addressing
challenges in sequence processing and long-range dependencies. The
transformative shift arrived with the introduction of the transformer
architecture in 2017, allowing for parallel processing and significantly
improving training times.
In 2018, Google researchers Devlin et al. introduced BERT. BERT
adopted a bidirectional context prediction approach. During pretraining,
BERT is exposed to a masked language modeling task in which a random
subset of words in a sentence is masked and the model predicts those masked
words based on both left and right context. This bidirectional training allows
BERT to capture more nuanced contextual relationships between words. This
makes it particularly effective in tasks requiring a deep understanding of
context, such as question answering and sentiment analysis.
During the same period, OpenAI’s GPT series marked a paradigm shift in
NLP, starting with GPT in 2018 and progressing through GPT-2 in 2019, to
GPT-3 in 2020, and GPT-3.5-turbo, GPT-4, and GPT-4-turbo-visio (with
multimodal inputs) in 2023. As autoregressive models, these predict the next
token (which is an atomic element of natural language as it is elaborated by
machines) or word in a sequence based on the preceding context. GPT’s
autoregressive approach, predicting one token at a time, allows it to generate
coherent and contextually relevant text, showcasing versatility and language
understanding. The size of this model is huge, however. For example, GPT-3
has a massive scale of 175 billion parameters. (Detailed information about
GPT-3.5-turbo and GPT-4 are not available at the time of this writing.) The
fact is, these models can scale and generalize, thus reducing the need for task-
specific fine-tuning.

Functioning basics
The core principle guiding the functionality of most LLMs is autoregressive
language modeling, wherein the model takes input text and systematically
predicts the subsequent token or word (more on the difference between these
two terms shortly) in the sequence. This token-by-token prediction process is
crucial for generating coherent and contextually relevant text. However, as
emphasized by Yann LeCun, this approach can accumulate errors; if the N-th
token is incorrect, the model may persist in assuming its correctness,
potentially leading to inaccuracies in the generated text.
Until 2020, fine-tuning was the predominant method for tailoring models
to specific tasks. Recent advancements, however—particularly exemplified
by larger models like GPT-3—have introduced prompt engineering. This
allows these models to achieve task-specific outcomes without conventional
fine-tuning, relying instead on precise instructions provided as prompts.
Models such as those found in the GPT series are intricately crafted to
assimilate comprehensive knowledge about the syntax, semantics, and
underlying ontology inherent in human language corpora. While proficient at
capturing valuable linguistic information, it is imperative to acknowledge that
these models may also inherit inaccuracies and biases present in their training
corpora.

Different training approaches


An LLM can be trained with different goals, each requiring a different
approach. The three prominent methods are as follows:
Causal language modeling (CLM) This autoregressive method is used
in models like OpenAI’s GPT series. CLM trains the model to predict
the next token in a sequence based on preceding tokens. Although
effective for tasks like text generation and summarization, CLM models
possess a unidirectional context, only considering past context during
predictions. We will focus on this kind of model, as it is the most used
architecture at the moment.
Masked language modeling (MLM) This method is employed in
models like BERT, where a percentage of tokens in the input sequence
are randomly masked and the model predicts the original tokens based
on the surrounding context. This bidirectional approach is advantageous
for tasks such as text classification, sentiment analysis, and named entity
recognition. It is not suitable for pure text-generation tasks because in
those cases the model should rely only on the past, or “left part,” of the
input, without looking at the “right part,” or the future.
Sequence-to-sequence (Seq2Seq) These models, which feature an
encoder-decoder architecture, are used in tasks like machine translation
and summarization. The encoder processes the input sequence,
generating a latent representation used by the decoder to produce the
output sequence. This approach excels in handling complex tasks
involving input-output transformations, which are commonly used for
tasks where the input and output have a clear alignment during training,
such as translation tasks.
The key disparities lie in their objectives, architectures, and suitability for
specific tasks. CLM focuses on predicting the next token and excels in text
generation, MLM specializes in (bidirectional) context understanding, and
Seq2Seq is adept at generating coherent output text in the form of sequences.
And while CLM models are suitable for autoregressive tasks, MLM models
understand and embed the context, and Seq2Seq models handle input-output
transformations. Models may also be pretrained on auxiliary tasks, like next
sentence prediction (NSP), which tests their understanding of data
distribution.

The transformer model


The transformer architecture forms the foundation for modern LLMs.
Vaswani et al. presented the transformer model in a paper, “Attention Is All
You Need,” released in December 2017. Since then, NLP has been
completely revolutionized. Unlike previous models, which rely on sequential
processing, transformers employ an attention mechanism that allows for
parallelization and captures long-range dependencies.
The original model consists of an encoder and decoder, both articulated in
multiple self-attention processing layers. Self-attention processing means that
each word is determined by examining and considering its contextual
information.
In the encoder, input sequences are embedded and processed in parallel
through the layers, thus capturing intricate relationships between words. The
decoder generates output sequences, using the encoder’s contextual
information. Throughout the training process, the decoder learns to predict
the next word by analyzing the preceding words.
The transformer incorporates multiple layers of decoders to enhance its
capacity for language generation. The transformer’s design includes a context
window, which determines the length of the sequence the model considers
during inference and training. Larger context windows offer a broader scope
but incur higher computational costs, while smaller windows risk missing
crucial long-range dependencies. The real “brain” that allows transformers to
understand context and excel in tasks like translation and summarization is
the self-attention mechanism. There’s nothing like conscience or neuronal
learning in today’s LLM.
The self-attention mechanism allows the LLM to selectively focus on
different parts of the input sequence instead of treating the entire input in the
same way. Because of this, it needs fewer parameters to model long-term
dependencies and can capture relationships between words placed far away
from each other in the sequence. It’s simply a matter of guessing the next
words on a statistical basis, although it really seems smart and human.
While the original transformer architecture was a Seq2Seq model,
converting entire sequences from a source to a target format, nowadays the
current approach for text generation is an autoregressive approach.
Deviating from the original architecture, some models, including GPTs,
don’t include an explicit encoder part, relying only on the decoder. In this
architecture, the input is fed directly to the decoder. The decoder has more
self-attention heads and has been trained with a massive amount of data in an
unsupervised manner, just predicting the next word of existing texts.
Different models, like BERT, include only the encoder part that produces the
so-called embeddings.

Tokens and tokenization


Tokens, the elemental components in advanced language models like GPTs,
are central to the intricate process of language understanding and generation.
Unlike traditional linguistic units like words or characters, a token
encapsulates the essence of a single word, character, or subword unit. This
finer granularity is paramount for capturing the subtleties and intricacies
inherent in language.
The process of tokenization is a key facet. It involves breaking down texts
into smaller, manageable units, or tokens, which are then subjected to the
model’s analysis. The choice of tokens over words is deliberate, allowing for
a more nuanced representation of language.
OpenAI and Azure OpenAI employ a subword tokenization technique
Random documents with unrelated
content Scribd suggests to you:
THE TALKING LEAVES.[1]
An Indian Story.

BY WILLIAM O. STODDARD.

Chapter VIII.
refusal to go out with the hunters was a strange thing to come
from Red Wolf. No other young brave in that band of Apaches
had a better reputation for killing deer and buffaloes. It was a
common saying among the older squaws that when he came to
have a lodge of his own "there would always be plenty of meat
in it." He was not, therefore, "a lazy Indian," and it was
something he had on his mind that kept him in the camp that
day. It had also made him beckon to Ni-ha-be, and look very
hard after Rita when she hurried away toward the bushes with
her three magazines of "talking leaves." Red Wolf was curious.
He hardly liked to say as much to a squaw, even such a young
squaw as Ni-ha-be, and his own sister, but he had some questions to ask her
nevertheless.
He might have asked some of them of his father, but the great war chief of that band of
Apaches was now busily watching Dolores and her saucepan, and everybody knew better
than to speak to him just before supper. Ni-ha-be saw at a glance what was the matter
with her haughty brother, and she was glad enough to tell him all there was to know of
how and where the talking leaves had been found.
"Did they speak to you?"
"No; but I saw pictures."
"Pictures of what?"
"Mountains, big lodges, trees, braves, pale-face squaws, pappooses, white men's bears,
and pictures that lied. Not like anything."
"Ugh! Bad medicine. Talk too much. So blue-coat soldier throw them away."
"They talk to Rita."
"What say to her?"
"I don't know. She'll tell me. She'll tell you if you ask her."
"Ugh! No. Red Wolf is a warrior. Not want any squaw talk about pictures. You ask Rita
some things?"
"What things?"
"Make the talking leaves tell where all blue-coat soldiers go. All that camped here. Know
then whether we follow 'em."
"Maybe they won't tell."
"Burn some. The rest talk then. White man's leaves not want to tell about white man.
Rita must make them talk. Old braves in camp say they know. Many times the talking
leaves tell the pale-faces all about Indians. I Tell where go. Tell what do. Tell how to find
and kill. Bad medicine."
The "old braves" of many an Indian band have puzzled their heads over the white man's
way of learning things and sending messages to a distance, and Red Wolf's ideas had
nothing unusual in them. If the talking leaves could say anything at all, they could be
made to tell a chief and his warriors the precise things they wanted to know.
Ni-ha-be's talk with her brother lasted until he pointed to the camp fire, where Many
Bears was resting after his first attack upon the results of Mother Dolores's cookery.
"Great chief eat. Good time talk to him. Go now."
There was no intentional lack of politeness in the sharp, overbearing tone of Red Wolf. It
was only the ordinary manner of a warrior speaking to a squaw. It would therefore have
been very absurd for Ni-ha-be to get out of temper about it; but her manner and the toss
of her head as she turned away were decidedly wanting in the submissive meekness to
be expected of her age and sex.
"It won't be long before I have a lodge of my own," she said, positively. "I'll have Rita
come and live with me. Red Wolf shall not make her burn the talking leaves. Maybe she
can make them talk to me. My eyes are better than hers. She's nothing but a pale-face, if
she did get brought into my father's lodge."
A proud-spirited maiden was Ni-ha-be, and one who wanted a little more of "her own
way" than she could have under the iron rule of her great father and the watchful eyes of
Mother Dolores.
"I'll go to the bushes and see Rita. Our supper won't be ready yet for a good while."
It would be at least an hour, but Ni-ha-be had never seen a clock in her life, and knew
nothing at all about "hours." There is no word for such a thing in the Apache language.
She was as light of foot as an antelope, and her moccasins hardly made a sound upon
the grass as she parted the bushes and looked in upon Rita's hiding-place.
"Weeping? The talking leaves have been scolding her. I will burn them. They shall not say
things to make her cry."
In a moment more her arms were around the neck of her adopted sister. It was plain
enough that the two girls loved each other dearly.
"Rita, what is the matter? Have they said strong words to you?"
"No, Ni-ha-be; good words, all of them. Only I can not understand them all."
"Tell me some. See if I can understand them. I am the daughter of a great chief."
Ni-ha-be did not know how very little help the wealth of a girl's father can give her in a
quarrel with her school-books. But just such ideas as hers have filled the silly heads of
countless young white people of both sexes.
"I can tell you some of it."
"Tell me what made you cry."
"I can't find my father. He is not here. Not in any of them."
"You don't need him now. He was only a pale-face. Many Bears is a great chief. He is
your father now."
Something seemed to tell Rita that she would not be wise to arouse her friend's national
jealousy. It was better to turn to some of the pictures, and try to explain them. Very
funny explanations she gave, too, but she at least knew more than Ni-ha-be, and the
latter listened seriously enough.
"Rita, was there ever such a mule as that?—one that could carry a pack under his skin?"
It was Rita's turn now to be proud, for that was one of the pictures she had been able to
understand. She had even read enough to be able to tell Ni-ha-be a good deal about a
camel.
It was deeply interesting, but the Apache maiden suddenly turned from the page to
exclaim,
"Rita, Red Wolf says the talking leaves must tell you about the blue-coat soldiers or he
will burn them up."
"I'm going to keep them."
"I won't let him touch them."
"But, Ni-ha-be, they do tell about the soldiers. Look here."
She picked up another of the magazines, and turned over a few leaves.
"There they are. All mounted and ready to march."
Sure enough, there was a fine wood-cut of a party of cavalry moving out of camp with
wagons.
Over went the page, and there was another picture.
Ten times as many cavalry on the march, followed by an artillery force with cannon.
"Oh, Rita! Father must see that."
"Of course he must; but that is not all."
Another leaf was turned, and there was a view of a number of Indian chiefs in council at
a fort, with a strong force of both cavalry and infantry drawn up around them.
Rita had not read the printed matter on any of those pages, and did not know that it was
only an illustrated description of campaigning and treaty-making on the Western plains.
She was quite ready to agree with Ni-ha-be that Many Bears ought to hear at once what
the talking leaves had to say about so very important a matter.
It was a good time to see him now, for he was no longer very hungry, and word had
come in from the hunters that they were having good success. A fine prospect of a
second supper, better than the first, was just the thing to make the mighty chief good-
tempered, and he was chatting cozily with some of his "old braves" when Rita and Ni-ha-
be drew near.
They beckoned to Red Wolf first.
"The talking leaves have told Rita all you wanted them to. She must speak to father."
Red Wolf's curiosity was strong enough to make him arrange for that at once, and even
Many Bears himself let his face relax into a grim smile as the two girls came timidly
nearer the circle of warriors.
After all, they were the pets and favorites of the chief; they were young and pretty, and
so long as they did not presume to know more than warriors and counsellors they might
be listened to. Besides, there were the talking leaves, and Rita's white blood, bad as it
was for her, might be of some use in such a matter.
"Ugh!"
Many Bears looked at the picture of the cavalry
squad with a sudden start. "No lie this time. Camp
right here. Just so many blue-coats. Just so many
wagons. Good. Now where go?"
Rita turned the leaf, and her Indian father was yet
more deeply interested.
"Ugh! More blue-coats. Great many. No use follow.
Get all killed. Big guns. Indians no like 'em. Ugh!"
If the cavalry expedition was on its way to join a
larger force, it would indeed be of no use to follow
it, and Many Bears was a cautious leader as well
as a brave one.
Rita's news was not yet all given, however, and
when the eyes of the chief fell upon the picture of
the "treaty-making" he sprang to his feet.
"Ugh! Big talk come. Big presents. Other Apaches
"MANY BEARS LOOKED AT THE
all know—all be there—all get blanket, gun,
PICTURE."
tobacco, new axe. Nobody send us word, because
we off on hunt beyond the mountains. Now we
know, we march right along. Rest horse, kill game, then ride. Not lose our share of
presents."
Rita could not have told him his mistake, and even if she had known it, she would have
been puzzled to explain away the message of the talking leaves.
Did not every brave in the band know that that first picture told the truth about the
cavalry? Why, then, should they doubt the correctness of the rest of it?
No; a treaty there was to be, and presents were to come from the red man's "great
father at Washington," and that band of Apaches must manage to be on hand and secure
all that belonged to it, and as much more as possible.
Red Wolf had nothing more to say about burning up leaves which had talked so well, and
his manner toward Rita was almost respectful as he led her and Ni-ha-be away from the
group of great men that was now gathering around the chief. Red Wolf was too young a
brave to have any business to remain while gray heads were in council. A chief would
almost as soon take advice from a squaw as from a "boy."
Mother Dolores had heard nothing of all this, but her eyes had not missed the slightest
thing. She had even permitted a large slice of deer meat to burn to a crisp in her eager
curiosity.
"What did they say to the chief?" was her first question to Rita.
But Ni-ha-be answered her with: "Ask the warriors. If we talk too much, we shall get into
trouble."
"You must tell me."
"Not until after supper. Rita, don't let's tell her a word unless she cooks for us and gives
us all we want. She made us get our own supper last night."
"You came late. I did not tell your father. I gave you enough. I am very good to you."
"No," said Rita; "sometimes you are cross, and we don't get enough to eat. Now you
shall cook us some corn-bread and some fresh meat. I am tired of dried buffalo: it is
tough."
The curiosity of Dolores was getting hotter and hotter, and she thought again of the
wonderful leaf which had spoken to her. She wanted to ask Rita questions about that too,
and she had learned by experience that there was more to be obtained from her willful
young friends by coaxing than in any other way.
"I will get your supper now, while the chiefs are talking. It shall be a good supper—good
enough for Many Bears. Then you shall tell me all I ask."
"Of course I will," said Rita.
A fine fat deer had been deposited near that camp fire by one of the first hunters that
had returned, and Mother Dolores was free to cut and carve from it, but her first attempt
at a supper for the girls did not succeed very well. It was not on account of any fault of
hers, however, or because the venison steak she cut and spread upon the coals, while
her corn-bread was frying, did not broil beautifully.
No; the temporary disappointment of Ni-ha-be and Rita was not the fault of Mother
Dolores. Their mighty father was sitting where the odor of that cookery blew down upon
him, and it made him hungry again before the steak was done. He called Red Wolf to
help him, for the other braves were departing to their own camp fires, and in a minute or
so more there was little left of the supper intended for the two young squaws. Dolores
patiently cut and began to broil another slice, but that was Red Wolf's first supper, and it
was the third slice which found its way into the lodge, after all.
The strange part of it was that not even Ni-ha-be dreamed of complaining. It was
according to custom.
There was plenty of time to eat supper after it came, for Dolores was compelled to look
out for her own. She would not have allowed any other squaw to cook for her, any more
than she herself would have condescended to fry a cake for any one below the rank of
her own husband and his family.
Mere common braves and their squaws could take care of themselves, and it was of small
consequence to Dolores whether they had anything to eat or not. There is more
"aristocracy" among the wild red men than anywhere else, and they have plenty of white
imitators who should know better.

[to be continued.]

HAPPY AS A KING—"PAPERS ALL


SOLD."
SHADOW PANTOMIMES.
What are the boys and girls going to do Thanksgiving night when dinner is over, the nuts
and raisins all gone, the last sugar-plum eaten, and it isn't yet time to go to bed?
Suppose they try Shadow Pantomimes.
Draw a white screen across the parlor, hanging down to the floor, darken the part of the
room where the audience are, and place one strong light at the extreme end, behind the
stage, so that the shadows of the actors will be thrown on the screen when they pass or
stand behind it. The subjects have to be guessed by the audience. A Shadow Pantomime
has the advantage that all sorts of contrivances can be used, and the appearance of the
players disguised, so that the lookers-on will soon want to see what is at the other side
of the screen, where the sight of card-board cats and donkeys and paper noses and chins
would be a sad disillusion. The player should in general keep near the screen, but never
touch or shake it; and as there is no scenery except such shadows as bushes or fences,
no scene is announced, but all has to be guessed from the action of the figures. The
subjects should, of course, be easy to guess, as the audience enjoys better what is
recognized quickly. We suggest to ingenious shadow-makers as possible subjects:
Cinderella—the child and the godmother, the dance, the fitting of the shoe. The Lion and
the Unicorn—the lion's mane and tail and the unicorn's horn being the chief distinctions,
and the crown being represented on a pole in the middle while they fight; afterward the
representation of the last lines are easy: "Some gave them white bread, and some gave
them brown; some gave them plum-cake, and drummed them out of town." Punch and
Judy, with Judy's large cap and Punch's hump, pointed cap, and long nose and chin, and
of course a Toby, well cut out of mill-board or card-board. The House that Jack built, with
a constant show of the objects in succession, some of them only cut models, held at a
distance from the screen so as to enlarge the shadows: this would be necessary, for
instance, in showing the house with its bright windows, and it is well for such subjects to
draw a curtain across the lower part of the stage, and place a screen at each side, so as
to leave only a small square of light for exhibiting the shadows, while the hands are
hidden behind the screens. Sing a Song of Sixpence, the pie being the shadow of a
packed clothes-basket, the king and queen wearing crowns, and the blackbird of the last
verses being swung on the end of a thread so as to hit off a paper nose.
Most of the nursery rhymes admit of being shown in shadows, and also such ballads as
the "Mistletoe Bough." There may be, for a change at the end, a few shadow charades,
such as Snow-ball, Cox-comb, Asterisk (ass-tea-risk), Ring-let, Cat-as-(ass)-trophy, etc.,
done quickly and guessed easily.
KING HAZELNUT
King Hazelnut, of Weisnichtwo,
A jolly King was he,
And all his subjects, high and low,
Were happy as could be.

They feasted every day on pie


And pudding and plum-cake,
And never broke the law—for why?—
There was no law to break.

Oh, jolly was King Hazelnut,


Especially at noon;
Then many a caper he would cut,
And hum a merry tune.

And from his golden throne he'd hop,


And fling his sceptre down,
And on the table, like a top,
Would spin his golden crown.

Then he would slap his sides and sing


Unto his serving-man,
"That rolly-poly pudding bring
As lively as you can."
.
A HAPPY THANKSGIVING and a splendid time to all our boys and girls!

Glencoe, Louisiana.

Viola E. would perhaps find the names most familiar to your young Creole
subscribers in Louisiana as unaccustomed as are those of which she writes to the
ears of children outside of Virginia. In this house the young girl to whom Young
People is addressed was christened Elmire, but is known only by her petit nom of
"Fillette." Her mother's name is Gracieuse—is it not musical? An impish little ebon-
hued maid in the yard is Mariquite. Another, with gleaming ivories, is Yélie. A cousin
who comes often, and is nearly old enough to cast his vote, is yet "Bébé," despite his
sponsors having called him Édouard. And "Guisson," his brother, who would guess
his name to be Émile?
A little knowledge of creole interiors would correct the ideas so prevalent as to creole
indolence. Away down here, on a sluggish little bayou that makes its way through
the plantation to the not-far-distant Gulf, these young girls, though not perhaps
speaking so good English as their Virginia sisters of Anglo-Saxon extraction, having
learned it rather from the lips of negro servants than from their parents, are, at any
rate, their peers in womanly accomplishments, if practical knowledge of the details
of a ménage constitutes such—the ability to wash, starch, iron, straighten a room,
make a gumbo, mix a cake and bake it, etc. The very neatly made calico dresses
they wear are their own handiwork. After five hours spent in the school-room with
their institutrice, and the required time given to the practice of their piano, one of
them is amusing herself by making a quantity of under-clothing for a beloved little
filleule. A basse-cour of about six hundred turkeys, ducks, and chickens is cared for
almost wholly by the two girls and their mother. Domestic virtues these, worthy even
of Yankee girls, are they not? Just as much, though, as Yankee girls or as Virginia
girls do these young Louisianians claim their heritage as Americans and their place
among your "Young People."

L'Institutrice.
We have read this letter with great pleasure, and now we would like to hear from
somebody about our Western girls; and the New England girls too will find a corner
waiting if they choose to write.

Harper, Iowa.

I can now read all the long stories in Young People. I liked "Tim and Tip" very much,
and think the bear hunt was quite funny. I had a pair of white doves given me as a
present. One of them, in trying to fly through the screen door, broke its neck, and
the other flew away with some wild ones. So I lost my pets, and was very sorry. I
am sorry for Jimmy Brown. He makes me think of myself sometimes. My sister
teaches piano music. My two brothers play in the Cornet Band, and I am learning
music; so we have plenty of music. We all go to school.

Harper R.

Manhattan, Kansas.

I have three brothers and two sisters. This summer we all went to New Mexico. We
stopped at Las Vegas, and saw the Hot Springs, and the water in the springs was so
hot that we could not hold our hands in it. And we stopped over Sunday at Santa Fe,
and saw the Corpus Christi procession. We saw a horned toad that ran as fast as a
horse. We brought back two donkeys, and mine threw me off, and broke my two
front teeth. Uncle Henry gave us some saddles. Our baby is only two months old,
and has red hair. I liked "Toby Tyler" best of any. I am nine years old. My name is

Maggie P.

ROSA MAYFIELD'S LOSS.

Let me introduce my readers to a bright, sunny-haired girl who on a pleasant


morning in July is playing in a large garden. She first sits down in a pretty little arbor,
and sews for a short time; then she puts her work away, and goes to plant some
seed which old James, the gardener, has given her. Suddenly she hears some one
calling to her from the house.
"Rosa! Rosa! come here a minute, my child."
"Yes, mamma," said Rosa; "I will come as soon as I have put away my tools."
When she reached the sitting-room, her mamma was not there, but on running to
the bedroom, she found her, all dressed to go out, and putting on her gloves. As
soon as she saw Rosa, she said: "Would you like to go to the cattle show with me,
dear, and then go to your cousins, in the country for tea? The carriage will be round
presently."
"Oh yes, indeed I should, mamma," said the little girl, as she skipped away to nurse
to be dressed.
"Oh, you darling mamma," said Rosa, as she settled herself in the carriage beside
her mother. "I always enjoy going to tea with May and Clara Haliburton so much!
and I have never been to a cattle show;" and here she clapped her hands and
laughed so loud that her mother had to tell her to be quiet, as the passers-by would
think she must be a very badly behaved little girl.
At last, they reached the cattle show. Then they got out of the carriage, and went
inside. There they saw dogs, cats, rabbits, and all sorts of animals. Rosa was greatly
delighted with a beautiful white rabbit with pink eyes.
After they had seen enough, they drove to the rectory, where the Haliburtons lived.
After Rosa had said good-afternoon to her aunt, May and Clara took her to see the
chickens and rabbits, the donkey, and all their other pets. Never had she spent such
a delightful afternoon, and was very sorry when the tea bell rang, and they had to
go in. But what a tea they had! Muffins, cakes, and preserves of all sorts, and such
delicious fresh bread and butter, and new milk from her uncle's farm. At a quarter to
nine the carriage came to take them home, and they had to say good-by.
Rosa was so tired that she fell asleep in her mamma's arms, and never woke till the
next morning, when she found herself in her own little bed.
In Mrs. Mayfield's room some parcels are waiting, addressed to Miss R. Mayfield, one
large, and the others small; and as it is Rosa's birthday, she is to open them herself.
All the small ones are opened. In one she finds a gold brooch from her mamma; in
another is a prayer-book from her father; in the others are presents from all her little
friends. At last she unties the string and draws off the paper of the large parcel, and
gives one scream of delight as she sees in a beautiful lined basket the little rabbit
she saw at the cattle show. The lady to whom it belonged, being a friend of Mrs.
Mayfield, had heard Rosa saying she would like to have it, and had sent it to her.
Rosa ran off with her new pet to feed it, and after showing it to everybody she took
it into the garden and put it into a cage close by her arbor, in a sunny corner, where
she could always see it. She kept it carefully for three months; but on going to feed
it one morning, with her hands full of lettuce leaves and clover, she found her pet
was gone. A cruel cat had come every day and watched her feeding her rabbit, and
at last, seeing her just pull the door to, and not lock it, had seized the opportunity,
and had carried off her pet.
Poor little Rosa cried herself to sleep that night, and for many nights after, and never
loved any of the pets her mamma gave her as she had loved her little white rabbit.

Gussie Tobias (aged 10 years),


Liverpool, England.

Okahumpka, Florida.

I am a little girl ten years old, and live away down in South Florida, where the sun is
always bright and the trees always green. In our quiet little home there are only
mamma, Addie, and I. Our dear father is dead. Sister Addie is six years old. We have
no school, church, nor Sunday-school. Mamma gives us our lessons daily at home,
and a kind English gentleman gives me music lessons. We do not know who sends
us the Young People, but hope our kind unknown friend will see this letter, and learn
how much we enjoy the gift and appreciate the kindness. I am suffering from sore
eyes, and not allowed to read or write, so mamma is writing for me; but when I get
well I will write myself, and tell about our pets and other things.

Rosa M. J.

Scandia, Kansas.

I have been taking your paper almost a year, and like it very much. It was papa's
Christmas present to me, so I thought I would write you a letter. I have a pet hen. I
call her Brownie. She is getting old now. She answers me in hen language when I
take her up and talk to her. I have a canary-bird. I call him Dickey. He is just learning
to sing.

Laura H.

Harlem, New York.

I have had my cat Till seven years. We think he is a very wise cat, for he sits upon
his hind-legs and begs. When I go down stairs in the morning, if I say, "Good-
morning, Till," he will shake hands with me. He is a very dainty cat. He will not eat
roast beef unless it is very rare, and he does not care at all for the heads of chickens
and turkeys; but he loves cheese and crackers, and will eat all the cake I will give
him. I am eleven years old.

Mabel M. S.

Milwaukee, Wisconsin.

I have a great many dolls, and a large doll house in the conservatory, which I enjoy
very much, so I thought you would be pleased to have a letter from me. Mrs. Love
Lee and her ten children live in the large doll house, which is a little taller than I am.
I am six. The babies Faith, Hope, and Love are triplets. I wish we had three live
babies. Cozy has two kittens. Cozy is my cat. Arthur and Arabella are twins, about in
the middle. Blanche is the young lady, and Fifine the big school-girl. Rosebud is only
six inches tall, and her eyes open and shut, and she moves her head and arms and
legs. Daffodil is just the same, only smaller, and Joe is the little boy. Ida takes care of
the children in the nursery. Dinah is the cook. She is colored very much. Chechon
sets the table, and keeps the dining-room in order. Chechon is a Chinese. The twins
have a very nice cabinet of shells and stones. I gave them some out of mine. Each of
the children have something to do to help their mamma, just, as I do.
I go to Kindergarten, and once a week I speak a little piece out of Baby-Land, or St.
Nicholas, or Harper's Young People, or The Nursery. I can say all of "The Cat, the
Parrot, and the Monkey." It is just at the end of my bound Harper's Young People. It is
called "Filbert." That is the best story I know. I like "The Story of a Parrot," too, but
it would have been better if some one had carried him home at last. Papa says he
don't see why I like that story so well, but he reads it to me 'most every Sunday. He
likes "Toby Tyler" a great deal better, or even "Tim and Tip." They are pretty good
too. I don't like story boys as well as I do story animals. I like live animals too. Dogs
and cats are never afraid of me, but will come right to me in the street or anywhere.
I found a little mud-turtle at Minnehaha Falls, and brought it to papa and mamma by
its tail, and it played with me a little while, and then I carried it back to its cave at
the side of the path down the gully.
This fall I caught a live star-fish, when the tide was coming in, down on the beach at
Portland, Maine, and we brought it home to put in my cabinet when it gets dry
enough. It is sticky yet. It is out in the wood-shed drying. When we were going
there I caught a mouse. It ran into its hole in the corner of the dépôt, all but its tail.
I suppose I took hold too tightly, or else too high up, for he turned around and bit
my thumb. I wasn't going to hurt him, but just to play with him a little while. I wish
animals could talk. That was at the Montreal dépôt.
You asked about dolls. I have a doll, about a foot high, wheeling a little cart in front
of her. When I draw the cart by a string, the doll goes trot, trot, trot on behind, and
every one I meet turns around, and says, "Did you ever see anything so funny?"
Uncle Ebb found it at Manistee, Michigan, and sent it to me by express.
Blossom is my very large wax doll. I draw her around the block in her carriage every
pleasant afternoon. Sometimes Daisy, who is almost as large, rides in the front seat.
If it is too warm for Blossom to go out, Daisy will ride in the back seat, and Charity in
front. Charity is indestructible and good, but not beautiful. Cisily I took with me to
Vermont and Boston and Maine, because she had never been anywhere. She ought
to have a new dress Christmas, if Santa Claus only knew it. Joe is just as tall as
Cisily. I measure them often with my foot-rule. They are once and a half tall. They
have the same furry hair. They have a very nice carriage, and always ride out
together. I shall take Joe next. He has never been anywhere yet, but Cisily wore his
overcoat and rubbers East, and took his little knife I in her pocket. He thought she
might want it to whittle in Vermont or Boston. Uncle Ebb often helps me play, and
speaks for the dolls. I am all there is here of children.
I have a good many more dolls. There is a small doll house full, and Mother Goose
with her shoe full of them, and some of the children in the doll houses have dolls for
themselves. The "log-cabin" has a family in that. The "Swiss cottage" has only
wooden people. The frame house has twelve children. I like large families. They are
more convenient for the children. Mamma reads your letters to me. I could read
them, but they are printed so fine it is hard to read. I am in the Second Reader, and
the same words are easy to read in that. I read a lesson every day in the connecting
class, after Kindergarten is over at noon. I read, spell, write, and draw about fifteen
minutes each, and am home to dinner at one. Then come the kitties and dolls.

Nellie B.

Saybrook, Connecticut.

I see you want to know whether dolls have gone out of style. No, I think not. I am
eleven years old. I was very sick when I was six years old, and have not been able to
walk since except in braces. I have a rolling-chair that I am wheeled in when out-
doors, and I have many nice times with my dolls. I have eight of them. I think Young
People is very nice. I hope this is not too long to be printed, as it is my first letter to
any paper. I have eight pets.

Belle M. I.

I want to tell you about my little dog. He is a black and tan, and is so cute. He will
speak, sit on his hind-legs and beg, and catch anything thrown to him. His name is
Bijon.
I will send twenty-five rare foreign stamps for ten gilt picture advertising cards, and
give twelve internal revenue stamps for five gilt picture cards. One $2 stamp; nine
$1; a 30 cent, 50, 25, 20, 15; two 10, two 5, and one 2 cent stamp. Please give your
full address when you send cards. My name is

Nellie Mason, P. O. Box 636,


Madison, Wisconsin.

Hill View, Kentucky.

My teacher gave me Young People as a prize for being a good scholar. Ma raised
about one hundred turkeys this year, and I raised twelve guinea-fowl with them. I
like the paper very much. I am always glad when Saturday comes.

Carrie McK.

South Norwalk, Connecticut.

I am sorry the girl in South Glastenbury does not like cats. If she knew my cat, I
think she would like him. My brother caught fifty little fish for him, each about as
long as my little finger. After he had eaten twenty-five, he could scarcely eat any
more, but would not let us take them away, as he wanted to play with them.
Sometimes he goes to the door, and asks us to let him come up stairs, when he gets
into my doll's bed, pulls the sheet off her, and gets close to her. When she sits up in
a chair, he gets in her lap. He does not like to hear the noise made by dishes, so,
when they are washed, he mews till they are done. My brother plagued him once,
and Kit ran to the door, and stopped a minute to consider, then ran back, and struck
him with his paws. He is lazy, but you need not put that in Young People.

Jessie B.
A puss that has fifty fish offered him at once is quite excusable for being lazy. We think
he is a very interesting cat.

Oakdale, Pennsylvania.

Papa gave me a male canary about two years ago, and last spring my uncle gave my
sister a female, and we thought we would try to raise some little birds. The mother
bird laid five eggs, and they all hatched and grew to be big birds, were very tame,
and we used to carry them around the room, and let them ride in our dolls' coaches.
She laid five eggs again, but we only raised three more birds. They are all singers.
We have seven cats—Polly, Beauty, Tom, Milly, Pussy, Harry, and Lottie. Polly is a
Maltese. Our dog is named Friskie. I am ten years old.

Mary E. D.

Pine Bend, Minnesota.

I thought I would tell you about some hens we had when I was four or five years
old. One would come in the pantry, if the window was left open, and lay her egg in a
pan of eggs on the shelf. Another was determined to make her nest up stairs, and
we did not dare leave the front-door open. Another hen laid three times in the wood-
box in the kitchen, in spite of being driven out many times.

Mary M.

Denver, Colorado.

I like the paper real well, and the little letters too. My mamma reads 'em to us,
'cause we can't read ourselves. Grandpapa sent it to brother and me last New-Year's.
My dolly I like so much! She has nice clothes, and the dearest little button boots and
stockings what come off; and I have lovely dishes. Grandpapa sent 'em to me. I
have lots of nice times with my things, but there are too many to tell about. We had
a nice time at a birthday party Saturday. I just started to school this fall. I will be
seven years old to-morrow. Mamma "finks" my letter pretty nearly too long now, so I
won't write any more. I'll try and not be "'spointed" if you can't print it, 'cause you
have so many letters. Mamma's writing for me. Good-by.

Nellie D.
I am Charlie, Nellie's brother. I like all the stories so well, I can't tell which I like best.
We can see the mountains from our doors and windows just as plain all the time,
only when it's stormy. My kitty got up in mamma's lap at table the other day, and
wanted to eat out of her plate. I had a live frog in a pail. One morning I went to
school, and forgot to fill up the pail, and just as I came from school kitty had him. He
killed him, and was going to eat him. I took him away, and gave him to the chickens,
and spanked Sam—that's my kitty's name; I named him for grandpapa. I will be nine
years old April 3, but it's so hard to write. Good-by.

Charles Fred D.

Brooklyn, New York.

I am eleven years old, and I save the pennies I get for doing errands to buy Harper's.
I earned four dollars this season to help papa buy me a winter suit. I have been to
Boston, and would like to live there all the time. I have only one sister, and she is my
pet. She has a little white bantam hen for her pet. I have nine aunts, and I am going
to write to them all some day, and send them one of my Harper's Magazines.
Mamma wrote this letter, but I told her what to say. Good-by, from

Daniel A.

C. Y. P. R. U.
The Postmistress is very happy to give the readers of Our Post-office Box the pleasure of
reading a description of the little yacht Toby Tyler, now cruising in Southern waters:

Dear "Young People,"—The Toby Tyler, named after the hero of Mr. Otis's most
successful story, is a very small steamer, being only about forty-five feet in length,
and drawing but three feet of water. She was built so small and of such light draught
because it is intended that she shall explore most of the rivers on the west coast of
Florida, some of which are very shallow. Perhaps she will go farther than Florida, and
explore a country that abounds in material for interesting adventures and thrilling
stories.
As the Toby is so small, she can not go away out to sea and around Cape Hatteras,
like the great steam-ships that carry passengers to Florida. She has to take what is
known as the "inland passage."
After leaving her dock at the foot of West Twenty-ninth Street, in New York, the Toby
steamed down the North or Hudson River until she passed the Battery. Then she was
in the Upper Bay. Crossing this, and turning to the westward, she steamed along the
north shore of Staten Island, through the broad river-like body of water called the
Kill Von Kull. Passing New Brighton and the Sailors' Sung Harbor and Elizabethport,
through the Arthur Kill and Staten Island Sound, both continuations of the Kill Von
Kull, the Toby reached Perth Amboy, and turned into the Raritan River, which here
empties into Raritan Bay.
The Raritan River is so shallow and so crooked that the yacht proceeded very slowly
and carefully for seventeen miles, until she reached New Brunswick. Here she
entered the Delaware and Raritan Canal, and found herself in company with great
numbers of heavy canal-boats drawn by mules or horses. The canal in which the
little Toby now sailed runs through a very beautiful portion of New Jersey, and her
passengers enjoyed travelling on it very much. They especially enjoyed going
through the locks, always in company with some other craft, which was sometimes a
canal-boat, sometimes another steamer, with sometimes a big schooner, whose tall
masts and white sails looked very funny among the trees on the canal banks.
The principal places that the Toby passed while in the canal were Bound Brook,
Princeton, Trenton, and Bordentown. At the last-named place she passed through
the last of the twelve locks, and having had forty-three miles of canal sailing,
steamed gladly out into the broad Delaware River.
A run of twenty-nine miles down this beautiful river brought her to Philadelphia,
where she rested for a few days, and gave her passengers time to get acquainted
with this dear old city, in which so many of the readers of Harper's Young People live.
On leaving Philadelphia the Toby steamed merrily down the Delaware for forty miles
to Delaware City, in the State of Delaware, where she entered the Delaware and
Chesapeake Canal, which connects the Delaware River with Chesapeake Bay. This
canal is only fourteen miles long, and has but two locks, one at each end, so that the
little yacht, soon found herself at Chesapeake City, in the State of Maryland, and at
the southern end of the canal.
After an all day's run down the upper end of Chesapeake Bay, the Toby entered the
Patapsco River, and steamed up to Baltimore, where she landed her passengers in
time to witness the great Oriole Celebration.
Then she went back down the Patapsco and again into Chesapeake Bay. This bay is
so wide that it is almost as rough and stormy at times as the sea itself, and the poor
little Toby had a very hard time, and was roughly handled by the great waves before
the pleasant Wednesday morning when she turned into the broad mouth of the York
River, and dropped anchor amongst the big ships in front of Yorktown. As the little
boat ran in between two of the great war ships, they began firing guns and banging
away at such a furious rate that in a few moments not only the poor little Toby but
they themselves were completely enveloped in a dense cloud of smoke. In a few
minutes those on board the Toby learned that the government steamer Dispatch,
with President Arthur on board, had just arrived, and that all this firing of guns was
only a salute to him, as though the big ships had said, "How do you do, Mr.
President? We are very glad to welcome you to Yorktown."
After leaving this place the Toby went back down the York River into Chesapeake Bay
again, and for a short distance out into the ocean, before steaming past the grim
walls of Fortress Monroe and into Hampton Roads.
Without stopping to see the fort or the Indian schools at Hampton, the Toby hurried
on, and an hour later sailed into the quiet harbor of Norfolk, at the mouth of the
Elizabeth River.
The upper deck or cabin roof of the Toby Tyler extends nearly over her entire length,
so that, though small, she can be made very comfortable in any weather. Her cabin,
which is also dining-room and sleeping-room for four, is back of the engine-room,
and occupies the whole of the after-part of the yacht. Her engine is in the middle,
right under the smoke-stack, and forward of this is the cockpit, of which the sides
are open except when inclosed by heavy canvas storm curtains. Here, in very warm
weather, hammocks can be slung at night, in which the passengers may sleep.
On the upper deck is a light cedar canoe—the Psyche—with paddles, masts, and
sails, intended for exploring rivers and lakes that are too shallow for the Toby, and
beside the canoe is lashed a good-sized tent with its poles, so that when Mr. Otis and
his friends tire of living on board the yacht, they can, if they choose, establish a
camp on shore.
In various lockers on the yacht, besides the baggage of her passengers and crew,
and the coal, are stored four hundred pounds of canned provisions and fruits, a tool
chest, medicine chest, ammunition chest, blankets, writing and sketching materials,
books, charts, etc.

Captain C. K. M.

THE POET COWPER.

William Cowper was born November 26, 1731, in Hertfordshire. England. His mother
died before he was six years old. He was sent to a school where he suffered a great
deal from the teasing of the other boys. He had an affection of the eyes, and so he
was placed at an oculist's house, where he had smallpox, and that cured his eyes.
After that he became a clerk in a lawyer's office, and studied for admission to the
bar. The strain on his mind was too great, and he sought relief by trying to commit
suicide by hanging. In this he did not succeed. A friend placed him in the country,
where, after skillful treatment, he recovered from the fits of mental depression that
he was subject to. He was fickle and inconstant to friends, but loving and kind to his
pets. He had three leverets, or hares, given to him, and in these he found much
amusement, for he was sick, and wanted something to occupy his mind. The hares
were males, and their names were Puss, Tiney, and Bess. He built them a house,
and each had his own bedroom to sleep in. Puss lived to be eleven years old, Tiney
to be nine, and Bess died soon after Cowper received him. The poetry about the
chair is found in the "Task," and is called "The Sofa." Cowper died in the town of
East Durham, on Friday, the 25th of April, 1800, and was buried in St. Edmund's
Chapel, in the church of East Durham.

Edna L. Maynard.
This little description of the poet Cowper is very creditable to its writer, who is only
eleven years old. But the Postmistress must disagree with her in the opinion that he was
inconstant and fickle as a friend.

In this number we begin the publication of a series of articles calculated to be of especial


interest to the members of the C. Y. P. R. U. They are from the pen of the popular English
novelist Mr. James Payn, and, under the head of "Perils and Privations," deal with stories
of fact relating to shipwreck more thrilling than any tales of fictitious adventure.
PUZZLES FROM YOUNG CONTRIBUTORS.
No. 1.

HISTORICAL ENIGMA.

I am a celebrated document, and am composed of eleven letters.


My first was one of the decisive battles of the world, and was fought between the Greeks
and Persians.
My second was a very great warrior, who could not govern himself, though he conquered
the world.
My third was a humane physician who invented an instrument of cruelty.
My fourth was a great philosopher and mathematician.
My fifth came over in the Mayflower.
My sixth was a young hero celebrated by an English poetess.
My seventh was a blind poet whom seven cities claimed for their own.
My eighth was one of the signers of the Declaration of Independence.
My ninth was a great artist.
My tenth is a distinguished living poet.
My eleventh met a disgraceful death in the Revolutionary war.
Susan Nipper.

No. 2.

TWO EASY DIAMONDS.


1.—Centrals.—A famous battle in the Revolution.
1. A letter. 2. A weapon. 3. A sort of knife. 4. Spectral. 5. The conclusion. 6. A letter.
W. D. M.
2.—1. A letter. 2. Devoured. 3. Orbs of light. 4. A period. 5. A letter.
E. W.

No. 3.
NUMERICAL ENIGMA.

The whole, of 14 letters, is a city in Europe.


My 8, 2, 7 is a weight.
My 14, 6, 8, 11, 10 is an American city.
My 1, 6, 3, 5, 2, 3 is a Chinese city.
My 12, 9, 4, 5, 2, 13 is a small fire-arm.
Damon and Pythias.
ANSWERS TO PUZZLES IN No. 105.
No. 1.

P I L OT D
I VAN NED
L AC DEB AR
ON DAM
T R

No. 2.

F
S AD
S I RED
STR I P E S
S I REN I CA L
F A R I N A C E OUS
D E P I C T I NG
DEC E I V E
S AON E
L UG
S

No. 3.
Valhalla.

No. 4.
"John Burns of Gettysburg."

No. 5.

D
SER
DA T E S
DE L E TES
S A L E R A TUS
DE T E RM I N E RS
RE TA I NERS
SETNESS
SUE RS
SRS
S

Correct answers to puzzles have been received from M. E. S., Willie Volckhausen, "North
Star," Frank S. Davis, Nannie Francis, Charles Beck, Emma Rose A., Lucy Cox, John D.
Smith, Kittie E. Gill, Henry E. Johnston, Jun., James R. Magoffin, Clara H. Tower, Annetta
D. Jackson, and Calvin Rufus Morgan.

[For Exchanges, see second and third pages of cover.]

THE REAL WAY TO CELEBRATE THANKSGIVING, ACCORDING TO THE


VIEWS OF OUR ESTEEMED FELLOW-CITIZENS G. OBBLER, ESQ.,
MESSRS. T. URKEY, C. APON, D. UCK, R. OOSTER, AND MANY
OTHERS.
LETTER PUZZLES.
1.

Two S's, two N's, four E's, and a T,


Put together, and pray spell the word unto me.

2.

One R and two S's, three A's and one U,


Three N's and four T's and two I's, add unto
One O and one B, and tell me, I pray,
What word they will make if put in the right way.

3.

Four S's, four I's, two P's and an M,


What word can you easily make out of them?

4.

Three E's and two M's, two R's and one B,


Put down in right order, what word shall you see?
ANSWER TO YORKTOWN PUZZLE.
BELOW will be found the answer to the Yorktown Puzzle, given in No. 103, page 816:

NAMES OF ARTICLES (19).

N egro.
I mp.
N uts.
E nsigns.
T eeth.
E lm.
E wers.
N est.
T rays.
H andle.
O tter.
F lags.
O ats.
C hairs.
T ail.
O ak.
B ats.
E ave.
R amrod.

MILITARY MEN (16).

Steuben.
Lee.
Ward.
Marion.
Stark.
Gates.
Smith.
Greene.
St. Clair.
Stevens.
Gist.
Thomas.
Poor.
Arnold.
Nash.
Lafayette.

UNHAPPY THOUGHT.

Tommy. "I mean to be an Astronomer when I grow up!"

Effie. "What on earth will you do with yourself all Day long?"

FOOTNOTES:
[1] Begun in No. 101, Harper's Young People.
*** END OF THE PROJECT GUTENBERG EBOOK HARPER'S YOUNG
PEOPLE, NOVEMBER 22, 1881 ***

Updated editions will replace the previous one—the old editions


will be renamed.

Creating the works from print editions not protected by U.S.


copyright law means that no one owns a United States
copyright in these works, so the Foundation (and you!) can copy
and distribute it in the United States without permission and
without paying copyright royalties. Special rules, set forth in the
General Terms of Use part of this license, apply to copying and
distributing Project Gutenberg™ electronic works to protect the
PROJECT GUTENBERG™ concept and trademark. Project
Gutenberg is a registered trademark, and may not be used if
you charge for an eBook, except by following the terms of the
trademark license, including paying royalties for use of the
Project Gutenberg trademark. If you do not charge anything for
copies of this eBook, complying with the trademark license is
very easy. You may use this eBook for nearly any purpose such
as creation of derivative works, reports, performances and
research. Project Gutenberg eBooks may be modified and
printed and given away—you may do practically ANYTHING in
the United States with eBooks not protected by U.S. copyright
law. Redistribution is subject to the trademark license, especially
commercial redistribution.

START: FULL LICENSE


THE FULL PROJECT GUTENBERG LICENSE
PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK

To protect the Project Gutenberg™ mission of promoting the


free distribution of electronic works, by using or distributing this
work (or any other work associated in any way with the phrase
“Project Gutenberg”), you agree to comply with all the terms of
the Full Project Gutenberg™ License available with this file or
online at www.gutenberg.org/license.

Section 1. General Terms of Use and


Redistributing Project Gutenberg™
electronic works
1.A. By reading or using any part of this Project Gutenberg™
electronic work, you indicate that you have read, understand,
agree to and accept all the terms of this license and intellectual
property (trademark/copyright) agreement. If you do not agree
to abide by all the terms of this agreement, you must cease
using and return or destroy all copies of Project Gutenberg™
electronic works in your possession. If you paid a fee for
obtaining a copy of or access to a Project Gutenberg™
electronic work and you do not agree to be bound by the terms
of this agreement, you may obtain a refund from the person or
entity to whom you paid the fee as set forth in paragraph 1.E.8.

1.B. “Project Gutenberg” is a registered trademark. It may only


be used on or associated in any way with an electronic work by
people who agree to be bound by the terms of this agreement.
There are a few things that you can do with most Project
Gutenberg™ electronic works even without complying with the
full terms of this agreement. See paragraph 1.C below. There
are a lot of things you can do with Project Gutenberg™
electronic works if you follow the terms of this agreement and
help preserve free future access to Project Gutenberg™
electronic works. See paragraph 1.E below.
1.C. The Project Gutenberg Literary Archive Foundation (“the
Foundation” or PGLAF), owns a compilation copyright in the
collection of Project Gutenberg™ electronic works. Nearly all the
individual works in the collection are in the public domain in the
United States. If an individual work is unprotected by copyright
law in the United States and you are located in the United
States, we do not claim a right to prevent you from copying,
distributing, performing, displaying or creating derivative works
based on the work as long as all references to Project
Gutenberg are removed. Of course, we hope that you will
support the Project Gutenberg™ mission of promoting free
access to electronic works by freely sharing Project Gutenberg™
works in compliance with the terms of this agreement for
keeping the Project Gutenberg™ name associated with the
work. You can easily comply with the terms of this agreement
by keeping this work in the same format with its attached full
Project Gutenberg™ License when you share it without charge
with others.

1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside
the United States, check the laws of your country in addition to
the terms of this agreement before downloading, copying,
displaying, performing, distributing or creating derivative works
based on this work or any other Project Gutenberg™ work. The
Foundation makes no representations concerning the copyright
status of any work in any country other than the United States.

1.E. Unless you have removed all references to Project


Gutenberg:

1.E.1. The following sentence, with active links to, or other


immediate access to, the full Project Gutenberg™ License must
appear prominently whenever any copy of a Project
Gutenberg™ work (any work on which the phrase “Project
Gutenberg” appears, or with which the phrase “Project
Gutenberg” is associated) is accessed, displayed, performed,
viewed, copied or distributed:

This eBook is for the use of anyone anywhere in the United


States and most other parts of the world at no cost and
with almost no restrictions whatsoever. You may copy it,
give it away or re-use it under the terms of the Project
Gutenberg License included with this eBook or online at
www.gutenberg.org. If you are not located in the United
States, you will have to check the laws of the country
where you are located before using this eBook.

1.E.2. If an individual Project Gutenberg™ electronic work is


derived from texts not protected by U.S. copyright law (does not
contain a notice indicating that it is posted with permission of
the copyright holder), the work can be copied and distributed to
anyone in the United States without paying any fees or charges.
If you are redistributing or providing access to a work with the
phrase “Project Gutenberg” associated with or appearing on the
work, you must comply either with the requirements of
paragraphs 1.E.1 through 1.E.7 or obtain permission for the use
of the work and the Project Gutenberg™ trademark as set forth
in paragraphs 1.E.8 or 1.E.9.

1.E.3. If an individual Project Gutenberg™ electronic work is


posted with the permission of the copyright holder, your use and
distribution must comply with both paragraphs 1.E.1 through
1.E.7 and any additional terms imposed by the copyright holder.
Additional terms will be linked to the Project Gutenberg™
License for all works posted with the permission of the copyright
holder found at the beginning of this work.

1.E.4. Do not unlink or detach or remove the full Project


Gutenberg™ License terms from this work, or any files
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like