Programming Large Language Models With Azure Open Ai: Conversational Programming and Prompt Engineering With Llms (Developer Reference) 1st Edition Esposito - The newest ebook version is ready, download now to explore
Programming Large Language Models With Azure Open Ai: Conversational Programming and Prompt Engineering With Llms (Developer Reference) 1st Edition Esposito - The newest ebook version is ready, download now to explore
com
OR CLICK HERE
DOWLOAD EBOOK
https://textbookfull.com/product/clean-architecture-with-net-
developer-reference-1st-edition-esposito-dino/
textbookfull.com
https://textbookfull.com/product/beginning-game-ai-with-unity-
programming-artificial-intelligence-with-c-sebastiano-m-cossu/
textbookfull.com
https://textbookfull.com/product/learn-ai-assisted-python-programming-
with-github-copilot-and-chatgpt-1st-edition-leo-porter/
textbookfull.com
Programming with MicroPython Embedded Programming with
Microcontrollers and Python 1st Edition Nicholas H.
Tollervey
https://textbookfull.com/product/programming-with-micropython-
embedded-programming-with-microcontrollers-and-python-1st-edition-
nicholas-h-tollervey/
textbookfull.com
https://textbookfull.com/product/programming-microsoft-asp-net-mvc-
dino-esposito/
textbookfull.com
https://textbookfull.com/product/hands-on-start-to-wolfram-
mathematica-and-programming-with-the-wolfram-language-cliff-hastings/
textbookfull.com
Programming Large Language
Models with Azure Open AI:
Conversational programming and
prompt engineering with LLMs
Francesco Esposito
Programming Large Language Models with Azure Open AI:
Conversational programming and prompt engineering with
LLMs
Published with the authorization of Microsoft Corporation by: Pearson
Education, Inc.
Trademarks
Microsoft and the trademarks listed at http://www.microsoft.com on the
“Trademarks” webpage are trademarks of the Microsoft group of companies.
All other marks are property of their respective owners.
Special Sales
For information about buying this title in bulk quantities, or for special sales
opportunities (which may include electronic versions; custom cover designs;
and content particular to your business, training goals, marketing focus, or
branding interests), please contact our corporate sales department at
[email protected] or (800) 382-3419.
For government sales inquiries, please contact
[email protected].
For questions about sales outside the U.S., please contact
[email protected].
Editor-in-Chief
Brett Bartow
Executive Editor
Loretta Yates
Associate Editor
Shourav Bose
Development Editor
Kate Shoup
Managing Editor
Sandra Schroeder
Copy Editor
Dan Foster
Indexer
Timothy Wright
Proofreader
Donna E. Mulder
Technical Editor
Dino Esposito
Editorial Assistant
Cindy Teeters
Cover Designer
Twist Creative, Seattle
Compositor
codeMantra
Graphics
codeMantra
Figure Credits
Figure 4.1: LangChain, Inc
Figures 7.1, 7.2, 7.4: Snowflake, Inc
Figure 8.2: SmartBear Software
Figure 8.3: Postman, Inc
Dedication
A I.
Perché non dedicarti un libro sarebbe stato un sacrilegio.
Contents at a Glance
Introduction
Index
Contents
Acknowledgments
Introduction
Chapter 8 Conversational UI
Overview
Scope
Tech stack
The project
Minimal API setup
OpenAPI
LLM integration
Possible extensions
Summary
Index
Acknowledgments
In the spring of 2023, when I told my dad how cool Azure OpenAI was
becoming, his reply was kind of a shock: “Why don’t you write a book about
it?” He said it so naturally that it hit me as if he really thought I could do it.
In fact, he added, “Are you up for it?” Then there was no need to say more.
Loretta Yates at Microsoft Press enthusiastically accepted my proposal, and
the story of this book began in June 2023.
AI has been a hot topic for the better part of a decade, but the emergence
of new-generation large language models (LLMs) has propelled it into the
mainstream. The increasing number of people using them translates to more
ideas, more opportunities, and new developments. And this makes all the
difference.
Hence, the book you hold in your hands can’t be the ultimate and
definitive guide to AI and LLMs because the speed at which AI and LLMs
evolve is impressive and because—by design—every book is an act of
approximation, a snapshot of knowledge taken at a specific moment in time.
Approximation inevitably leads to some form of dissatisfaction, and
dissatisfaction leads us to take on new challenges. In this regard, I wish for
myself decades of dissatisfaction. And a few more years of being on the stage
presenting books written for a prestigious publisher—it does wonders for my
ego.
First, I feel somewhat indebted to all my first dates since May because
they had to endure monologues lasting at least 30 minutes on LLMs and
some weird new approach to transformers.
True thanks are a private matter, but publicly I want to thank Martina first,
who cowrote the appendix with me and always knows what to say to make
me better. My gratitude to her is keeping a promise she knows. Thank you,
Martina, for being an extraordinary human being.
To Gianfranco, who taught me the importance of discussing and
expressing, even loudly, when something doesn’t please us, and taught me to
always ask, because the worst thing that can happen is hearing a no. Every
time I engage in a discussion, I will think of you.
I also want to thank Matteo, Luciano, Gabriele, Filippo, Daniele,
Riccardo, Marco, Jacopo, Simone, Francesco, and Alessia, who worked with
me and supported me during my (hopefully not too frequent) crises. I also
have warm thoughts for Alessandro, Antonino, Sara, Andrea, and Cristian
who tolerated me whenever we weren’t like 25-year-old youngsters because I
had to study and work on this book.
To Mom and Michela, who put up with me before the book and probably
will continue after. To my grandmas. To Giorgio, Gaetano, Vito, and Roberto
for helping me to grow every day. To Elio, who taught me how to dress and
see myself in more colors.
As for my dad, Dino, he never stops teaching me new things—for
example, how to get paid for doing things you would just love to do, like
being the technical editor of this book. Thank you, both as a father and as an
editor. You bring to my mind a song you well know: “Figlio, figlio, figlio.”
Beyond Loretta, if this book came to life, it was also because of the hard
work of Shourav, Kate, and Dan. Thank you for your patience and for
trusting me so much.
This book is my best until the next one!
Introduction
This is my third book on artificial intelligence (AI), and the first I wrote on
my own, without the collaboration of a coauthor. The sequence in which my
three books have been published reflects my own learning path, motivated by
a genuine thirst to understand AI for far more than mere business
considerations. The first book, published in 2020, introduced the
mathematical concepts behind machine learning (ML) that make it possible to
classify data and make timely predictions. The second book, which focused
on the Microsoft ML.NET framework, was about concrete applications—in
other words, how to make fancy algorithms work effectively on amounts of
data hiding their complexity behind the charts and tables of a familiar web
front end.
Then came ChatGPT.
The technology behind astonishing applications like ChatGPT is called a
large language model (LLM), and LLMs are the subject of this third book.
LLMs add a crucial capability to AI: the ability to generate content in
addition to classifying and predicting. LLMs represent a paradigm shift,
raising the bar of communication between humans and computers and
opening the floodgates to new applications that for decades we could only
dream of.
And for decades, we did dream of these applications. Literature and
movies presented various supercomputers capable of crunching any sort of
data to produce human-intelligible results. An extremely popular example
was HAL 9000—the computer that governed the spaceship Discovery in the
movie 2001: A Space Odyssey (1968). Another famous one was JARVIS
(Just A Rather Very Intelligent System), the computer that served Tony
Stark’s home assistant in Iron Man and other movies in the Marvel Comics
universe.
Often, all that the human characters in such books and movies do is
simply “load data into the machine,” whether in the form of paper
documents, digital files, or media content. Next, the machine autonomously
figures out the content, learns from it, and communicates back to humans
using natural language. But of course, those supercomputers were conceived
by authors; they were only science fiction. Today, with LLMs, it is possible
to devise and build concrete applications that not only make human–
computer interaction smooth and natural, but also turn the old dream of
simply “loading data into the machine” into a dazzling reality.
This book shows you how to build software applications using the same
type of engine that fuels ChatGPT to autonomously communicate with users
and orchestrate business tasks driven by plain textual prompts. No more, no
less—and as easy and striking as it sounds!
To fully grasp the value of a programming book on LLMs, there are a couple
of prerequisites, including proficiency in foundational programming concepts
and a familiarity with ML fundamentals. Beyond these, a working knowledge
of relevant programming languages and frameworks, such as Python and
possibly ASP.NET Core, is helpful, as is an appreciation for the significance
of classic natural language processing in the context of business domains.
Overall, a blend of programming expertise, ML awareness, and linguistic
understanding is recommended for a comprehensive grasp of the book’s
content.
This book might not be for you if you’re just seeking a reference book to find
out in detail how to use a particular pattern or framework. Although the book
discusses advanced aspects of popular frameworks (for example, LangChain
and Semantic Kernel) and APIs (such as OpenAI and Azure OpenAI), it does
not qualify as a programming reference on any of these. The focus of the
book is on using LLMs to build useful applications in the business domains
where LLMs really fit well.
Stay in touch
Let’s keep the conversation going! We’re on X / Twitter:
http://twitter.com/MicrosoftPress.
Chapter 1
Luring someone into reading a book is never a small feat. If it’s a novel, you
must convince them that it’s a beautiful story, and if it’s a technical book,
you must assure them that they’ll learn something. In this case, we’ll try to
learn something.
Over the past two years, generative AI has become a prominent buzzword.
It refers to a field of artificial intelligence (AI) focused on creating systems
that can generate new, original content autonomously. Large language
models (LLMs) like GPT-3 and GPT-4 are notable examples of generative
AI, capable of producing human-like text based on given input.
The rapid adoption of LLMs is leading to a paradigm shift in
programming. This chapter discusses this shift, the reasons for it, and its
prospects. Its prospects include conversational programming, in which you
explain with words—rather than with code—what you want to achieve. This
type of programming will likely become very prevalent in the future.
No promises, though. As you’ll soon see, explaining with words what you
want to achieve is often as difficult as writing code.
This chapter covers topics that didn’t find a place elsewhere in this book.
It’s not necessary to read every section or follow a strict order. Take and read
what you find necessary or interesting. I expect you will come back to read
certain parts of this chapter after you finish the last one.
LLMs at a glance
History of LLMs
The evolution of LLMs intersects with both the history of conventional AI
(often referred to as predictive AI) and the domain of natural language
processing (NLP). NLP encompasses natural language understanding (NLU),
which attempts to reduce human speech into a structured ontology, and
natural language generation (NLG), which aims to produce text that is
understandable by humans.
LLMs are a subtype of generative AI focused on producing text based on
some kind of input, usually in the form of written text (referred to as a
prompt) but now expanding to multimodal inputs, including images, video,
and audio. At a glance, most LLMs can be seen as a very advanced form of
autocomplete, as they generate the next word. Although they specifically
generate text, LLMs do so in a manner that simulates human reasoning,
enabling them to perform a variety of intricate tasks. These tasks include
sentiment analysis, summarization, translation, entity and intent recognition,
structured information extraction, document generation, and so on.
LLMs represent a natural extension of the age-old human aspiration to
construct automatons (ancestors to contemporary robots) and imbue them
with a degree of reasoning and language. They can be seen as a brain for such
automatons, able to respond to an external input.
AI beginnings
Modern software—and AI as a vibrant part of it—represents the culmination
of an embryonic vision that has traversed the minds of great thinkers since
the 17th century. Various mathematicians, philosophers, and scientists, in
diverse ways and at varying levels of abstraction, envisioned a universal
language capable of mechanizing the acquisition and sharing of knowledge.
Gottfried Leibniz (1646–1716), in particular, contemplated the idea that at
least a portion of human reasoning could be mechanized.
The modern conceptualization of intelligent machinery took shape in the
mid-20th century, courtesy of renowned mathematicians Alan Turing and
Alonzo Church. Turing’s exploration of “intelligent machinery” in 1947,
coupled with his groundbreaking 1950 paper, “Computing Machinery and
Intelligence,” laid the cornerstone for the Turing test—a pivotal concept in
AI. This test challenged machines to exhibit human behavior
(indistinguishable by a human judge), ushering in the era of AI as a scientific
discipline.
Note
Considering recent advancements, a reevaluation of the original
Turing test may be warranted to incorporate a more precise
definition of human and rational behavior.
NLP
NLP is an interdisciplinary field within AI that aims to bridge the interaction
between computers and human language. While historically rooted in
linguistic approaches, distinguishing itself from the contemporary sense of
AI, NLP has perennially been a branch of AI in a broader sense. In fact, the
overarching goal has consistently been to artificially replicate an expression
of human intelligence—specifically, language.
The primary goal of NLP is to enable machines to understand, interpret,
and generate human-like language in a way that is both meaningful and
contextually relevant. This interdisciplinary field draws from linguistics,
computer science, and cognitive psychology to develop algorithms and
models that facilitate seamless interaction between humans and machines
through natural language.
The history of NLP spans several decades, evolving from rule-based
systems in the early stages to contemporary deep-learning approaches,
marking significant strides in the understanding and processing of human
language by computers.
Originating in the 1950s, early efforts, such as the Georgetown-IBM
experiment in 1954, aimed at machine translation from Russian to English,
laying the foundation for NLP. However, these initial endeavors were
primarily linguistic in nature. Subsequent decades witnessed the influence of
Chomskyan linguistics, shaping the field’s focus on syntactic and
grammatical structures.
The 1980s brought a shift toward statistical methods, like n-grams, using
co-occurrence frequencies of words to make predictions. An example was
IBM’s Candide system for speech recognition. However, rule-based
approaches struggled with the complexity of natural language. The 1990s saw
a resurgence of statistical approaches and the advent of machine learning
(ML) techniques such as hidden Markov models (HMMs) and statistical
language models. The introduction of the Penn Treebank, a 7-million word
dataset of part-of-speech tagged text, and statistical machine translation
systems marked significant milestones during this period.
In the 2000s, the rise of data-driven approaches and the availability of
extensive textual data on the internet rejuvenated the field. Probabilistic
models, including maximum-entropy models and conditional random fields,
gained prominence. Begun in the 1980s but finalized years later, the
development of WordNet, a semantical-lexical database of English (with its
groups of synonyms, or synonym set, and their relations), contributed to a
deeper understanding of word semantics.
The landscape transformed in the 2010s with the emergence of deep
learning made possible by a new generation of graphics processing units
(GPUs) and increased computing power. Neural network architectures—
particularly transformers like Bidirectional Encoder Representations from
Transformers (BERT) and Generative Pretrained Transformer (GPT)—
revolutionized NLP by capturing intricate language patterns and contextual
information. The focus shifted to data-driven and pretrained language
models, allowing for fine-tuning of specific tasks.
LLMs
An LLM, exemplified by OpenAI’s GPT series, is a generative AI system
built on advanced deep-learning architectures like the transformer (more on
this in the appendix).
These models operate on the principle of unsupervised and self-supervised
learning, training on vast text corpora to comprehend and generate coherent
and contextually relevant text. They output sequences of text (that can be in
the form of proper text but also can be protein structures, code, SVG, JSON,
XML, and so on), demonstrating a remarkable ability to continue and expand
on given prompts in a manner that emulates human language.
The architecture of these models, particularly the transformer architecture,
enables them to capture long-range dependencies and intricate patterns in
data. The concept of word embeddings, a crucial precursor, represents words
as continuous vectors (Mikolov et al. in 2013 through Word2Vec),
contributing to the model’s understanding of semantic relationships between
words. Word embeddings is the first “layer” of an LLM.
The generative nature of the latest models enables them to be versatile in
output, allowing for tasks such as text completion, summarization, and
creative text generation. Users can prompt the model with various queries or
partial sentences, and the model autonomously generates coherent and
contextually relevant completions, demonstrating its ability to understand and
mimic human-like language patterns.
The journey began with the introduction of word embeddings in 2013,
notably with Mikolov et al.’s Word2Vec model, revolutionizing semantic
representation. RNNs and LSTM architectures followed, addressing
challenges in sequence processing and long-range dependencies. The
transformative shift arrived with the introduction of the transformer
architecture in 2017, allowing for parallel processing and significantly
improving training times.
In 2018, Google researchers Devlin et al. introduced BERT. BERT
adopted a bidirectional context prediction approach. During pretraining,
BERT is exposed to a masked language modeling task in which a random
subset of words in a sentence is masked and the model predicts those masked
words based on both left and right context. This bidirectional training allows
BERT to capture more nuanced contextual relationships between words. This
makes it particularly effective in tasks requiring a deep understanding of
context, such as question answering and sentiment analysis.
During the same period, OpenAI’s GPT series marked a paradigm shift in
NLP, starting with GPT in 2018 and progressing through GPT-2 in 2019, to
GPT-3 in 2020, and GPT-3.5-turbo, GPT-4, and GPT-4-turbo-visio (with
multimodal inputs) in 2023. As autoregressive models, these predict the next
token (which is an atomic element of natural language as it is elaborated by
machines) or word in a sequence based on the preceding context. GPT’s
autoregressive approach, predicting one token at a time, allows it to generate
coherent and contextually relevant text, showcasing versatility and language
understanding. The size of this model is huge, however. For example, GPT-3
has a massive scale of 175 billion parameters. (Detailed information about
GPT-3.5-turbo and GPT-4 are not available at the time of this writing.) The
fact is, these models can scale and generalize, thus reducing the need for task-
specific fine-tuning.
Functioning basics
The core principle guiding the functionality of most LLMs is autoregressive
language modeling, wherein the model takes input text and systematically
predicts the subsequent token or word (more on the difference between these
two terms shortly) in the sequence. This token-by-token prediction process is
crucial for generating coherent and contextually relevant text. However, as
emphasized by Yann LeCun, this approach can accumulate errors; if the N-th
token is incorrect, the model may persist in assuming its correctness,
potentially leading to inaccuracies in the generated text.
Until 2020, fine-tuning was the predominant method for tailoring models
to specific tasks. Recent advancements, however—particularly exemplified
by larger models like GPT-3—have introduced prompt engineering. This
allows these models to achieve task-specific outcomes without conventional
fine-tuning, relying instead on precise instructions provided as prompts.
Models such as those found in the GPT series are intricately crafted to
assimilate comprehensive knowledge about the syntax, semantics, and
underlying ontology inherent in human language corpora. While proficient at
capturing valuable linguistic information, it is imperative to acknowledge that
these models may also inherit inaccuracies and biases present in their training
corpora.
BY WILLIAM O. STODDARD.
Chapter VIII.
refusal to go out with the hunters was a strange thing to come
from Red Wolf. No other young brave in that band of Apaches
had a better reputation for killing deer and buffaloes. It was a
common saying among the older squaws that when he came to
have a lodge of his own "there would always be plenty of meat
in it." He was not, therefore, "a lazy Indian," and it was
something he had on his mind that kept him in the camp that
day. It had also made him beckon to Ni-ha-be, and look very
hard after Rita when she hurried away toward the bushes with
her three magazines of "talking leaves." Red Wolf was curious.
He hardly liked to say as much to a squaw, even such a young
squaw as Ni-ha-be, and his own sister, but he had some questions to ask her
nevertheless.
He might have asked some of them of his father, but the great war chief of that band of
Apaches was now busily watching Dolores and her saucepan, and everybody knew better
than to speak to him just before supper. Ni-ha-be saw at a glance what was the matter
with her haughty brother, and she was glad enough to tell him all there was to know of
how and where the talking leaves had been found.
"Did they speak to you?"
"No; but I saw pictures."
"Pictures of what?"
"Mountains, big lodges, trees, braves, pale-face squaws, pappooses, white men's bears,
and pictures that lied. Not like anything."
"Ugh! Bad medicine. Talk too much. So blue-coat soldier throw them away."
"They talk to Rita."
"What say to her?"
"I don't know. She'll tell me. She'll tell you if you ask her."
"Ugh! No. Red Wolf is a warrior. Not want any squaw talk about pictures. You ask Rita
some things?"
"What things?"
"Make the talking leaves tell where all blue-coat soldiers go. All that camped here. Know
then whether we follow 'em."
"Maybe they won't tell."
"Burn some. The rest talk then. White man's leaves not want to tell about white man.
Rita must make them talk. Old braves in camp say they know. Many times the talking
leaves tell the pale-faces all about Indians. I Tell where go. Tell what do. Tell how to find
and kill. Bad medicine."
The "old braves" of many an Indian band have puzzled their heads over the white man's
way of learning things and sending messages to a distance, and Red Wolf's ideas had
nothing unusual in them. If the talking leaves could say anything at all, they could be
made to tell a chief and his warriors the precise things they wanted to know.
Ni-ha-be's talk with her brother lasted until he pointed to the camp fire, where Many
Bears was resting after his first attack upon the results of Mother Dolores's cookery.
"Great chief eat. Good time talk to him. Go now."
There was no intentional lack of politeness in the sharp, overbearing tone of Red Wolf. It
was only the ordinary manner of a warrior speaking to a squaw. It would therefore have
been very absurd for Ni-ha-be to get out of temper about it; but her manner and the toss
of her head as she turned away were decidedly wanting in the submissive meekness to
be expected of her age and sex.
"It won't be long before I have a lodge of my own," she said, positively. "I'll have Rita
come and live with me. Red Wolf shall not make her burn the talking leaves. Maybe she
can make them talk to me. My eyes are better than hers. She's nothing but a pale-face, if
she did get brought into my father's lodge."
A proud-spirited maiden was Ni-ha-be, and one who wanted a little more of "her own
way" than she could have under the iron rule of her great father and the watchful eyes of
Mother Dolores.
"I'll go to the bushes and see Rita. Our supper won't be ready yet for a good while."
It would be at least an hour, but Ni-ha-be had never seen a clock in her life, and knew
nothing at all about "hours." There is no word for such a thing in the Apache language.
She was as light of foot as an antelope, and her moccasins hardly made a sound upon
the grass as she parted the bushes and looked in upon Rita's hiding-place.
"Weeping? The talking leaves have been scolding her. I will burn them. They shall not say
things to make her cry."
In a moment more her arms were around the neck of her adopted sister. It was plain
enough that the two girls loved each other dearly.
"Rita, what is the matter? Have they said strong words to you?"
"No, Ni-ha-be; good words, all of them. Only I can not understand them all."
"Tell me some. See if I can understand them. I am the daughter of a great chief."
Ni-ha-be did not know how very little help the wealth of a girl's father can give her in a
quarrel with her school-books. But just such ideas as hers have filled the silly heads of
countless young white people of both sexes.
"I can tell you some of it."
"Tell me what made you cry."
"I can't find my father. He is not here. Not in any of them."
"You don't need him now. He was only a pale-face. Many Bears is a great chief. He is
your father now."
Something seemed to tell Rita that she would not be wise to arouse her friend's national
jealousy. It was better to turn to some of the pictures, and try to explain them. Very
funny explanations she gave, too, but she at least knew more than Ni-ha-be, and the
latter listened seriously enough.
"Rita, was there ever such a mule as that?—one that could carry a pack under his skin?"
It was Rita's turn now to be proud, for that was one of the pictures she had been able to
understand. She had even read enough to be able to tell Ni-ha-be a good deal about a
camel.
It was deeply interesting, but the Apache maiden suddenly turned from the page to
exclaim,
"Rita, Red Wolf says the talking leaves must tell you about the blue-coat soldiers or he
will burn them up."
"I'm going to keep them."
"I won't let him touch them."
"But, Ni-ha-be, they do tell about the soldiers. Look here."
She picked up another of the magazines, and turned over a few leaves.
"There they are. All mounted and ready to march."
Sure enough, there was a fine wood-cut of a party of cavalry moving out of camp with
wagons.
Over went the page, and there was another picture.
Ten times as many cavalry on the march, followed by an artillery force with cannon.
"Oh, Rita! Father must see that."
"Of course he must; but that is not all."
Another leaf was turned, and there was a view of a number of Indian chiefs in council at
a fort, with a strong force of both cavalry and infantry drawn up around them.
Rita had not read the printed matter on any of those pages, and did not know that it was
only an illustrated description of campaigning and treaty-making on the Western plains.
She was quite ready to agree with Ni-ha-be that Many Bears ought to hear at once what
the talking leaves had to say about so very important a matter.
It was a good time to see him now, for he was no longer very hungry, and word had
come in from the hunters that they were having good success. A fine prospect of a
second supper, better than the first, was just the thing to make the mighty chief good-
tempered, and he was chatting cozily with some of his "old braves" when Rita and Ni-ha-
be drew near.
They beckoned to Red Wolf first.
"The talking leaves have told Rita all you wanted them to. She must speak to father."
Red Wolf's curiosity was strong enough to make him arrange for that at once, and even
Many Bears himself let his face relax into a grim smile as the two girls came timidly
nearer the circle of warriors.
After all, they were the pets and favorites of the chief; they were young and pretty, and
so long as they did not presume to know more than warriors and counsellors they might
be listened to. Besides, there were the talking leaves, and Rita's white blood, bad as it
was for her, might be of some use in such a matter.
"Ugh!"
Many Bears looked at the picture of the cavalry
squad with a sudden start. "No lie this time. Camp
right here. Just so many blue-coats. Just so many
wagons. Good. Now where go?"
Rita turned the leaf, and her Indian father was yet
more deeply interested.
"Ugh! More blue-coats. Great many. No use follow.
Get all killed. Big guns. Indians no like 'em. Ugh!"
If the cavalry expedition was on its way to join a
larger force, it would indeed be of no use to follow
it, and Many Bears was a cautious leader as well
as a brave one.
Rita's news was not yet all given, however, and
when the eyes of the chief fell upon the picture of
the "treaty-making" he sprang to his feet.
"Ugh! Big talk come. Big presents. Other Apaches
"MANY BEARS LOOKED AT THE
all know—all be there—all get blanket, gun,
PICTURE."
tobacco, new axe. Nobody send us word, because
we off on hunt beyond the mountains. Now we
know, we march right along. Rest horse, kill game, then ride. Not lose our share of
presents."
Rita could not have told him his mistake, and even if she had known it, she would have
been puzzled to explain away the message of the talking leaves.
Did not every brave in the band know that that first picture told the truth about the
cavalry? Why, then, should they doubt the correctness of the rest of it?
No; a treaty there was to be, and presents were to come from the red man's "great
father at Washington," and that band of Apaches must manage to be on hand and secure
all that belonged to it, and as much more as possible.
Red Wolf had nothing more to say about burning up leaves which had talked so well, and
his manner toward Rita was almost respectful as he led her and Ni-ha-be away from the
group of great men that was now gathering around the chief. Red Wolf was too young a
brave to have any business to remain while gray heads were in council. A chief would
almost as soon take advice from a squaw as from a "boy."
Mother Dolores had heard nothing of all this, but her eyes had not missed the slightest
thing. She had even permitted a large slice of deer meat to burn to a crisp in her eager
curiosity.
"What did they say to the chief?" was her first question to Rita.
But Ni-ha-be answered her with: "Ask the warriors. If we talk too much, we shall get into
trouble."
"You must tell me."
"Not until after supper. Rita, don't let's tell her a word unless she cooks for us and gives
us all we want. She made us get our own supper last night."
"You came late. I did not tell your father. I gave you enough. I am very good to you."
"No," said Rita; "sometimes you are cross, and we don't get enough to eat. Now you
shall cook us some corn-bread and some fresh meat. I am tired of dried buffalo: it is
tough."
The curiosity of Dolores was getting hotter and hotter, and she thought again of the
wonderful leaf which had spoken to her. She wanted to ask Rita questions about that too,
and she had learned by experience that there was more to be obtained from her willful
young friends by coaxing than in any other way.
"I will get your supper now, while the chiefs are talking. It shall be a good supper—good
enough for Many Bears. Then you shall tell me all I ask."
"Of course I will," said Rita.
A fine fat deer had been deposited near that camp fire by one of the first hunters that
had returned, and Mother Dolores was free to cut and carve from it, but her first attempt
at a supper for the girls did not succeed very well. It was not on account of any fault of
hers, however, or because the venison steak she cut and spread upon the coals, while
her corn-bread was frying, did not broil beautifully.
No; the temporary disappointment of Ni-ha-be and Rita was not the fault of Mother
Dolores. Their mighty father was sitting where the odor of that cookery blew down upon
him, and it made him hungry again before the steak was done. He called Red Wolf to
help him, for the other braves were departing to their own camp fires, and in a minute or
so more there was little left of the supper intended for the two young squaws. Dolores
patiently cut and began to broil another slice, but that was Red Wolf's first supper, and it
was the third slice which found its way into the lodge, after all.
The strange part of it was that not even Ni-ha-be dreamed of complaining. It was
according to custom.
There was plenty of time to eat supper after it came, for Dolores was compelled to look
out for her own. She would not have allowed any other squaw to cook for her, any more
than she herself would have condescended to fry a cake for any one below the rank of
her own husband and his family.
Mere common braves and their squaws could take care of themselves, and it was of small
consequence to Dolores whether they had anything to eat or not. There is more
"aristocracy" among the wild red men than anywhere else, and they have plenty of white
imitators who should know better.
[to be continued.]
Glencoe, Louisiana.
Viola E. would perhaps find the names most familiar to your young Creole
subscribers in Louisiana as unaccustomed as are those of which she writes to the
ears of children outside of Virginia. In this house the young girl to whom Young
People is addressed was christened Elmire, but is known only by her petit nom of
"Fillette." Her mother's name is Gracieuse—is it not musical? An impish little ebon-
hued maid in the yard is Mariquite. Another, with gleaming ivories, is Yélie. A cousin
who comes often, and is nearly old enough to cast his vote, is yet "Bébé," despite his
sponsors having called him Édouard. And "Guisson," his brother, who would guess
his name to be Émile?
A little knowledge of creole interiors would correct the ideas so prevalent as to creole
indolence. Away down here, on a sluggish little bayou that makes its way through
the plantation to the not-far-distant Gulf, these young girls, though not perhaps
speaking so good English as their Virginia sisters of Anglo-Saxon extraction, having
learned it rather from the lips of negro servants than from their parents, are, at any
rate, their peers in womanly accomplishments, if practical knowledge of the details
of a ménage constitutes such—the ability to wash, starch, iron, straighten a room,
make a gumbo, mix a cake and bake it, etc. The very neatly made calico dresses
they wear are their own handiwork. After five hours spent in the school-room with
their institutrice, and the required time given to the practice of their piano, one of
them is amusing herself by making a quantity of under-clothing for a beloved little
filleule. A basse-cour of about six hundred turkeys, ducks, and chickens is cared for
almost wholly by the two girls and their mother. Domestic virtues these, worthy even
of Yankee girls, are they not? Just as much, though, as Yankee girls or as Virginia
girls do these young Louisianians claim their heritage as Americans and their place
among your "Young People."
L'Institutrice.
We have read this letter with great pleasure, and now we would like to hear from
somebody about our Western girls; and the New England girls too will find a corner
waiting if they choose to write.
Harper, Iowa.
I can now read all the long stories in Young People. I liked "Tim and Tip" very much,
and think the bear hunt was quite funny. I had a pair of white doves given me as a
present. One of them, in trying to fly through the screen door, broke its neck, and
the other flew away with some wild ones. So I lost my pets, and was very sorry. I
am sorry for Jimmy Brown. He makes me think of myself sometimes. My sister
teaches piano music. My two brothers play in the Cornet Band, and I am learning
music; so we have plenty of music. We all go to school.
Harper R.
Manhattan, Kansas.
I have three brothers and two sisters. This summer we all went to New Mexico. We
stopped at Las Vegas, and saw the Hot Springs, and the water in the springs was so
hot that we could not hold our hands in it. And we stopped over Sunday at Santa Fe,
and saw the Corpus Christi procession. We saw a horned toad that ran as fast as a
horse. We brought back two donkeys, and mine threw me off, and broke my two
front teeth. Uncle Henry gave us some saddles. Our baby is only two months old,
and has red hair. I liked "Toby Tyler" best of any. I am nine years old. My name is
Maggie P.
Okahumpka, Florida.
I am a little girl ten years old, and live away down in South Florida, where the sun is
always bright and the trees always green. In our quiet little home there are only
mamma, Addie, and I. Our dear father is dead. Sister Addie is six years old. We have
no school, church, nor Sunday-school. Mamma gives us our lessons daily at home,
and a kind English gentleman gives me music lessons. We do not know who sends
us the Young People, but hope our kind unknown friend will see this letter, and learn
how much we enjoy the gift and appreciate the kindness. I am suffering from sore
eyes, and not allowed to read or write, so mamma is writing for me; but when I get
well I will write myself, and tell about our pets and other things.
Rosa M. J.
Scandia, Kansas.
I have been taking your paper almost a year, and like it very much. It was papa's
Christmas present to me, so I thought I would write you a letter. I have a pet hen. I
call her Brownie. She is getting old now. She answers me in hen language when I
take her up and talk to her. I have a canary-bird. I call him Dickey. He is just learning
to sing.
Laura H.
I have had my cat Till seven years. We think he is a very wise cat, for he sits upon
his hind-legs and begs. When I go down stairs in the morning, if I say, "Good-
morning, Till," he will shake hands with me. He is a very dainty cat. He will not eat
roast beef unless it is very rare, and he does not care at all for the heads of chickens
and turkeys; but he loves cheese and crackers, and will eat all the cake I will give
him. I am eleven years old.
Mabel M. S.
Milwaukee, Wisconsin.
I have a great many dolls, and a large doll house in the conservatory, which I enjoy
very much, so I thought you would be pleased to have a letter from me. Mrs. Love
Lee and her ten children live in the large doll house, which is a little taller than I am.
I am six. The babies Faith, Hope, and Love are triplets. I wish we had three live
babies. Cozy has two kittens. Cozy is my cat. Arthur and Arabella are twins, about in
the middle. Blanche is the young lady, and Fifine the big school-girl. Rosebud is only
six inches tall, and her eyes open and shut, and she moves her head and arms and
legs. Daffodil is just the same, only smaller, and Joe is the little boy. Ida takes care of
the children in the nursery. Dinah is the cook. She is colored very much. Chechon
sets the table, and keeps the dining-room in order. Chechon is a Chinese. The twins
have a very nice cabinet of shells and stones. I gave them some out of mine. Each of
the children have something to do to help their mamma, just, as I do.
I go to Kindergarten, and once a week I speak a little piece out of Baby-Land, or St.
Nicholas, or Harper's Young People, or The Nursery. I can say all of "The Cat, the
Parrot, and the Monkey." It is just at the end of my bound Harper's Young People. It is
called "Filbert." That is the best story I know. I like "The Story of a Parrot," too, but
it would have been better if some one had carried him home at last. Papa says he
don't see why I like that story so well, but he reads it to me 'most every Sunday. He
likes "Toby Tyler" a great deal better, or even "Tim and Tip." They are pretty good
too. I don't like story boys as well as I do story animals. I like live animals too. Dogs
and cats are never afraid of me, but will come right to me in the street or anywhere.
I found a little mud-turtle at Minnehaha Falls, and brought it to papa and mamma by
its tail, and it played with me a little while, and then I carried it back to its cave at
the side of the path down the gully.
This fall I caught a live star-fish, when the tide was coming in, down on the beach at
Portland, Maine, and we brought it home to put in my cabinet when it gets dry
enough. It is sticky yet. It is out in the wood-shed drying. When we were going
there I caught a mouse. It ran into its hole in the corner of the dépôt, all but its tail.
I suppose I took hold too tightly, or else too high up, for he turned around and bit
my thumb. I wasn't going to hurt him, but just to play with him a little while. I wish
animals could talk. That was at the Montreal dépôt.
You asked about dolls. I have a doll, about a foot high, wheeling a little cart in front
of her. When I draw the cart by a string, the doll goes trot, trot, trot on behind, and
every one I meet turns around, and says, "Did you ever see anything so funny?"
Uncle Ebb found it at Manistee, Michigan, and sent it to me by express.
Blossom is my very large wax doll. I draw her around the block in her carriage every
pleasant afternoon. Sometimes Daisy, who is almost as large, rides in the front seat.
If it is too warm for Blossom to go out, Daisy will ride in the back seat, and Charity in
front. Charity is indestructible and good, but not beautiful. Cisily I took with me to
Vermont and Boston and Maine, because she had never been anywhere. She ought
to have a new dress Christmas, if Santa Claus only knew it. Joe is just as tall as
Cisily. I measure them often with my foot-rule. They are once and a half tall. They
have the same furry hair. They have a very nice carriage, and always ride out
together. I shall take Joe next. He has never been anywhere yet, but Cisily wore his
overcoat and rubbers East, and took his little knife I in her pocket. He thought she
might want it to whittle in Vermont or Boston. Uncle Ebb often helps me play, and
speaks for the dolls. I am all there is here of children.
I have a good many more dolls. There is a small doll house full, and Mother Goose
with her shoe full of them, and some of the children in the doll houses have dolls for
themselves. The "log-cabin" has a family in that. The "Swiss cottage" has only
wooden people. The frame house has twelve children. I like large families. They are
more convenient for the children. Mamma reads your letters to me. I could read
them, but they are printed so fine it is hard to read. I am in the Second Reader, and
the same words are easy to read in that. I read a lesson every day in the connecting
class, after Kindergarten is over at noon. I read, spell, write, and draw about fifteen
minutes each, and am home to dinner at one. Then come the kitties and dolls.
Nellie B.
Saybrook, Connecticut.
I see you want to know whether dolls have gone out of style. No, I think not. I am
eleven years old. I was very sick when I was six years old, and have not been able to
walk since except in braces. I have a rolling-chair that I am wheeled in when out-
doors, and I have many nice times with my dolls. I have eight of them. I think Young
People is very nice. I hope this is not too long to be printed, as it is my first letter to
any paper. I have eight pets.
Belle M. I.
I want to tell you about my little dog. He is a black and tan, and is so cute. He will
speak, sit on his hind-legs and beg, and catch anything thrown to him. His name is
Bijon.
I will send twenty-five rare foreign stamps for ten gilt picture advertising cards, and
give twelve internal revenue stamps for five gilt picture cards. One $2 stamp; nine
$1; a 30 cent, 50, 25, 20, 15; two 10, two 5, and one 2 cent stamp. Please give your
full address when you send cards. My name is
My teacher gave me Young People as a prize for being a good scholar. Ma raised
about one hundred turkeys this year, and I raised twelve guinea-fowl with them. I
like the paper very much. I am always glad when Saturday comes.
Carrie McK.
I am sorry the girl in South Glastenbury does not like cats. If she knew my cat, I
think she would like him. My brother caught fifty little fish for him, each about as
long as my little finger. After he had eaten twenty-five, he could scarcely eat any
more, but would not let us take them away, as he wanted to play with them.
Sometimes he goes to the door, and asks us to let him come up stairs, when he gets
into my doll's bed, pulls the sheet off her, and gets close to her. When she sits up in
a chair, he gets in her lap. He does not like to hear the noise made by dishes, so,
when they are washed, he mews till they are done. My brother plagued him once,
and Kit ran to the door, and stopped a minute to consider, then ran back, and struck
him with his paws. He is lazy, but you need not put that in Young People.
Jessie B.
A puss that has fifty fish offered him at once is quite excusable for being lazy. We think
he is a very interesting cat.
Oakdale, Pennsylvania.
Papa gave me a male canary about two years ago, and last spring my uncle gave my
sister a female, and we thought we would try to raise some little birds. The mother
bird laid five eggs, and they all hatched and grew to be big birds, were very tame,
and we used to carry them around the room, and let them ride in our dolls' coaches.
She laid five eggs again, but we only raised three more birds. They are all singers.
We have seven cats—Polly, Beauty, Tom, Milly, Pussy, Harry, and Lottie. Polly is a
Maltese. Our dog is named Friskie. I am ten years old.
Mary E. D.
I thought I would tell you about some hens we had when I was four or five years
old. One would come in the pantry, if the window was left open, and lay her egg in a
pan of eggs on the shelf. Another was determined to make her nest up stairs, and
we did not dare leave the front-door open. Another hen laid three times in the wood-
box in the kitchen, in spite of being driven out many times.
Mary M.
Denver, Colorado.
I like the paper real well, and the little letters too. My mamma reads 'em to us,
'cause we can't read ourselves. Grandpapa sent it to brother and me last New-Year's.
My dolly I like so much! She has nice clothes, and the dearest little button boots and
stockings what come off; and I have lovely dishes. Grandpapa sent 'em to me. I
have lots of nice times with my things, but there are too many to tell about. We had
a nice time at a birthday party Saturday. I just started to school this fall. I will be
seven years old to-morrow. Mamma "finks" my letter pretty nearly too long now, so I
won't write any more. I'll try and not be "'spointed" if you can't print it, 'cause you
have so many letters. Mamma's writing for me. Good-by.
Nellie D.
I am Charlie, Nellie's brother. I like all the stories so well, I can't tell which I like best.
We can see the mountains from our doors and windows just as plain all the time,
only when it's stormy. My kitty got up in mamma's lap at table the other day, and
wanted to eat out of her plate. I had a live frog in a pail. One morning I went to
school, and forgot to fill up the pail, and just as I came from school kitty had him. He
killed him, and was going to eat him. I took him away, and gave him to the chickens,
and spanked Sam—that's my kitty's name; I named him for grandpapa. I will be nine
years old April 3, but it's so hard to write. Good-by.
Charles Fred D.
I am eleven years old, and I save the pennies I get for doing errands to buy Harper's.
I earned four dollars this season to help papa buy me a winter suit. I have been to
Boston, and would like to live there all the time. I have only one sister, and she is my
pet. She has a little white bantam hen for her pet. I have nine aunts, and I am going
to write to them all some day, and send them one of my Harper's Magazines.
Mamma wrote this letter, but I told her what to say. Good-by, from
Daniel A.
C. Y. P. R. U.
The Postmistress is very happy to give the readers of Our Post-office Box the pleasure of
reading a description of the little yacht Toby Tyler, now cruising in Southern waters:
Dear "Young People,"—The Toby Tyler, named after the hero of Mr. Otis's most
successful story, is a very small steamer, being only about forty-five feet in length,
and drawing but three feet of water. She was built so small and of such light draught
because it is intended that she shall explore most of the rivers on the west coast of
Florida, some of which are very shallow. Perhaps she will go farther than Florida, and
explore a country that abounds in material for interesting adventures and thrilling
stories.
As the Toby is so small, she can not go away out to sea and around Cape Hatteras,
like the great steam-ships that carry passengers to Florida. She has to take what is
known as the "inland passage."
After leaving her dock at the foot of West Twenty-ninth Street, in New York, the Toby
steamed down the North or Hudson River until she passed the Battery. Then she was
in the Upper Bay. Crossing this, and turning to the westward, she steamed along the
north shore of Staten Island, through the broad river-like body of water called the
Kill Von Kull. Passing New Brighton and the Sailors' Sung Harbor and Elizabethport,
through the Arthur Kill and Staten Island Sound, both continuations of the Kill Von
Kull, the Toby reached Perth Amboy, and turned into the Raritan River, which here
empties into Raritan Bay.
The Raritan River is so shallow and so crooked that the yacht proceeded very slowly
and carefully for seventeen miles, until she reached New Brunswick. Here she
entered the Delaware and Raritan Canal, and found herself in company with great
numbers of heavy canal-boats drawn by mules or horses. The canal in which the
little Toby now sailed runs through a very beautiful portion of New Jersey, and her
passengers enjoyed travelling on it very much. They especially enjoyed going
through the locks, always in company with some other craft, which was sometimes a
canal-boat, sometimes another steamer, with sometimes a big schooner, whose tall
masts and white sails looked very funny among the trees on the canal banks.
The principal places that the Toby passed while in the canal were Bound Brook,
Princeton, Trenton, and Bordentown. At the last-named place she passed through
the last of the twelve locks, and having had forty-three miles of canal sailing,
steamed gladly out into the broad Delaware River.
A run of twenty-nine miles down this beautiful river brought her to Philadelphia,
where she rested for a few days, and gave her passengers time to get acquainted
with this dear old city, in which so many of the readers of Harper's Young People live.
On leaving Philadelphia the Toby steamed merrily down the Delaware for forty miles
to Delaware City, in the State of Delaware, where she entered the Delaware and
Chesapeake Canal, which connects the Delaware River with Chesapeake Bay. This
canal is only fourteen miles long, and has but two locks, one at each end, so that the
little yacht, soon found herself at Chesapeake City, in the State of Maryland, and at
the southern end of the canal.
After an all day's run down the upper end of Chesapeake Bay, the Toby entered the
Patapsco River, and steamed up to Baltimore, where she landed her passengers in
time to witness the great Oriole Celebration.
Then she went back down the Patapsco and again into Chesapeake Bay. This bay is
so wide that it is almost as rough and stormy at times as the sea itself, and the poor
little Toby had a very hard time, and was roughly handled by the great waves before
the pleasant Wednesday morning when she turned into the broad mouth of the York
River, and dropped anchor amongst the big ships in front of Yorktown. As the little
boat ran in between two of the great war ships, they began firing guns and banging
away at such a furious rate that in a few moments not only the poor little Toby but
they themselves were completely enveloped in a dense cloud of smoke. In a few
minutes those on board the Toby learned that the government steamer Dispatch,
with President Arthur on board, had just arrived, and that all this firing of guns was
only a salute to him, as though the big ships had said, "How do you do, Mr.
President? We are very glad to welcome you to Yorktown."
After leaving this place the Toby went back down the York River into Chesapeake Bay
again, and for a short distance out into the ocean, before steaming past the grim
walls of Fortress Monroe and into Hampton Roads.
Without stopping to see the fort or the Indian schools at Hampton, the Toby hurried
on, and an hour later sailed into the quiet harbor of Norfolk, at the mouth of the
Elizabeth River.
The upper deck or cabin roof of the Toby Tyler extends nearly over her entire length,
so that, though small, she can be made very comfortable in any weather. Her cabin,
which is also dining-room and sleeping-room for four, is back of the engine-room,
and occupies the whole of the after-part of the yacht. Her engine is in the middle,
right under the smoke-stack, and forward of this is the cockpit, of which the sides
are open except when inclosed by heavy canvas storm curtains. Here, in very warm
weather, hammocks can be slung at night, in which the passengers may sleep.
On the upper deck is a light cedar canoe—the Psyche—with paddles, masts, and
sails, intended for exploring rivers and lakes that are too shallow for the Toby, and
beside the canoe is lashed a good-sized tent with its poles, so that when Mr. Otis and
his friends tire of living on board the yacht, they can, if they choose, establish a
camp on shore.
In various lockers on the yacht, besides the baggage of her passengers and crew,
and the coal, are stored four hundred pounds of canned provisions and fruits, a tool
chest, medicine chest, ammunition chest, blankets, writing and sketching materials,
books, charts, etc.
Captain C. K. M.
William Cowper was born November 26, 1731, in Hertfordshire. England. His mother
died before he was six years old. He was sent to a school where he suffered a great
deal from the teasing of the other boys. He had an affection of the eyes, and so he
was placed at an oculist's house, where he had smallpox, and that cured his eyes.
After that he became a clerk in a lawyer's office, and studied for admission to the
bar. The strain on his mind was too great, and he sought relief by trying to commit
suicide by hanging. In this he did not succeed. A friend placed him in the country,
where, after skillful treatment, he recovered from the fits of mental depression that
he was subject to. He was fickle and inconstant to friends, but loving and kind to his
pets. He had three leverets, or hares, given to him, and in these he found much
amusement, for he was sick, and wanted something to occupy his mind. The hares
were males, and their names were Puss, Tiney, and Bess. He built them a house,
and each had his own bedroom to sleep in. Puss lived to be eleven years old, Tiney
to be nine, and Bess died soon after Cowper received him. The poetry about the
chair is found in the "Task," and is called "The Sofa." Cowper died in the town of
East Durham, on Friday, the 25th of April, 1800, and was buried in St. Edmund's
Chapel, in the church of East Durham.
Edna L. Maynard.
This little description of the poet Cowper is very creditable to its writer, who is only
eleven years old. But the Postmistress must disagree with her in the opinion that he was
inconstant and fickle as a friend.
HISTORICAL ENIGMA.
No. 2.
No. 3.
NUMERICAL ENIGMA.
P I L OT D
I VAN NED
L AC DEB AR
ON DAM
T R
No. 2.
F
S AD
S I RED
STR I P E S
S I REN I CA L
F A R I N A C E OUS
D E P I C T I NG
DEC E I V E
S AON E
L UG
S
No. 3.
Valhalla.
No. 4.
"John Burns of Gettysburg."
No. 5.
D
SER
DA T E S
DE L E TES
S A L E R A TUS
DE T E RM I N E RS
RE TA I NERS
SETNESS
SUE RS
SRS
S
Correct answers to puzzles have been received from M. E. S., Willie Volckhausen, "North
Star," Frank S. Davis, Nannie Francis, Charles Beck, Emma Rose A., Lucy Cox, John D.
Smith, Kittie E. Gill, Henry E. Johnston, Jun., James R. Magoffin, Clara H. Tower, Annetta
D. Jackson, and Calvin Rufus Morgan.
2.
3.
4.
N egro.
I mp.
N uts.
E nsigns.
T eeth.
E lm.
E wers.
N est.
T rays.
H andle.
O tter.
F lags.
O ats.
C hairs.
T ail.
O ak.
B ats.
E ave.
R amrod.
Steuben.
Lee.
Ward.
Marion.
Stark.
Gates.
Smith.
Greene.
St. Clair.
Stevens.
Gist.
Thomas.
Poor.
Arnold.
Nash.
Lafayette.
UNHAPPY THOUGHT.
Effie. "What on earth will you do with yourself all Day long?"
FOOTNOTES:
[1] Begun in No. 101, Harper's Young People.
*** END OF THE PROJECT GUTENBERG EBOOK HARPER'S YOUNG
PEOPLE, NOVEMBER 22, 1881 ***
1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside
the United States, check the laws of your country in addition to
the terms of this agreement before downloading, copying,
displaying, performing, distributing or creating derivative works
based on this work or any other Project Gutenberg™ work. The
Foundation makes no representations concerning the copyright
status of any work in any country other than the United States.
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com