Instant ebooks textbook 9th International Workshop on Spoken Dialogue System Technology Luis Fernando D'Haro download all chapters
Instant ebooks textbook 9th International Workshop on Spoken Dialogue System Technology Luis Fernando D'Haro download all chapters
Instant ebooks textbook 9th International Workshop on Spoken Dialogue System Technology Luis Fernando D'Haro download all chapters
com
https://textbookfull.com/product/9th-international-workshop-
on-spoken-dialogue-system-technology-luis-fernando-dharo/
OR CLICK BUTTON
DOWNLOAD NOW
https://textbookfull.com/product/conversational-dialogue-systems-for-
the-next-decade-704-luis-fernando-dharo-editor/
textboxfull.com
https://textbookfull.com/product/critical-care-obstetrics-6th-edition-
luis-d-pacheco/
textboxfull.com
Soap Manufacturing Technology Second Edition Luis Spitz
https://textbookfull.com/product/soap-manufacturing-technology-second-
edition-luis-spitz/
textboxfull.com
https://textbookfull.com/product/sugarcane-biorefinery-technology-and-
perspectives-1st-edition-fernando-santos-editor/
textboxfull.com
Lecture Notes in Electrical Engineering 579
9th International
Workshop on
Spoken Dialogue
System
Technology
Lecture Notes in Electrical Engineering
Volume 579
Series Editors
Leopoldo Angrisani, Department of Electrical and Information Technologies Engineering, University of Napoli
Federico II, Naples, Italy
Marco Arteaga, Departament de Control y Robótica, Universidad Nacional Autónoma de México, Coyoacán,
Mexico
Bijaya Ketan Panigrahi, Electrical Engineering, Indian Institute of Technology Delhi, New Delhi, Delhi, India
Samarjit Chakraborty, Fakultät für Elektrotechnik und Informationstechnik, TU München, Munich, Germany
Jiming Chen, Zhejiang University, Hangzhou, Zhejiang, China
Shanben Chen, Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai, China
Tan Kay Chen, Department of Electrical and Computer Engineering, National University of Singapore,
Singapore, Singapore
Rüdiger Dillmann, Humanoids and Intelligent Systems Lab, Karlsruhe Institute for Technology, Karlsruhe,
Baden-Württemberg, Germany
Haibin Duan, Beijing University of Aeronautics and Astronautics, Beijing, China
Gianluigi Ferrari, Università di Parma, Parma, Italy
Manuel Ferre, Centre for Automation and Robotics CAR (UPM-CSIC), Universidad Politécnica de Madrid,
Madrid, Spain
Sandra Hirche, Department of Electrical Engineering and Information Science, Technische Universität
München, Munich, Germany
Faryar Jabbari, Department of Mechanical and Aerospace Engineering, University of California, Irvine, CA,
USA
Limin Jia, State Key Laboratory of Rail Traffic Control and Safety, Beijing Jiaotong University, Beijing, China
Janusz Kacprzyk, Systems Research Institute, Polish Academy of Sciences, Warsaw, Poland
Alaa Khamis, German University in Egypt El Tagamoa El Khames, New Cairo City, Egypt
Torsten Kroeger, Stanford University, Stanford, CA, USA
Qilian Liang, Department of Electrical Engineering, University of Texas at Arlington, Arlington, TX, USA
Ferran Martin, Departament d’Enginyeria Electrònica, Universitat Autònoma de Barcelona, Bellaterra,
Barcelona, Spain
Tan Cher Ming, College of Engineering, Nanyang Technological University, Singapore, Singapore
Wolfgang Minker, Institute of Information Technology, University of Ulm, Ulm, Germany
Pradeep Misra, Department of Electrical Engineering, Wright State University, Dayton, OH, USA
Sebastian Möller, Quality and Usability Lab, TU Berlin, Berlin, Germany
Subhas Mukhopadhyay, School of Engineering & Advanced Technology, Massey University, Palmerston
North, Manawatu-Wanganui, New Zealand
Cun-Zheng Ning, Electrical Engineering, Arizona State University, Tempe, AZ, USA
Toyoaki Nishida, Graduate School of Informatics, Kyoto University, Kyoto, Japan
Federica Pascucci, Dipartimento di Ingegneria, Università degli Studi “Roma Tre”, Rome, Italy
Yong Qin, State Key Laboratory of Rail Traffic Control and Safety, Beijing Jiaotong University, Beijing, China
Gan Woon Seng, School of Electrical & Electronic Engineering, Nanyang Technological University,
Singapore, Singapore
Joachim Speidel, Institute of Telecommunications, Universität Stuttgart, Stuttgart, Baden-Württemberg,
Germany
Germano Veiga, Campus da FEUP, INESC Porto, Porto, Portugal
Haitao Wu, Academy of Opto-electronics, Chinese Academy of Sciences, Beijing, China
Junjie James Zhang, Charlotte, NC, USA
The book series Lecture Notes in Electrical Engineering (LNEE) publishes the latest developments
in Electrical Engineering - quickly, informally and in high quality. While original research
reported in proceedings and monographs has traditionally formed the core of LNEE, we also
encourage authors to submit books devoted to supporting student education and professional
training in the various fields and applications areas of electrical engineering. The series cover
classical and emerging topics concerning:
** Indexing: The books of this series are submitted to ISI Proceedings, EI-Compendex,
SCOPUS, MetaPress, Web of Science and Springerlink **
Editors
123
Editors
Luis Fernando D’Haro Rafael E. Banchs
Universidad Politécnica de Madrid Nanyang Technological University
Madrid, Spain Singapore, Singapore
Haizhou Li
Department of Electrical and Computer
Engineering
National University of Singapore
Singapore, Singapore
This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd.
The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721,
Singapore
Program Committee
v
vi Program Committee
The 9th International Workshop on Spoken Dialog Systems (IWSDS’18) was held
on April 18–20, 2018, in Singapore; being the southernmost IWSDS ever, just one
degree north of the Equator! The conference allowed participants to keep track
of the state-of-the-art in spoken dialogue systems, while enjoying the year-round
summer paradise island that is Singapore.
The IWSDS conference series brings together, on a yearly basis, international
researchers working in the field of spoken dialogue systems and associated tech-
nologies. It provides an international forum for the presentation of current research,
applications, technological challenges, and discussions among researchers and
industrialists. The IWSDS’18 edition built over the success of the previous 8th
editions:
• IWSDS’09 (Irsee, Germany),
• IWSDS’10 (Gotemba Kogen Resort, Japan),
• IWSDS’11 (Granada, Spain),
• IWSDS’12 (Paris, France),
• IWSDS’14 (Napa, USA),
• IWSDS’15 (Busan, Korea),
• IWSDS’16 (Saariselkä, Finland), and
• IWSDS’17 (Farmington, PA, USA).
IWSDS’18 conference theme was “Towards creating more human-like conversa-
tional agent technologies”, inviting and receiving paper submissions on the fol-
lowing topics:
• Engagement and emotion in human–robot interactions.
• Digital resources for interactive applications.
• Multi-modal and machine learning methods.
• Companions, personal assistants, and dialogue systems.
• Proactive and anticipatory interactions.
• Educational and healthcare robot applications.
• Dialogue systems and reasoning.
ix
x Preface
Task organized by Prof. Bayan Abu Shawar, Arab Open University (Jordan), Prof.
Luis Fernando D’Haro, Universidad Politécnica de Madrid, Spain, and Prof. Zhou
Yu, University of California, Davis (USA). This was the fifth event of a “Workshop
and Special Session Series on Chatbots and Conversational Agents”. WOCHAT
aims at bringing together researchers working on problems related to chat-oriented
dialogue with the objective of promoting discussion and knowledge sharing about
the state-of-the-art and approaches in this field, as well as coordinating a collabo-
rative effort to collect/generate data, resources, and evaluation protocols for future
research in this area. The WOCHAT series also accommodated a Shared Task on
Data Collection and Annotation for generating resources that can be made publicly
available to the rest of the research community for further research and experi-
mentation. In this shared task, human–machine dialogues are generated by using
different online and offline chat engines, and annotations are generated following
some basic provided guidelines.
IWSDS’18 received a total of 52 submissions, where each submission was
reviewed by at least two program committee members. The committee decided to
accept a total of 37 papers: 13 long papers, 6 short papers, 4 demo papers, 4 papers
for the Empathic session, 7 papers for the WOCHAT session, 2 papers for the
Humic session, and 1 invited paper.
Finally, we would like to take this opportunity to thank the IWSDS Steering
Committee and the members of the IWSDS’18 Scientific Committee for their
timely and efficient contributions and for completing the review process on time. In
addition, we would like to express our gratitude to the members of the Local
Committee who highly contributed to the success of the workshop, making it an
unforgettable experience for all participants. Last, but not least, we want also to
thank our sponsors: the Special Group on Discourse and Dialogue (SIGDial) and
Chinese and Oriental Languages Information Processing Society (COLIPS) for
their economical and logistic support; without it we and participants could not have
such a remarkable conference.
xiii
xiv Contents
End-to-End Systems
An End-to-End Goal-Oriented Dialog System with a Generative
Natural Language Response Generation . . . . . . . . . . . . . . . . . . . . . . . . 209
Stefan Constantin, Jan Niehues and Alex Waibel
Enabling Spoken Dialogue Systems for Low-Resourced
Languages—End-to-End Dialect Recognition for North Sami . . . . . . . . 221
Trung Ngo Trong, Kristiina Jokinen and Ville Hautamäki
1 Introduction
Task oriented dialogue system, which has been widely used in a variety of different
applications, is designed to accomplish a specific task through natural language inter-
actions. One of its most important components is Natural Language Understanding
(NLU). NLU aims at collecting information related to the task.
Semantic frames are commonly applied in NLU [11], each of which contains
different slots. One of the goals of NLU is to fill in the slots with values extracted
from the user utterances. In previous work, sequence labeling models are usually used
for slot values recognition. For example, Tur et al. [10] used Conditional Random
Field (CRF) with domain-specific features for the task. With the success of deep
neural networks, Yao et al. [14] proposed a RNN model with Named Entities (NER)
as features. They also used Long Short-Term Memory (LSTM) [13] and some other
deeper models. Ma et al. [4] combined Convolutional Neural Network (CNN), LSTM
and CRF in a hierarchical way, where features extracted by a CNN are fed to a LSTM,
a CRF in top level is used to label slot values.
Nevertheless, only the labeling of the slot values is not enough in some applica-
tions. The slot values labeled in utterances should be normalized to some standard
values for database search. For example, in a restaurant booking system, there are
standard values of slot ‘food’ like ‘Asian oriented’. If a user wondered a restaurant
which serves ‘pan Asian’ food, the system should normalize the ‘pan Asian’ in utter-
ance into the standard value of ‘Asian oriented’ in database. There were two different
ways for addressing this problem. One is two-stage methods. Lefvévre [3] proposed
a 2+1 model. It used a generative model consisted of two parts, namely semantic
prior model and lexicalization model, to determine the best semantic structure and
then treated the normalized slot values as hidden variables to figure it out. Yeh [15]
employed fuzzy matching in Apache Solr system for the normalization. Two-stage
methods are either prone to accumulating errors or too complicated to compute. The
other way is directly mapping an utterance to one of the standard values instead
of identifying the values in raw texts. A lot of classifiers were used for building
the mappings. Bhagat et al. [1] tried several different models including Vote model,
Maximum Entropy, Support Vector Machine (SVM). Mairesse et al. [5] proposed
a two-step method: a binary classifiers was first used to determine if a slot appears
in the utterance or not, and then a series classifiers were used to map the utterance
to standard values of that slot. Mota et al. [7] built different classifiers for different
slots respectively.
There is an important problem in above classification based methods however.
These models failed in dealing with the situation where a slot value out of the standard
value set is mentioned in an utterance. This value should not be classified into any
existing standard values and should be recognized as a new value. To our knowledge,
there is no research on this problem in classification based NLU.
The problem might be thought as one type of zero-shot problems in word sense
or text classification and others. But there is a significant difference between new
slot values and other zero-shot problems. The sense of a new word might be very
Attention Based Joint Model with Negative Sampling for New Slot Values Recognition 5
different from that of other known words. But a new slot value is still a value of
the same slot. It should share some important similarities with other known slot
values. That is the starting point for us to construct training samples for unknown
new values. We first distinguish two different types of samples of the standard values
of a specific slot S. Utterances including any known standard value or its variants of
the slot S are positive samples, and the others are negative ones. We further divide
the negative samples into two types, the first is negative samples of S, i.e. samples
including values of other slots or including no value of any slot, and the second is
negative samples of any known standard values of S. The latter is therefore can be
used to build a classifier (together with positive samples of the standard values of S)
for identifying if an utterance includes a known standard value or a new value of S.
The paper proposes a negative sampling based method to construct samples of the
latter.
Meanwhile, sequence labeling is able to locate slot values in original utter-
ances even if they are unseen in standard value set. The slot values themselves
are also important information for classification. The paper proposes a joint model
of sequence labeling and classification by attention mechanism, which focuses on
important information automatically and takes advantage of the raw texts at the same
time. Sequence labeling here aims at slot-value detection and classification is used
to obtain the standard values directly.
Overall, we propose an attention based joint model with negative sampling. Our
contributions in this work are two-fold: (1) negative sampling for existing values for
a certain slot S enables our model to effectively recognize new slot values; (2) joint
model collaborated by attention mechanism promotes the performance. We evaluate
our work on a public dataset DSTC and a dataset Service from an enterprise. All the
results demonstrate that our model achieves impressive improvements on new slot
values with less damage on other sub-datasets. The F1 score evaluated on new slot
values raises up to 0.8621 in DSTC and 0.7759 in Service respectively.
This paper is organized as follows: Sect. 2 details on our attention based joint
model with negative sampling. We explain experiment settings in Sect. 3, then eval-
uate and analyse our model in Sect. 4. In Sect. 5 we will conclude our work.
We assume that slots are independent of each other so they can be handled separately.
A vocabulary of values for slot S is defined as R S = {Sold } {N E W, NU L L}, where
Sold = {s0 , s1 , ...sk } refers to the set of standard values for which there is some labeled
data in training set. N E W refers to a new slot value. It will be assigned to an utterance
providing a new slot value for slot S which is outside Sold , and NU L L refers to no
value in an utterance. For a user input xi , the aim of the model is to map the xi into
one of values in R S . Since there is no training data for a new slot value (if we have
6 M. Hou et al.
some training samples for a value, it belongs to Sold ), classification based models on
the dataset are unable to address the problem, while sequence taggers need another
step to normalize the labels.
We describe our attention based joint model, followed by the negative sampling
methods.
A sequence tagger and a classifier complement each other. A sequence tagger rec-
ognizes units of a slot value in an utterance, while a classifier map an utterance as a
whole into a slot value. In order to benefit from both of them, we combine them into
a joint model.
Specifically, we adopt the bi-directional LSTM [2] as a basic structure. The output
of each timestep is used to output a slot tag by a softmax operation on a linear layer
as shown in Eq. 1:
sˆt = softmax(Ws h t + bs ) (1)
−
→ ← −
h t = ( h t , h t ) refers to the hidden state of time t by concatenating the hidden state in
forward and backward direction. In each direction of LSTM, like in forward LSTM,
−
→
hidden state h t is a function of the current input and the inner memory, as defined
in Eq. 2
−
→ −−→ −−→
h t = f ( h t−1 , wt , C t−1 ) (2)
−−→
where wt denotes the input word at time t and C t−1 is the previous cell state. We
compute function f using the LSTM cell architecture in [16]. So as on backward
direction.
The hidden state of the last timestep T is used to output the class label according
to Eq. 3:
ŷ = softmax(Wc h T + bc ) (3)
T
H= αt v t (4)
t
where v t = (et , h t ) concatenates word embeddings and hidden states of LSTM and
αt is defined in Eq. 5.
Attention Based Joint Model with Negative Sampling for New Slot Values Recognition 7
Fig. 1 In this figure, attention based joint model combines sequence tagging and classifying and
adopts attention mechanism for further improvements. Legend in the right corner shows the meaning
of operations
exp(qt )
αt = T (5)
k exp(qk )
Our model computes qt by an align function in Eq. 6 which is the same way as [9]:
All parameters are learned simultaneously to minimize a joint loss function shown
in Eq. 8, i.e. the weighted sum of two losses for sequence tagging and classification
respectively.
1
N
L classi f ication = L( ŷi , yi ) (10)
N i
Model fails in recognizing new slot values without training data for them as men-
tioned before. If we regard all the samples for new slot values of a specific slot as
the negative samples of existing ones, construction of samples for new slot values
can then convert to the construction of negative samples of old ones.
As mentioned in Sect. 1, a new slot value is still a value of the same slot. It should
share some important similarities with other known slot values. Here we think the
similarities are hidden in contexts of the value, i.e. the contexts are shared among
different values of a same slot. It is therefore a possible way to construct a negative
sample by just replacing the slot values in a positive sample with a non-value. But
there are so many choices for non-value, how to choose a proper one?
Mikolov et al. [6] have already used negative sampling in CBOW and Skip-gram
models. They investigated a number of choices for distribution of negative samples
and found that the unigram distribution U (wor d) raised to the 3/4rd power (i.e.,
U (wor d)3/4 /Z ) outperformed significantly the unigram and the uniform distribu-
tions. Z is the normalization constant and U (wor d) is the word frequency in another
word, which is calculated by U (wor d) = count (wor d)/ |Data|. We use the same
method but leave the word frequency alone. In our work a negative sample is a com-
plete slot value that sometimes consists of several words, different from the negative
samples of a single word in [6]. That results in repeating sampling until a segment
of the same length as the existing value is formed. Figure 2 shows the construction
of a negative example for Service dataset.
Fig. 2 Negative sampling for service dataset. Lower part is a translation of the example
Attention Based Joint Model with Negative Sampling for New Slot Values Recognition 9
3 Experiments Setting
3.1 Dataset
We evaluate our model on two dataset: Dialogue State Tracking Challenge (DSTC)
and a dataset from an after-sale service dialogue system of an enterprise (Service).
DSTC is an English dataset from a public contest [12] and we use DSTC2 and
DSTC3 together. It collects 5510 dialogues about hotels and restaurants booking.
Each of the utterance in dialogues gives the standard slot values, according to which
slot tags can be assigned to word sequence. Based on the independency assumption,
we build datasets for each slot: keep all B- or I- tags of the slot labels and reset the
rest to ‘O’. However we find out that not all slots are suitable for the task, since there
are too few value types of the slot. At last we choose the dataset for slot ‘food’ only
in our experiments.
Service is a Chinese dialogue dataset which is mainly about consultation for cell
phones and contains a single slot named ‘function’. It has both sequence tags and
slot values on each utterance.
We divide two datasets into training, dev and test set respectively, and then con-
struct some negative samples into training set for both of them. All of the utterances
corresponding to infrequent slot values in training set are put into test set to form
corpus of new slot values. These values thus have no samples in training data. Table 1
shows the statistics of the final experimental data and Table 2 tells about the diversity
of slot values.
with
n si Ps × Rsi
ωi = , F1si = 2 i (12)
n Psi + Rsi
where n refers to the size of the test set and n si denotes the size of class si . P and R
is precision score and recall score defined in [8].
We also evaluate on the sub-dataset of old values by Eq. 13.
k
F1old = ωiold F1si (13)
i=0
n
where ωiold = n old
si
.
For sequence tagging we still consider F1 score as criterion which can be calcu-
lated by running the official script conlleval.pl1 of CoNLL conference.
3.3 Baseline
There are no previous models and experimental results reported especially on new
slot values recognition. We compare our model to existing two types of NLU methods
for the task.
(1) The pipeline method:labeling the words with slot value tags first and then
normalizing them into standard values. Here, a bi-directional LSTM as same as that
in our model is used for tagging, and the fuzzy matching2 is then used to normalize
extracted tags like that in [15]. The model is denoted by LSTM_FM.
(2) The classification: classifying the utterance to standard values directly. A bi-
directional LSTM is used to encode user input, a full-connected layer is then used
for the classification. The model is denoted by LSTM_C.
1 https://www.clips.uantwerpen.be/conll2000/chunking/output.html.
2 http://chairnerd.seatgeek.com/fuzzywuzzy-fuzzy-string-matching-in-python.
Attention Based Joint Model with Negative Sampling for New Slot Values Recognition 11
3.4 Hyperparameters
We evaluate our model on two dataset described in Sect. 3.1. Our model can output
both classification results of a utterance and the labeled tags in a utterance. Tables 3
and 4 shows the results of classification and labeling respectively.
As we can see in Table 3, our model outperforms both baseline models signifi-
cantly in classification task. It achieves 13.44 and 16.17% improvements compared
to LSTM_FM and LSTM_C model on DSTC dataset, and achieves 8.55 and 5.85%
improvements on Service dataset. Especially, it shows big advantage on new slot
values recognition, where the F1 scores achieve at least 20% raises on both DSTC
and Service data.
Similar to the performance in the classification, our model also achieves best
results in slot value tagging as we can see in Table 4. It performs significant better
than the pipeline method, especially for the new value. We also give the tagging results
of LSTM_FM trained by adding negative samples used in our model (denoted by
LSTM_FM_NS in Table 4). We find negative samples are helpful to NEW slot values
significantly, but they hurt the performance of old values. We give more details of
negative samples and attention mechanism in our model and baseline models in next
subsection.
4.2 Analyses
In order to analyze our model, we compare it to the model dropping out attention
mechanism only and the other dropping negative samples only. We refer to the former
model as JM_NS and the latter as AJM.
From Table 5 we can find out that the one dropping out negative samples (AJM)
failed in dealing with new slot values recognition. It shows that the negative sampling
is the key for the success in new slot values recognition. The negative samples actually
enables the model to distinguish old and new slot values. For more details, the changes
of confusion matrices are shown in Tables 6 and 7. The left part of ‘⇒’ in the table is
the confusion matrix of the model without negative samples(AJM), and the right part
is from the original full model (AJM_NS). With the training of negative samples,
classification results related to NEW value change better significantly, while change
little on other classes, i.e. negative samples bring less damage to other classes.
We also add same negative samples for training other models. The result in Table 8
shows that LSTM_C_NS(LSTM_C with negative samples) now achieve good per-
formance of recognizing new slot values. As for LSTM_FM_NS, the F1 score drops
a lot for old values while for new slot values it raises up on the contrary. It shows
that, although negative samples still work, they damage other classes significantly
Attention Based Joint Model with Negative Sampling for New Slot Values Recognition 13
Fig. 3 Comparison between the full model (AJM_NS) and the one dropping out attention mecha-
nism (JM_NS). The heatmap in full model is the visualization of weights for different words. The
deeper color means a larger weight
in pipeline model. We can also find out that our model AJM_NS still beats the rest
models on the whole dataset even if all of them use negative samples.
When we abandon attention mechanism (JM_NS), the model is slightly inferior
to the full one (AJM_NS), i.e. the attention mechanism can further improve the
performance by focusing on the important subsequences. Since it introduces the
original word embeddings at the same time, it corrects some mistakes in the model
dropping out attention mechanism (JM_NS) in which the final label is wrongly
classified even with correct sequence tags. We visualize a sample of attention in
Fig. 3.
5 Conclusion
The paper proposes an attention based joint model with negative sampling to
satisfy the requirement. The model combines a sequence tagger with a classifier
by an attention mechanism. Negative sampling is used for constructing negative
samples for training the model. Experimental results on two datasets show that our
model outperforms the previous methods. The negative samples contributes to new
slot values identification, and the attention mechanism improves the performance.
We may try different methods of negative sampling to further improve the perfor-
mance in following works, such as introducing prior knowledge. At the same time,
scenario of multiple slot in an utterance will also be explored as it happens a lot in
daily life.
References
1. Bhagat R, Leuski A, Hovy E (2005) Statistical shallow semantic parsing despite little training
data. In: Proceedings of the Ninth international workshop on parsing technology, pp 186–187.
Association for Computational Linguistics
2. Graves A, Jaitly N, Mohamed AR (2013) Hybrid speech recognition with deep bidirec-
tional LSTM. In: 2013 IEEE Workshop on Automatic Speech Recognition and Understanding
(ASRU), pp 273–278. IEEE
3. Lefévre, F (2007) Dynamic Bayesian networks and discriminative classifiers for multi-stage
semantic interpretation. In: IEEE International Conference on Acoustics, Speech and Signal
Processing, ICASSP 2007, vol 4, pp IV–13. IEEE
4. Ma X, Hovy E (2016) End-to-end sequence labeling via bi-directional LSTM-CNNS-CRF.
arXiv:1603.01354
5. Mairesse F, Gasic M, Jurcícek F, Keizer S, Thomson B, Yu K, Young S (2009) Spoken lan-
guage understanding from unaligned data using discriminative classification models. In: IEEE
International Conference on Acoustics, Speech and Signal Processing, 2009. ICASSP 2009,
pp 4749–4752. IEEE
6. Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013) Distributed representations of
words and phrases and their compositionality. Advanc Neural Informat Process Syst 3111–3119
7. Pedro Mota Luísa Coheur AM (2012) Natural language understanding as a classification pro-
cess: report of initial experiments and results. In: INForum
8. Perry JW, Kent A, Berry MM (1955) Machine literature searching x. machine language; factors
underlying its design and development. J Associat Informat Sci Technol 6(4):242–254
9. Seo M, Kembhavi A, Farhadi A, Hajishirzi H (2016) Bidirectional attention flow for machine
comprehension. arXiv:1611.01603
10. Tur G, Hakkani-Tür D, Heck L, Parthasarathy S (2011) Sentence simplification for spoken
language understanding. In: 2011 IEEE International Conference on Acoustics, Speech and
Signal Processing (ICASSP), pp 5628–5631. IEEE
11. Wang YY, Deng L, Acero A (2011) Semantic frame-based spoken language understanding.
Spoken language understanding: systems for extracting semantic information from speech, pp
41–91
12. Williams J, Raux A, Ramachandran D, Black A (2013) The dialog state tracking challenge. In:
Proceedings of the SIGDIAL 2013 conference, pp 404–413
13. Yao K, Peng B, Zhang Y, Yu D, Zweig G, Shi Y (2014) Spoken language understanding
using long short-term memory neural networks. In: 2014 IEEE Spoken Language Technology
Workshop (SLT), pp 189–194. IEEE
14. Yao K, Zweig G, Hwang MY, Shi Y, Yu D (2013) Recurrent neural networks for language
understanding. In: Interspeech, pp 2524–2528
Other documents randomly have
different content
admiration. Nor was I less interested in the Little One that ran at his heels.
Stories there were of these two, eddying about the hive—of their kindness and
also their malevolence. How mighty they appeared! I had seen them but once
before. That picture was still vivid.
We were not long in reaching home. Without ceremony I lit on the board and
instantly my friend was beside me. At the same moment a guard accosted him
and seized him, recognizing him as an intruder. I interfered, but almost
unavailingly, for the guard was about to sting him. The two of us escaped this
guard only to be attacked by another, which we beat off, and hurriedly entered
the hive. I was almost certain that yet others would question the stranger, and
sure enough, we had barely got inside before another guard summarily
attacked him. Poor fellow, with only five legs and tired from the combats of the
day, he could make but a poor fight. Again I rescued him, and again we raced
into the interior. And now, happily, our troubles were over. Without thinking, I
made straight for my cell, with “Crip,” as I began to call him, at my heels.
He seemed to realize that he was a stranger and that he owed his life to me,
for he clung to me as closely as possible. He seemed to know, too, that the
ground whereon I stood was sacred to me. He did not speak for a time, nor did
I. We simply hung limp on the comb, and rested. He broke the silence:
“You have a wonderful colony, I can see. I hope I shall grow into it as though
it were my own. Indeed, in a sense it is my own, for all bees are sprung from
the same source, and the life of the bee is kept alive by us, each in his own
cell. I know now that I shall grow into it. Listen to that voice! How long it is
since I heard a Queen-Mother sing!”
I roused myself, somewhat confused. “Queen-Mother!” I stammered.
“Yes. Won’t you take me to her?”
I hardly knew how to answer; I had never seen her myself, although I knew
from Crip’s story and from some unknown source that there was somewhere a
reigning spirit. But my life had been so brief and I had already learned so many
things, I said, as lightly as I might, “Let us go.”
He seemed to know the way to her. He hobbled along as best he might on
his five legs. He was now no longer suspected as an intruder, and we marched
without interruption. Presently we climbed through a hole in a comb and came
face to face with our Queen-Mother.
I stopped, dazed, overcome by her serenity. The grace and magnificent
proportions of her body and the fire of her eyes held me entranced. I shall not
live long enough fitly to describe my emotions. There she was, queenly and
wonderful, and yet simple as any one of us. She approached us and appeared
to nod, as if to say, “I salute you, my children.” Then she went on with her
labors.
I turned to Crip. He was speechless.
Immediately we started back to our cell, for it was henceforth to be his also.
“It is strange,” he said. “I do not understand it. Life and death are in her
keeping, and yet she knows it not. You and I don’t count for much. We pass
like the leaves, but life everlasting lingers in her body—the very spirit of things
ranges through her. But I am content with my insignificant place, to live my life,
doing my duty from day to day.”
I did not answer him. We fell silent as we made our way across the combs.
“Suppose we take a turn in the woods,” he suddenly suggested, wheeling
about and heading for the door. “I have new bearings to get and you have new
lands to explore.”
“I supposed you knew this country,” I ventured.
“I do, but the way to this new home of mine must be learned.”
Out into the air we hurried, but he flew back and forth many times before our
door. He wanted to make sure that he knew it; then, flying round and round in
ever wider circles, we mounted with ecstasy into the higher reaches. Lake
Espantoso, with its border of great oaks, lay below us like a bar of silver; and
the Master’s house stood like a sentinel beside the white hives which, row on
row, spread beneath us in the sun.
“That prominent knoll,” said Crip, “is a thing to remember, if you are returning
late and flying low. And remember, too, that in that window of the Master’s
house a lantern burns. This may sometimes be a guide. But, mark you, never
fly into it, though you may be tempted. Better still, get in before it is too dark.
Just there by that row of hives is a tree to remember. It is a glory in the spring
with its yellow flowers, until the cutting ants get it. They clip off the leaves and
blossoms. But it is an excellent land-mark, nevertheless. And there’s the
Master,” went on Crip, “and the Little One, and that horrid dog. That little boy
sits by for hours while the great one labors with some of us. The horrid dog
sleeps—I’d like to sting him. Things will go wrong—the Master sets them to
rights. He seems to know everything; and yet, when he took away some of our
honey, in spite of our having vast stores of it, we fought him. The little he took
harmed us not at all, and I suppose we fight him because our brothers have
done so for centuries. But I talk too much.”
After a rather long flight, and much interesting converse, we reached our
door again. Crip’s experience with the guard was still fresh in his mind, for he
clung closely to me for protection. But the guard this time passed him without a
word. He had acquired the scent and the note of the hive, and henceforth his
life and all the energies of his body would be merged with that of the colony.
CHAPTER SEVEN
Crip, the Wise
When we had returned to our cell we halted, and for a season remained
quiet. Indeed, we slept a tiny bit, as much as ever a bee can sleep at a stretch,
and then we fell into meditation. Among other things, I was wondering what the
Queen-Mother was doing when she popped her long, thin body into each cell
as she made her rounds. I could not understand and so I called on Crip to
explain.
“Why, laying eggs!” he said, right sharply, as though annoyed at my
ignorance.
“Well, what are eggs?” for I was still no wiser.
“Come with me,” he said, and off we went across the combs.
He did not stop until he reached the very spot where we had seen the
Queen. The odor of her was still strong thereabouts, but she had gone.
“Now look, stupid!” Crip said. “At the bottom of each of the cells in this
section of comb is an egg.”
I looked down into one and, sure enough, a small, thin, yellowish-white egg
was stuck squarely in the center of it. I looked into several other cells, and
each had its one egg.
I shall never forget the story which he went on to unfold. The wonderful cycle
from egg to larva, from larva to bee, he explained in fascinating detail. I saw at
once that he was a real sage, that his knowledge was boundless, and then to
crown it he told me that even the Queen-Mother herself had sprung from an
ordinary egg, having been converted through miracle into a queen ruling over
this empire. Simply by feeding and tending them differently—only the bees in
their wisdom know how—the egg which might develop into a worker or a
drone, passing through a metamorphosis, can be made to break from the dark
cover of the cell the personification of life eternal, as exemplified in the body
and the life of the Queen.
I could not quite understand all these things, but I felt sure Crip was telling
the truth; and indeed I began to look up to him with increasing admiration and
wonder on account of the worlds of things he knew.
We were silent awhile. There rose again for me the night hymn of the hive. It
penetrated me as not before; it had a new significance, a new message—I had
been visited with a revelation. The sight I had gained of the Queen-Mother
woke new and tremulous emotions within me—there was a new meaning in
life.
Crip stirred rather sharply, breaking my train of thought.
“What’s the matter?” I queried.
“I’m tired holding on. We must get another place to rest. You see, with only
five legs the load of my body grows heavy.”
With that we moved up the comb to the top of it, and there he spread himself
out with a little hum of content. And just then I developed a curiosity to know
how he had lost his leg.
“You miss your leg, but do you suffer pain on account of it? And how did it
happen?”
“That’s a short story. I was coming home late one day, well laden with honey,
when, without warning, one of those terrible black bee-hawks darted for me
and clutched me, sailing away to the nearest bush. He had quickly rolled me
up with his powerful legs and almost by the time he lit he was ready to kill me
with one thrust of his proboscis. Of course I had struggled, but when one of
those fellows gets his claws on you it’s good-by. I had about ceased to struggle
when suddenly there came a tremendous shock, and the next moment I was
rolling on the ground and shaking myself free from the mutilated hawk. He had
been torn to pieces by some mysterious force, and my leg, my bread-basket
leg, was gone. At that moment the Master approached me; in his hands he
held a long black thing which I had seen emit fire on other occasions, and
somehow I suspected at once he had saved me. The little boy came hurriedly
up, stooped over me and helped release me, and in a moment I was circling
round to get my bearings. The little boy and the Master—and even the dog—
watched my movements with an expression of satisfaction on their faces. I flew
straightway home and was thankful still to be alive.”
“Tell me more about this Master,” I begged, for I was now growing vastly
interested in his activities and in those of the Little One, and even the dog
which once I tried to sting, because he came so close to our hive.
“Some say he is good—some say that he is bad. I only know him as the
chopper of weeds about our home and as my rescuer. Many times since the
day he saved me have I heard him shooting bee-hawks. Indeed, I had heard
the little thunder of his gun before that day, but I did not understand its
meaning. They say, too, that he takes away our honey—and he did take some
of ours once—and frightens us nearly to death with the prospect of starvation.
And they fall upon him and sting him, trying to drive him away. But all this is
useless, they report, since he comes armed with fire and smoke.
“Others tell of him that in the dark, cold days, if provisions run low, he brings
honey and closes the door against blizzards. But I know nothing of this. I have
not lived through a winter and I fear I shall never know what it means.”
Thus I became infinitely interested in the Master who passed from day to
day about the yard. But I was confused in mind about him. Somehow I
instinctively feared him and I always found myself ready to attack him, as I
explained to Crip.
“There would be no use in that,” answered he. “Should you sting him, you
would achieve nothing. Instead, you would lose your life.”
“How is that?” I cried, for I did not till then know I had a life—at least I had
never thought of it.
“You can sting once, but unless you escape with your stinger, which is rare,
your life is sacrificed.”
I seemed to know this and answered him nothing.
“Is it not a strange fatality,” he continued, “that we should be given stingers
with which to defend ourselves and our homes, and yet, when we make use of
them, we lose our lives! Still, we are always ready to strike, with no thought of
death.”
“What is death?” I asked of Crip.
“I don’t know, except that once when the bee-hawk caught me I felt myself
going away. It grew dark and I heard the hum of wings that were strange and
wonderful. Somehow you go to sleep and forget.”
“I have thought of death,” he went on. “I am old and battered, my days are
as the falling flowers when the frost is upon them, and the frost soon will fall.”
I waited awhile in silence, but he spoke no more. Soon he lay in that buzzing
hive, asleep, and I was not long in following him to where the golden honey
dripped in the garden of dreams.
CHAPTER EIGHT
A Gleaner of Honey
We awakened about the same time and began to stir about. The first thing
that happened was a new experience—the wax-pickers fell upon me and raked
and scraped me for the tiny bits of wax which now, on account of my voracious
appetite, had begun to grow in each of the rings marking the under sections of
my body. They were so rude that at first I was inclined to resent their
interference, which seemed to be mere meddling. But when I looked at Crip
and saw two busy wax-pickers fumbling over him, I began to understand that
this was part of a routine, and so I stood still until they had finished.
“They won’t bother with me much longer,” said Crip, sadly. “You see, when
one becomes old the wax grows thinly—so the pickers give over. But you!
They’ll get you. I have noticed that you are rather greedy about eating honey.
This means you’ll get fat and produce lots of wax.”
“Tell me about wax and comb,” I begged of him.
“Comb, my child, is made of wax; this is comb on which you are standing. It
is everywhere about you. The cups that hold our honey and our bread are
made of it. The cell in which you were born is of wax; and, besides, it is used to
stop the holes in our house. Of course there are different kinds of comb,
depending on the use to which it is put. Why, these sheets of comb with their
six-sided cells are wonderful in their economy, in their plan and symmetry. The
cell we build is perfect. No other structure would serve our purposes,
combining such strength and capacity. The cell is indispensable to the life of
the bee!—otherwise he could not exist. So don’t let me see you make ready to
fight the next time the wax-pickers approach, and they’ll soon be after you
again.”
I answered nothing. I was wondering in what far age we had learned to build
the six-sided cell, and in what tiny brain it had been conceived. They fit so
perfectly, I stood quite still marveling at the harmony of it all and wondering
how many things there remained for me to learn. At every turn I had been
confronted with something new. And was it to be so to the end? What could the
end be, of which Crip frequently spoke?
“How old are you?” I asked.
“Two months—glorious with flowers, but ending in disaster.”
“What disaster?”
“Well, you saw the close of it—the death of our colony.”
“Yes, I remember,” I said. But he was so wise I could scarcely believe that he
was but two months old, for he seemed so tattered of wing and battered of
body!
Without thinking what we were about, we drew near the door. Groups of
workers were banked about the entrance, waiting impatiently to be away at the
first streaks of dawn. Presently a note like a bugle-call sounded, and
immediately the face of things was changed. By twos and threes and fours the
workers took wing and scurried into the fields.
A dull gray light lay on the world; the air was damp and moved lazily out of
the east; the dew had fallen thick on the flowers and now began to twinkle from
myriad angles. Crip and I had left the hive at the same instant, but once on the
wing I forgot all about him and flew like mad this way and that until I caught a
whiff of fragrance from an unexplored meadow, and thither I hastened. Strange
and thrilling sensation! I had not until now felt the joy of dipping into the flowers
and searching out their honey-pots. It was a field of late sunflowers, and all of
them had their faces toward the east, eager to look upon the sun. Joyfully they
waved in the breeze and beckoned to one another as if to say: “Good morning.
How glorious is the sun, our king!” In spite of the dew on their faces, some of
them already were wearing the brand of the hot summer, which had all but
gone and left them beseeching of autumn her tender graces.
“I am old and frayed,” I heard one say, “and these mornings chill me, but my
work is done. The heart and soul of me are here; I shall not pass; I shall
endure; my seed shall spring up to brighten the world.”
“But I am young,” a tender blossom said, “and I shall be cut off. The frost will
slay me and I shall have rattled down to dust ere my soul has developed its
immortal parts.”
At the moment I was taking honey from its lips, and I felt a quivering as if its
heart fluttered.
“Dear little flower,” I said, “you are living your life; you cannot die; you will be
swallowed up in the universal spirit of things. Your face has spread a glamour
of gold in the world; your honey has nourished a thousand winged things; your
scented breath has floated far and has carried blessings into silent places.
Memory of you will linger; it will be preserved by the things you have fed, by
the things you have gladdened. And, too, I promise that I shall remember you!”
“How can you remember me,” the flower asked, “when you, too, are
doomed?”
“What!” I cried. “Doomed! Why, I am young, I am swift, I am beautiful, I am
glorious!”
“Yes, and so am I. But we pass.”
“You are wise for so young a flower,” spoke up the elder blossom. “Both of
you are of the heavens; both have your lives before you in this tiny garden, ere
you return to the golden fields that spread out toward the sun. You are
immortal.”
Just then I saw one of the petals blow away from the face of the elder flower.
It fluttered and fluttered and finally fell to the earth. Scarcely had it struck the
ground when something with a long, thin body and active legs seized it and
began struggling to draw it through the grass, intent on some mysterious
purpose. I was quite absorbed, and from my post of vantage on the breast of
the floweret I followed the movements of the thing that tugged at the petal. I
had never seen this thing before and I was wishing for Crip, when, behold! he
appeared.
“What are you doing?” he cried at me. “How many loads have you gathered?
What are you staring at?”
He had submerged me with questions. I answered none of them. I had,
indeed, forgotten my work momentarily, so absorbed had I been in the talk of
the flowers.
“Have you a load? Let’s go,” cried he.
I was ready, truly, but I could not refrain from asking him about this strange
animal that pulled the leaf so sedulously through the grass.
“An ant!” Crip answered, rather glumly.
“Do you see what he is about?”
“Yes he is gathering his winter stores. A time comes when he must go
indoors and he must have food even as you and I. Come now, let’s be off.”
I looked down at the ant struggling with his burden and then at the
disheveled flower, casting a last glance at the tender face which had yielded up
honey to me, wondering at the strangeness of it all.
“Come on,” cried Crip, rising on wing.
I did not speak, but followed him. I flew at his heels until he began to fag a
bit and then I came up alongside, careful, however, not to outdistance him. I
soon saw that he had a heavier load than I, and I felt ashamed, but I knew this
had come through my having wasted a few minutes, and I resolved then and
there that the next time I should be first.
Another thing I noticed, we were flying very low, so near the earth we almost
brushed the tops of the bushes. I asked Crip the reason.
“The wind,” he answered, in better humor than could have been expected.
“Don’t you feel that heavy head current? If you should go up it would be a hard
fight home with these loads. You see, there are currents and currents,” he went
on, “and you must use your wits. Take the current that blows your way. Profit
by whatever nature bestows.”
Almost at once I saw the yard with its white hives, like dots, and the Master
with the Little One and the dog that seemed always with them. The next
moment Crip and I were dropping down to our hive. I was overjoyed when I fell
upon the alighting-board, and could not restrain my exuberance of feeling. So I
bowed my head humbly as best I might with the load I carried, uttering a hymn
of thanksgiving—the very hymn, Crip told me, that every worker for a million
years had uttered on returning to his hive with his first load of honey. I cannot
explain, but some mysterious force seized me, compelling me to bow my head
and to sing. I should have done it had it cost my life. Such is the law of the
hive, just as there is the law of the jungle. I did not know why I was so happy,
but something bubbled over in me, and the very intoxication of it finally sent me
running madly to deposit my load in a waiting cell, and once more to take wing
for the field of the flowers of the sun.
CHAPTER NINE
A STORM
On my way back the first rays of light caught the topmost branches of the
trees and gilded the flying clouds in the east. Far in the west, black and
forbidding masses of cloud were gathering, and the wind, I observed, had
shifted its course. Again I had lost Crip, and I was regretful, for there were
questions which only he could answer. But I flew all the faster for being alone,
and soon found the very place and the very flowers I had visited before.
Speedily I took my load, but I could not fail to return to the flowers I had come
to love. Other petals from the elder had fluttered away, due either to the eager
foraging of bees or to the gusty impatience of the wind. The younger had
opened wider her heart to the sun.
“I’ve been waiting for you,” she said, sweetly. “All that I have I yield up to you
gladly. This is my end. Oh, how glorious is life! How splendid to be able to give
of one’s store so that life shall go on eternally!”
“Yes, eternally,” echoed the elder blossom. “Even I, in dying, leave my seed
behind to follow the summer suns through numberless ages; and I breathe into
the world an imperishable fragrance. It shall be wafted to the utmost bounds; it
shall gladden the hearts of the lowliest. Though it be scattered by the winds, it
shall not cease to exist.”
By this time I had filled my honey-sac, and, after flying three times around
these two well-beloved blossoms, I made for home. I was depressed by the
talk which I had heard. I could not wholly comprehend it, and I wanted to
consult Crip.
I was not long reaching our hive, for the wind seemed to get under me and
literally to blow me on. I deposited my treasure, hurried out again, and once
more headed for the sunflower-field, where I quickly gathered a load. Then
straight for home. It was difficult flying now, because the wind was in my face. I
rose higher, following Crip’s advice, but still it blew and almost beat me back.
The black clouds which I remembered having seen in the west seemed almost
over me, and suddenly terrific noises crashed around. It grew dark and great
flashes of fire tore the heavens apart and blinded me.
This terrified me. I knew not its meaning, but instinctively I fled homeward.
But my progress was slow, and I had not gone far when again the whole world
seemed to tremble, shaken through and through by the most violent rumblings
conceivable. It grew so dark I almost stopped in my flight, not sure of my way.
At this moment of hesitation something struck me squarely in the back, almost
knocking me down. It had been a great drop of water, and almost immediately
others began to pelt me. Soaking wet and tossed by the gale, I was forced to
alight. As I dropped downward I saw nothing but black shadows, and presently
I was dashed into a great tree. I seized a branch that offered shelter, which
proved to be none too well protected against the blast that now drove the rain
in solid sheets. I was cold, and clambered around to the under side of the limb,
and there, feeling none too secure, I grudgingly deposited some of my honey
in a crevice. By lightening my load I was better able to keep my balance; but so
gusty was the blast that it whipped the rain all over me, and I was unable to
find a spot that was dry. I began to climb from one branch to another in the
hope of reaching a safer haven, but, alas! none was to be found.
Worse things, too, were awaiting me. I was crying for Crip when the branch
to which I clung suddenly snapped. Down and down it fell while I clung to it. I
was too cold and wet to try to take wing, and presently the branch crashed into
a swirling stream of water. At first I was entirely submerged. It seemed an
interminable time that I stayed under the water; but presently I came to the
surface and caught my breath. Cold as I was, I still clung with all the tenacity of
my being to the floating branch that was hurried onward by the raging torrent. I
was beginning to feel a little more comfortable when over went the branch
again in the seething water, and again I seemed to go down to immeasurable
depths. This time I felt my legs giving way in the rush of the waters. My head
swam and I strangled, but just as it seemed all over with me the branch again
came to the surface. I caught my breath, shifted slightly my footing, and
hurriedly emptied my honey-sac. This gave me more confidence in spite of the
numbness that had nearly overcome me from the cold and water. There I sat
shaking, awaiting the next turn of the branch, which now seemed merely to be
bobbing up and down in the waters. The wind was still whistling through the
trees, the rain was falling in torrents, and the thunder rumbled in unabated
violence.
How long I clung to the branch in desperation I do not know. But after a time
the rain ceased, the wind fell to a whimper among the bushes, and the
darkness broke along the horizon. It began to grow a little brighter. Imagine my
joy, therefore, to find that my perch was now quite clear of the flood waters, the
branch safely nestling in the top of a bush. In a short space it grew warmer,
and I took courage; I began to dry myself and to preen my wings. The light
gained, and before long, after trying out my strength, I found that I could again
mount into the air, and with one wide sweep I made for home.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com