0% found this document useful (0 votes)
127 views25 pages

Realistic Handwriting Generation Using RNN's Major Project

The document discusses using recurrent neural networks (RNNs) to generate realistic handwriting. Specifically, it proposes using long short-term memory (LSTM) RNNs, which are better able to store and access information over long periods of time compared to standard RNNs. The system would take in handwriting samples, train an LSTM model on the data, and then be able to generate new text in a style similar to the input samples. This could have applications in forensics and psychology to analyze individuals' handwriting traits.

Uploaded by

Abdul Wase
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
127 views25 pages

Realistic Handwriting Generation Using RNN's Major Project

The document discusses using recurrent neural networks (RNNs) to generate realistic handwriting. Specifically, it proposes using long short-term memory (LSTM) RNNs, which are better able to store and access information over long periods of time compared to standard RNNs. The system would take in handwriting samples, train an LSTM model on the data, and then be able to generate new text in a style similar to the input samples. This could have applications in forensics and psychology to analyze individuals' handwriting traits.

Uploaded by

Abdul Wase
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

REALISTIC HANDWRITING

GENERATION USING RNNs

Presented by :
Mohammed Umar (1604-17-733-079)
Mohd Ehtesham Uddin Qureshi (1604-17-733-102)
Mohammed Abdul Wase (1604-17-733-112)

Guide: Mr. Rajesham Gajula


Asst Prof. B.E(CSE)
ABSTRACT
● In this ever-growing world of automated technology, people are creating
solutions that avoid monotonous jobs and replace them with computers
so that humans can engage in much impactful, complicated and world
changing tasks.

● With respect to the above scenario, a system is proposed that will


generate custom Handwriting fonts, This is possible by using Recurrent
Neural Networks (RNNs).

● Hand-writing is one of the most important cryptographic parameter and a


vital indicator of people’s characteristics such as their personality.
INTRODUCTION

● Recurrent neural networks (RNNs) are a rich class of dynamic


models that have been used to generate sequences in domains as
diverse as music , text and motion capture data

● RNNs are ‘fuzzy’ in the sense that they do not use exact templates
from the training data to make predictions, but rather—like other
neural networks— use their internal representation to perform a high-
dimensional interpolation between training examples
● In principle a large enough RNN should be sufficient to generate
sequences of arbitrary complexity. In practice however, standard
RNNs are unable to store information about past inputs for very long
time.

● Having a longer memory has a stabilizing effect, because even if the


network cannot make sense of its recent history, it can look further
back in the past to formulate its predictions.

● We believe that a better memory is a more profound and effective


solution. Long Short-term Memory (LSTM) is an RNN architecture
designed to be better at storing and accessing information than standard
RNNs
RNN Architecture
OBJECTIVE

● The main purpose or objective is to create a stable system that


can generate custom handwriting fonts that are accurately
relatable to individual.

● This system can be well used in forensics department and also


for drafting official documents or in psychological departments
where handwriting can be used as a tool to understand an
individual’s traits.
PROBLEM STATEMENT
● It is very important in today’s world that extensive use of automation
be used so that people can concentrate on more complicated tasks so
that the world becomes a better place.

● Handwriting recognition and generation is one task that requires


automation in its entirety.

● Several indispensable departments require this automation some of


those departments include criminal, forensics, psychological
departments etc.
EXISTING SYSTEM
● In the existing system we have to provide Handwritten characters like
alphabets in upper and lower case,numbers,special symbols etc.

● Now we feed the handwritten characters to the system and a font is


generated similar to the given handwriting

● When some texts are given as input to the system it generates the
given text and gives output in the handwriting font created.
EXISTING ARCHITECTURE
DISADVANTAGES

● The existing system gives a machinist look to the output.

● It seldom misinterprets the letters.

● parsing is inadequate.

● Randomization is seen.
PROPOSED SYSTEM
● Online handwriting is an attractive choice for sequence generation
due to its low dimensionality and ease of visualisation.
● The data used for this system were taken from the IAM online
handwriting database (IAM-OnDB)
● IAM-OnDB is divided into a training set, two validation sets and a
test set,containing handwritten lines taken from several inputs.
● Additionally, LSTM is introduced in this system which was not used
in the existing system.
● Generated text sequences one character at a time.
Proposed System Architecture
ADVANTAGES

● It helps to achieve stable levels of generalization.

● Synthetic training data can be created.

● Situations are simulated.

● Practical tasks can be accomplished.

● It gives us a way to understand data clearly.


MODEL ARCHITECTURE

Fig 4.1 : Model Architecture


MODULES
1. PreProcess: It is used to analyze the data,filter it and preprocess the
data

2. Train: The data that has been preprocessed Training is done to train
the model on the preprocessed data such that we get a trained model
after this process

3. Generate : The generate module is used to synthesize handwriting, it


can provide various information like log-loss etc..
CONCLUSION
• This project, “Realistic Handwriting generation Using RNNs” is a complicated task and it
is even more difficult to mimic the particular style.

• With the explained model, we can achieve satisfying results. Results of the model totally
depend on its hyperparameters and bias. Tuning them properly is necessary.

• We demonstrated the ability of Long Short-Term Memory recurrent neural networks to


generate both discrete and real-valued sequences with complex, long-range structure using
next-step prediction.

• It has also introduced a novel convolutional mechanism that allows a recurrent network to
condition its predictions on an auxiliary annotation sequence, and used this approach to
synthesise diverse and realistic samples of online handwriting.
FUTURE WORK
• This project is not limited to handwriting data. It can be functional to any
sequential data with few tweaks. Moreover, in future, this designed model
can be applied in a much more useful real-time application.

• It would also be interesting to develop a mechanism to automatically


extract high-level annotations from sequence data. In the case of
handwriting, this could allow for more nuanced annotations than just text,
for example stylistic features, different forms of the same letter,
information about stroke order and so on.

• Another is to gain a better insight into the internal representation of the


data and to use this to manipulate the sample distribution directly.
BIBLIOGRAPHY
● [1] Y. Bengio, P. Simard, and P. Frasconi. Learning long-term
dependencies with gradient descent is difficult. IEEE Transactions
on Neural Networks, March 1994.

● [2] C. Bishop. Neural Networks for Pattern Recognition. Oxford


University Press, Inc., 1995.

● [3] F. Gers, N. Schraudolph, and J. Schmidhuber. Learning precise timing


with LSTM recurrent networks. Journal of Machine Learning
Research,2002.

● [4] A. Graves, A. Mohamed, and G. Hinton. Speech recognition


with deep recurrent neural networks. In Proc. ICASSP, 2013.

● [5] T. Mikolov. Statistical Language Models based on Neural Networks.


PhD thesis, Brno University of Technology, 2012.
● [6] A. Graves. Sequence transduction with recurrent neural networks. In
ICML Representation Learning Worksop, 2012.

● [7] A. Graves. Practical variational inference for neural networks. In


Advancesin Neural Information Processing Systems, volume 24, pages
2348-2356,2011.

● [8] A. Graves and J. Schmidhuber. Framewise phoneme classication with


bidirectional LSTM and other neural network architectures. Neural
Networks, 18:602-610, 2005.

● [9] A. Graves and J. Schmidhuber. Oine handwriting recognition with


multidimensional recurrent neural networks. In Advances in Neural
Information Processing Systems, volume 21, 2008.

● [10] C. Bishop. Neural Networks for Pattern Recognition. Oxford


University Press, Inc., 1995.

You might also like