0% found this document useful (0 votes)
24 views25 pages

Unit Ai 4

Uploaded by

MS.KIRUTHIKA V
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views25 pages

Unit Ai 4

Uploaded by

MS.KIRUTHIKA V
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

UNIT-IV

NATURAL LANGUAGE PROCESSING (NLP)

NLP:
NLP is a field of computer science, artificial intelligence and linguistics that
studies how humans and computers interact with language.

NLP USED FOR:


NLP is a machine learning technology that gives computers the ability to
interpt, manipulate, and comprehend human language.

IS NLP AI or ML?
NLP is a subfield of computer science and artificial intelligence(AI) that uses
machine learning to enable computers to understand and communicate with human
language.

IS NLP THE FUTURE OF AI?


NLP stands as aadvancing domain with extensive applications across diverse
industrial sectors. Its surge in popularity within these sectors can be attributed to
the exponential growth of AI driven technology.

IS NLP A PROGRAMMING LANGUAGE?


It is an ontology assisted way of programming in terms of natural language
sentences.
LAYERS OF NATURAL LANGUAGE PROCESSING(NLP).

LEXICAL ANALYSIS:
 This phase scans the source code as a stream of characters and
converts it into meaningful lexemes.

 It divides the whole text into paragraphs sentences,words.

SYNTACTIC ANALYSIS:
 Syntactic analysis is used to check grammar,word arrangements
and shows the relationship among the words.
SEMANTIC ANALYSIS:
 Syntactic analysis is concerned with the meaning
representation. It mainly focuses on the literal meaning of
words, phrases and sentences.

DISCOURSE INTEGRATION:
 Discourse integration depends upon the sentences that
proceeds it and also invokes the meaning of the sentence
that follow it.

PRAGMATIC ANALYSIS:
 It helps you to discover the intended effect by applying a
set of rules that characterize cooperative dialogues.

ADVANTAGES OF NLP:
 NLP helps users to ask questions about any subject and
get a direct response within seconds.

 NLP helps computers to communicate with humans in


their language.

 It is very time efficient.


DISADVANTAGES OF NLP:
 NLP is unpredictable.
 NLP may not show context
 NLP may require more keystrokes.
NLP TECHNIQUES:
NLP encompasses a wide array of technique that aimed at enabling
computers to process and understand human language.

 Test processing and preprocessing in NLP

 Dividing text into smaller units, such as words or sentences

 Reducing words to their base or root forms.

 Removing common words(like “and” , “the” , “is”) that may not


carry significant meaning.

 Syntax and parsing in NLP

 Assigning parts of speech to each word in a sentence(eg.,


noun,verb,adjective)

 Analyzing the grammatical structure of a sentence to identify


relationships between words.

 Breaking down a sentence into its constituent parts or phrases(eg.,


noun phrases, verb phrases).

 Semantic Analysis

 Identifying and classifying entities in text, such as names of people,


organizations,locations,dates,etc.,

 Determining which meaning of a word is used in a given context.

 Identifying when different words refer to the same entity in a


text.(eg., “he” refers to “John”).

 Text classification in NLP

 Identifying topics or themes within a large collection of documents.

 Classifying text as spam or not spam.


 Question Answering

 Retrival based QA: Finding and returning the most relevant text
passage in response to a query.

 Generating an answer based on the information available in a text


corpus.

FUTURE SCOPE OF NLP

 NLP has a promising future and is expected to improve existing technologies


and make interactions with technology more natural.

 It has numerous possibilities and applications. Advancements in field like


sppech recognition, automated machine translation, sentiment analysis and
chatbots.

FUTURE ENHANCEMENTS OF NLP


 The future of NLP holds exciting possibilities across various sectors.

 Healthcare: NLP can transform patient care through improved diagnostics,


personalized treatment plans and efficient patient doctor communication.

SYNTACTIC PROCESS IN NLP


 Syntactic processing is the process of analyzing the grammatical structure of
a sentence to understand its meaning.
 This involves identifying the different parts of speech in a sentence, such as
nouns, verbs, adjectives, adverbs and how they relate to each other in order
to give proper meaning to the sentence.

SYNTACTIC PROCESSING WORK


“The cat sat on the mat”

“cat” as a noun

“sat” as a verb

“on” as a preposition

“mat” as a noun

It would also involve understanding that “cat” is the subject of the sentence and
“mat” is the object.

APPLICATIONS OF SYNTACTIC PROCESSING


1. Language translation:
Understanding the syntactic structure of a sentence is crucial for
accurate translation.

2. Sentiment analysis:
Identifying the relationships between words and phrases helps
determine the sentiment of a text.

3. Question answering:
Syntactic processing helps identify the relationships between
entities and actions in a text.
4. Text summarization:
Understanding the syntactic structure of a text helps identify the most
important information.

SEMANTIC ANALYSIS IN NLP


Semantic analysis is a technique used in NLP to help machines
understand the meaning of words, sentences and texts by considering the context of
the text.

EXAMPLE:

“The boy ate the apple” defines an apple as a fruit. while,

“The boy went to Apple” define Apple as a brand or store.

APPLICATIONS OF SEMANTIC ANALYSIS


1. Information Retrieval:
It improves search engines by understanding the meaning behind
user queries and document content.

2. Question Answering:
Semantic analysis helps answer complex questions by identifying
relevant information and relationships in text.

3. Text Summarization:
Semantic analysis summarizes long documents by extracting key
concepts and relationships.
4. Sentiment Analysis:
It determines the sentiment and emotional tone behind text, such
as detecting positive or negative opinions.

5. Named Entity Recognition:


Semantic analysis identifies and categorizes named entities
(people, places, organizations) in text.

6. Relationship Extraction:
Semantic analysis identifies relationships between entities, such
as "person A is a colleague of person B".

ADVANTAGES OF SEMANTIC ANALYSIS


1. Improved understanding:
Semantic analysis provides a deeper understanding of the
meaning and context of text, leading to more accurate interpretations.

2. Enhanced information retrieval:


Semantic analysis improves search results by capturing the
nuances of language and returning more relevant information.
3. Better sentiment analysis:
Semantic analysis accurately identifies sentiment and emotional
tone, helping businesses and organizations understand public opinion.

4. More accurate entity recognition:


Semantic analysis identifies and categorizes named entities,
improving data quality and facilitating data integration.

5. Increased accuracy in question answering:


It provides more accurate answers to complex questions by
understanding relationships and context.

SEMANTIC ANALYSIS WORKS


ELEMENTS OF SEMANTIC ANALYSIS
 Hyponyms:

o This refers to a specific lexical entity having a relationship


with a more generic verbal entity called hypernym.

For example: red, blue, and green are all hyponyms of color, their hypernym.

 Meronomy:

o Refers to the arrangement of words and text that


denote a minor component of something.

For example: mango is a meronomy of a mango tree.

 Polysemy:
o It refers to a word having more than one meaning.
However, it is represented under one entry.

For example: the term ‘dish’ is a noun. In the sentence, ‘arrange the dishes
on the shelf,’ the word dishes refers to a kind of plate.

 Synonyms:

o This refers to similar-meaning words.

For example: abstract (noun) has a synonyms summary–synopsis.

 Antonyms:

o This refers to words with opposite meanings.

For example: cold has the antonyms warm and hot.


 Homonyms:

o This refers to words with the same spelling and


pronunciation, but reveal a different meaning
altogether.

For example: bark (tree) and bark (dog).

TASKS INVOLVED IN SEMANTIC ANALYSIS


 Word sense Disambiguation

 Relationship extraction

WORD SENSE DISAMBIGUATION:

It refers to an automated process of determining the sense or meaning


of the word in a given context.

EXAMPLE:

‘Raspberry Pi’ can refer to a fruit, a single-board computer, or even a


company (UK-based foundation). Hence, it is critical to identify which meaning
suits the word depending on its usage.

RELATIONSHIP EXTRACTION:
It determine the semantic relationship between words in a text. In this,
relationship include various entities such as an individuals name,place, company,
designation, etc.,

EXAMPLE:

Elon Musk is one of the co-founders of Tesla, which is based in Austin,


Texas.
This phrase illustrates two different relationships.

Elon Musk is the co-founder of Tesla


[Person] [Company]

Tesla is based in Austin, Texas


[Company] [Place]

PARALLEL AND DISTRIBUTED AI PSYCOLOGICAL


MODELLING
It refers to the use of parallel and distributed computing techniques to
model human cognition and behavior. This involves.

1. Parallel processing:
It using multiple processing units to perform tasks
simultaneously, mimicking the brain's parallel processing capabilities.

2. Distributed processing:

It breaking down complex tasks into smaller sub-tasks and


distributing them across multiple processing units or agents, similar to
how cognitive tasks are distributed across different brain regions.

GOALS:

 Scalability:

Model complex cognitive phenomena that require large-scale


processing.
 Flexibility:

Accommodate individual differences and adapt to changing


environments.

 Real-time processing:
Enable real-time interaction and feedback.

APPLICATIONS:
 Cognitive architectures:

Integrate parallel and distributed processing into cognitive


models.

 Neural networks:

Use parallel and distributed computing to train and deploy


neural networks.

 Multi-agent systems:

Model social behavior and interactions using distributed AI.

 Human-computer interaction:

Develop more natural and intuitive interfaces using parallel


and distributed AI.
 Cognitive robotics:

Control and coordinate robotic behavior using distributed AI.

BENEFITS:
 Improved scalability and flexibility

 Enhanced real-time processing capabilities

 More accurate and comprehensive cognitive models

 Better human-computer interaction and human-robot interaction

 Potential applications in fields like education, healthcare, and social


sciences.

CHALLENGES:
 Complexity:
Managing and coordinating parallel and distributed processes

 Communication:
Ensuring efficient data exchange between processing units.
 Synchronization:
Coordinating tasks and maintaining consistency across processing
units

 Scalability:
Adapting to large-scale problems and datasets

 Interpretability:
Understanding and explaining complex AI models and behaviors.

PARALLELISM AND DISTRIBUTED IN REASONING SYSTEM


Parallelism and distributed processing in reasoning systems refer to
the use of multiple processing units or agents to perform reasoning tasks
simultaneously, improving efficiency and scalability.

PARALLELISM:
1. Task parallelism: Dividing a reasoning task into smaller sub-tasks and
executing them concurrently.

2. Data parallelism: Distributing data across multiple processing units and


performing the same reasoning task on each unit.

3. Pipelined parallelism: Breaking down a reasoning task into a series of


stages and executing them concurrently.
DISTRIBUTED PROCESSING:
1. Decentralized reasoning: Distributing reasoning tasks across multiple
agents or nodes, each contributing to the overall solution.

2. Distributed knowledge representation: Storing knowledge across


multiple nodes, enabling efficient access and reasoning.

3. Communication and coordination: Ensuring data exchange and


synchronization between nodes.

BENEFITS:
1. Scalability: Handle large-scale reasoning tasks and knowledge bases.
2. Efficiency: Reduce processing time through concurrent execution.
3. Flexibility: Adapt to changing environments and requirements.

APPLICATIONS:
1. Artificial intelligence: Enhance reasoning capabilities in AI systems.
2. Expert systems: Improve performance and scalability in expert systems.
3. Multi-agent systems: Enable distributed reasoning and decision-making.

CHALLENGES:
1. Coordination and communication: Manage data exchange and
synchronization.

2. Consistency and coherence: Ensure consistent and coherent reasoning


results.

3. Scalability and efficiency: Balance computational resources and reasoning


performance.

LEARNING CONNECTIONST MODEL:


 The Connectionist Model in AI is a cognitive architecture that posits that
cognitive processes can be understood in terms of the connections and
interactions between simple computational units or nodes.

 This model is inspired by the structure and function of the brain and is often
used to describe the processing of information in neural networks.

KEY FEATURES:

 Distributed Representation: Information is represented across


multiple nodes, rather than being localized in a single location.

 Parallel Processing: Multiple nodes process information


simultaneously, allowing for fast and efficient processing.

 Learning and Adaptation: Connections between nodes can be


modified based on experience, allowing the system to learn and
adapt.
 Activation and Inhibition: Nodes can be activated or inhibited
by other nodes, allowing for complex patterns of activity to emerge.

TYPES OF CONNECTIONST MODEL:


 Feedforward Networks: Information flows only in one direction, from
input nodes to output nodes.

 Recurrent Networks: Information can flow in a loop, allowing for


feedback and recurrent processing.

 Attractor Networks: The network converges to a stable state, or


attractor, which represents the processed information.

APPLICATIONS OF CONNECTIONST MODEL:


 Pattern Recognition: Connectionist models can learn to recognize
patterns in data, such as images or speech.

 Language Processing: Connectionist models can learn to process


and generate natural language.

 Memory and Learning: Connectionist models can learn and


remember new information.

 Decision-Making: Connectionist models can make decisions based


on patterns and associations learned from data.
BENEFITS OF CONNECTIONIST MODEL:
 Flexibility: Can be used to model a wide range of cognitive tasks and
processes.

 Scalability: Can be applied to large-scale problems and datasets.

 Biological Plausibility: Is inspired by the structure and function of the


brain.

LIMITATIONS OF CONNECTIONIST MODEL:


 Complexity: Can be difficult to understand and analyze due to the
complex interactions between nodes.

 Lack of Interpretability: Can be challenging to interpret the results of


Connectionist models.

 Training Requirements: Requires large amounts of data and


computational resources to train.

HOPEFIELD NETWORKS
 Hopfield networks are a type of recurrent artificial neural network that serve
as a content-addressable ("associative") memory system with binary
threshold nodes.

 They are a simple example of a neural network that can store and recall
memories.
CHARACTERISTICS OF HOPEFIELD NETWORK:
 Recurrent: Hopfield networks have feedback connections, which
allow the network to settle into a stable state.

 Binary threshold nodes: Each node in the network has a binary


output (0 or 1) and a threshold value that determines its output.

 Symmetric weights: The weights connecting nodes are symmetric,


meaning that the weight from node A to node B is the same as the weight
from node B to node A.

HOW HOPEFIELD NETWORKS WORK:


 Initialization: The network is initialized with a set of random
weights and biases.

 Training: The network is trained on a set of patterns, where each


pattern is a binary vector.

 Storage: The network stores the patterns in its weights and biases.

 Recall: When a noisy or incomplete pattern is presented to the


network, it settles into a stable state that corresponds to the closest
stored pattern.
APPLICATIONS OF HOPEFIELD NETWORKS:
 Associative memory: Hopfield networks can store and recall patterns,
making them useful for associative memory tasks.

 Optimization: Hopfield networks can be used to solve optimization


problems by encoding the problem into a pattern and using the network to
find the optimal solution.

 Machine learning: Hopfield networks can be used as a building block for


more complex machine learning models.

LIMITATIONS OF HOPEFIELD NETWORKS:


 Capacity: Hopfield networks have a limited capacity for storing patterns.

 Convergence: The network may not always converge to a stable state.

 Sensitivity to noise: The network can be sensitive to noise in the input


patterns.

NEURAL NETWORKS:
A Neural networks is a machine learning model inspired by the structure and
function of the human brain. It consists of layers of interconnected nodes on
“neurons”. Which process and transmit information.
Input layer : Receives the data to be proceed.

Hidden layer: Performs complex calculation and transformation.


Output layer: Generates the final prediction on result.

NEURAL NETWORK CAN:


Learn: Adapt to new data and imrove performance.

Classify: Categories data into classes or groups

Predict: Forecast future values or outcomes.

Generate: Create new data, like text or images.

APPLICATIONS OF NEURAL NETWORKS:


 Image recognition

 Natural language processing

 Speech recognition

 Predictive analysis.

ADVANTAGES OF NEURAL NETWORKS:


 Adaptability: Neural networks are useful for activities where the link
between inputs and outputs is complex or not well defined because they
can adapt to new situations and learn from data.

 Pattern Recognition: Their proficiency in pattern recognition renders


them efficacious in tasks like as audio and image identification, natural
language processing, and other intricate data patterns.

 Parallel Processing: Because neural networks are capable of parallel


processing by nature, they can process numerous jobs at once, which
speeds up and improves the efficiency of computations.

 Non-Linearity: Neural networks are able to model and comprehend


complicated relationships in data by virtue of the non-linear activation
functions found in neurons, which overcome the drawbacks of linear
models.

DISADVANTAGES OF NEURAL NETWORKS:

 Computational Intensity: Large neural network training can be a


laborious and computationally demanding process that demands a lot of
computing power.

 Black box Nature: As “black box” models, neural networks pose a


problem in important applications since it is difficult to understand how
they make decisions.

 Overfitting: Overfitting is a phenomenon in which neural networks


commit training material to memory rather than identifying patterns in
data. Although regularization approaches help to alleviate this, the problem
still exists.

TYPES OF NEURAL NETWORKS:


 Convolutional neural networks
 Feedforward neural networks
 Recurrent neural networks
 Generative adversarial networks
 Modular neural networks
Feedforward Neural Network :
 The feedforward neural network is one of the most basic artificial neural
networks.

 In this ANN, the data or the input provided travels in a single direction.

 It enters into the ANN through the input layer and exits through the output
layer while hidden layers may or may not exist.

 So the feedforward neural network has a front-propagated wave only and


usually does not have backpropagation.

Convolutional Neural Network :


 A Convolutional neural network has some similarities to the feed-forward
neural network, where the connections between units have weights that
determine the influence of one unit on another unit.

 But a CNN has one or more than one convolutional layer that uses a
convolution operation on the input and then passes the result obtained in the
form of output to the next layer.

 CNN has applications in speech and image processing which is particularly


useful in computer vision.

Modular Neural Network:


 A Modular Neural Network contains a collection of different neural
networks that work independently towards obtaining the output with no
interaction between them.

 Each of the different neural networks performs a different sub-task by


obtaining unique inputs compared to other networks.

 The advantage of this modular neural network is that it breaks down a large
and complex computational process into smaller components, thus
decreasing its complexity while still obtaining the required output.
Radial basis function Neural Network:
 Radial basis functions are those functions that consider the distance of a
point concerning the center.

 RBF functions have two layers. In the first layer, the input is mapped into all
the Radial basis functions in the hidden layer and then the output layer
computes the output in the next step.

 Radial basis function nets are normally used to model the data that
represents any underlying trend or function.

Recurrent Neural Network:


 The Recurrent Neural Network saves the output of a layer and feeds this
output back to the input to better predict the outcome of the layer.

 The first layer in the RNN is quite similar to the feed-forward neural
network and the recurrent neural network starts once the output of the first
layer is computed.

 After this layer, each unit will remember some information from the
previous step so that it can act as a memory cell in performing computations.

You might also like