Natural Language Processing Using Python: (With NLTK, Scikit-Learn and Stanford NLP Apis)
Natural Language Processing Using Python: (With NLTK, Scikit-Learn and Stanford NLP Apis)
3
Why Python?
As far as I know, you cannot execute a Python program (compiled to bytecode) on every
machine, such as on windows, or on linux without modification.
You are incorrect. The python bytecode is cross platform. See Is python bytecode version-dependent?
Is it platform-dependent? on Stack Overflow.
However, it is not compatible across versions. Python 2.6 cannot execute Python 2.5 files. So while
cross-platform, its not generally useful as a distribution format.
Speed. Strict interpretation is slow. Virtually every "interpreted" language actually compiles the source
code into some sort of internal representation so that it doesn't have to repeatedly parse the code.
In python's case it saves this internal representation to disk so that it can skip the parsing/compiling
process next time it needs the code.
22
NER and Coreference Resolution using NLTK
DEMO – Named entity chunking using NLTK
Using Stanford CoreNLP tool for Coreference Resolution
Download and installation instruction at
http://stanfordnlp.github.io/CoreNLP/
Python Wrapper for CoreNLP
https://github.com/dasmith/stanford-corenlp-python
• DEMO – Coreference Resolution using Stanford CoreNLP
• Scripts: coreference_resolution.py
named_entity_chunking.py
24
WordNet
DEMO - NLTK WordNet (wordnet.py)
Finding all the synonym set (SynSet) of a word for all possible pos
tags.
Finding all the SynSets if POS tag is known.
Finding hypernym, hyponym of a synset.
Finding similarities between two words.