Python的NLP工具库汇总
关于自然语言处理的重要Python工具的集合。
-
NLTK
NLTK is a leading platform for building Python programs to work with human language data. It provides easy-to-use interfaces to lexical resources such as WordNet.It also has text processing libraries for classification, tokenization, stemming, tagging, parsing, and semantic reasoning
Website:
Installation:
Install NLTK: run sudo pip install -U nltk
Install Numpy (optional): run sudo pip install -U numpy
Test installation: run python then type import nltk
-
Pattern
Pattern has tools for natural language processing like part-of-speech taggers, n-gram search, sentiment analysis, WordNet.It supports machine learning vector space model, clustering, SVM.
Website:
https://github.com/clips/pattern
Installation:
pip install pattern
-
TextBlob
TextBlob is a Python library for processing textual data. It provides a simple API for diving into common natural language processing tasks such as part-of-speech tagging, noun phrase extraction, sentiment analysis, classification, translation, and more.
Website: -
Gensim
Gensim is a Python library for topic modelling, document indexing and similarity retrieval with large corpora. It can process input larger than RAM. According to the author it is "the most robust, efficient and hassle-free piece of software to realize unsupervised semantic modelling from plain text
Website:
Installationpip install -U gensim
-
PyNLPl
Python Natural Language Processing Library(pronounced as: pineapple) is a Python library for Natural Language Processing. It is a collection of various independent or loosely interdependent modules useful for common, and less common, NLP tasks. PyNLPl can be used for example the computation of n-grams, frequency lists and distributions, language models. There are also more complex data types, such as Priority Queues, and search algorithms, such as Beam Search
Installation:
#Linux
sudo apt-get install pymol
# Fedora
yum install pymol
-
spaCy
It’s commercial open source software. Industrial strength NLP with Python and Cython.Its a pipeline for fast, state-of-the-art natural language processing.
Website:
https://github.com/proycon/pynlpl
Installationpip install spacy
-
Polyglot
Polyglot is a natural language pipeline that supports massive multilingual applications. It supports tokenization for 165 languages, Language detection for 196 languages, Named Entity Recognition for 40 languages, Part of Speech Tagging for 16 languages, Sentiment Analysis for 136 languages, Word Embeddings for 137 languages, Morphological analysis for 135 languages, Transliteration for 69 languages
Website:
https://pypi.python.org/pypi/polyglot
Installationpip install polyglot
-
MontyLingua
MontyLingua is a free, commonsense-enriched, end-to-end natural language understander for English. Feed raw English text into MontyLingua, and the output will be a semantic interpretation of that text. Perfect for information retrieval and extraction, request processing, and question answering. From English sentences, it extracts subject/verb/object tuples, extracts adjectives, noun phrases and verb phrases, and extracts people's names, places, events, dates and times, and other semantic information
Website:
http://web.media.mit.edu/~hugo/montylingua/
-
BLLIP Parser
BLLIP Parser (also known as the Charniak-Johnson parser) is a statistical natural language parser including a generative constituent parser and discriminative maximum entropy reranker. It includes command-line and Python interfaces. -
Quepy
Quepy is a python framework to transform natural language questions to queries in a database query language. It can be easily customized to different kinds of questions in natural language and database queries. So, with little coding you can build your own system for natural language access to your database.
Website
https://github.com/machinalis/quepy
http://quepy.machinalis.com/