SnowNLP is a python class library developed by Chinese people. It can easily process Chinese text content. It was inspired by TextBlob. Since most of the natural language processing libraries now are basically for English, I wrote one A class library that is convenient for processing Chinese, and unlike TextBlob, NLTK is not used here. All algorithms are implemented by ourselves, and it comes with some trained dictionaries. Note that this program handles unicode encoding, so please decode it into unicode yourself when using it. Released under the MIT license.
its github homepage
I modified the python code in the above link and added some comments to facilitate your understanding:
from snownlp import SnowNLP# SnowNLP库:# words:分词# tags:关键词# sentiments:情感度# pinyin:拼音# keywords(limit):关键词# summary:关键句子# sentences:语序# tf:tf值# idf:idf值s = SnowNLP(u'这个东西真心很赞')# s.words # [u'这个', u'东西', u'真心', u'很', u'赞']print(s.words) s.tags # [(u'这个', u'r'), (u'东西', u'n'), (u'真心', u'd')# , (u'很', u'd'), (u'赞', u'Vg')]print(s.sentiments)# s.sentiments # 0.9769663402895832 positive的概率# s.pinyin # [u'zhe', u'ge', u'dong', u'xi', # u'zhen', u'xin', u'hen',# u'zan']4s = SnowNLP(u'「繁體字」「繁體中文」的叫法在臺灣亦很常見。')# s.han # u'「繁体字」「繁体中文」的叫法在台湾亦很常见。'print(s.han)
from snownlp import SnowNLP text = u'''自然语言处理是计算机科学领域与人工智能领域中的一个重要方向。 它研究能实现人与计算机之间用自然语言进行有效通信的各种理论和方法。 自然语言处理是一门融语言学、计算机科学、数学于一体的科学。 因此,这一领域的研究将涉及自然语言,即人们日常使用的语言, 所以它与语言学的研究有着密切的联系,但又有重要的区别。 自然语言处理并不是一般地研究自然语言, 而在于研制能有效地实现自然语言通信的计算机系统, 特别是其中的软件系统。因而它是计算机科学的一部分。'''s = SnowNLP(text)print(s.keywords(6)) # [u'语言', u'自然', u'计算机'] 不能用tags输出关键字.s.summary(3) # [u'因而它是计算机科学的一部分', u'自然语言处理是一门融语言学、计算机科学、# 数学于一体的科学', u'自然语言处理是计算机科学领域与人工智能领域中的一个重要方向']s.sentences# print(s.sentences)print(s.sentiments) # 1.0s = SnowNLP([[u'这篇', u'文章'], [u'那篇', u'论文'], [u'这个']])# print(s.tf)# print(s.idf)# print(s.sim([u'文章'])) # [0.3756070762985226, 0, 0]
Before compiling and running, you must first install the snownlp package, followed by pylab and pandas modules:
Enter in the VS Code terminal (View->Integrated Terminal):
pip install snownlp
pip install pylab
pip install pandas
The premise is that you have pip installed. If pip is not installed, you can check my previous article
In VS Code, we can right-click the module name to view the definition. You can see the implementation of the module. I have to say that VS Code is very powerful. I hope Microsoft can continue to go like this and move toward open source and cross-platform! !
Then I randomly extracted the Douban review of "Good Will Hunting" and put it in a txt:
In fact, in most cases, the mainland translation is more flavorful than the Hong Kong translation.
It is not ur fault!
I only saw this movie occasionally on TV. It was really touching when I watched it. Why could such a genius have such a tortuous life?
I think the script is very good but it was not fully filmed :) I still have some doubts about the actors’ performances~ Haha
Good review
I just watched it a few days ago, a movie that touches my heart, I’m looking for it Real Life
This movie review is very well written, my eyes are moist
Very good film
The last step is the processing procedure:
from snownlp import SnowNLPimport pandas as pdimport pylab as pl txt = open('F:/_analyse_Emotion.txt') text = txt.readlines() txt.close()print('读入成功') sentences = [] senti_score = []for i in text: a1 = SnowNLP(i) a2 = a1.sentiments sentences.append(i) # 语序... senti_score.append(a2)print('doing') table = pd.DataFrame(sentences, senti_score)# table.to_excel('F:/_analyse_Emotion.xlsx', sheet_name='Sheet1')# ts = pd.Series(sentences, senti_score)# ts = ts.cumsum()# print(table)x = [1, 2, 3, 4, 5, 6, 7, 8] pl.mpl.rcParams['font.sans-serif'] = ['SimHei'] pl.plot(x, senti_score) pl.title(u'心 灵 捕 手 网 评') pl.xlabel(u'评 论 用 户') pl.ylabel(u'情 感 程 度') pl.show()
The final effect:
It may be a little inaccurate. I also extracted the data casually, but snownlp still claims that sentiment analysis is very accurate!
The above is the detailed content of Share python snownlp tutorial examples. For more information, please follow other related articles on the PHP Chinese website!