python - Flask-whooshsqlalchemyplus中文分词搜索问题
伊谢尔伦
伊谢尔伦 2017-04-18 10:06:25
0
2
951
伊谢尔伦
伊谢尔伦

小伙看你根骨奇佳,潜力无限,来学PHP伐。

reply all(2)
刘奇

If you are using a postgresql database, check whether the encoding of your database is UTF-8? You can view database information through l in the database shell:

postgres=# \l
                                  List of databases
   Name    |  Owner   | Encoding  |   Collate   |    Ctype    |   Access privileges   
-----------+----------+-----------+-------------+-------------+-----------------------
 db1  | owner | UTF8      | en_US.UTF-8 | en_US.UTF-8 | =Tc/owner         +
           |          |           |             |             | owner=CTc/owner
 db2     | owner   | SQL_ASCII | C           | C           | =Tc/owner           +

Is it possible to search in Chinese in the database shell? Can be checked via the following sql:

SELECT to_tsvector('我们') @@ to_tsquery('我:*');

The above db1 is UTF-8, so it supports Chinese search,

postgres=# \c db1
db1=#
db1=# SELECT to_tsvector('我们') @@ to_tsquery('我:*');
 ?column? 
----------
 t
(1 row)

db1=#

db2 is SQL_ASCII and does not support Chinese search

db1=# \c db2
db2=#
db2=# SELECT to_tsvector('我们') @@ to_tsquery('我:*');
NOTICE:  text-search query contains only stop words or doesn't contain lexemes, ignored
 ?column? 
----------
 f
(1 row)

db2=#
Peter_Zhu

You can refer to this: https://www.v2ex.com/t/274600...

I used flask-whooshalchemy before, but the Chinese word segmentation effect was not good. Then I used jieba to make the word segmentation table and index, and then whooshalchemy searched the word segmentation table. The effect was okay.

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template