sphinx全文检索引擎

来源:互联网 发布:电商运营优化计划 编辑:程序博客网 时间:2024/04/30 12:26

官网和文档:http://sphinxsearch.com/docs/

中文参考:http://www.sphinxsearch.org/

python sdk:http://pypi.python.org/pypi/sphinxapi

增量索引参考:http://blog.csdn.net/jianglei421/article/details/5431946

校友张晏的blog有写他对sphinx的使用:http://blog.s135.com/post/360/

理论:

1.sphinx支持Mysql协议,所以除了使用普通Sphinx SDK封装的API以外,还可以使用mysql客户端对searchd发起连接和查询(使用SphinxQL语言)

2.重新编译MYSQL可以把sphinx的Mysql存储引擎编译进去,这样可以创建sphinxSE数据表,用于查询的时候跟searchd通信,直接获得查询结果

3.命令行工具:indexer/searchd/search/...

4.增量索引需要注意的是:1>下次建增量索引的时候老的增量索引文件会被覆盖,所以应该在下次建增量索引之前把老的增量索引合并进全量索引中或者合并到某个指定索引中(然后定时合并到全量索引)  2>每次建增量索引时需要把该次增量的标记记下,下次增量从该标记开始即可,如此反复...不记录标记的话增量需要处理的数据会原来越大越来越慢,最终失去增量在时效上的意义

5.中文分词需要注意的问题:做好配置的情况下(配置如下所示,参考自http://www.sphinxsearch.org/sphinx-tutorial),sphinx是可以对中文做单个汉字的切词和索引的;如果需要根据语义做的词语分词就需要安装一些中文分词的插件了(比如sfc神马的)

ngram_len = 1 # 对于非字母型数据的长度切割ngram_chars = U+4E00..U+9FBF, U+3400..U+4DBF, U+20000..U+2A6DF, U+F900..U+FAFF,\U+2F800..U+2FA1F, U+2E80..U+2EFF, U+2F00..U+2FDF, U+3100..U+312F, U+31A0..U+31BF,\U+3040..U+309F, U+30A0..U+30FF, U+31F0..U+31FF, U+AC00..U+D7AF, U+1100..U+11FF,\U+3130..U+318F, U+A000..U+A48F, U+A490..U+A4CF

6.对于json字符串建立索引:今天(2013.1.18)业务上有个需求是对数据表中存储json串建全文索引,那么就有两种情况了,一种是内部utf8编码的utf8 json串、另一种是对内部unicode编码直接加\转义的utf8 json串。前者和我们一般的处理方式没什么区别;后者需要把搜索词也做unicode转义然后进行搜索

下面是菜鸟的第一次实战:

1.安装(sphinx的rpm包不支持自定义安装路径--prefix,error: package sphinx is not relocatable)

[dongsong@bogon ~]$ sudo rpm -i sphinx-2.0.4-1.rhel6.x86_64.rpmSphinx installed!Now create a full-text index, start the search daemon, and you're all set.To manage indexes:    editor /etc/sphinx/sphinx.confTo rebuild all disk indexes:    sudo -u sphinx indexer --all --rotateTo start/stop search daemon:    service searchd start/stopTo query search daemon using MySQL client:    mysql -h 0 -P 9306    mysql> SELECT * FROM test1 WHERE MATCH('test');See the manual at /usr/share/doc/sphinx-2.0.4 for details.For commercial support please contact Sphinx Technologies Inc athttp://sphinxsearch.com/contacts.html

可以做一个符号链接把本地html手册链接到apache目录下,方便本地查看帮助

[dongsong@bogon python_study]$ sudo ln -sf /usr/share/doc/sphinx-2.0.4/sphinx.html ./sphinx.html
http://172.26.16.100/sphinx.html

2.使用

[root@bogon sphinx]# indexer --config /etc/sphinx/sphinx.conf spiderSphinx 2.0.4-id64-release (r3135)Copyright (c) 2001-2012, Andrew AksyonoffCopyright (c) 2008-2012, Sphinx Technologies Inc (http://sphinxsearch.com)using config file '/etc/sphinx/sphinx.conf'...indexing index 'spider'...WARNING: attribute 'id' not found - IGNORINGWARNING: Attribute count is 0: switching to none docinfocollected 20011 docs, 115.0 MBsorted 5.4 Mhits, 100.0% donetotal 20011 docs, 115049820 bytestotal 33.003 sec, 3486001 bytes/sec, 606.33 docs/sectotal 2 reads, 0.123 sec, 25973.1 kb/call avg, 61.9 msec/call avgtotal 188 writes, 2.964 sec, 585.8 kb/call avg, 15.7 msec/call avg[root@bogon sphinx]# search -c /etc/sphinx/sphinx.conf 中国Sphinx 2.0.4-id64-release (r3135)Copyright (c) 2001-2012, Andrew AksyonoffCopyright (c) 2008-2012, Sphinx Technologies Inc (http://sphinxsearch.com)using config file '/etc/sphinx/sphinx.conf'...index 'spider': query '中国 ': returned 150 matches of 150 total in 0.000 secdisplaying matches:1. document=719806, weight=26542. document=1397236, weight=26543. document=3733569, weight=17294. document=13384, weight=17225. document=3563788, weight=17056. document=3742995, weight=17057. document=17777, weight=16988. document=3741757, weight=16989. document=3888109, weight=169810. document=2472909, weight=168911. document=3741705, weight=168912. document=2145250, weight=167613. document=2600863, weight=167614. document=3561074, weight=167615. document=3737639, weight=167616. document=3746591, weight=167617. document=3805049, weight=167618. document=1822, weight=165419. document=7755, weight=165420. document=13399, weight=1654words:1. '中国': 150 documents, 237 hits
3.search可以搜到数据,api搜索报错(api搜索需要启动searchd进程,searchd启动过程没有报错,但是并没有在指定端口上监听,也没有实际的searchd进程存在)

[dongsong@bogon api]$ vpython test.py -h localhost -p 9312 -i spider 中国query failed: connection to localhost;9312 failed ([Errno 111] Connection refused)
在/etc/sphinx/sphinx.conf中查看searchd的日志文件位置

searchd{        listen                  = 9312        listen                  = 9306:mysql41        log                     = /var/log/sphinx/searchd.log        query_log               = /var/log/sphinx/query.log        read_timeout            = 5        max_children            = 30        pid_file                = /var/run/sphinx/searchd.pid        max_matches             = 1000        seamless_rotate         = 1        preopen_indexes         = 1        unlink_old              = 1        workers                 = threads # for RT to work        binlog_path            = /var/data}
打开日志文件/var/log/sphinx/search.log找到问题的根源
[Fri Jun 15 10:28:44.583 2012] [ 7889] listening on all interfaces, port=9312[Fri Jun 15 10:28:44.583 2012] [ 7889] listening on all interfaces, port=9306[Fri Jun 15 10:28:44.585 2012] [ 7889] FATAL: failed to open '/var/data/binlog.lock': 2 'No such file or directory'[Fri Jun 15 10:28:44.585 2012] [ 7888] Child process 7889 has been forked[Fri Jun 15 10:28:44.585 2012] [ 7888] Child process 7889 has been finished, exit code 1. Watchdog finishes also. Good bye![Fri Jun 15 10:29:09.968 2012] [ 7905] Child process 7906 has been forked[Fri Jun 15 10:29:09.970 2012] [ 7906] listening on all interfaces, port=9312[Fri Jun 15 10:29:09.970 2012] [ 7906] listening on all interfaces, port=9306[Fri Jun 15 10:29:09.987 2012] [ 7906] FATAL: failed to open '/var/data/binlog.lock': 2 'No such file or directory'[Fri Jun 15 10:29:09.993 2012] [ 7905] Child process 7906 has been finished, exit code 1. Watchdog finishes also. Good bye!

把binlog_path的配置注释掉就ok了

[dongsong@bogon api]$ vpython test.py -h localhost -p 9312 -i spider 中国      Query '中国 ' retrieved 3 of 3 matches in 0.005 secQuery stats:        '中国' found 4 times in 3 documentsMatches:1. doc_id=5, weight=1002. doc_id=80, weight=1003. doc_id=2012, weight=100

4.对于中文数据的检索,不在conf的index里面设置下面这项就搜不到中文

charset_table = U+FF10..U+FF19->0..9, 0..9, U+FF41..U+FF5A->a..z, U+FF21..U+FF3A->a..z,\A..Z->a..z, a..z, U+0149, U+017F, U+0138, U+00DF, U+00FF, U+00C0..U+00D6->U+00E0..U+00F6,\U+00E0..U+00F6, U+00D8..U+00DE->U+00F8..U+00FE, U+00F8..U+00FE, U+0100->U+0101, U+0101,\U+0102->U+0103, U+0103, U+0104->U+0105, U+0105, U+0106->U+0107, U+0107, U+0108->U+0109,\U+0109, U+010A->U+010B, U+010B, U+010C->U+010D, U+010D, U+010E->U+010F, U+010F,\U+0110->U+0111, U+0111, U+0112->U+0113, U+0113, U+0114->U+0115, U+0115, \U+0116->U+0117,U+0117, U+0118->U+0119, U+0119, U+011A->U+011B, U+011B, U+011C->U+011D,\ U+011D,U+011E->U+011F, U+011F, U+0130->U+0131, U+0131, U+0132->U+0133, U+0133, \U+0134->U+0135,U+0135, U+0136->U+0137, U+0137, U+0139->U+013A, U+013A, U+013B->U+013C, \U+013C,U+013D->U+013E, U+013E, U+013F->U+0140, U+0140, U+0141->U+0142, U+0142, \U+0143->U+0144,U+0144, U+0145->U+0146, U+0146, U+0147->U+0148, U+0148, U+014A->U+014B, \U+014B,U+014C->U+014D, U+014D, U+014E->U+014F, U+014F, U+0150->U+0151, U+0151, \U+0152->U+0153,U+0153, U+0154->U+0155, U+0155, U+0156->U+0157, U+0157, U+0158->U+0159,\ U+0159,U+015A->U+015B, U+015B, U+015C->U+015D, U+015D, U+015E->U+015F, U+015F, \U+0160->U+0161,U+0161, U+0162->U+0163, U+0163, U+0164->U+0165, U+0165, U+0166->U+0167, \U+0167,U+0168->U+0169, U+0169, U+016A->U+016B, U+016B, U+016C->U+016D, U+016D, \U+016E->U+016F,U+016F, U+0170->U+0171, U+0171, U+0172->U+0173, U+0173, U+0174->U+0175,\ U+0175,U+0176->U+0177, U+0177, U+0178->U+00FF, U+00FF, U+0179->U+017A, U+017A, \U+017B->U+017C,U+017C, U+017D->U+017E, U+017E, U+0410..U+042F->U+0430..U+044F, \U+0430..U+044F,U+05D0..U+05EA, U+0531..U+0556->U+0561..U+0586, U+0561..U+0587, \U+0621..U+063A, U+01B9,U+01BF, U+0640..U+064A, U+0660..U+0669, U+066E, U+066F, \U+0671..U+06D3, U+06F0..U+06FF,U+0904..U+0939, U+0958..U+095F, U+0960..U+0963, \U+0966..U+096F, U+097B..U+097F,U+0985..U+09B9, U+09CE, U+09DC..U+09E3, U+09E6..U+09EF, \U+0A05..U+0A39, U+0A59..U+0A5E,U+0A66..U+0A6F, U+0A85..U+0AB9, U+0AE0..U+0AE3, \U+0AE6..U+0AEF, U+0B05..U+0B39,U+0B5C..U+0B61, U+0B66..U+0B6F, U+0B71, U+0B85..U+0BB9, \U+0BE6..U+0BF2, U+0C05..U+0C39,U+0C66..U+0C6F, U+0C85..U+0CB9, U+0CDE..U+0CE3, \U+0CE6..U+0CEF, U+0D05..U+0D39, U+0D60,U+0D61, U+0D66..U+0D6F, U+0D85..U+0DC6, \U+1900..U+1938, U+1946..U+194F, U+A800..U+A805,U+A807..U+A822, U+0386->U+03B1, \U+03AC->U+03B1, U+0388->U+03B5, U+03AD->U+03B5,U+0389->U+03B7, U+03AE->U+03B7, \U+038A->U+03B9, U+0390->U+03B9, U+03AA->U+03B9,U+03AF->U+03B9, U+03CA->U+03B9, \U+038C->U+03BF, U+03CC->U+03BF, U+038E->U+03C5,U+03AB->U+03C5, U+03B0->U+03C5, \U+03CB->U+03C5, U+03CD->U+03C5, U+038F->U+03C9,U+03CE->U+03C9, U+03C2->U+03C3, \U+0391..U+03A1->U+03B1..U+03C1,U+03A3..U+03A9->U+03C3..U+03C9, U+03B1..U+03C1, \U+03C3..U+03C9, U+0E01..U+0E2E,U+0E30..U+0E3A, U+0E40..U+0E45, U+0E47, U+0E50..U+0E59, \U+A000..U+A48F, U+4E00..U+9FBF,U+3400..U+4DBF, U+20000..U+2A6DF, U+F900..U+FAFF, \U+2F800..U+2FA1F, U+2E80..U+2EFF,U+2F00..U+2FDF, U+3100..U+312F, U+31A0..U+31BF, \U+3040..U+309F, U+30A0..U+30FF,U+31F0..U+31FF, U+AC00..U+D7AF, U+1100..U+11FF, \U+3130..U+318F, U+A000..U+A48F,U+A490..U+A4CF
charset就不说了吧....
charset_type            = utf-8

5.处理增量

[root@bogon sphinx]# indexer --config /etc/sphinx/sphinx.conf spiderinc --rotateSphinx 2.0.4-id64-release (r3135)Copyright (c) 2001-2012, Andrew AksyonoffCopyright (c) 2008-2012, Sphinx Technologies Inc (http://sphinxsearch.com)using config file '/etc/sphinx/sphinx.conf'...indexing index 'spiderinc'...WARNING: attribute 'id' not found - IGNORINGWARNING: Attribute count is 0: switching to none docinfocollected 17 docs, 0.1 MBsorted 0.0 Mhits, 100.0% donetotal 17 docs, 87216 bytestotal 0.060 sec, 1444643 bytes/sec, 281.58 docs/sectotal 2 reads, 0.000 sec, 23.4 kb/call avg, 0.0 msec/call avgtotal 6 writes, 0.008 sec, 16.9 kb/call avg, 1.4 msec/call avgrotating indices: succesfully sent SIGHUP to searchd (pid=10459).
6.对于正在对外提供服务的索引(searchd已经征用的索引),调用indexer创建索引的时候加上--rotate可以不打断其服务(建立新的索引然后覆盖旧的,如不加会创建索引失败)

   如果第一次建立索引就用indexer --rotate则会失败(因为没有老版本索引可以覆盖)

7.示例代码

#查询cl = SphinxClient()cl.SetServer ( host, port )cl.SetWeights ( [100, 1] )cl.SetMatchMode ( mode )if filtervals:cl.SetFilter ( filtercol, filtervals )if groupby:cl.SetGroupBy ( groupby, SPH_GROUPBY_ATTR, groupsort )if sortby:cl.SetSortMode ( SPH_SORT_ATTR_DESC, sortby )#cl.SetLimits ( offset, limit, limit+offset ) #如果过于追求效率可以把启用该行 命中必须的数据后立马返回 markbyxds cl.SetLimits ( offset, limit)cl.SetConnectTimeout(60.0)res = cl.Query ( query, index )if not res:return HttpResponse(json.dumps({'page':page, 'count':count, 'total':0, 'datas':[]}))#去数据库取实际数据ids = [match['id'] for match in res['matches']]rawDatas = RawData.objects.filter(id__in = ids).order_by('-create_time')response = {'page':page, 'count':count, 'total':res['total'], 'datas':[]}response['datas'] = construct_response_data(rawDatas)#对内容生成高亮摘要'''cl.SetConnectTimeout(5.0)bodyDatas = [tmpData['data'] for tmpData in response['datas']]try:excerpts = cl.BuildExcerpts(bodyDatas,'spider',query,{'before_match':'<span "style":"color:red;">','after_match':'</span>','query_mode':mode}) except Exception,e:import pdbpdb.set_trace()passlistIndex = 0for excerpt in excerpts:response['datas'][listIndex]['data'] = excerpt.decode('utf-8')listIndex += 1'''cl.SetConnectTimeout(0.1)for listIndex in range(len(response['datas'])):tmpData = response['datas'][listIndex]['data']for retry in range(3):try:excerpt = cl.BuildExcerpts([tmpData],'spider', #用多个索引返回空list spider;spiderinc markbyxds query,{'before_match':'<span style="color:red;">','after_match':'</span>','query_mode':mode})except Exception,e:logging.error("%s:%s" % (type(e),str(e)))excerpt = Nonebreakelse:if excerpt != None:response['datas'][listIndex]['data'] = excerpt[0].decode('utf-8')breakelse:logging.warning('return none . (timeout is too small), to retrying。。。')continueif excerpt == None:snippetLen = 1024response['datas'][listIndex]['data'] = tmpData[0:snippetLen]if len(tmpData) > snippetLen: response['datas'][listIndex]['data'] += '...'#json串jsonStr = json.dumps(response, ensure_ascii = False)if isinstance(jsonStr, unicode):jsonStr = jsonStr.encode('utf-8')return HttpResponse(jsonStr)

8.今天(2014.2.20)找到对微博内容建sphinx索引后搜索出来的id到mysql差不到数据的原因:

以微博id微博sphinx文档id、以微博内容为文本建sphinx索引时因为微博id是bigint大整数(8bytes),到sphinx这边被截断了(默认安装的sphinx只支持4byte的id)

详见:http://sphinxsearch.com/forum/view.html?id=2064

原创粉丝点击