CIKM 2016 aNMM: Ranking Short Answer Texts with Attention-Based Neural Matching Model
来源:互联网 发布:java进阶路线 编辑:程序博客网 时间:2024/05/20 23:05
中文简介:本文针对当前深度学习模型包括基于CNN或者LSTM的模型适用于Answer Sentence Selection这个task时必须额外combine 传统的text matching feature的问题,提出了一个attention based neural matching model。该模型提出使用value-shared weighting scheme以及基于attention scheme学习问题词的重要性。基于标准Benchmark TREC QA data的实验结果证明,该模型不需要额外combine任何text matching的feature也可以得到和之前的深度学习模型以及基于特征工程的方法相当或者更好的排序效果。如果额外combine一个简单的QL的feature,该模型的排序效果可以超过当前Answer Sentence Selection 这个任务state-of-the-art的方法的效果。
论文出处:CIKM'16
英文摘要:As an alternative to question answering methods based on feature engineering, deep learning approaches such as convolutional neural networks (CNNs) and Long Short-Term Memory Models (LSTMs) have recently been proposed for semantic matching of questions and answers. To achieve good results, however, these models have been combined with additional features such as word overlap or BM25 scores. Without this combination, these models perform significantly worse than methods based on linguistic feature engineering. In this paper, we propose an attention based neural matching model for ranking short answer text. We adopt value-shared weighting scheme instead of position-shared weighting scheme for combining different matching signals and incorporate question term importance learning using question attention network. Using the popular benchmark TREC QA data, we show that the relatively simple aNMM model can significantly outperform other neural network models that have been used for the question answering task, and is competitive with models that are combined with additional features. When aNMM is combined with additional features, it outperforms all baselines.
下载链接:http://maroo.cs.umass.edu/pub/web/getpdf.php?id=1240
开源Code Github链接:https://github.com/yangliuy/aNMM-CIKM16
- CIKM 2016 aNMM: Ranking Short Answer Texts with Attention-Based Neural Matching Model
- 【论文阅读】Neural Language Correction with Character-Based Attention
- Abstractive Document Summarization with a Graph-Based Attentional Neural Model
- #Paper Reading# Robust Word-Network Topic Model for Short Texts
- 浅谈Attention-based Model【原理篇】
- 浅谈Attention-based Model【源码篇】
- Effective Approaches to Attention-based Neural Machine Translation
- [EMNLP2015]Effective Approaches to Attention-based Neural Machine Translation
- #Paper Reading# A Neural Attention Model for Abstractive Sentence Summarization
- [EMNLP2015]A Neural Attention Model for Sentence Summarization
- keras实现attention based sequence to sequence model(首稿)
- SIP Using SDP with Offer/Answer Model
- SIP Using SDP with Offer/Answer Model
- SIP Using SDP with Offer/Answer Model
- 论文阅读 - 《Neural Sentiment Classification with User and Product Attention》
- 论文阅读:Neural Image Caption Generation with Visual Attention
- 【论文笔记】Neural Relation Extraction with Multi-lingual Attention
- Attention Model
- BCD解密
- 表格输出
- SIGIR 2016 Improving Language Estimation with the Paragraph Vector Model for Ad-hoc Retrieval
- ICTIR 2016 Analysis of the Paragraph Vector Model for Information Retrieval
- mysql事务处理用法与实例详解
- CIKM 2016 aNMM: Ranking Short Answer Texts with Attention-Based Neural Matching Model
- 二分查找和递归@java
- 理解神经网络中的反向传播法
- MySQL自定义函数用法详解-复合结构自定义变量/流程控制
- Paint House II
- error while loading shared libraries的解決方法
- 百度地图坐标距离计算,源于百度地图JS API 2.0
- 数据结构——循环队列存储结构以及实现
- 冒泡排序@java