[EMNLP2015]Effective Approaches to Attention-based Neural Machine Translation
来源:互联网 发布:网易域名邮箱注册 编辑:程序博客网 时间:2024/06/05 09:28
neural machine translation有以下优点:
(1) 有能力生成很长的词序列
(2) 因为不需要存储巨大的短语词表,所以需要很小的内存
(3) 解码很容易
A: 介绍了两种attention模型,其共同点是在每一步decoding时hidden state h
(1)global attention
在生成target word y
上图给出了两条计算路线, 上面一条是以往的计算路线,下面的是本文给出的计算路线
(2) local attention
input 中的部分词参与生成 y
重点是计算p
B:input-feeding Approach
图1和图4中蓝色方框是encoder, 红色部分是decoder, 现在关注的是decoder的输入有变化, 在图1中,decoder的当前时刻的input 是前一时刻的输出,而在图4中decoder的当前时刻input除了前一时刻的输出还有前一时刻的隐状态
阅读全文
0 0
- [EMNLP2015]Effective Approaches to Attention-based Neural Machine Translation
- Effective Approaches to Attention-based Neural Machine Translation
- Neural Machine translation中的Attention机制
- 神经网络机器翻译Neural Machine Translation: Attention Mechanism
- 神经网络机器翻译Neural Machine Translation(2): Attention Mechanism
- Towards String-to-Tree Neural Machine Translation
- A novel approach to neural machine translation
- [ACL2017]Sequence-to-Dependency Neural Machine Translation
- 神经网络机器翻译Neural Machine Translation(5): Gradient-based Optimization Algorithms
- [EMNLP2015]A Neural Attention Model for Sentence Summarization
- Adversarial Neural Machine Translation
- 三大机器翻译技术的high-level概述:Neural, Rule-Based and Phrase-Based Machine Translation
- NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE
- Neural Machine Translation and Sequence-to-sequence Models: A Tutorial
- Google's Neural Machine Translation System
- Neural Machine Translation(NMT)技术概述
- Modeling Coverage for Neural Machine Translation
- Massive Exploration of Neural Machine Translation Architectures
- 玩转spring boot——简单登录认证
- 可扩展性的页面布局
- 模板_Dinic算法
- eclipse上传项目到GitHub上
- 企业级架构师应该向谁汇报工作?
- [EMNLP2015]Effective Approaches to Attention-based Neural Machine Translation
- Cython 入门:helloworld
- Spring设值注入
- Android电视应用首页开发以及效果展示
- maven项目生成war包,发布tomcat服务器报错,问题解决
- 面向 DBA 的 Linux Shell 脚本简介
- 视频下载到本地
- FTPrep, 46 Permutations
- HDU 1698-Just a Hook(线段树区间更新)