@songying
2019-01-04T16:42:33.000000Z
字数 1035
阅读 1346
阅读理解模型
这篇文字介绍了一种新的神经网络架构:FusionNet, 该架构从三方面对现有attention进行扩展:
1. it puts forward a novel concept of “history of word” to characterize attention information from the lowest word-level embedding up to the highest semantic-level representation.
2. it identifies an attention scoring function that better utilizes the “history of word” concept.
3. it proposes a fully-aware multi-level attention mechanism to cap-
ture the complete information in one text (such as a question) and exploit it in its
counterpart (such as context or passage) layer by layer.
数据集: SQUAD, Addsets, AddOneSent
本段有以下内容:
- 简短介绍了下Machine Comprehension 任务内容。
- 总结最近Machine Comprehension领域的新进展
- 介绍一个新的概念: history-of-word, . History-of-word can capture different levels of contextual information to fully understand the text.
- 对 history-of-word的实现。