[关闭]
@songying 2019-01-04T16:42:33.000000Z 字数 1035 阅读 1346

Fusionnet: fusing via fully-aware attention with application to machine comprehension

阅读理解模型


Abstact

这篇文字介绍了一种新的神经网络架构:FusionNet, 该架构从三方面对现有attention进行扩展:
1. it puts forward a novel concept of “history of word” to characterize attention information from the lowest word-level embedding up to the highest semantic-level representation.
2. it identifies an attention scoring function that better utilizes the “history of word” concept.
3. it proposes a fully-aware multi-level attention mechanism to cap-
ture the complete information in one text (such as a question) and exploit it in its
counterpart (such as context or passage) layer by layer.

数据集: SQUAD, Addsets, AddOneSent

Introduction

2. Machine Comprehension & Fully-Aware Attention

本段有以下内容:

  1. 简短介绍了下Machine Comprehension 任务内容。
  2. 总结最近Machine Comprehension领域的新进展
  3. 介绍一个新的概念: history-of-word, . History-of-word can capture different levels of contextual information to fully understand the text.
  4. 对 history-of-word的实现。

2.1 Task Description

2.2 Conceptual Architecture For Machine Reading Comprehension

Fully-Aware Attention on History of word

3. Fully-Aware Fusion Netwok

3.1 End-to-End Architecture

3.2 Application in Machine Comprehension

添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注