[关闭]
@songying 2019-03-19T12:06:55.000000Z 字数 2053 阅读 1072

综述:Emergent logical structure in vector representations of neural readers

阅读理解-蛮荒时代


本文是阅读理解的一篇综述性论文。

abstruct

我们将现有的模型分为aggregation readers 和 explicit reference readers两种。

In an independent contribution, we show that the addition of linguistics features to the input to existing neural readers significantly boosts performance yielding the best results to date on Who-did-What datasets

introduction

cloze-style语料库: CNN&Daily, the Children's Bokk Test, Who did What dataset。

我们将目前的模型分为两类:aggregation readers 和 explicit reference readers。

1. Aggregation readers

包括:Memory networks, the Attentive Reader, the Standford Reader, 使用双向LSTMs或 GRUs 来 construct a contextual embedding of each position t in the passage and also an embedding q of the question. 然后 select and answer c using a criterion similar to

  • Aggregation readers compute a vector representation of the passage involving a question-sensitive attention. They then select an answer based on the passage vector.

  • Explicit reference readers
    包括: the Attention Sum Reader, the Gated Attention Reader, the Attention-over-Attention Reader

2. A brief survey of Datasets


- q: a question given as sequence of words containing a special taken for a “blank” to be filled in
- p: a document consisting of a sequence of words
- A: a set of possible answers
- a: , the ground truth answer.

问题可以描述为挑选答案 , 其中, a是基于p对q的回答。

CNN & DailyMail:

Who-did-What

Children's Book Test

3. Aggregation Readers and Explicit Reference Readers

这里我们将所有的readers分为: aggregation readers 和 explicit reference readers。

3.1 Aggregation Readers

Standford Reader(看一下论文)

the Stanford Reader 使用一个双向lstm来计算passage和question。

Memory Networks

Attentive Reader

3.2 Explict Reference Readers

Attention-Sum Reader

论文:《Text understanding with the attention sum reader network》:为了解决cloze-style 问题。数据集: CNN&Daily Mail, the Children's Book Test.

Gated-Attention Reader

论文: 《Gated-Attention Readers for Text Comprehension》

Attention-over-Attention Reader

论文: 《Attention-over-attention neural networks for reading comprehension》

添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注