Neural symbolic reader: Scalable integration of distributed and symbolic representations for reading comprehension

X Chen, C Liang, AW Yu, D Zhou, D Song… - International …, 2019 - openreview.net
… This formulation is similar to the attention mechanism introduced in prior work (Bahdanau
et al., 2014). … of the passage tokens attp, and the attention vector of the question

Get to the point: Summarization with pointer-generator networks

A See, PJ Liu, CD Manning - arXiv preprint arXiv:1704.04368, 2017 - arxiv.org
… are not restricted to simply selecting and rearranging passagesvector is used as extra input
to the attention mechanism, … the semi finals of the atp masters 1000 event in key biscayne . …

[BOOK][B] Neural Generation of Open-Ended Text and Dialogue

A See - 2021 - search.proquest.com
… the same input passages repeatedly (addressing Problem 2, … a query vector q ∈ Rk, the
attention mechanism computes a … better for decoding; accordingly they find that a Transformer …