Joshua Maynez
Title
Cited by
Cited by
Year
On faithfulness and factuality in abstractive summarization
J Maynez, S Narayan, B Bohnet, R McDonald
arXiv preprint arXiv:2005.00661, 2020
962020
Morphosyntactic tagging with a meta-BiLSTM model over context sensitive token encodings
B Bohnet, R McDonald, G Simoes, D Andor, E Pitler, J Maynez
arXiv preprint arXiv:1805.08237, 2018
772018
Morphosyntactic tagging with a meta-BiLSTM model over context sensitive token encodings
B Bohnet, R McDonald, G Simoes, D Andor, E Pitler, J Maynez
arXiv preprint arXiv:1805.08237, 2018
772018
Stepwise extractive summarization and planning with structured transformers
S Narayan, J Maynez, J Adamek, D Pighin, B Bratanič, R McDonald
arXiv preprint arXiv:2010.02744, 2020
82020
Planning with Entity Chains for Abstractive Summarization
S Narayan, Y Zhao, J Maynez, G Simoes, R McDonald
arXiv preprint arXiv:2104.07606, 2021
42021
Focus Attention: Promoting Faithfulness and Diversity in Summarization
R Aralikatte, S Narayan, J Maynez, S Rothe, R McDonald
arXiv preprint arXiv:2105.11921, 2021
12021
Shatter: An Efficient Transformer Encoder with Single-Headed Self-Attention and Relative Sequence Partitioning
R Tian, J Maynez, AP Parikh
arXiv preprint arXiv:2108.13032, 2021
2021
The system can't perform the operation now. Try again later.
Articles 1–7