共 14 条
[1]
Bidirectional attention flow for machine comprehension . Seo Minjoon,Kembhavi Aniruddha,Farhadi Ali,et al. Proceedings of the International Conference on Learning Representations . 2015
[2]
Read + Verify: Machine Reading Comprehension with Unanswerable Questions[J] . Minghao Hu,Furu Wei,Yuxing Peng,Zhen Huang,Nan Yang,Dongsheng Li. roceedings of the AAAI Conference on Artificial Intelligence . 2019
[3]
Machine Comprehension Using Match-LSTM and Answer Pointer[J] . Shuohang Wang,Jing Jiang. oRR . 2016
[4]
Dropout: a simple way to prevent neural networks from overfitting[J] . Nitish Srivastava,Geoffrey E. Hinton,Alex Krizhevsky,Ilya Sutskever,Ruslan Salakhutdinov. ournal of Machine Learning Research . 2014 (1)
[6]
FusionNet:Fusing via fully-aware attention with application to machine comprehension . Hsin-Yuan Huang,Chenguang Zhu,Yelong Shen,et al. . 2018
[7]
QANet:Combining local convolution with global self-attention for reading comprehension . Adams Wei Yu,David Dohan,Minh Thang Luong,et al. Proceedings of the International Conference on Learning Representations . 2018
[8]
BERT:Pre-training ofdeep bidirectional transformers for Language Understanding . Jacob Devlin,Mingwei Chang,Kenton Lee,et al. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics . 2019
[9]
Stochastic answer networks for machine reading comprehension . Xiaodong Liu,Yelong Shen,Kevin Duh,et al. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics . 2018
[10]
Reinforced mnemonic reader for machine reading comprehension . Minghao Hu,Yuxing Peng,Zhen Huang,et al. Proceedings of the 27th International Joint Conference on Artificial Intelligence . 2018