TAN Hongye, ZHAO Honghong, LI Ru, LIU Bei. A Pipeline Approach to Free-Description Question Answering in Chinese Gaokao Reading Comprehension[J]. Chinese Journal of Electronics, 2019, 28(1): 113-119. doi: 10.1049/cje.2018.08.001
Citation: TAN Hongye, ZHAO Honghong, LI Ru, LIU Bei. A Pipeline Approach to Free-Description Question Answering in Chinese Gaokao Reading Comprehension[J]. Chinese Journal of Electronics, 2019, 28(1): 113-119. doi: 10.1049/cje.2018.08.001

A Pipeline Approach to Free-Description Question Answering in Chinese Gaokao Reading Comprehension

doi: 10.1049/cje.2018.08.001
Funds:  This work is supported by the National Natural Science Foundation of China (No.61673248, and No.61772324), and the Project of Postgraduate Joint Cultivation Base of Shanxi Province (No.2018JD02).
  • Received Date: 2017-04-05
  • Rev Recd Date: 2018-01-29
  • Publish Date: 2019-01-10
  • This study attempted to answer complicated free-description questions in Chinese Gaokao Reading comprehension (RC) tasks. We found that quite a few questions can be answered by extracting sentences from the document and combining them, so we used a pipeline approach with two components:Answer sentence extraction (ASE) and Answer sentence fusion (ASF). Semantic vector similarity and topical distribution similarity were explored for ASE. Integer linear programming strategy was used for ASF, which combined dependencies with the language model, based on word importance. As a first step towards the new challenge, we obtained some encouraging results on actual exam questions in Chinese subject's RC tasks of Beijing Gaokao, which helped us obtain insights into techniques needed to solve real-word complex questions.
  • loading
  • Matthew Richardson, Christopher J.C. Burges, et al., "Mctest:A challenge dataset for the open domain machine comprehension of text", Proc. of EMNLP2013, pp.193-203, 2013.
    Karl Moritz Hermann, Tomas Kociskyz, et al., "Teaching machines to read and comprehend", Proc. of NIPS2015, 2015.
    Akira Fujita, Akihiro Kameda, et al., "Overview of Todai Robot Project and evaluation framework of its NLP-based problem solving", Proc. of LREC2014, pp.2590-2597, 2014.
    G. Cheng, W.X. et al., "Taking up the Gaokao challenge:An information retrieval approach", Proc. of the TwentyFifth International Joint Conference on Artificial Intelligence (IJCAI-16), pp.2479-2485, 2016.
    D.M. Endres and J.E. Schindelin, "A new metric for probability distributions", IEEE Transactions on Information Theory, Vol.49, No.7, pp.1858-1860, 2003.
    M.H. Zhang, H.L. Wang and G.D. Zhou, "An automatic summarization approach based on LDA topic feature", Computer Applications and Software, pp.20-22, 2011.
    Katja Filippova and Michael Strube, "Sentence fusion via dependency graph compression", Proc. of the 2008 Conference on Empirical Methods in Natural Language Processing, pp.177-185, 2008.
    Y. Zhou, D.J. Zhu, et al., "Sentence similarity based on HowNet", Bulletin of Adavanced Technology Research, Vol.4, No.8, pp.32-37, 2010. (in Chinese)
    D.M. Blei, J.D. Lafferty, et al., "A correlated topic model of Science", Annals of Applied Statistics,, pp.17-35, 2007.
    X.H. Yan, et al., "A biterm topic model for short texts", Proc. of WWW2013, pp.1445-1456, 2013.
    Katja Filippova, Enrique Alfonseca, Carlos A. Colmenares, et al., "Sentence compression by deletion with LSTMs", Proc. of the 2015 Conference on Empirical Methods in Natural Language Processing, pp.360-368, 2015.
    Regina Barzilay and Kathleen R. McKeown, "Sentence fusion for multidocument news summarization", Computational Linguistics, Vol.31, No.3, pp.297-328, 2005.
    L.D. Bing, P.J. Li, Y. Liao and W. Lam, "Abstractive multidocument summarization via phrase selection and merging", Proc. of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Proceeding, pp.1587-1597, 2015.
    S.C. Yang, X.Y. Dai and J.J. Chen, "Advances in question classification for open-domain question answering", Acta Electronica Sinica, Vol.43, No.8, pp.1627-1636, 2015. (in Chinese)
    H. Wang, M. Bansal, K. Gimpel, et al., "Machine comprehension with syntax, frames, and semantics", Proc. of ACL2015, pp.700-706, 2015.
    Mrinmaya Sachan, Avinava Dubey, Eric P. Xing, et al., "Learning answer-entailing structures for machine comprehension", Proc. of ACL2015, pp.239-249, 2015.
    Karthik Narasimhan and Regina Barzilay, "Machine comprehension with discourse relations", Proc. of ACL2015, pp.1253-1262, 2015.
    W.P. Yin, S. Ebert and H. Schtze, "Attention-based convolutional neural network for machine comprehension", Proc. of NAACL2016, pp.15-21, 2016.
    A. Trischler, Z. Ye, X.D. Yuan, "Natural language comprehension with the EpiReader", Proc. of EMNLP 2016, pp.128-137, 2016.
    Y.M. Cui, Z.P. Chen, et al., "Attention-over-attention neural networks for reading comprehension", ACL2017, arXiv:1606.01603, 2016.
    Álvaro Rodrigo, et al., "Overview of CLEF QA entrance exams task 2015", Working Notes of CLEF2015, 2015.
    Dominique Laurent, Baptiste Chardon, Sophie Ngre, et al., "Reading comprehension at Entrance Exams", Working Notes of CLEF2015, 2015.
    Hideyuki Shibuki, Kotaro Sakamoto, Madoka Ishioroshi, et al., "Overview of the NTCIR-12 QA Lab-2 task", Proc. of the 12th NTCIR Conference on Evaluation of Information Access Technologies, 2016.
    Takuma Takada, Takuya Imagawa, et al., "SML questionanswering system for world history essay and multiplechoice exams at NTCIR-12 QA Lab-2", Proc. of the 12th NTCIR Conference on Evaluation of Information Access Technologies, 2016.
    Kotaro Sakamoto, Madoka Ishioroshi, et al., "Forst:Question answering system for second-stage examinations at NTCIR-12 QA Lab-2 Task", Proc. of the 12th NTCIR Conference on Evaluation of Information Access Technologies, 2016.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (170) PDF downloads(277) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return