Daichi Mochihashi


pdf bib
How LSTM Encodes Syntax: Exploring Context Vectors and Semi-Quantization on Natural Text
Chihiro Shibata | Kei Uchiumi | Daichi Mochihashi
Proceedings of the 28th International Conference on Computational Linguistics

Long Short-Term Memory recurrent neural network (LSTM) is widely used and known to capture informative long-term syntactic dependencies. However, how such information are reflected in its internal vectors for natural text has not yet been sufficiently investigated. We analyze them by learning a language model where syntactic structures are implicitly given. We empirically show that the context update vectors, i.e. outputs of internal gates, are approximately quantized to binary or ternary values to help the language model to count the depth of nesting accurately, as Suzgun et al. (2019) recently show for synthetic Dyck languages. For some dimensions in the context vector, we show that their activations are highly correlated with the depth of phrase structures, such as VP and NP. Moreover, with an L1 regularization, we also found that it can accurately predict whether a word is inside a phrase structure or not from a small number of components of the context vector. Even for the case of learning from raw text, context vectors are shown to still correlate well with the phrase structures. Finally, we show that natural clusters of the functional words and the part of speeches that trigger phrases are represented in a small but principal subspace of the context-update vector of LSTM.


pdf bib
MIPA: Mutual Information Based Paraphrase Acquisition via Bilingual Pivoting
Tomoyuki Kajiwara | Mamoru Komachi | Daichi Mochihashi
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)

We present a pointwise mutual information (PMI)-based approach to formalize paraphrasability and propose a variant of PMI, called MIPA, for the paraphrase acquisition. Our paraphrase acquisition method first acquires lexical paraphrase pairs by bilingual pivoting and then reranks them by PMI and distributional similarity. The complementary nature of information from bilingual corpora and from monolingual corpora makes the proposed method robust. Experimental results show that the proposed method substantially outperforms bilingual pivoting and distributional similarity themselves in terms of metrics such as MRR, MAP, coverage, and Spearman’s correlation.

pdf bib
Nonparametric Bayesian Semi-supervised Word Segmentation
Ryo Fujii | Ryo Domoto | Daichi Mochihashi
Transactions of the Association for Computational Linguistics, Volume 5

This paper presents a novel hybrid generative/discriminative model of word segmentation based on nonparametric Bayesian methods. Unlike ordinary discriminative word segmentation which relies only on labeled data, our semi-supervised model also leverages a huge amounts of unlabeled text to automatically learn new “words”, and further constrains them by using a labeled data to segment non-standard texts such as those found in social networking services. Specifically, our hybrid model combines a discriminative classifier (CRF; Lafferty et al. (2001) and unsupervised word segmentation (NPYLM; Mochihashi et al. (2009)), with a transparent exchange of information between these two model structures within the semi-supervised framework (JESS-CM; Suzuki and Isozaki (2008)). We confirmed that it can appropriately segment non-standard texts like those in Twitter and Weibo and has nearly state-of-the-art accuracy on standard datasets in Japanese, Chinese, and Thai.

pdf bib
Suggesting Sentences for ESL using Kernel Embeddings
Kent Shioda | Mamoru Komachi | Rue Ikeya | Daichi Mochihashi
Proceedings of the 4th Workshop on Natural Language Processing Techniques for Educational Applications (NLPTEA 2017)

Sentence retrieval is an important NLP application for English as a Second Language (ESL) learners. ESL learners are familiar with web search engines, but generic web search results may not be adequate for composing documents in a specific domain. However, if we build our own search system specialized to a domain, it may be subject to the data sparseness problem. Recently proposed word2vec partially addresses the data sparseness problem, but fails to extract sentences relevant to queries owing to the modeling of the latent intent of the query. Thus, we propose a method of retrieving example sentences using kernel embeddings and N-gram windows. This method implicitly models latent intent of query and sentences, and alleviates the problem of noisy alignment. Our results show that our method achieved higher precision in sentence retrieval for ESL in the domain of a university press release corpus, as compared to a previous unsupervised method used for a semantic textual similarity task.


pdf bib
Human-like Natural Language Generation Using Monte Carlo Tree Search
Kaori Kumagai | Ichiro Kobayashi | Daichi Mochihashi | Hideki Asoh | Tomoaki Nakamura | Takayuki Nagai
Proceedings of the INLG 2016 Workshop on Computational Creativity in Natural Language Generation


pdf bib
Learning Word Meanings and Grammar for Describing Everyday Activities in Smart Environments
Muhammad Attamimi | Yuji Ando | Tomoaki Nakamura | Takayuki Nagai | Daichi Mochihashi | Ichiro Kobayashi | Hideki Asoh
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing

pdf bib
Inducing Word and Part-of-Speech with Pitman-Yor Hidden Semi-Markov Models
Kei Uchiumi | Hiroshi Tsukahara | Daichi Mochihashi
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)


pdf bib
Improvements to the Bayesian Topic N-Gram Models
Hiroshi Noji | Daichi Mochihashi | Yusuke Miyao
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing


pdf bib
Predicting Word Fixations in Text with a CRF Model for Capturing General Reading Strategies among Readers
Tadayoshi Hara | Daichi Mochihashi | Yoshinobu Kano | Akiko Aizawa
Proceedings of the First Workshop on Eye-tracking and Natural Language Processing


pdf bib
Learning Common Grammar from Multilingual Corpus
Tomoharu Iwata | Daichi Mochihashi | Hiroshi Sawada
Proceedings of the ACL 2010 Conference Short Papers


pdf bib
Bayesian Unsupervised Word Segmentation with Nested Pitman-Yor Language Modeling
Daichi Mochihashi | Takeshi Yamada | Naonori Ueda
Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP


pdf bib
Learning Nonstructural Distance Metric by Minimum Cluster Distortion
Daichi Mochihashi | Genichiro Kikui | Kenji Kita
Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing