Localizing Syntactic Composition with Left-Corner Recurrent Neural Network Grammars

Description: In computational neurolinguistics, it has been demonstrated that hierarchical models like Recurrent Neural Network Grammars (RNNGs), which jointly generate word sequences and their syntactic structures via the syntactic composition, better explained human brain data than sequential models such as Long Short-Term Memory networks (LSTMs). However, while the vanilla RNNG adopted in the previous literature has employed the top-down parsing strategy, the psycholinguistics literature have pointed out that the top-down parsing strategy is suboptimal for head-final/left-branching languages, and alternatively proposed the left-corner parsing strategy as the psychologically plausible parsing strategy. In this paper, building on this line of inquiry, we investigate not only whether hierarchical models like RNNGs better explain human brain data than sequential models like LSTMs, but also which parsing strategy is more neurobiologically plausible, by constructing a novel fMRI corpus where participants read newspaper articles naturalistically through the fMRI experiment in a head-final/left-branching language, namely Japanese. For the whole brain analysis, the design matrices were created fro the first-level GLM. All predictors (word rate, word length, word frequency, sententence ID, sentence position, five-gram, LSTM, suprisals estimated from RNNGs, the distance computed from RNNGs) were included except for head movement parameters.One-sample t-tests were performed for the second-level analysis. Related article: https://doi.org/10.1162/nol_a_00118

View ID Name Type
Field Value
Compact Identifierhttps://identifiers.org/neurovault.collection:14567
Add DateJune 14, 2023, 3:50 a.m.
Uploaded byyushis
Contributors
Related article DOINone
Related article authorsNone
Citation guidelines

If you use the data from this collection please include the following persistent identifier in the text of your manuscript:

https://identifiers.org/neurovault.collection:14567

This will help to track the use of this data in the literature.