• shareshare
  • link
  • cite
  • add
auto_awesome_motion View all 2 versions
Publication . Contribution for newspaper or weekly magazine . Conference object . 2020

Priorless Recurrent Networks Learn Curiously

Jeff Mitchell; Jeffrey S. Bowers;
Open Access
Published: 01 Dec 2020
Publisher: International Committee on Computational Linguistics
Country: United Kingdom
Recently, domain-general recurrent neural networks, without explicit linguistic inductive biases, have been shown to successfully reproduce a range of human language behaviours, such as accurately predicting number agreement between nouns and verbs. We show that such networks will also learn number agreement within unnatural sentence structures, i.e. structures that are not found within any natural languages and which humans struggle to process. These results suggest that the models are learning from their input in a manner that is substantially different from human language acquisition, and we undertake an analysis of how the learned knowledge is stored in the weights of the network. We find that while the model has an effective understanding of singular versus plural for individual sentences, there is a lack of a unified concept of number agreement connecting these processes across the full range of inputs. Moreover, the weights handling natural and unnatural structures overlap substantially, in a way that underlines the non-human-like nature of the knowledge learned by the network.
Subjects by Vocabulary

Microsoft Academic Graph classification: Range (mathematics) Process (engineering) Recurrent neural network Agreement Sentence Natural language processing Artificial intelligence Natural language Computer science Noun computer.software_genre computer business.industry business media_common.quotation_subject media_common


/dk/atira/pure/core/keywords/cognitive_science, Cognitive Science, /dk/atira/pure/core/keywords/psyc_language, Language

Related Organizations
Related to Research communities
Digital Humanities and Cultural Heritage
Download fromView all 3 sources
Explore Bristol Research
Contribution for newspaper or weekly magazine . 2020