[ICLR24] Neural-Symbolic Recursive Machine for Systematic Generalization

Abstract

Current learning models often struggle with human-like systematic generalization, particularly in learning compositional rules from limited data and extrapolating them to novel combinations. We introduce the Neural-Symbolic Recursive Machine ( NSR), whose core is a Grounded Symbol System ( GSS), allowing for the emergence of combinatorial syntax and semantics directly from training data. The NSR employs a modular design that integrates neural perception, syntactic parsing, and semantic reasoning. These components are synergistically trained through a novel deduction-abduction algorithm. Our findings demonstrate that NSR’s design, imbued with the inductive biases of equivariance and compositionality, grants it the expressiveness to adeptly handle diverse sequence-to-sequence tasks and achieve unparalleled systematic generalization. We evaluate NSR’s efficacy across four challenging benchmarks designed to probe systematic generalization capabilities: SCAN for semantic parsing, PCFG for string manipulation, HINT for arithmetic reasoning, and a compositional machine translation task. The results affirm NSR’s superiority over contemporary neural and hybrid models in terms of generalization and transferability.

Publication
In Proceedings of the International Conference on Learning Representations
Qing Li
Qing Li
Research Scientist
Yixin Zhu
Yixin Zhu
Assistant Professor

I build humanlike AI.

Yitao Liang
Yitao Liang
Assistant Professor
Ying Nian Wu
Ying Nian Wu
Professor
Song-Chun Zhu
Song-Chun Zhu
Chair Professor
Siyuan Huang
Siyuan Huang
Research Scientist

Related