Cognitive Science Colloquium - Paul Smolensky

Thu, Feb 27, 2020, 12:00 pm to 1:00 pm
Peretsman Scully Hall - Room 101

Abstract: I will argue that new neural network architectures for deep learning of continuous structure unify the sources of power of new- and old-school theories of cognition — deep neural-network learning and rule-based symbolic structure processing — and advance performance and interpretability of AI models. I will introduce ‘continuous structure’, briefly review previous work using neurosymbolic processing to derive new grammatical formalisms and psycholinguistic models, and then move to more recent work deploying deep learning. I will offer an analysis of some key sources of power and critical limitations of symbolic and neural cognitive models, introduce new architectures that unify these sources of power to overcome these limitations, and present initial results of applying these architectures to core problems in AI involving natural language.