The recent rise in large language models profoundly changes the landscape for theories of human language. I'll discuss how these models should cause us to rethink popular ideas about grammar and language acquisition. Maybe most importantly, modern language models provide a compelling way to think about semantics and conceptual representation. I'll argue that the sense in which they possess "meanings" is likely to be analogous to how human words and concepts achieve meaning by implementing "conceptual role" theories which are at least partially accessible in the statistics present in text, even without embodied contact with the world.  

Video Recording