it's true that language encodes many forms of our intelligence. and it's quite magical. think about it: words need to be boundless but also structured. language is a formulation of syntax and semantics ("colourless green ideas sleep furiously"), embued with micro-worlds and meaning only through human interaction and context. a lot of what is perceived to distinguish humans from other animals is intimiately bound with our linguistic abilities.
but what i think we often mistaken when interacting with LLMs, is that language = intelligence. language presents a tool for us to represent and communicate our worlds to one another. judith fan gives a good example -- describe a specific bookshelf to another person.
- angle 1: "it's a bookshelf"
- angle 2: "it's a place where you can store books"
- angle 3: "it has 8 cubic holes for storing reading material"
- angle 4: "30x30x30cm wooden planks, multiple that by 8 times, place them together in a rectangular shape, store these items that have many 30x50cm pages, all composed of words into binders...etc, etc."
language layers abstraction over abstraction, formatting our experiences in a way that's both efficient and mutually understood between two people.
so, while yes, language encodes 'core intelligence' in a sense, it is not purely intelligence itself. words are spoken and heard and thought, but what we're lacking include the other sensory inputs of the world: sight, touch, and even time -- evolution and continual learning and building of linguistical blocks on how we got from line to bookshelf. this bleeds into fei-fei li and yann lecun's work on world models. but that's for another day.
