How much can you know through language alone?

NOTE HXA7241 2021-05-09T13:09Z

Is telling someone how to ride a bike knowledge of how to ride a bike? Yes. If anything is transmissible with language, everything is.

You could describe GPT-3 as working by finding the structure in the language itself. Let us consider this generally ‒ this idea of learning purely by discerning structure in language ‒ and call it GPT-X.

Some knowledge is more deeply analysed by language ‒ broken down into pieces, and both those pieces and the relations between are mirrored or represented in linguistic structure. Certain things you have to know external to language, you have to experience them directly. But other things, language can explain to you, by articulating how they are built from other things you know. Is that a good analysis?

So you would think that GPT-X's weakness is in not having those primitives of knowledge only accessible by experience. But maybe that is wrong: is not everything analysed (and interrelated) by language? Is there anything that has not been written about? Has not all articulable knowledge that can be extracted, already been extracted? Not entirely truly ‒ we do not know everything ‒ but just up to our current level. Is not everything we can know (currently), expressed in language?

The complaint that some things cannot be expressed in language is performatively contradictory. If you wave your hands and make a vague inarticulate suggestion, we do not know what you are talking about, and your claim fails. And if you explain clearly, we realise that you have expressed that thing in language after all.

Does that not nullify the distinction between tacit and non-tacit knowledge? It is claimed that, eg, you can learn to cook something purely from a written recipe, but you can only learn to play guitar by doing it. Yet what is ‘playing guitar’ ‒ you can only meaningfully claim it cannot be learnt if you say clearly what is not being learnt, but if you do express that clearly, it would be a sufficient explanation to learn from.

The only experiences that you cannot possibly describe to other people ‒ in language ‒ are ones that are not in any way like anything else. But if that was true, they would be meaningless to you too. How could you understand an experience if it cannot be connected to anything else? How can it mean anything, if it cannot mean anything in terms of anything else? It cannot, because meaning is a relation between things. All feelings/experiences/descriptions are linked together ‒ there are no islands, because islands of understanding cannot exist.


And GPT-X need not even know the truth. To say anything understandable is to express the general structure underneath. A corpus of text is a shadow of reality. The shape of the shadow might be a word that is a lie, but the shadow tells the truth about the position of the sun and the shape of the occluder. A lie must tell a truth about the context, the range of options, of what it lies about ‒ to be an effective lie, a lie must be about something true. If you lie about a car having a particular size battery, that indicates that cars can have that size battery. You could talk nonsense, but that could be filtered out as not fitting any recognisable, repeated pattern. Can we not find out everything by assembling all the things that even lies, and hence imperfect info too, talk about?