This morning, I awoke to Tomer Ullman’s interesting AI generated thought experiments, in which he had given GPT-3 (one of the most powerful AI models we have at this time) – as input – philosophical examples. (Careful Tomer, there are few enough jobs to go around, as is.) My first question, on reading these very interesting, peculiar strings of sentences: “Could we contrapose the examples of input, and generated output?” To see whether GTP-3 simply reformulated the syntactic structure, placing at the property points of a given proposition, concepts which are, through pattern based analysis, recognised as connected in their use. (Where, for example, thought experiment A might entail: iff a and iff b therefore possibility paradox c, or d. Where a – d are connected through consistency of use, qua. linguistic analysis in both their positions, and connectivity in use (Water (Sea (Waves))).) Or whether it was possible for GPT-3 to derive something more primitive… or, perhaps a better way of saying it might be, something more organic, from the philosophical thought experiments it churned through.
If the former, the AI’s examples will have interesting implications on how we view the logical structure of language, and in particular, philosophical language. I must admit, I do enjoy them. This would be almost a Kantian data mine, in which the ability to draw from loosely connected bundles of information – that which has a family resemblance – to generate similar strings, might be a good way of reverse-engineering the silent, unwritten half of The Tractatus. Which, would further prove the point that AI can never be intelligent in any deep sense of the word, but is actually a tool used to filter information through derivation. Or, whether there is something in the emergent data that is unaccountable, based on a logical analysis of the structures of both the input, and the previous information fed to the machine. Time will tell.
Find the examples below, they are truly fascinating.
Credit to Tomer Ullman, and WhiteboardG for the images.