Recently, I've been giving some more foreground treatment to my School of Tomorrow background context, in a section on Github in a Jupyter Notebook markdown cell, near code cells filled with runnable Python.
I'm working in some of the newer threads such as the Mark Fisher thread, intertwining them with older threads, starting literally in Napoleonic times and before (hello William Blake).
How should we counterpose Mark's philosophy with Peter Sloterdijk's? The equations, for both thinkers, involve consensus realities (CRs) versus non-consensus ones (NCRs) -- terms from Process Work. Both types may be pictured as bubbles, and therefore as both mass producible (as in AI foams), and poppable (as in theories of anything).
Who or what circumscribes my world, by means of language? "Circumscribe" has an ambiguous, and therefore ambivalent meaning. Constructive or constrictive? Protective or suffocating? Liberating or bewitching?
As another instance of such interweaving, let's recall that Hugh Kenner, author of a Bucky bio (Bucky), James Joyce scholar and author of The Pound Era, wrote a column for Byte Magazine.
He also studied under Marshall McLuhan at the University of Toronto, a base he shared with Drs. Geoffrey Hinton and Donald Coxeter, both practitioners of n-D Hilbert Space based mathematics.
Bucky's magnum opus Synergetics (itself outside the Hilbert hypercross namespace) is dedicated to this same Donald Coxeter of Regular Polytopes fame. McLuhan and Fuller were outright co-conspirators in many dimensions.
In November, 1984, Hugh Kenner's column, entitled A Travesty Generator for Micros, co-authored with Joseph O'Rourke, hit the stands with immediate, and longer term, ripple effects within the field of natural language processing (NLP).
The English professor had teamed up with a computer science professor, both from Johns Hopkins, to feature a generative language algorithm, the output of which was based on conserving n-grams within the input, in terms of their statistical frequency. The listed computer program, called Travesty, was under 300 lines of Pascal.
Granted, Travesty is a far cry from the prediction-based generative language models of today, and yet is a milestone along the journey, much as MEMEX, imagined in the speculative 1945 Atlantic Monthly article, As We May Think, by Dr. Vannevar Bush, prefigured yesterday's hypertext and search engines, as well as the AI of today and tomorrow.
Donald Coxeter, we should remember, tested the waters of contemporary philosophy in England, encountering Ludwig Wittgenstein (LW), in the process of deciding his own path, which was geometry. LW would later prove influential within American Pragmatism, helping the leftist, yet non-Marxist Richard Rorty get free from traditional Anglo representationalism, an umbrella term for both nominalism and Platonic realism. Enter operationalism, within a philosophy of mathematics.
Back to the roots of contemporary AI, we have Emerson remarking in his journal that perhaps Babbage would be coming up with a novel-generating machine. I'm guessing this remark was tongue-in-cheek, yet expressing the same anxieties as were occasioned by The Turk, ostensibly a mechanical device that could win at chess against humans, including against Napoleon.
The brave midget hiding within The Turk's mechanical exterior symbolizes the homunculus, the self we imagine must, pretty much by definition, awaken within, if ever our AI machines do attain general consciousness. We picture a soul as a smaller man, a 3rd eye viewer of an inner television, the ghost in the machine, a solipsistic cogito-spectator. These images may seem to force themselves upon us, once we choose to resist them that is.