Gregory Bateson, the systems theorist and anthropologist, recognized the emergence of Cybernetics as one of the major landmarks in human history. However, "Cybernetics" per se (the word itself), like "Biosphere", has not enjoyed the wide currency many expected.
That doesn't mean Gregory Bateson was wrong, only that the precise language used is highly mercurial vs-a-vs the less fickle concepts themselves. Systems that auto-tune in the presence of an environment, in order to optimize various capabilities, don't have to be identified as "cybernetic" in order for them to get their work done.
General Systems Theory (GST) actually includes an appreciation for "word meaning trajectories" meaning we track the significance of words in semantic space, and not just according to their frequency (common versus esoteric).
The concepts to consider here: biases, weights, precession. The first two seem obvious and show up in linear algebra specifically. The latter, precession, is borrowed from Synergetics, often lumped with Cybernetics (for good reason) and has to do with the curvilinear paths (the geodesics) formed in the presence of feedback loops, tensor fields, some of which may be self reinforcing (e.g. the "vortex" pattern).
Planets were originally conceived of as "wanderers" because from the standpoint of Earth, their orbits are not simply elliptical, as they are from the standpoint of a Galilean observer, looking from outside the solar system. Picking the viewpoint from which bodies in motion have a simplest set of relationships is a non-trivial application of machine learning. At least metaphorically, the fixed point theorem applies: there's an identity function hiding in a forest, like a singular tree.
Machine learning is somewhat like fine tuning an ear to hear, inside a chamber with characteristic frequencies we hope to detect. Train your ear while creating a track record, a history, of improvement, thanks to feedback loops. Then correctly categorize new sounds, as evidence that you've practiced some generic skill and aren't helpless outside the training cocoon. In today's Tensorflow tutorial we distinguished training, validation and testing data.
That doesn't mean Gregory Bateson was wrong, only that the precise language used is highly mercurial vs-a-vs the less fickle concepts themselves. Systems that auto-tune in the presence of an environment, in order to optimize various capabilities, don't have to be identified as "cybernetic" in order for them to get their work done.
General Systems Theory (GST) actually includes an appreciation for "word meaning trajectories" meaning we track the significance of words in semantic space, and not just according to their frequency (common versus esoteric).
The concepts to consider here: biases, weights, precession. The first two seem obvious and show up in linear algebra specifically. The latter, precession, is borrowed from Synergetics, often lumped with Cybernetics (for good reason) and has to do with the curvilinear paths (the geodesics) formed in the presence of feedback loops, tensor fields, some of which may be self reinforcing (e.g. the "vortex" pattern).
Planets were originally conceived of as "wanderers" because from the standpoint of Earth, their orbits are not simply elliptical, as they are from the standpoint of a Galilean observer, looking from outside the solar system. Picking the viewpoint from which bodies in motion have a simplest set of relationships is a non-trivial application of machine learning. At least metaphorically, the fixed point theorem applies: there's an identity function hiding in a forest, like a singular tree.
Machine learning is somewhat like fine tuning an ear to hear, inside a chamber with characteristic frequencies we hope to detect. Train your ear while creating a track record, a history, of improvement, thanks to feedback loops. Then correctly categorize new sounds, as evidence that you've practiced some generic skill and aren't helpless outside the training cocoon. In today's Tensorflow tutorial we distinguished training, validation and testing data.