[math-fun] "machine learning" v. "scientific method" ?
I know essentially zero about the current AI/ML hype. I was wondering if there is a trivial example of AI/ML with a very small number of "connections" (or whatever they call them) that is capable of learning some trivial task -- e.g., an AND gate working on 0/1 input data. [I suppose that I could try out some of the current open source ML software on a trivial example like the AND gate to see how difficult this problem is. Perhaps someone here has already done such a "hello world" example? Also, has anyone tried such ML software on the various *integer sequences* ? Not that the ML software would "understand" the sequences, but it might provide some idea of the relative "complexity" of the different sequences.] I'm curious about whether this process can be seen from some perspective as a kind of "scientific method", where some model (which may consist of perhaps 1-4 real or rational numbers) is a "hypothesis" which is "tested" against some training data which then causes some sort of "model refinement". If AI/ML cannot be put into (or forced into) this form, then perhaps it is time to come up with a replacement for the "scientific method" that CAN be put into such an AI/ML form. My point is that if ML really has come far enough to provide a more concrete rationale behind "scientific progress", then perhaps we need to revise what the phrase "scientific progress" should mean.
On Fri, Dec 28, 2018 at 1:14 PM Henry Baker <hbaker1@pipeline.com> wrote:
I know essentially zero about the current AI/ML hype.
I was wondering if there is a trivial example of AI/ML with a very small number of "connections" (or whatever they call them) that is capable of learning some trivial task -- e.g., an AND gate working on 0/1 input data.
It takes two neurons to learn to ride a bike: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.88.3781&rep=rep1&ty...
I'm curious about whether this process can be seen from some perspective as a kind of "scientific method", where some model (which may consist of perhaps 1-4 real or rational numbers) is a "hypothesis" which is "tested" against some training data which then causes some sort of "model refinement".
https://arxiv.org/pdf/1509.03580.pdf "The ability to discover physical laws and governing equations from data is one of humankind’s greatest intellectual achievements. A quantitative understanding of dynamic constraints and balances in nature has facilitated rapid development of knowledge and enabled advanced technological achievements, including aircraft, combustion engines, satellites, and electrical power. In this work, we combine sparsity-promoting techniques and machine learning with nonlinear dynamical systems to discover governing physical equations from measurement data. The only assumption about the structure of the model is that there are only a few important terms that govern the dynamics, so that the equations are sparse in the space of possible functions; this assumption holds for many physical systems. In particular, we use sparse regression to determine the fewest terms in the dynamic governing equations required to accurately represent the data. The resulting models are parsimonious, balancing model complexity with descriptive ability while avoiding overfitting. We demonstrate the algorithm on a wide range of problems, from simple canonical systems, including linear and nonlinear oscillators and the chaotic Lorenz system, to the fluid vortex shedding behind an obstacle. The fluid example illustrates the ability of this method to discover the underlying dynamics of a system that took experts in the community nearly 30 years to resolve. We also show that this method generalizes to parameterized, time-varying, or externally forced systems." -- Mike Stay - metaweta@gmail.com http://math.ucr.edu/~mike https://reperiendi.wordpress.com
participants (2)
-
Henry Baker -
Mike Stay