Applications
Some applications of the SLP are immediate, for example as a simple amplifier (a single cell whose function is f(a,in)=a*in, where a is the amplification and in is an incoming signal). A few cells can be configured to perform differentiation or integration. Analog versions of flip flops (e.g. a sample-and-hold circuit) can be built from 2 cells. This alone is interesting, and an area of potentially rich research. For example, a population of analog circuits can be evolved using GAs; circuits can be built on-the-fly based on use patterns; and so on. See the online videos for more details.
Another area of application is in the composing of sets of functions into a single function. For example, three cells implementing functions f1, f2 and f3 can be composed by connecting f1's output to f2's input and f2's output to f3's input. The composite behavior can then be imprinted onto a single cell g, thereby allowing “interesting” (perhaps discovered via a GA) circuits to be coded into smaller circuits, and used as building blocks in subsequent discoveries. This has potential applications to areas such as machine-learning, where complex behaviors can be re-cast as low-level building blocks for the construction of even more-complex behaviors. See the videos 008- and 009-Functional Analysis here for more details related to this idea.
Even more interesting applications are found by exploiting the C inputs of cells as part of a circuit's native behavior. One area which is being explored relates to pattern recognition. The idea is to detect patterns in a set of input, and to code those patterns in SLP cells. One way to do this is to store incoming patterns in a series of cells, and as each new input arrives, compare it in parallel with all the stored patterns. Rather than simply observing how well the pattern matches, the degree of fit can be user to drive the C input of each cell, as follows:
This behavior can be achieved by calculating the difference between the incoming- and stored-patterns (integrate their difference, for example), complementing that (normalize from 0=identical to 1=very different, and subtract from 1), and using that to drive the C input of the pattern-storing cell. The intended effect here is that, beginning with a collection of effectively-random stored patterns, incoming samples can be used to tune the collection. Stored patterns that are close to an incoming signal will be tweaked to more-closely math that signal. Incoming signals that recur frequently will thus imprint cells that are initially somewhat-close to that pattern, and the more the incoming signal is observed, the more those cells will mimic it; whereas signals that are very different won't affect those cells, but may imprint different cells whose initial (random) stored pattern was closer to that incoming signal.
Some work has been done to explore this type of algorithm. See the Code Page for a further description and links to the simulation code.