"Neural Networks"

 

Hearing the term "Neural Networks", I got interested to see if there was anything I should know.  So I went and did some research to see what it was all about. I found some new words:
  1. learning, supervised and unsupervised
  2. artificial neurons
  3. back propagation
  4. units, visible and hidden
  5. and many types of models

What first struck me was the similarity to Shift-Reduce look ahead compiler logic. The only difference I could see was the deterministic versus indeterminist characteristics between the two problem solving results. In one the solution (look ahead) has a 100% certainty of use, where in other (neural networks) the use of the solution is uncertain.  So the only difference I could see in the two systems was the expected confidence level on the result, however both are identical in concept and structure.

The one big advantage of the look ahead approach is that it is well defined, and can be easily programmed using a single declaration table, has better control on the source code, has control on the back propagation logic, can be indeterminist, can be trained, and can be supervised.

My conclusion is that neural networks is the same as an already well defined technology, just with a different name because it has a difference use.  So here we go again, re-inventing the wheel, five steps back.... I guess it sounds cool.  For those of you that think the parallelism has been lost; no... we simply move it from theory to realistic implementation, as we should do will all logic. To implement neural networks in Jane I will use existing technology with an alternate name.

The true nature of the problem is that we are working toward having indeterminist become deterministic, then there is no difference between the two systems. This is moving toward, using another term I hate, "artificial intelligence" which is; stupid humans getting better at programming to have indeterminist results "good enough" to be acceptable. Some examples are quantum numbers and image recognition, as the error rate approaches a level of usability, then we could say, for practical use, the result is deterministic.