Patent Number: 6,490,571

Title: Method and apparatus for neural networking using semantic attractorarchitecture

Abstract: A semantic attractor memory uses an evolving neural network architectureand learning rules derived from the study of human language acquisitionand change to store, process and retrieve information. The architecture isbased on multiple layer channels, with random connections from one layerto the next. One or more layers are devoted to processing inputinformation. At least one processing layer is provided. One or more layersare devoted to processing outputs and feedback is provided from theoutputs back to the processing layer or layers. Inputs from parallelchannels are also provided to the one or more processing layers. With theexception of the feedback loop and central processing layers, the networkis feedforward unless it is employed in a hybrid back-propagationconfiguration. The learning rules are based on non-stationary statisticalprocesses, such as the Polya process or the processes leading toBose-Einstein statistics, again derived from considerations of humanlanguage acquisition. The invention provides rapid, unsupervisedprocessing of complex data sets, such as imagery or continuous humanspeech, and a means to capture successful processing or patternclassification constellations for implementation in other networks.

Inventors: Cooper; David L. (Fairfax, VA)

Assignee:

International Classification: G06N 3/04 (20060101); G06N 3/00 (20060101); G06N 003/02 ()

Expiration Date: 12/02015