SBE Nuggets
Neural Networks: Simulation of Mind


American scientists since the 1940's, attempting to understand the human brain, have developed mathematical and computer models called neural networks which try to duplicate the computational power of the human nervous system. For every human behavior or perceptual activity -- vision, memory and language -- the brain enlists dynamic interacting populations of neurons (nerve cells) into coordinated activity to perform the specific task at hand. Dr. James A. Anderson is a neural network pioneer who has made contributions to understanding mental computation using models based loosely on the architecture of the nervous system. His work over several decades, much of it funded by the National Science Foundation, has aided progress in the fields of cognitive science and neuroscience and has been useful for building "smart" machines and in other forms of artificial intelligence.

With funding from the NSF's Human Cognition and Perception Program, part of the BCS (Behavioral, & Cognitive Sciences) division, Dr. Anderson, Chair of Brown University's Department of Cognitive and Linguistic Sciences and colleagues have recently studied neural network modelling of human reaction times. For more than a century, psychologists have studied the patterns seen in the time it takes a person to produce an answer to a problem in an effort to better understand the details of mental operation. This venerable technique benefits from new approaches using Dr. Anderson's nonlinear BSB neural network combined with computer simlulations. A major part of this project involves looking at the way humans solve simple arithmetic problems, a surprisingly difficult task for both humans and artificial neural networks. Instead of actually computing the answers to problems the way a digital computer would, humans (and networks) seem to learn elementary arithmetic very differently, by memorizing facts and estimating answers. Although these human strategies can be error-prone and slow, natural extensions of them can give rise to the poweful but poorly understood faculty of mathematical intuition as well as the ability to reason "intuitively" about complex systems.

Another NSF funded project in which Dr. Anderson is involved is funded through the LIS (Learning and Intelligent Systems) initative which promotes studies on intelligence in humans, animals and artifical systems. Anderson and colleagues are involved in a study on "Adaptive Cortical Computation in the Visual Domain" which is investigating the long range spatial interactions among neurons during visual processing. The neural network model used in this study is called informally the "network of networks" and is an attempt to bridge the huge gap in scale between single processing elements (neurons) and entire brain regions that may contain hundreds of millions of cooperating neurons. This model suggests that neurons can work together to build larger and larger functional groupings and that larger, and even larger groupings may form from the smaller ones using similar rules of formation.

In his years of NSF-funded research James Anderson has proven repeatedly that, in his own words, "cognitive science can, in fact, be immensely practical in the right situation." Network models similar to those he and his colleagues first developed with NSF support to simulate the human nervous system have become the foundation for the artificial neural networks now used routinely in many pattern based applications such as credit verification, medical diagnosis, and speech recognition. Anderson has also collaborated with companies such as Texas Instruments to improve military electronics. One project required a means of analyzing a confusing flood of radar signal data (see sidebar). The radar data was processed by a neural network designed to simplify the complex, as humans do, by breaking information into manageable blocks of data. Humans use "concepts" as a way of simplifying and understanding a complex environment. The techniques used for radar analysis were based directly on neural network models for human concept formation. He and colleagues at Distributed Data Systems, Inc. have used this idea to develop a "smart" radar analysis system for the U.S. Navy. The NSF has also supported Anderson and colleagues by providing funds for the purchase of powerful modern computer facilities for these neural network simulations as well as for related work helping to understand the mechanics of the human mind.

For more information please see:

Anderson, J.A., Sutton, J.P. (1997) "If We Compute Faster, Do We Understand Better?" Behavior Research Methods, Instruments, & Computers, 1997, 29 (1), 66-77.

Anderson, J.A. (1991) "Why, having so many neurons, do we have so few thoughts", in W.E. Hockley and S. Lewandowsky, Relating Theory to Data: Essays on Human Memory. Hillsdale, NJ: Erlbaum.

Anderson, J.A., D. Bennett, and K. Spoehr (1993) "A study in numerical perversity: Teaching arithmetic to a neural network", in D.S. Levine and M. Aparicio (ed.), Neural Networks for Knowledge Representation and Inference. Hillsdale, NJ: Erlbaum.

Anderson, J.A. (1993) "The BSB Model: A simple non-linear autoassociative network", in M. Hassoun (ed.), Associative Neural Memories: Theory and Implementation, Oxford: Oxford University Press.

Anderson, J.A., Gately, M.T., Penz, P.A., Collins, D.R. (1990) "Radar Signal Categorization Using a Neural Network", in Proceedings of the IEEE, Vol. 78 No. 10, October 1990

Anderson, J.A. (1992) "Neural networks and Mark Twain's Cat." IEEE Communications Magazine, 9/92:16-23.

Anderson, J.A. (1995) An Introduction to Neural Networks, Cambridge, MA: MIT Press.
A Visual Demonstration of
Radar Signal Categorization Using a Neural Network

For several practical applications in complex radar environments, it is important to be able to tell quickly how many emitters are present and what their properties are. This is a simplified simulation of a system that uses a neural network to accomplish this task.
The system as a whole is referred to as the Adaptive Network Sensor Processor.
007_animation 1

The input signals are made up of pulses from a number of different emitters, which may be difficult to identify because of noise, movement or even deception.

The first part of the system is a Feature Extractor, capable of processing each pulse into feature values, i.e., azimuth, elevation, signal to noise ratio, frequency, and pulse width.
007_animation 2 These data are then passed to the Deinterleaver, which clusters incoming radar pulses into groups. A number of pulses are observed, and a neural network computes how many emitters are present and estimates their properties.
007_animation 3 The Pulse Pattern Extractor uses the deinterleaved information to compute the pulse repetition pattern of the emitters by using times of arrival for the pulses contained in a given cluster.
007_animation still The fourth and fifth parts of the system are the Tracker, which acts as long term memory for the clusters found by the Deinterleaver, storing the average features of the pulses; and the Classifier, which identifies the emitters based on a database of emitter types.
The most important question for the ANSP to answer is:
"Who is looking at me, should I be concerned, and what can I do about it?"
This system has been developed further by Distributed Data Systems, Inc. for the U.S. Navy where it is known as the RSC (Radar Source Classification) package.

Dr. Anderson's Web page at: http://biomedcs.biomed.brown.edu/NeuroBrochure/Faculty/Anderson.html

This research is supported by the Human Cognition and Perception Program and the Learning and Intelligent Systems Initative.

All photos and illustrations are copyright© of their respective owners and may not be used without permission.
| NSF Home | SBE Home | BCS Home | NSF Science News | SBE Science Nuggets |