- Shun-ichi Amari, Stephen Grossberg, John G. Taylor:
**Editorial.**1-

- Nathan Intrator, Leon N. Cooper:
**Objective function formulation of the BCM theory of visual cortical plasticity: Statistical connections, stability conditions.**3-17

- Li Deng:
**Processing of acoustic signals in a cochlear model incorporating laterally coupled suppressive elements.**19-34

- Alan Y. K. Wong, John A. Armour:
**A neural network model of canine intrathoracic ganglia regulating the heart.**35-46

- Robert L. Coultrip, Richard H. Granger, Gary Lynch:
**A cortical model of winner-take-all competition via lateral inhibition.**47-54

- Jacob M. J. Murre, R. Hans Phaf, Gezinus Wolters:
**CALM: Categorizing and learning module.**55-82

- Michael A. Cohen:
**The construction of arbitrary stable dynamics in nonlinear neural networks.**83-103

- Yoshifusa Ito:
**Approximation of continuous functions on R**105-115^{d}by linear combinations of shifted rotations of a sigmoid function with and without scaling.

- Avrim Blum, Ronald L. Rivest:
**Training a 3-node neural network is NP-complete.**117-127

- A. Ronald Gallant, Halbert White:
**On learning the derivatives of an unknown mapping with multilayer feedforward networks.**129-138

- Frank Bärmann, Friedrich Biegler-König:
**On a class of efficient learning algorithms for neural networks.**139-144

- Tom Heskes, Stan C. A. M. Gielen:
**Retrieval of pattern sequences at variable speeds in a neural network with delays.**145-152

- Eg G. J. Eijkman:
**Neural nets tested by psychophysical methods.**153-162

- Shengwei Zhang, Anthony G. Constantinides, Li-He Zou:
**Further noise rejection in linear associative memories.**163-168

- Cleber M. Gomes, Hiroyuki Sekine, Takashi Yamazaki, Shunsuke Kobayashi:
**Bipolar optical neural networks using ferroelectric liquid crystal devices.**169-177

- Pierre Landau, Eric L. Schwartz:
**Computer simulation of cortical polymaps: A proto-column algorithm.**187-206

- Pierre Cardaliaguet, Guillaume Euvrard:
**Approximation of a function and its derivative with a neural network.**207-220

- Neil E. Cotter, Thierry J. Guillerm:
**The CMAC and a theorem of Kolmogorov.**221-228

- Kurt Hornik, Chung-Ming Kuan:
**Convergence analysis of local feature extraction algorithms.**229-240

- David H. Wolpert:
**Stacked generalization.**241-259

- Ray H. White:
**Competitive hebbian learning: Algorithm and demonstrations.**261-275

- Olivier François, Jacques Demongeot, Thierry Hervé:
**Convergence of a self-organizing stochastic neural network.**277-282

- Harry A. C. Eaton, Tracy L. Olivier:
**Learning coefficient dependence on training set size.**283-288

- Kevin N. Gurney:
**Training nets of hardware realizable sigma-pi units.**289-303

- Guy Tabary, Isabelle Salaün:
**Control of a redundant articulated system by neural networks.**305-311

- Ichiro Tsuda:
**Dynamic link of memory--Chaotic memory map in nonequilibrium neural networks.**313-326

- Stefan Bornholdt, Dirk Graudenz:
**General asymmetric neural networks and structure design by genetic algorithms.**327-334

- Raghu Krishnapuram, Joonwhoan Lee:
**Fuzzy-set-based hierarchical networks for information fusion in computer vision.**335-350

- David Casasent:
**Multifunctional hybrid neural net.**361-370

- Thaddeus J. Marczynski, Leslie L. Burns, Christopher A. Monley:
**Empirically derived model of the role of sleep in associative learning and recuperative processes.**371-402

- Michel Kerszberg, Stanislas Dehaene, Jean-Pierre Changeux:
**Stabilization of complex input-output functions in neural clusters formed by synapse selection.**403-413

- Bard Ermentrout:
**Complex dynamics in winner-take-all neural nets with slow inhibition.**415-431

- Eduardo R. Caianiello, Antonella De Benedictis, Alfredo Petrosino, Roberto Tagliaferri:
**Neural associative memories with minimum connectivity.**433-439

- Lei Xu, Erkki Oja, Ching Y. Suen:
**Modified Hebbian learning for curve and surface fitting.**441-457

- Helmut Riedel, Detlev Schild:
**The dynamics of hebbian synapses can be stabilized by a nonlinear decay term.**459-463

- Arjen Van Ooyen, Bernard Nienhuis:
**Improving the convergence of the back-propagation algorithm.**465-471

- Nikzad Benny Toomarian, Jacob Barhen:
**Learning a trajectory using adjoint functions and teacher forcing.**473-484

- Hua Yang, Tharam S. Dillon:
**Convergence of self-organizing neural algorithms.**485-493

- Kiyotoshi Matsuoka:
**Stability conditions for nonlinear continuous neural networks with asymmetric connection weights.**495-500

- Vera Kurková:
**Kolmogorov's theorem and multilayer neural networks.**501-506

- Baocheng Bai, Nabil H. Farhat:
**Learning networks for extrapolation and radar target identification.**507-529

- Robert O. Gjerdingen:
**Learning syntactically significant temporal patterns of chords: A masking field embedded in an ART 3 architecture.**551-564

- Bert de Vries, José Carlos Príncipe:
**The gamma model--A new neural model for temporal processing.**565-576

- Edward K. Blum, Xin Wang:
**Stability of fixed points and periodic orbits and bifurcations in analog neural networks.**577-587

- Héctor J. Sussmann:
**Uniqueness of the weights for minimal feedforward nets with a given input-output map.**589-593

- Mohamad T. Musavi, Wahid Ahmed, Khue Hiang Chan, K. B. Faris, Donald M. Hummels:
**On the training of radial basis function classifiers.**595-603

- George J. Mpitsos, Robert M. Burton Jr.:
**Convergence and divergence in neural networks: Processing of chaos and biological analogy.**605-625

- Robert M. Burton Jr., George J. Mpitsos:
**Event-dependent control of noise enhances learning in neural networks.**627-637

- Bo Zhang, Ling Zhang, Huai Zhang:
**A quantitative analysis of the behaviors of the PLN network.**639-644

- William G. Gibson, John Robinson:
**Statistical analysis of the dynamics of a sparse associative memory.**645-661

- Shigeo Abe, Junzo Kawakami, Kotaro Hirasawa:
**Solving inequality constrained combinatorial optimization problems by the hopfield neural networks.**663-670

- Benjamin J. Hellstrom, Laveen N. Kanal:
**Asymmetric mean-field neural networks for multiprocessor scheduling.**671-686

- David Casasent, Brian Telfer:
**High capacity pattern recognition associative processors.**687-698

- Igor Grebert, David G. Stork, Ron Keesing, Steve Mims:
**Connectionist generalization for production: An example from GridFont.**699-710

- Yuval Lirov:
**Computer aided neural network engineering.**711-719

- François Chapeau-Blondeau, Gilbert A. Chauvet:
**Stable, oscillatory, and chaotic regimes in the dynamics of small neural networks with delay.**735-743

- Michael Georgiopoulos, Gregory L. Heileman, Juxin Huang:
**The**745-753*N-N-N*conjecture in ART1.

- Osamu Fujita:
**Optimization of the hidden unit function in feedforward neural networks.**755-764

- Fa-Long Luo, Bao Zheng:
**Real-time neural computation of the maximum likelihood criterion for bearing estimation problems.**765-769

- Richard Zollner, H. J. Schmitz, Fritz Wünsch, Uwe Krey:
**Fast generating algorithm for a general three-layer perceptron.**771-777

- Samir Shah, Francesco Palmieri, Michael Datum:
**Optimal filtering algorithms for fast learning in feedforward neural networks.**779-787

- Jennifer L. Raymond, Douglas A. Baxter, Dean V. Buonomano, John H. Byrne:
**A learning rule based on empirically-derived activity-dependent neuromodulation supports operant conditioning in a small network.**789-803

- Jacek M. Kowalski, Gerald L. Albert, Barry K. Rhoades, Guenter W. Gross:
**Neuronal networks with spontaneous, correlated bursting activity: Theory and simulations.**805-822

- Georg Hartmann:
**Motion induced transformations of spatial representations: Mapping 3d information onto 2d.**823-834

- Mark W. Mao, James B. Kuo:
**A coded block adaptive neural network system with a radical-partitioned structure for large-volume Chinese characters recognition.**835-841

- Michael Sabourin, Amar Mitiche:
**Optical character recognition by a neural network.**843-852

- Bruce R. Davis, Tim R. Pattison:
**Error in proof of exponential convergence.**869-

- Philippe Lefèvre, Henrietta L. Galiana:
**Dynamic feedback to the superior colliculus in a neural network model of the gaze control system.**871-890

- Bahram Nabet, Robert B. Darling, Robert B. Pinter:
**Implementation of front-end processor neural networks.**891-902

- Henrik Schløler, Uwe Hartmann:
**Mapping neural network derived from the parzen window estimator.**903-909

- Eduard Moser, Tiko Kameda:
**Bounds on the number of hidden units of boltzmann machines.**911-921

- David G. Stork, James D. Allen:
**How to solve the N-bit parity problem with two hidden units.**923-926

- Erkki Oja:
**Principal components, minor components, and linear neural networks.**927-935

- Gilles Burel:
**Blind separation of sources: A nonlinear neural algorithm.**937-947

- Derek M. Wells:
**Solving degenerate optimization problems using networks of neural oscillators.**949-959

- Kanad Chakraborty, Kishan Mehrotra, Chilukuri K. Mohan, Sanjay Ranka:
**Forecasting the behavior of multivariate time series using neural networks.**961-970

- John Shawe-Taylor, Martin Anthony, Walter Kern:
**Classes of feedforward neural networks and their circuit complexity.**971-977