- Stephen Grossberg, John G. Taylor:
**The fifth anniversary of neural networks.**1

- Kurt Hornik, Maxwell B. Stinchcombe, Halbert White:
**Letters to the editor.**3

- Vladik Kreinovich:
**Letters to the editor.**3-4

- Terry Bossomaier, Natalina Isidoro, Adrian Loeff:
**Errors from grid approximation of IFS codes.**5-6

- Marc M. Van Hulle, Tom Tollenaere:
**A modular artificial neural network for texture processing.**7-32

- Kunihiko Fukushima, Taro Imagawa:
**Recognition and segmentation of connected characters with selective attention.**33-41

- Stephen Grossberg, Frank H. Guenther, Daniel Bullock, Douglas N. Greve:
**Neural representations for sensory-motor control, II: Learning a head-centered visuomotor representation of 3-D target position.**43-67

- Jeffrey Hoffman, Josef Skrzypek, Jacques J. Vidal:
**Cluster network for recognition of handwritten, cursive script characters.**69-78

- Andrew H. Gee, Sreeram V. B. Aiyer, Richard W. Prager:
**An analytical framework for optimizing neural networks.**79-97

- Sam-Kit Sin, Rui J. P. de Figueiredo:
**Efficient learning procedures for optimal interpolative nets.**99-113

- Masahiko Morita:
**Associative memory with nonmonotone dynamics.**115-126

- Friedrich Biegler-König, Frank Bärmann:
**A learning algorithm for multilayered neural networks based on linear least squares problems.**127-131

- Kevin N. Gurney:
**Training nets of stochastic units using system identification.**133-145

- Shun-ichi Amari:
**A universal theorem on learning curves.**161-166

- Shuji Yoshizawa, Masahiko Morita, Shun-ichi Amari:
**Capacity of associative memory using a nonmonotonic neuron model.**167-176

- Rüdiger W. Brause:
**The error-bounded descriptional complexity of approximation networks.**177-187

- John J. Shynk, Neil J. Bershad:
**Stationary points of a single-layer perceptron for nonseparable data models.**189-202

- Kevin T. Judd, Kazuyuki Aihara:
**Pulse propagation networks: A neural network model that uses temporal coding by action potentials.**203-215

- Koji Nakajima, Yoshihiro Hayakawa:
**Correct reaction neural network.**217-222

- Albert Y. Zomaya, Tarek M. Nabhan:
**Centralized and decentralized neuro-adaptive robot controllers.**223-244

- Haluk Ögmen:
**A neural theory of retino-cortical dynamics.**245-273

- Michael Sabourin, Amar Mitiche:
**Modeling and classification of shape using a Kohonen associative memory with selective multiresolution.**275-283

- Christopher Ting, Keng-Chee Chuang:
**An adaptive algorithm for neocognitron to recognize analog images.**285-299

- Mats Bengtsson:
**A neural system as a dynamical model for early vision.**313-325

- Haruo Kobayashi, Takashi Matsumoto, Tetsuya Yagi, Takuji Shimmi:
**Image processing regularization filters on layered architecture.**327-350

- Thierry Denoeux, Régis Lengellé:
**Initializing back propagation networks with prototypes.**351-363

- Alessandro Sperduti, Antonina Starita:
**Speed up learning and network optimization with extended back propagation.**365-383

- Charles W. Lee:
**Learning in neural networks by using tangent planes to constraint surfaces.**385-392

- Lars Kai Hansen:
**Stochastic linear learning: Exact test and training error averages.**393-396

- Mohamad T. Musavi, K. Kalantri, Wahid Ahmed, Khue Hiang Chan:
**A minimum error neural network (MNN).**397-407

- Jörgen M. Karlholm:
**Associative memories with short-range, higher order couplings.**409-421

- John G. Taylor, Stephen Coombes:
**Learning higher order correlations.**423-427

- Martin Brown, Chris J. Harris, Patrick C. Parks:
**The interpolation capabilities of the binary CMAC.**429-440

- Geoffrey J. Chappell, John G. Taylor:
**The temporal Kohönen map.**441-445

- Stephen Grossberg:
**A solution of the figure-ground problem for biological vision.**463-483

- Hiroaki Gomi, Mitsuo Kawato:
**Recognition of manipulated objects by motor learning with modular architecture networks.**485-497

- Gyöngyi Gaál:
**Population coding by simultaneous activities of neurons in intrinsic coordinate systems defined by their receptive field weighting functions.**499-515

- Wolfram Schiffmann, H. Willi Geffers:
**Adaptive control of dynamic systems by back propagation networks.**517-524

- Martin Fodslette Møller:
**A scaled conjugate gradient algorithm for fast supervised learning.**525-533

- Asim Roy, Lark Sang Kim, Somnath Mukhopadhyay:
**A polynomial time algorithm for the construction and training of a class of multilayer perceptrons.**535-545

- Ulrich Ramacher:
**Hamiltonian dynamics of neural networks.**547-557

- Giancarlo Parodi, Sandro Ridella, Rodolfo Zunino:
**Using chaos to generate keys for associative noise-like coding memories.**559-572

- Gregory Allen Kohring:
**On the**573-581*Q*-state neuron problem in attractor neural networks.

- Stevan V. Odri, Dusan P. Petrovacki, Gordana A. Krstonosic:
**Evolutional development of a multilevel neural network.**583-595

**Conferences on neural networks and related topics.**597-606

- Dale A. Brown:
**Letters to the editor.**607-608

- Alexander Korn:
**Letters to the editor.**608

- David G. Stork:
**Letter to the editor.**609

- Nico Weymaere:
**Letter to the editor.**611

- Arjen van Ooyen, B. Nienhuis:
**Response to letter by N. Weymaere.**611-612

- Terry M. Caelli, David McG. Squire, Tom P. J. Wild:
**Model-based neural networks.**613-625

- Lei Xu:
**Least mean square error reconstruction principle for self-organizing neural-nets.**627-648

- Pascal Koiran:
**On the complexity of approximating mappings using feedforward networks.**649-653

- Michel Benaïm:
**The "off line learning approximation" in continuous time neural networks: An adiabatic theorem.**655-665

- Frank E. McFadden, Yun Peng, James A. Reggia:
**Local conditions for phase transitions in neural networks with variable connection strengths.**667-676

- Theodore A. Burton:
**Averaged neural networks.**677-680

- Harald Englisch, Yegao Xiao, Kailun Yao:
**Strongly diluted networks with selfinteraction.**681-688

- Haruhisa Takahashi, Etsuji Tomita, Tsutomu Kawabata:
**Separability of internal representations in multilayer perceptrons with application to learning.**689-703

- J. M. Minor:
**Parity with two layer feedforward nets.**705-707

- Katsunori Shimohara, Tadasu Uchiyama, Yukio Tokunaga:
**Subconnection neural network for event-driven temporal sequence processing.**709-718

- Youngjik Lee, Sang-Hoon Oh, Myung Won Kim:
**An analysis of premature saturation in back propagation learning.**719-728

- Shigeo Abe, Masahiro Kayama, Hiroshi Takenaga, Tadaaki Kitamura:
**Extracting algorithms from pattern classification neural networks.**729-735

- Brendan L. Rogers:
**New conditioned stimulus trace circuit for the rabbit's nictitating membrane response.**753-769

- William Finnoff, Ferdinand Hergert, Hans-Georg Zimmermann:
**Improving model selection by nonconvergent methods.**771-783

- Hubertus M. A. Andree, Gerard T. Barkema, Wim Lourens, Arie Taal, Jos C. Vermeulen:
**A comparison study of binary feedforward neural networks and digital circuits.**785-790

- Faouzi Bouslama, Akira Ichikawa:
**Application of neural networks to fuzzy control.**791-799

- Ken-ichi Funahashi, Yuichi Nakamura:
**Approximation of dynamical systems by continuous time recurrent neural networks.**801-806

- Thierry Catfolis:
**A method for improving the real-time recurrent learning algorithm.**807-821

- Mark D. Plumbley:
**Efficient information transfer and anti-Hebbian neural networks.**823-833

- Pierre Courrieu:
**A convergent generator of neural networks.**835-844

- Ali A. Minai, Ronald D. Williams:
**On the derivatives of the sigmoid.**845-853

- Masahiko Arai:
**Bounds on the number of hidden units in binary-valued three-layer neural networks.**855-860

- Moshe Leshno, Vladimir Ya. Lin, Allan Pinkus, Shimon Schocken:
**Multilayer feedforward networks with a nonpolynomial activation function can approximate any function.**861-867

- Shashank K. Mehta, Laszlo Fulop:
**An analog neural network to solve the hamiltonian cycle problem.**869-881