Neural networks - Info and Reading Options
an introduction
By Berndt Müller, Joachim Reinhardt and Michael T. Strickland

"Neural networks" was published by Springer-Verlag in 1995 - New York, it has 329 pages and the language of the book is English.
“Neural networks” Metadata:
- Title: Neural networks
- Authors: Berndt MüllerJoachim ReinhardtMichael T. Strickland
- Language: English
- Number of Pages: 329
- Publisher: Springer-Verlag
- Publish Date: 1995
- Publish Location: New York
“Neural networks” Subjects and Themes:
- Subjects: ➤ Neural networks (Computer science) - Neural Networks - Neurosciences - Neural Computing - Science - Computers - Communications / Networking - Science/Mathematics - Physics - Computers / Neural Networks - Science / Physics - Neural networks (Computer scie - Neural networks (computer science)
Edition Specifications:
- Pagination: xv, 329 p. :
Edition Identifiers:
- The Open Library ID: OL21151101M - OL9076197W
- Library of Congress Control Number (LCCN): 95024948
- ISBN-10: 3540602070
- All ISBNs: 3540602070
AI-generated Review of “Neural networks”:
"Neural networks" Table Of Contents:
- 1- Models of Neural Networks
- 2- The Structure of the Central Nervous System
- 3- The Neuron
- 4- The Cerebral Cortex
- 5- Neural Networks Introduced
- 6- A Definition
- 7- A Brief History of Neural Network Models
- 8- Why Neural Networks?
- 9- Parallel Distributed Processing
- 10- Understanding How the Brain Works
- 11- General Literature on Neural Network Models
- 12- Associative Memory
- 13- Associative Information Storage and Recall
- 14- Learning by Hebb's Rule
- 15- Neurons and Spins
- 16- The "Magnetic" Connection
- 17- Parallel versus Sequential Dynamics
- 18- Neural "Motion Pictures"
- 19- Stochastic Neurons
- 20- The Mean-Field Approximation
- 21- Single Patterns
- 22- Several Patterns
- 23- Cybernetic Networks
- 24- Layered Networks
- 25- Simple Perceptrons
- 26- The Perceptron Learning Rule
- 27- "Gradient" Learning
- 28- A Counterexample: The Exclusive-OR Gate
- 29- Multilayered Perceptrons
- 30- Solution of the XOR Problem
- 31- Learning by Error Back-Propagation
- 32- Boolean Functions
- 33- Representation of Continuous Functions
- 34- Applications
- 35- Prediction of Time Series
- 36- The Logistic Map
- 37- A Nonlinear Delayed Differential Equation
- 38- Nonlinear Prediction of Noisy Time Series
- 39- Learning to Play Backgammon
- 40- Prediction of the Secondary Structure of Proteins
- 41- Net-Talk: Learning to Pronounce English Text
- 42- More Applications of Neural Networks
- 43- Neural Networks in High Energy Physics
- 44- The Search for Heavy Quarks
- 45- Triggering
- 46- Mass Reconstruction
- 47- The Jetnet Code
- 48- Pattern Recognition
- 49- Handwriting and Text Recognition
- 50- Speech Recognition
- 51- 3D Target Classification and Reconstruction
- 52- Neural Networks in Biomedicine
- 53- Neural Networks in Economics
- 54- Network Architecture and Generalization
- 55- Building the Network
- 56- Dynamic Node Creation
- 57- Learning by Adding Neurons
- 58- Can Neural Networks Generalize?
- 59- General Aspects
- 60- Generalization and Information Theory
- 61- Invariant Pattern Recognition
- 62- Higher-Order Synapses
- 63- Preprocessing the Input Patterns
- 64- Associative Memory: Advanced Learning Strategies
- 65- Storing Correlated Patterns
- 66- The Projection Rule
- 67- An Iterative Learning Scheme
- 68- Repeated Hebbian Learning
- 69- Special Learning Rules
- 70- Forgetting Improves the Memory!
- 71- Nonlinear Learning Rules
- 72- Dilution of Synapses
- 73- Networks with a Low Level of Activity
- 74- Combinatorial Optimization
- 75- NP-Complete Problems
- 76- Optimization by Simulated Annealing
- 77- Realization on a Network
- 78- The Traveling-Salesman Problem
- 79- Optical Image Processing
- 80- VLSI and Neural Networks
- 81- Hardware for Neural Networks
- 82- Networks Composed of Analog Electronic Circuits
- 83- Symmetrical Networks with Hidden Neurons
- 84- The Boltzmann Machine
- 85- The "Boltzmann" Learning Rule
- 86- Applications
- 87- Coupled Neural Networks
- 88- Stationary States
- 89- Recurrent Back-Propagation
- 90- Back-Propagation Through Time
- 91- Network Hierarchies
- 92- Unsupervised Learning
- 93- "Selfish" Neurons
- 94- Learning by Competition
- 95- "The Winner Takes All"
- 96- Structure Formation by Local Inhibition
- 97- The Kohonen Map
- 98- Implementations of Competitive Learning
- 99- A Feature-Sensitive Mapping Network
- 100- The Neocognitron
- 101- Evolutionary Algorithms for Learning
- 102- Why Do Evolutionary Algorithms Work?
- 103- Evolving Neural Networks
- 104- Feed-Forward Networks
- 105- Recurrent Networks
- 106- Evolving Back-Propagation
- 107- Statistical Physics of Neural Networks
- 108- Statistical Physics and Spin Glasses
- 109- Elements of Statistical Mechanics
- 110- Spin Glasses
- 111- Averages and Ergodicity
- 112- The Edwards-Anderson Model
- 113- The Hopfield Network for p/N -> 0
- 114- Evaluation of the Partition Function
- 115- Equilibrium States of the Network
- 116- The Hopfield Network for Finite p/N
- 117- The Replica Trick
- 118- Averaging over Patterns
- 119- The Saddle-Point Approximation
- 120- Phase Diagram of the Hopfield Network
- 121- Storage Capacity of Nonlinear Neural Networks
- 122- The Dynamics of Pattern Retrieval
- 123- Asymmetric Networks
- 124- Highly Diluted Networks
- 125- The Fully Coupled Network
- 126- The Space of Interactions in Neural Networks
- 127- Replica Solution of the Spherical Model
- 128- Results
- 129- Computer Codes
- 130- Numerical Demonstrations
- 131- How to Use the Computer Programs
- 132- Notes on the Software Implementation
- 133- Asso: Associative Memory
- 134- Program Description
- 135- Numerical Experiments
- 136- Asscount: Associative Memory for Time Sequences
- 137- Program Description
- 138- Numerical Experiments
- 139- Perbool: Learning Boolean Functions with Back-Prop
- 140- Program Description
- 141- Numerical Experiments
- 142- Perfunc: Learning Continuous Functions with Back-Prop
- 143- Program Description
- 144- Numerical Experiments
- 145- Solution of the Traveling-Salesman Problem
- 146- The Hopfield-Tank Model
- 147- TSPHop: Program Description
- 148- The Potts-Glass Model
- 149- The Mean-Field Approximation of the Potts Model
- 150- TSPOtts: Program Description
- 151- TSAnneal: Simulated Annealing
- 152- Numerical Experiments
- 153- Kohomap: The Kohonen Self-Organizing Map
- 154- Program Description
- 155- Numerical Experiments
- 156- BTT: Back-Propagation Through Time
- 157- Program Description
- 158- Numerical Experiments
- 159- Neurogen: Using Genetic Algorithms to Train Networks
- 160- Program Description
- 161- References
- 162- Index
Read “Neural networks”:
Read “Neural networks” by choosing from the options below.
Search for “Neural networks” downloads:
Visit our Downloads Search page to see if downloads are available.
Borrow "Neural networks" Online:
Check on the availability of online borrowing. Please note that online borrowing has copyright-based limitations and that the quality of ebooks may vary.
- Is Online Borrowing Available: Yes
- Preview Status: restricted
- Check if available: The Open Library & The Internet Archive
Find “Neural networks” in Libraries Near You:
Read or borrow “Neural networks” from your local library.
- The WorldCat Libraries Catalog: Find a copy of “Neural networks” at a library near you.
Buy “Neural networks” online:
Shop for “Neural networks” on popular online marketplaces.
- Ebay: New and used books.