: Xingui He, Shaohua Xu
: Process Neural Networks Theory and Applications
: Springer-Verlag
: 9783540737629
: Advanced Topics in Science and Technology in China
: 1
: CHF 123.50
:
: Anwendungs-Software
: English
: 240
: Wasserzeichen/DRM
: PC/MAC/eReader/Tablet
: PDF

For the first time, this book sets forth the concept and model for a process neural network. You'll discover how a process neural network expands the mapping relationship between the input and output of traditional neural networks and greatly enhances the expression capability of artificial neural networks. Detailed illustrations help you visualize information processing flow and the mapping relationship between inputs and outputs.

Preface6
Table of Contents8
1 Introduction14
1.1 Development of Artificial Intelligence14
1.2 Characteristics of Artificial Intelligent System18
1.3 Computational Intelligence22
1.3.1 Fuzzy Computing22
1.3.2 Neural Computing25
1.3.3 Evolutionary Computing25
1.3.4 Combination of the Three Branches28
1.4 Process Neural Networks29
References30
2 Artificial Neural Networks33
2.1 Biological Neuron34
2.2 Mathematical Model of a Neuron35
2.3 Feedforward/Feedback Neural Networks36
2.3.1 Feedforward/Feedback Neural Network Model36
2.3.2 Function Approximation Capability of Feedforward Neural Networks38
2.3.3 Computing Capability of Feedforward Neural Networks40
2.3.4 Learning Algorithm for Feedforward Neural Networks41
2.3.5 Generalization Problem for Feedforward Neural Networks41
2.3.6 Applications of Feedforward Neural Networks43
2.4 Fuzzy Neural Networks45
2.4.1 Fuzzy Neurons45
2.4.2 Fuzzy Neural Networks46
2.5 Nonlinear Aggregation Artificial Neural Networks48
2.5.1 Structural Formula Aggregation Artificial Neural Networks48
2.5.2 Maximum (or Minimum) Aggregation Artificial Neural Networks48
2.5.3 Other Nonlinear Aggregation Artificial Neural Networks49
2.6 Spatio-temporal Aggregation and Process Neural Networks50
2.7 Classification of Artificial Neural Networks52
References53
3 Process Neurons56
3.1 Revelation of Biological Neurons56
3.2 Definition of Process Neurons57
3.3 Process Neurons and Functionals60
3.4 Fuzzy Process Neurons61
3.4.1 Process Neuron Fuzziness62
3.4.2 Fuzzy Process Neurons Constructed using Fuzzy Weighted Reasoning Rule63
3.5 Process Neurons and Compound Functions64
References65
4 Feedforward Process Neural Networks66
4.1 Simple Model of a Feedforward Process Neural Network66
4.2 A General Model of a Feedforward Process Neural Network68
4.3 A Process Neural Network Model Based on Weight Function Basis Expansion69
4.4 Basic Theorems of Feedforward Process Neural Networks71
4.4.1 Existence of Solutions72
4.4.2 Continuity75
4.4.3 Functional Approximation Property77
4.4.4 Computing Capability80
4.5 Structural Formula Feedforward Process Neural Networks80
4.5.1 Structural Formula Process Neurons81
4.5.2 Structural Formula Process Neural Network Model82
4.6 Process Neural Networks with Time-varying Functions as Inputs and Outputs84
4.6.1 Network Structure84
4.6.2 Continuity and Approximation Capability of the Model86
4.7 Continuous Process Neural Networks88
4.7.1 Continuous Process Neurons89
4.7.2 Continuous Process Neural NetworkModel90
4.7.3 Continuity, Approximation Capability, and Computing Capability of the Model91
4.8 Functional Neural Network96
4.8.1 Functional Neuron97
4.8.2 Feedforward Functional Neural Network Model98
4.9 Epilogue99
References100
5 Learning Algorithms for Process Neural Networks101
5.1 Learning Algorithms Based on the Gradient Descent Method and Newton Descent Method102
5.1.1 A General Learning Algorithm Based on Gradient Descent102
5.1.2 Learning Algorithm Based on Gradient-Newton Combination104
5.1.3 Learning Algorithm Based on the Newton Descent Method106
5.2 Learning Algorithm Based on Orthogonal Basis Expansion106
5.2.1 Orthogonal Basis Expansion of Input Functions107
5.2.2 Learning Algorithm Derivation108
5.2.3 Algorithm Description and Complexity Analysis109
5.3 Learning Algorithm Based on the Fourier Function Transformation110
5.3.1 FourierOrthogonal Basis Expansion of the Function in L2[0, 2rr]110
5.3.2 Learning Algorithm Derivation112
5.4 Learning Algorithm Based on the Walsh Function Transformation114
5.4.1 Learning Algorithm Based on Discrete Walsh Function Transformation114
5.4.2 Learning Algorithm Based on Continuous Walsh Function Transformation118
5.5 Learning Algorithm Based on Spline Function Fitting121
5.5.1 Spline Function121
5.5.2 Learning Algorithm Derivation122
5.5.3 Analysis of the Adaptability and Complexity of a Learning Algorithm124
5.6 Learning Algorithm Based on Rational Square Approximation and Optimal Piecewise Approximation125
5.6.1 Learning Algorithm Based on Rational Square Approximation125
5.6.2 Learning Algorithm Based on Optimal Piecewise Approximation132
5.7 Epilogue139
References139
6 Feedback Process Neural Networks141
6.1 A Three-Layer Feedback Process Neural Network142
6.1.1 Network Structure142
6.1.2 Learning Algorithm143
6.1.3 Stability Analysis145
6.2 Other Feedback Process Neural Networks148
6.2.1 Feedback Process Neural Network with Time-varying Functions as Inputs and Outputs148
6.2.2 Feedback Process Neural Network for Pattern Classification149
6.2.3 Feedback Process Neural Network for Associative Memory Storage150
6.3 Application Examples151
References155
7 Multi-aggregation Process Neural Networks156
7.1 Multi-aggregation Process Neuron156
7.2 Multi-aggregation Process Neural Network Model158
7.2.1 A General Model of Multi-aggregation