Charles Explorer logo
🇬🇧

Neural Networks

Class at Faculty of Mathematics and Physics |
NAIL002

Syllabus

1. Introduction to the area of artificial neural networks Biological neuron and neural networks, transmission of signals in axons and synapses, information processing in neurons, main parts of the brain. History and fundamental principles of artificial neural networks. Adaptation and learning, a formal description of patterns. Selection and ordering of pattern features, selection and ordering of training patterns.

2. Early models of artificial neural networks The model of a formal neuron, weights, potential, transfer function. Main types of artificial neural metworks. Connectionism, training and recall, supervised learning and self-organization, knowledge extraction, generalization and robustness. Perceptron and linear separability, separating hyperplane. Perceptron training algorithm and its convergence, the pocket algorithm.

3. Feed-forward neural networks and error back-propagation The back-propagation training algorithm and the derivation of weight adjustment rules. Training, test and validation sets, various training strategies. Internal knowledge representation, generalization, over-fitting and over-sizing, Vapnik-Chervonenkis dimension. Kolmogorov´s theorem, function approximation, complexity of the learning problem. Main areas and principles for applications of feed-forward neural networks.

4. Associative networks Recurrent neural networks, Hebbian learning, memory capacity, attractors, energy function and convergence to stable states. Associative memories, bidirectional associative memories (BAM), the Hopfield model, continuous Hopfield model, simulated annealing, the Boltzmann machine. Hopfield networks in the search for suboptimal solutions of NP-complete problems.

5. Self-organization and hybrid models Unsupervised reinforcement learning - Oja´s algorithm for PCA. Kohonen self-organizing feature maps and algorithms for their training, lateral inhibition, topological neighborhood. Counter-propagation neural networks, RBF-networks, Adaptive Resonance Theory (ART). Cascade correlation and modular neural networks - mixtures of local experts.

6. Genetic algorithms Coding of the optimization problem, population of strings, fundamental genetic operators - selection, cross-over, mutation. Fitness function. Convergence analysis - schemata theorem. Applications of genetic algorithms in the field of artificial neural networks.

Annotation

The theory of neural networks is motivated by the results achieved in the area of the central neural system research. These inventions often represent the origin for the derived mathematical models which have (despite of significant simplifications of real neuro-physiological processes) some features of the natural intelligence.

These models can be used in the design of non-traditional computational means applied in the solutions of many practical problems.