06CS756 - Neural Networks |
PART A |
UNIT 1 |
INTRODUCTION: What is a Neural Network?, Human Brain, Models of
Neuron, Neural Networks viewed as directed graphs, Feedback, Network
Architectures, Knowledge representation, Artificial Intelligence and Neural
Networks. |
UNIT 2 |
LEARNING PROCESSES 1: Introduction, Error-correction learning,
Memory-based learning, Hebbian learning, Competitive learning, Boltzamann learning, Credit Assignment problem, Learning with a Teacher,
Learning without a Teacher, Learning tasks, Memory, Adaptation. |
UNIT 3 |
LEARNING PROCESSES 2, SINGLE LAYER PERCEPTRONS: Statistical nature of the learning process, Statistical learning theory,
Approximately correct model of learning. Single Layer Perceptrons:
Introduction, Adaptive filtering problem, Unconstrained optimization
techniques, Linear least-squares filters, Least-mean square algorithm, Learning
curves, Learning rate annealing techniques, Perceptron, Perceptron convergence
theorem, Relation between the Perceptron and Bayes classifier for a Gaussian
environment. |
UNIT 4 |
MULTILAYER PERCEPTRONS 1: Introduction, Some preliminaries,
Back-propagation Algorithm, Summary of back-propagation algorithm, XOR
problem, Heuristics for making the back-propagation algorithm perform better,
Output representation and decision rule, Computer experiment, Feature
detection, Back-propagation and differentiation. |
PART B |
UNIT 5 |
MULTILAYER PERCEPTRONS 2: Hessian matrix, Generalization,
approximation of functions, Cross validation, Network pruning techniques,
virtues and limitations of back- propagation learning, Accelerated
convergence of back propagation learning, Supervised learning viewed as an
optimization problem, Convolution networks. |
UNIT 6 |
RADIAL-BASIC FUNCTION NETWORKS 1: Introduction, Covers
theorem on the separability of patterns, Interpolation problem, Supervised
learning as an ill-posed Hypersurface reconstruction problem, Regularization
theory, Regularization networks, Generalized radial-basis function networks,
XOR problem, Estimation of the regularization parameter. |
UNIT 7 |
RADIAL-BASIC FUNCTION NETWORKS 2, OPTIMIZATION - 1:
Approximation properties of RBF networks, Comparison of RBF networks
and multilayer Perceptrons, Kernel regression and its relation to RBF networks, Learning strategies, Computer experiment. Optimization using
Hopfield networks: Traveling salesperson problem, Solving simultaneous
linear equations, Allocating documents to multiprocessors. |
UNIT 8 |
OPTIMIZATION METHODS 2: Iterated gradient descent, Simulated
Annealing, Random Search, Evolutionary computation- Evolutionary
algorithms, Initialization, Termination criterion, Reproduction, Operators,
Replacement, Schema theorem. |
REFERENCE |
TEXT BOOKS: |
1. Neural Networks A Comprehensive Foundation - Simon
Haykin, 2nd Edition, Pearson Education, 1999.
2. Artificial Neural Networks Kishan Mehrotra, Chilkuri K.
Mohan, Sanjay Ranka, Penram International Publishing, 1997.
|
Reference Books |
1. Artificial Neural Networks B. Yegnanarayana, PHI, 2001. |
|