NEURAL NETWORK

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

BCSE0706: NEURAL NETWORK

Objective: Students will get an insight of the Neural Network approaches. Providing students, the
mathematical background to carry out optimization.

Credits: 03 L–T–P-J: 3–0–0-0


Module Teaching
Content
No. Hours
Neural Networks: Fundamentals of Artificial Neural Network (ANN), Human
Brain, Supervised Vs Unsupervised Learning, Models of ANN, ANN models:
Rosenblatt’s Perception, McCulloch & Pitts Model, Single Layer Perceptron,
Learning Methods in Perceptron, Linearly Separable Task and XOR Problem,
I. Multi-Layer Perceptron, ADALINE, MADALINE Network
20
Multilayered Neural Architecture: Activation functions (ReLU, Sigmoid,
tanh, etc.), Gradient Descent, Back Propagation (BP) Learning, learning rate,
momentum, Loss Function in Neural Network, Optimizers in Neural
Networks, Hessian Matrix, Heuristics for making Back-Propagation algorithm
perform better, Applications of Neural Network. Radial Basis Neural Network
Probabilistic Models: Probabilistic Models, Maximum Likelihood Estimation,
Linear Regression, Logistic Regression
Deep Learning: AI vs ML vs DL, Deep learning capabilities, Data Pre-
processing, training and testing, confusion matrix. Bias and Variance,
II. Vanishing gradients.
20
Convolutional Neural Network (CNN)- Basic terminologies in CNN
(Convolution, Filter/Kernel, Padding, Pooling), Architecture of CNN, Different
Models of CNN, Classification of MNIST dataset images with CNN.
Recurrent Neural Networks (RNN): Sequential Processing, Stability Gated
Nets- LSTM (Long Short-Term Memory), Gated Recurrent Unit), Examples.

Text Books:
 Haykin, Simon. Neural networks and learning machines, 3/E. Pearson Education India, 2010.
 Ian Goodfellow and Yoshua Bengio and Aaron Courville, Deep Learning, An MIT Press book, 2016
http://www.deeplearningbook.org

Reference Books:
 Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 2nd Edition by Aurélien
Géron, 2019, Publisher(s): O'Reilly Media, Inc.
 Joel Grus, Data Science from Scratch 2e: First Principles with Python, O′Reilly; 2nd edition, 2019.
 S. Rajsekaran& G.A. VijayalakshmiPai, “Neural Networks, Fuzzy Logic and Genetic Algorithm:
Synthesis and Applications”, 4th Edition, Prentice Hall of India,2003.

Outcome: After completion of course, student will be able to:


S.no CO Objective
1 CO1 Understand the differences between networks for supervised and unsupervised learning

2 CO2 Design single and multi-layer Perceptron neural networks

3 CO3 Understand Back Propagation Non-Linear Neural network architecture

4 CO4 Understand Convolutional Neural Network and Recurrent Neural Network


BCSE0736: NEURAL NETWORK LAB

Objective: Students will get an insight of the Neural Network approaches. Providing students, the
mathematical background to carry out optimization.

Credits: 01 L–T–P-J: 0–0–2-0


Module Teaching
Content
No. Hours

 Programming setup
 Python library introduction
 Plot activation functions and their derivatives
 Generate AND NOT function using McCulloch-Pitts neural net
I, II  Generate XOR function using McCulloch-Pitts neuron. 24
 Linear Regression, Logistic Regression
 Back Propagation Network for XOR function
 Multi-Layer Perceptron
 Training CNN
 Training RNN

Text Books:
 Haykin, Simon. Neural networks and learning machines, 3/E. Pearson Education India, 2010.
 Ian Goodfellow and Yoshua Bengio and Aaron Courville, Deep Learning, An MIT Press book, 2016
http://www.deeplearningbook.org

Reference Books:
 Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 2nd Edition by Aurélien
Géron, 2019, Publisher(s): O'Reilly Media, Inc.
 Joel Grus, Data Science from Scratch 2e: First Principles with Python, O′Reilly; 2nd edition, 2019.
 S. Rajsekaran& G.A. VijayalakshmiPai, “Neural Networks, Fuzzy Logic and Genetic Algorithm:
Synthesis and Applications”, 4th Edition, Prentice Hall of India,2003.

You might also like