Aids Cis Final
Aids Cis Final
Aids Cis Final
SYLLABUS:
UNIT DETAILS HOURS
UNIT I INTRODUCTION TO MACHINE LEARNING
Review of Linear Algebra for machine learning; Introduction and motivation for machine
I learning; Examples of machine learning applications, Vapnik-Chervonenkis (VC) dimension, 8
Probably Approximately Correct (PAC) learning, Hypothesis spaces, Inductive bias,
Generalization, Bias variance trade-off.
UNIT II SUPERVISED LEARNING
Linear Regression Models: Least squares, single & multiple variables, Bayesian linear
regression, gradient descent, Linear Classification Models: Discriminant function – Perceptron
II 11
algorithm, Probabilistic discriminative model - Logistic regression, Probabilistic generative
model – Naïve Bayes, Maximum margin classifier – Support vector machine, Decision Tree,
Random Forests
UNIT III ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING
Combining multiple learners: Model combination schemes, Voting, Ensemble Learning -
III 9
bagging, boosting, stacking, Unsupervised learning: K-means, Instance Based Learning: KNN,
Gaussian mixture models and Expectation maximization.
UNIT IV NEURAL NETWORKS
Multilayer perceptron, activation functions, network training – gradient descent optimization –
IV stochastic gradient descent, error backpropagation, from shallow networks to deep networks – 9
Unit saturation (aka the vanishing gradient problem) – ReLU, hyperparameter tuning, batch
normalization, regularization, dropout.
UNIT V DESIGN AND ANALYSIS OF MACHINE LEARNING EXPERIMENTS
Guidelines for machine learning experiments, Cross Validation (CV) and resampling – K-fold
V CV, bootstrapping, measuring classifier performance, assessing a single classification 8
algorithm and comparing two classification algorithms – t test, McNemar’s test, K-fold CV
paired t test
TOTAL HOURS 45
TEXT/REFERENCE BOOKS:
T/R BOOK TITLE/AUTHORS/PUBLICATION
T Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, Fourth Edition, 2020.
T Stephen Marsland, “Machine Learning: An Algorithmic Perspective, “Second Edition”, CRC Press, 2014.
R Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2006.
R Tom Mitchell, “Machine Learning”, McGraw Hill, 3rd Edition, 1997
R Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine Learning”, Second Edition,
MIT Press, 2012, 2018.
R Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016
R Sebastain Raschka, Vahid Mirjalili , “Python Machine Learning”, Packt publishing, 3rd Edition, 2019.
COURSE PRE-REQUISITES:
C.CODE COURSE NAME DESCRIPTION SEM
- - - -
COURSE OBJECTIVES:
1 To understand the basic concepts of machine learning.
COURSE OUTCOMES:
PO(1..12) &
S.NO DESCRIPTION PSO(1..2)
MAPPING
PO1,PO2,PO3,PO4,P
Students will be able to understand the fundamental concepts of machine learning
C212.1 O9,PO10,P011,PO12
and their applications in real-world problems.
& PSO1,PSO2
PO1,PO2,PO3,PO4,P
Students will be to evaluate supervised learning techniques, applying them adeptly
C212.2 SO5,PO9,PO10,P011
to address both regression and classification tasks with efficacy and precision.
,PO12 & PSO1,PSO2
Students will be able to utilize diverse model combination schemes to enhance PO1,PO2,PO3,PO4,P
C212.3 unsupervised learning techniques for effective pattern recognition, clustering, and SO5,PO9,PO10,P011
predictive modeling tasks. ,PO12 & PSO1,PSO2
Students will able to design and analyze machine learning experiments, employing PO1,PO2,PO3,PO4,P
C212.5 Cross Validation and resampling methods to ensure robustness and reliability in SO5,PO9,PO10,P011
model evaluation and performance estimation. ,PO12 & PSO1,PSO2
COURSE OUTCOMES VS POS MAPPING (DETAILED; HIGH:3; MEDIUM:2; LOW:1):
SNO COURSE OUTCOMES PO1 PO2 PO3 PO4 PO5 PO6 PO7 PO8 PO9 PO10 PO11 PO1 PSO PSO PSO
2 1 2 3
Students will be able to
understand the fundamental
concepts of machine 2 1 2 1 - - - - 3 3 2 2 2 2 2
C212.1
learning and their
applications in real-world
problems.
Students will be to evaluate
supervised learning
techniques, applying them
C212.2 adeptly to address both 2 3 3 1 2 - - - 2 2 2 1 3 1
regression and classification 1
tasks with efficacy and
precision.
Students will be able to
utilize diverse model
combination schemes to
enhance unsupervised
C212.3 learning techniques for 2 1 3 3 2 - - - 1 1 1 1 1 2 2
effective pattern
recognition, clustering, and
predictive modeling tasks.
Avg 2 2 3 2 1 - - - 2 2 2 1 2 2 2
* For Entire Course, PO /PSO Mapping; 1 (Low); 2(Medium); 3(High) Contribution to PO/PSO
DELIVERY/INSTRUCTIONAL METHODOLOGIES:
☐ CHALK & TALK ☐ STUD. ASSIGNMENT ☐ WEB RESOURCES ☐ NPTEL/OTHERS
☐ LCD/SMART BOARDS ☐ STUD. SEMINARS ☐ ADD-ON COURSES ☐ WEBNIARS
ASSESSMENT METHODOLOGIES-DIRECT
☐ ASSIGNMENTS ☐ STUD. SEMINARS ☐ TESTS/MODEL EXAMS ☐ UNIV. EXAMINATION
☐ STUD. LAB PRACTICES ☐ STUD. VIVA ☐ MINI/MAJOR PROJECTS ☐ CERTIFICATIONS
☐ ADD-ON COURSES ☐ OTHERS
ASSESSMENT METHODOLOGIES-INDIRECT
☐ ASSESSMENT OF COURSE OUTCOMES (BY FEEDBACK, ☐ STUDENT FEEDBACK ON FACULTY (TWICE)
ONCE)
☐ ASSESSMENT OF MINI/MAJOR PROJECTS BY EXT. ☐ OTHERS
EXPERTS
Prepared by Approved by
(Faculty) (HOD)