0% found this document useful (0 votes)
6 views6 pages

CP- THEORY- ML (1)

The document outlines the course plan for Machine Learning (AL3451) for the academic year 2024-2025, detailing course objectives, syllabus, and assessment methods. It covers topics such as supervised and unsupervised learning, neural networks, and machine learning experiments. The course aims to equip students with the ability to construct and evaluate machine learning models.

Uploaded by

deepika
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views6 pages

CP- THEORY- ML (1)

The document outlines the course plan for Machine Learning (AL3451) for the academic year 2024-2025, detailing course objectives, syllabus, and assessment methods. It covers topics such as supervised and unsupervised learning, neural networks, and machine learning experiments. The course aims to equip students with the ability to construct and evaluate machine learning models.

Uploaded by

deepika
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

COURSE PLAN-THEORY

Course Code AL3451

Course Name MACHINE LEARNING

Regulation 2021
Name of the Course Instructor(s) Mrs.A.Deepika
Name of the Course Coordinator
Academic Year: 2024-2025
Branch / Year / Semester CSE(AIML) / II / IV
Date of Commencement of Class
Date of Completion of Class
Revision No 0

Prepared By, Approved By,


Mrs. A.Deepika AP/CSE Dr.C.Callins Christiyana HoD/CSE

1
SYLLABUS

COURSE
COURSE NAME L T P C
CODE
AL3451 MACHINE LEARNING 3 0 0 3
COURSE OBJECTIVES :
 To understand the basic concepts of machine learning.
 To understand and build supervised learning models.
 To understand and build unsupervised learning models.
 To evaluate the algorithms based on corresponding metrics identified
UNIT I INTRODUCTION TO MACHINE LEARNING 8
Review of Linear Algebra for machine learning; Introduction and motivation for machine learning;
Examples of machine learning applications, Vapnik-Chervonenkis (VC) dimension, Probably
Approximately Correct (PAC) learning, Hypothesis spaces, Inductive bias, Generalization, Bias variance
trade-off.
UNIT II SUPERVISED LEARNING 11
Linear Regression Models: Least squares, single & multiple variables, Bayesian linear regression,
gradient descent, Linear Classification Models: Discriminant function – Perceptron algorithm,
Probabilistic discriminative model - Logistic regression, Probabilistic generative model – Naive Bayes,
Maximum margin classifier – Support vector machine, Decision Tree, Random Forests
UNIT III ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING 9
Combining multiple learners: Model combination schemes, Voting, Ensemble Learning - bagging,
boosting, stacking, Unsupervised learning: K-means, Instance Based Learning: KNN, Gaussian mixture
models and Expectation maximization.
UNIT IV NEURAL NETWORKS 9
Multilayer perceptron, activation functions, network training – gradient descent optimization – stochastic
gradient descent, error backpropagation, from shallow networks to deep networks –Unit saturation (aka
the vanishing gradient problem) – ReLU, hyper parameter tuning, batch normalization, regularization,
dropout.
DESIGN AND ANALYSIS OF MACHINE LEARNING
UNIT V 8
EXPERIMENTS
Guidelines for machine learning experiments, Cross Validation (CV) and resampling – K-fold CV,
bootstrapping, measuring classifier performance, assessing a single classification algorithm and
comparing two classification algorithms – t test, McNemar’s test, K-fold CV paired t test
TOTAL: 45 Periods
CONTENT BEYOND SYLLABI:
Advanced Ensemble Methods: Gradient Boosting Machines (GBM), CatBoost and AdaBoost

COURSE OUTCOMES:
After the successful completion of this course, the student will be able to
CO1: Explain the basic concepts of machine learning.
CO2 : Construct supervised learning models.
CO3 : Construct unsupervised learning algorithms.
CO4: Evaluate and compare different models

TEXT BOOKS:
2
T1: Ethem Alpaydin, “Introduction to Machine Learning”, MIT Press, Fourth Edition, 2020.
T2: Stephen Marsland, “Machine Learning: An Algorithmic Perspective, “Second Edition”, CRC
Press, 2014.

REFERENCE BOOKS/LINKS:
R1: Christopher M. Bishop, “Pattern Recognition and Machine Learning”, Springer, 2006.
R2: Tom Mitchell, “Machine Learning”, McGraw Hill, 3rd Edition, 1997.
R3: Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar, “Foundations of Machine
Learning”, Second Edition, MIT Press, 2012, 2018.
R4: Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016
R5: Sebastain Raschka, Vahid Mirjalili , “Python Machine Learning”, Packt publishing 3rd
Edition, 2019.
R6: https://www.geeksforgeeks.org/general-steps-to-follow-in-a-machine-learning-problem/

PLAN OF DELIVERY

Ref. Cumula Teaching


Sl. Teaching
Topic Covered Book Page No Hours tive Methodology (If
No Aid
Code Hours any)

UNIT-I INTRODUCTION TO MACHINE LEARNING


Review of Linear
1. Algebra for machine T2 64-66 1 1 BB -
learning
Introduction and T2 4-5
2. motivation for machine 1 2 BB+PPT -
T1 1-3
learning
T2 10-11
3. Examples of machine 1 3 BB Think Pair Share
learning applications T1 3-11
T1 22-23
4. Vapnik-Chervonenkis 1 4 BB -
(VC) dimension R2 215-217
Probably T1 24-25
5. Approximately Correct 1 5 BB -
R2 203-205
(PAC) learning
T1 32-34
6. Hypothesis spaces, 2 7 BB -
Inductive bias, R2 60-63

7. Generalization, Bias T2 35-36 1 8 BB+PPT Think Pair Share


variance trade-off
UNIT-II SUPERVISED LEARNING
8. Introduction To Machine T1 1-4 1 9
Learning
T1 36-38 BB -
9. Linear Regression 1 10
Models R1 137-140
Least Squares, Single & BB
10. R1 184-186 1 11 -
Multiple Variables
11. Bayesian Linear R1 152-156 1 12 Think Pair Share

3
Ref. Cumula Teaching
Sl. Teaching
Topic Covered Book Page No Hours tive Methodology (If
No Aid
Code Hours any)

Regression, Gradient R4 177-190 BB


Descent
Linear Classification
12. Models: Discriminant R1 181-190 1 13 -
BB
Function
Perceptron algorithm , R1 203-212
13. Probabilistic 2 15 -
T1 233-239 BB
Discriminative Model
R1 217-218
14. Logistic Regression 1 16 BB+PPT -
R3 245-247

Probabilistic Generative R1 196-202


15. 1 17 BB -
Model – Naive Bayes R2 300-305
Maximum Margin T2 170-183
16. Classifier – Support 1 18 BB+PPT -
R3 63-75
Vector Machine
Decision Tree, T2 249-260
17. 1 19 BB -
Random Forests T2 275-277
UNIT-III ENSEMBLE TECHNIQUES AND UNSUPERVISED LEARNING

Combining Multiple
18. Learners: Model R1 653-660 1 20 BB -
Combination Schemes
R1 339-340
19. Voting 1 21 BB -
T1 354-356

Ensemble Learning - R1 657-661


20. Bagging, Boosting, 2 23 BB -
Stacking T1 360-364

Unsupervised Learning: T1 167-170


21. 2 25 BB+PPT -
K-Means T2 282-290
T1 194-196
22. Instance Based 1 26 BB+PPT Think Pair Share
Learning: KNN R2 231-236

Gaussian Mixture Models T1 171-175


23. And Expectation 2 28 BB -
Maximization R2 191-195

UNIT-IV NEURAL NETWORKS


R2 86-88
24. Multilayer Perceptron 1 29 BB+PPT -
T2 43-47
25. Activation Functions R4 174-175 1 30 BB -

4
Ref. Cumula Teaching
Sl. Teaching
Topic Covered Book Page No Hours tive Methodology (If
No Aid
Code Hours any)

Network Training –
26. Gradient Descent R4 177-178 2 32 BB -
Optimization

27. Stochastic Gradient R4 294-296 1 33 BB+PPT -


Descent

Error Backpropagation, R4 85-108


28. From Shallow Networks 1 34 BB -
To Deep Networks R4 204-220
Unit Saturation (Aka The
29. Vanishing Gradient R4 193-194 1 35 BB -
Problem) – ReLU
30. Hyper parameter Tuning R4 428-430 1 36 BB Think Pair Share

31. Batch Normalization, R4 258-268 1 37 BB -


Regularization, Dropout
UNIT V- DESIGN AND ANALYSIS OF MACHINE LEARNING EXPERIMENTS
Web
32. Guidelines For Machine R6 1 38 BB+PPT -
Learning Experiments source
33. Cross Validation (CV)
T1 330-331 2 40 BB -
34. Resampling – K-Fold
CV
35. Bootstrapping T1 332-333 1 41 BB -

36. Measuring Classifier R1 326-331 1 42 BB -


Performance
Assessing A Single
Classification
37. Algorithm And T1 339-341 1 43 BB+PPT -
Comparing Two
Classification
Algorithms – T Test
38. Mcnemar’s Test, T1 342-343 1 44 BB -

39. K-Fold CV Paired T T1 343-344 1 45 BB Think Pair Share


Test
Content Beyond Syllabus
Advanced Ensemble
40. Web source 1 46 PPT -
Methods

5
ASSESSMENT PLAN

ASSESSMENT SCHEDULE-TEST

TEST DATE
TEST PORTION FOR TEST
NO. PLANNED CONDUCTED
UNIT 1,2
Internal I
Assessment
UNIT 3,4
Test II
UNIT 1
Open Book I
Test UNIT 3
II

ASSESSMENT SCHEDULE-ASSIGNMENT

Group Date of Submission


Assignment No Mode
/Common/Individual
I Seminar Individual
II Written Group

ASSESSMENT PATTERN

ITEM WEIGHTAGE
Continuous Assessment-I 40
Internal Assessment Test – I 60
40
Continuous Assessment-II 40
Internal Assessment Test – II 60
End Semester Examination 60
Total 100

You might also like